Any Internet-connected device is, in fact, a server, and must be seen and managed as one. This means strict control of installed services and, first and foremost, regular updates of all its software components (including firmware). If you acquire and install such a server which either can’t be updated or one which you know, realistically, won’t get any updates six months after installation, that’s asking to lose.
Yes and that is really bad... hopefully someone, one day, will solve this problem. I don't auto-update Android apps that what I need, because as a rule of thumb, app quality declines over time...
Then by definition you are not receiving security updates. Android sandboxing obviously helps here, but it’s still not a sensible position to take in the general case and definitely not an idea you want to give to your less technical friends or family.
Then it is unfortunate that the general behavior of (some big?) app developers has made it so necessary to eschew updates by default to avoid change-for-the-sake-of-change workflow breakages (e.g. how Mozilla went about rolling out Fenix) and to dodge the threat of user hostile changes (e.g. the ad and tracking nonsense we see so often on Windows updates).
This is a problem of their own creation, trying to tell users to accept it anyways is a non-starter for many.
> Then by definition you are not receiving security updates.
There are certain apps that I will update though, like Element, but I don't even update Firefox because the new engine breaks too many things for me, like extensions and bookmarklets. IE: Last time I checked you could only choose between 12 extensions on the latest Firefox Mobile.
> definitely not an idea you want to give to your less technical friends or family.
That's a load of garbage.
I don't do updates and zapped all google services except the playstore. Been looking good since.only whatsapp is the only culprit that demands me to update by force.
Of course… IT DEPENDS on the type of app, if the app doesn’t share user input with other users and doesn’t download or run untrusted code, then you probably don’t need to update it.
If security-patches-only (a.k.a. “stable”) is an available update channel, feel free to use that. I always do.
But if you don’t update at all, you will be subject to an exploit, and you and your devices will then possibly be unwitting members of (possibly multiple) botnets.
Nearly every phone today is sold with a USB C port.
"But this phone is older than..."
My Moto Z Play was released in 2016. That phone is now 5 years old. It came with Nougat... and I can mirror the screen with a USB C hub with Built in HDMI. Plug in a mouse and you're done.
No need to leave adb enabled, which is a security hole that can be exploited by unsafe USB ports and bad actors. (Also, police.)
Pixel 5 does. I tried with one recently and noted it only does mirroring, unlike Samsung and Huawei which launch some pseudo desktop OS when plugged in.
Got out my old Moto Z Play which was retired. Restarted the phone so I couldn't use fingerprint to unlock.
Plugged in a mouse, clicked pin. Unlocked.
Moto Z Play runs Nougat, stock but rooted. You don't need root to be able to do this nor do you need ADB to be enabled.
Also confirmed to be working on Razer Phone (stock, 9/Pie) and Essential Phone (stock, 10). Also works with Keyboard instead of mouse. Just type in the pin and hit enter.
Pattern unlock will have to be done with the mouse for obvious reasons.
So if your digitizer is completely busted, you can still use a mouse and keyboard. If your display is completely busted, you likely will be able to get video alt mode with USB C out.
I tried it on a few other devices that I had in my pile. Chances are, if it's Micro USB, you won't get video on boot, even with a proper MHL adapter.
* Sony Xperia J (Jelly Bean): Nothing, Micro USB
* Sony Xperia Play (Gingerbread): Nothing, Micro USB
* Marshall London (Lollipop): Mouse and Keyboard Works, Micro USB
* Asus Zenphone 3 Zoom (Nougat): Mouse and keyboard works, USB C
IIRC, Displaylink and OTG support was officially added in Lollipop, so that kind of jives with my experience here. As always, YMMV.
On my android I can enter pin/password per keyboard, or the swipe pattern per mouse. Some early android phones had slide-out keyboards (like a Blackberry), so good support for physical keyboards isn't that surprising. I wasn't expecting the mouse support tough.
MHL largely was for MicroUSB. Chances are what you need is a USB to DP or USB to HDMI adapter. Some work in one but not the other and vise versa. In the end however, Moto Es ARE designed to be the most cut down, least featured phone in the Moto line.
You would still be able to use a mouse or keyboard to input a pin or passphrase if the screen was still semi readable.
My friends Moto Z Play 2 screen was cracked and stopped working after a hard fall. Phone still powers on and accepts touch input but no display. 2 different hubs do not push video at all, so we’re probably looking at replacing the screen to recover her pics.
With a USB C port, there's a near absolute certainty that you can simply plug in a USB hub with HDMI built in to hook your phone up to a mouse and monitor to get around the cracked screen issue.
The recent versions of Android seem to have gotten way better on ADB security. It seems to generate a keypair on every computer with ADB and require a user with the unlocked phone to authorize the keypair for each device manually. It also auto-revokes authorization if you haven't connected that device with ADB in 7 days by default.
I had a double whammy of a corroded display connection and a loose usb-c port, meaning recovering data from the phone was a very sensitive and frustrating affair.
See I did that, but got the one-two punch of death.
Cracked screen AND USB-C port failure. No charging or data transfer possible.
Apparently at one time google opted me in for Google Photos, so I have a backup of those, but recent contacts were totally lost.
Still have the board lying around and am tempted to continue tinkering with it... Or would be if I had any clue where to get insight on how to do advanced debugging on why a mobile phone wouldn't even register as connected over USB-C. Assuming some sort of handshake failure.
At the level of running apps, yes. But unless I'm way behind on my rooting tech, all of the usual methods leave the phone in a state where anyone who connects it to a computer via USB can access everything on it. AFAIK, Android has gotten way better at having phones with the stock OS locked down hard, with signed bootloaders, OS level encryption keys stored in secure media, etc, and rooting blows that all away.
Not sure what you mean there. I'm saying that one is a necessary consequence of the other. If you want to have your phone really locked down tight such that it won't give up everything to anyone who connects a USB cable to it and has a few clever tools, it's inevitably going to be tough to do things like load third-party ROMs and root the phone. And it's going to be really tough to do any of those things and also maintain that security against hostile physical access.
Did you install a custom bootloader to load your ROM and root apps? Cool, but as a consequence of that, anyone who connects a USB cable to your phone can get into anything in it. Does it supposedly have security? Wanna bet the quality of any security in a homemade app vs Google's best efforts?
This is over-engineering IMO. If you want to use an old phone, simply setup syncthing, and add in the folder from the connected hard disk. Instantly available synced folders from other places.
For those looking for backup servers on Linux, using the backup tool built into many Linux distros (Deja-Dup or Duplicity) you can make file level backups without setting up backup software at the server side by using SFTP. Backups are encrypted and through a few (admittedly hidden) settings you can enforce a period after which full backups are made rather than diffed backups.
That's the system in currently using, at least. I'd be happy to read about other open source solutions if anyone has a better solution. My backup system is geared towards a service that exposes nothing more than SFTP or WebDav as a backup location because of a cheap cloud storage subscription I've managed to get.
I have two gripes with Déjà Dup (or rather had, the last time I reviewed my backup setup several years ago):
- It can only do one set of backup settings and consequently only one backup destination without additional tooling, which is not Right™.
- It cannot add different prefixes to the names of index and data files, making it impossible to set up S3 lifecycle rules for dumping the latter to Glacier. (Duplicity requires the former to be hot for some reason, I don’t know why.)
Both of those are things you can do with the underlying Duplicity tool and a scheduler, but the UI does not expose them. Thus I sadly had to discard Déjà Dup’s shiny GNOME UI, and I know of no other backup tool with a shiny GNOME UI (and am too lazy to write one so far).
While, from a technical POV, this sounds fun, I'd really recommend against using an old and cracked phone (which, to add insult to injury, is probably running very outdated and vulnerable software) for backup purposes.
Please, run your backup servers on machines designed to do so, capable of receiving regular software updates, etc...
If you don't open any port on your phone to the public internet, I guess it's fine for this use case? Most (if not all) home routers have firewall enabled by default. So your home devices are protected from inbound connections.
Root required, darn. I have 2 or 3 android phones with bad screens sitting in a drawer I'd like to do something with. Unfortunately they a.) don't have USB debugging enabled and b.) don't support any sort of external display.
If anyone had some tips on how I could get into these machines it would be much appreciated.
You're better off getting a single-board computer for this. You'll be able to install your own mSATA drive, and it'll run most distros without any fuss. That said, this quickly escalates to just set up a real NAS box. You don't really want to use an SSD for backup because they can lose data when not used. Alternatively, just go with a cloud provider.
As funnny as this is, i never really go the whole concept of a "real NAS box" - do people usually mean specialized hardware in combination with something like the FreeNAS OS?
Because right now i'm just using consumer hardware (an Athlon 200GE for its low 35W TDP, some cheapo RAM and a number of Seagate BarraCuda HDDs) in combination with Debian, which also runs on my other cloud servers. It's all mounted on my other devices either through SFTP or a Nextcloud instance that's also running on them for easier file replication. I don't even have RAID or ZFS/XFS/Btrfs file systems, just ext4 because any sort of clustering would introduce unneeded complexity to the setup - instead, a cron job with rsync backs the data from the "primary" HDDs to "secondary" ones every day in an incremental manner (well there's also BackupPC that does network rsync backups across servers).
What purpose would running a NAS oriented OS distro even serve, for simple uses cases like that? It feels like the current "ad hoc" setup is really affordable and easy to operate.
I think a SoC would also be nicely suited for this, as long as there is sufficient I/O capability. In regards to phones, however, i feel like drivers for obscure devices could probably become problematic quickly.
> Software RAID, to avoid data loss due to drive failure
Ok it's old but: raid is not backup.
I've personally worked for a small online shop that was selling digital items and shut down when their raid failed. Of course, they had ignored my warnings to build a mirror system or do copies on removable drives or ... anything resembling an actual backup.
Edit: that said, I do have a box that's pretty similar to what you're describing.
Yes, for actual backups I have M-DISC BluRays [0] which I keep in a fireproof box. They're limited in capacity compared to the full NAS, but big enough for really important stuff like photos and scanned documents (which conveniently are intrinsically write-once).
It probably depends on how much you need to store. But if you need ongoing backup that will get larger and larger then the answer is probably "No".
I back up my family photos and other important files to Amazon S3 using Restic. My 150GB of data ends up costing me about $1.50/month. I could get the price down lower if I use, say, the Infrequent Access storage tier, but at that price point I just can't be bothered to deal with it.
A free tier often isn't worth it if you have to put any time into thinking about whether you'll exceed the limit. I'd just go with a cheap pay-as-you-go service and not worry about it.
To me choosing a specialized hardware was motivated by these two factors:
- power management
- semi-closed source applications
On power management, I don't use the NAS box continuously, and there can be long gaps between accesses. Having the box "sleep" for 95% of the day, and wake up the disks only for one or two bursts of activity is interesting to me.
I tried doing that with DIY solutions, including Raspberry Pi+SATA disk type of arrangements. Overall it didn't work great and was a PITA.
On the semi-proprietary apps, I am thinking about Synology's apps. I could totally live without them, but it's a nice addition to the package, they're easy to install, seem to be well maintained and work decently well.
> Having the box "sleep" for 95% of the day, and wake up the disks only for one or two bursts of activity is interesting to me.
Have you really managed to get that to work? I got a QNAP TS-230 a few months ago hoping I could do just that, but it turns out that even this smallest NAS box they are offering isn't really designed with this "occasional use" in mind. There are so many user questions about this (https://www.qnap.com/en-us/how-to/faq/article/why-are-my-nas...) that they even built a half-baked utility that's supposed to give you a hint which process is preventing the drives from going into sleep mode, but it's not a great help. I have disabled all services (photo indexing with face recognition and other such crap) except for the most basic ones, but still the drives are happily chugging along, flashing their LED every few minutes, not spinning down. I know that it's basically a Linux computer, and there are probably utilities I can install to diagnose the issue better than with the above-mentioned software, but isn't this something that's supposed to work out of the box, or at least be easy for an average user to accomplish?!
I run relatively few stuff on it too, with no client that consistently access the shares though samba/nfs (I cluster the scripts that run automaticaly around the same range of time, and they unmount after they’re done).
I wonder if your system is not writing logs from a monitoring service (something checking your dynamic DNS, or waiting for remote connections, or pinging the internal servers to see if they’re alive).
I stopped most default services, including sync (it’s done more rarely via a user script instead), and yes, it can be a PITA to understand what’s running out of the box.
As someone who normally rolls their own and has a half dozen raspberry pis around the house, I have a Synology NAS.
As you said, it's very low energy and zero noise (rubber feet and quiet fan).
It's also effortless to set up and run with their RAID system and syncing etc is very good. It'll let me know if there is a problem with a drive, it keeps itself updated and I've never had to reboot it.
I do not use it for Plex as it wouldn't be able to handle it. They are a bit under powered but they're very optimised for their main purpose, Network Attached Storage, and that's the most important thing for me.
It is also very neat and tidy, unlike most home made solutions.
Would OpenMediaVault on a Raspberry Pi have met your RAID and sync needs?
I keep eyeing OMV but haven't splurged on running a NAS with multiply-redundant drives separate from my application server; currently it's just an RPi4 as a docker host, with rsync keeping two USB drives in sync with each other.
I’m obviously not very good at this, but my issue with OpenMediaVault was less the software side than finding hardware to reliably work with it.
For instance trying with an old low profile PC, wake on lan didn’t work half of the time so I gave up, and just let the drives sleep. But they would also fail to wake up sometimes.
All in all a NAS is not that expensive (I have one I bought for 400 or so, and it’s lasting for 15 years now…), so it’s hard to justify the endless tweaking of a solution that could work the same if done well.
BTW even with a commercial NAS , ssh access is configurable, major scripting languages are there, and recent ones have docker I think. For the really custom stuff, like you I use Raspberry Pis that mount the NAS files as needed.
yea, normies do just buy a ready-built nas appliance, if you know what you are doing or like to tinker, use old consumer hardware.
just one nitpick with your setup:
if your primary hdd corrupts, rsync will not notice/care and corrupt your backup as well. zfs is all the rage because it aims to have two sources for the data aswell as a checksum, so it can tell which data-source is flawed AND fix it.
That software solution sort of dances around the issue, because if any files were to become corrupted, i could just go back in time to their older versions, though that approach isn't necessarily good either - since that means having to store the original "full" backups for a long time and also takes up more storage, which i do for the essential data, but not the stuff that i'm willing to lose. And, since the data stored can be arbitrary and therefore checking for corruption would have to be done manually, there are serious drawbacks to that, in regards to even knowing when things have gone wrong.
Of course, there's also the possibility to use additional sources of redundancy, like versions within Nextcloud should the actual file contents of a particular version become corrupt, however for certain scenarios file systems like you've described indeed do become the way to go. Maybe not for every homelab out there, though.
yea, error correcting memory is a second step towards data integrity. but it is of little use if your at-rest storage has no means to detect errors (like normal filesystem or raid1).
> You don't really want to use an SSD for backup because they can lose data when not used
Not (really) true, there is not much hard data backing this up as I see it, and for what there is, the retention would be 1+ year. The thread is not about offline backups, but an always powered backup server anyway. And SSD does make sense for a lot of reasons: HDD speeds are likely slower than local CAT cable, the higher IOPS helps on parallel workload cases, not to mention the way lower power draw and zero noise (no moving parts for the rest of the setup), compared to a chirping HDD.
Once you have usb debugging, I can recommend Vysor, super easy to use to control your phone from the pc.
I'd also recommend turning on Talkback (assistive technology) and VoiceAccess as that can help you use the device even if you cant see anything on your screen. That helped me a lot with my phone that has a broken screen where nothing but the touchscreen works.
I have a rooted phone that I use to get information I need root for with a broken touchscreen. The other day I was stupid enough to try to reset it to see if I could part of the screen working. Obviously that didn't work.
Re-enabling USB debugging was the following process:
1. use USB-OTG to go into the bluetooth menu to connect a bluetooth device to.
2. enable usb debugging using mouse
3. connect to PC and use the bluetooth keyboard to allow USB debugging
4. use scrcpy to control the device
Obviously this only works reasonably well if you can see at least some portion of the screen.
Not sure about this guide but you should be able to install a linux rom through fastboot instead of a recovery. You won't need a working screen to do that.
I had a similar problem. But I could see part of the screen. So o used a usb OTG cable with a mouse to enable BT and connect a second mouse using BT. Then I was able to enable USB debugging to use with scrcpy. Now I use the phone as a "security" cam.
If they have their bootloaders unlocked, I'd look into postmarketos. If not... well, I guess you really need to fix the screen up first, unless you had enabled adb. It could even be a temporary fix (for instance, a single screen if the 3 devices are the same.
Earliest version of termux on f-droid in the internet archive requires android 5.0 or later. Current versions require andoid 7.0 or later. How does the user with old android phone, e.g., 4.x, use termux. Current versions of primitive ftpd will work with android 4.0.3 or later.
if your old phone has no unofficial support for more or less modern on version, either forget about it or go all in native linux route postmarketos etc
This is a bit of a tangent, but is there a way to get low-level access to an Android phone's radio without installing Android? I have an old OnePlus One and I'd really like to use it as an SDR for weekend projects. Ideally, I'd have it run some sort of minimal linux installation. If such a thing exists, I'm betting someone on HN knows about it.
I think PostmarketOS intends to be that linux installation: “a sustainable, privacy and security focused free software mobile OS that is modeled after traditional Linux distributions.” https://postmarketos.org/
No idea how easy it would be to access that radio though.
Unfortunately this is also a recipe of dropping an unpatched unmaintainable Linux machine on the network to provide interesting security challenges in the future.
The first thing that come to my mind is: "How will this fare for a long term daily backup?"
A few of my old phones and SD cards died due to the flash being used up(can't write new data, ro only).
That said, with some magic linux command sauce, external drives can be used(part 2 of link), then software RAID, offloading data to them to prolong flash writes lifetime, then more stuffs like AnLinux can instead be used to add functionality to it.
Somewhat off-topic, but the article uses UrBackup. I came across UrBackup a couple years ago when I was looking for a free workstation backup solution with centralized control.
It's pretty neat! Check it out if you have a need for it.
Step 1 of TFA is basically "install Debian using Linux Deploy on a rooted phone", I was asking whether I can use Debian installed via Termux, on a non-rooted phone.
You don't need to continue charging the battery past given percentage. It just happens that by default batteries don't charge past 100%, you can tell the firmware to stop at e.g. 40% which is a perfect compromise between having a bit of backup power and considerably extending its lifespan.
Is there a standardised way to do that on old Android devices? I've got a tablet that I'd like to run continuously at some point as a touch interface to my house but I'm wary if hooking up the battery 24/7 after seeing what happened to other people's devices when they did that.
I had my phone plugged in for almost 3 years, and the battery still lasts for 3 days on its own. Highly recommended.
As a side note, can someone tell me if something like this can be made for postmarketOS? I installed it couple months back and this is really bugging me. Apart from constant vibrations.
> As a side note, can someone tell me if something like this can be made for postmarketOS? I installed it couple months back and this is really bugging me.
In theory, if changing charge thresholds is supported by the PMIC (power-management IC) driver, it should be exposed in the sysfs. It is device-specific though, and might not be exposed by the driver.
Depends. If the power source is decent enough, it's fine. If not, you'll end up with a swollen battery, like I did, and you'll get rid of it as fast as possible.
— Me, 2½ years ago: https://news.ycombinator.com/item?id=18019343