If only there was some other alternative than throw my old stuff in the bin.
If only there was some other alternative than throw my old stuff in the bin.
I kinda-sorta finalized my migration to a smaller setup with my mail+web server. I’ve been running a small MSP business for several years and as customers flee right and left mostly to microsoft (due to 365 setup pricing) it’s been in a decline for quite a while. So, I finally pulled the plug and shut down the business side of things and downscaled that to a single VPS with a handful of domains, email service and a few simple wodrpress sites.
Also I kinda-sorta moved all of my photo archive of 20+ years to immich and set up a backup scheme for it, which is now (only) 2-1-1. I also need more storage for that thing, but it needs to wait for few days until paycheck and after that migration I can finish importing all the photos I have laying around. That also requires some reconfiguration of my disk arrays, copying couple of terabytes from system to another and back again, but that’s relatively easy thing to do, but it takes “a while” to accomplish.
After that there’s a long list of things to do, but mostly I’ll spend my free time and money to improve the current setup as quickly as possible in the immediate future.
True. And there’s also a ton of devices around which don’t trust LetsEncrypt either. There’s always edge cases. For example, take a bit older photocopier and it’s more than likely that it doesn’t trust on anything on this planet anymore and there’s no easy way to update CA lists even if the hardware itself is still perfectly functional.
That doesn’t mean that your self-signed CA, in itself, would be technically any less secure than the most expensive Verisign certificate you can find. And yes, there’s a ton of details and nuances here and there, but I’m not going to go trough every technical detail about how certificates work. I’m not an expert on that field by any stretch even if I do know a thing or two and there’s plenty of material online to dig deep into the topic if you want to.
Valid certificate is anything you trust. Any CA which you can trust is no more or less secure than the one you get from LE, so for the private network you can just happily sign your own certificates and just distribute the CA to your devices.
Ubuntu or something based on it
I would not recommend ubuntu, specially on this case. System updates, snapd mostly, have gone downhill and it’s nearly impossible to avoid reboots for extended periods. Debian seems to be still as solid as it’s always been.
Laptops use lithium-ion batteries and (at least your Average Joe’s and majority of commercial units too) UPS uses sealed Lead Acid. If lithium ion battery goes belly up it’ll burn your house down. If lead acid battery does the same, at worst, it’ll leak a bit of corrodive fluids to whatever it’s on top of.
There’s commercial size li-ion UPS’s too, but they require quite a lot of hardware around them to be used safely. Search from youtube (or whatever you like) a cell phone battery explosion and then scale that up to a fridge-sized cell-phone. It’s quite a bit of steel and concrete to contain that amount of energy. And the funny thing about li-ion fires is that lithium ions reacts quite violently with water and the battery contains all the chemicals to keep the fire going, oxygen included.
So, yeah, UPS is a whole another thing to manage than a laptop battery.
If you can’t access the hardware physically and you don’t have someone on site who can work on it, just drop the idea and get a VPS or whatever cloud based. No matter what hardware you plan to use. Anything and everything can happen. Broken memory module, odd power surge, rodents or bugs messing up with the system, moisture or straight up water leak corroding something, fan failure overheating the thing and so on.
There’s only one single fact on the business that I’ve learned over 20something years I’ve been working with IT: All hardware fails. No exceptions. The only question is ‘when’. And when the time comes you need someone to have physical access to the stuff.
I mean, sure, your laptop might run just fine for several years without problems or it might have shipping damage over that 3000km and it’ll break in a week. In either case, unless you have someone hands on the machine, it’s not going to do much.
It would be difficult to recommend Immich as a gallery app to someone who doesn’t have experience in selfhosting.
You already have plenty of responses, but immich is not an gallery app. I’m in the process of migrating my photo libraries to immich and it’s 20+ years of memories. Some are originally taken by film camera and then scanned, others are old enough that camera phones just didn’t exist and we had “compact” digital cameras. Then there’s photos taken with DSLR and drone and obviously all of the devices have changed multiple times over the years, so relying on just a single device is just not going to work over time.
All of those require some other system to store, organize, back up and enjoy than the device itself. And, as I have family, storing them on just my desktop would mean that no one else around would have easy access to them. And with immich I can easily share photos around when I carry DSLR with me in a family gathering or whatever.
And then there’s the obvious matter of having enough storage. Even my desktop doesn’t have a spare terabyte right now to store everything, I need the hardware anyways, so it just makes sense to keep them separated from my workstation which I can now do whatever I want with without worrying I’d lose any of those precious memories. And for the server part, I’m having one around anyways for pihole, home assistant, nextcloud to store/back up other data and so on, so for me it’s the most convenient approach to run immich server on there too.
And for the backup side of things. I’ve tried manual backups with various stuff over the years. It’s just not going to work for me. I either forget or life gets in the way or something other happens and then I’m several days or weeks behind the ‘schedule’. With dedicated server I don’t have to do anything, everything is running automatically at the background while I’m sleeping or doing something else more interesting than copying over a bunch of files.
Oh, no. The data is on my local server and VPS is a whole different thing. I just brought up that I already use hetzner services, so it would be convenient to stick with a single provider.
You are absolutely correct. I don’t mind the few GB’s worth of data for the operating system, a single video with my drone is likely more than that and it’s not something you can deduplicate nor compress very well. If I really wanted I think it should be possible to squeeze the operating system at least below 2GB, but it’s just not worth the effort. I just want that the memories over 20+ years I have on the thing to remain.
Proxmox backup server (at least from proxmox) is way more expensive than any raw storage option. For the external drives, I won’t do that. The server has RAID setup on disks and adding another local disk wouldn’t achieve anything on my situation as I need an off-site copy.
I just use xargs -n1. Or -exec with find.
https://www.offlineimap.org/ should do the trick
Well that’s a interesting approach.
First, you would need either a shared storage, like NAS, for all your devices or for them all to have equal amount of storage for your files so you can just copy everything to everywhere locally. Personally I would go with NAS, but storage problem in general has quite a few considerations, so depending on size of your data, bandwidth, hardware and everything else something other might suit your needs better.
For the operating system, you would of course need to have the same OS installed on each device, and they all would need to run the same architecture (x86 most likely). With linux you can just copy your home directory over via shared storage and it’ll take care of most of the things, like app settings and preferences. But keeping the installed software in sync and updated is a bit more tricky. You could just enable automatic updates and potentially create a script to match installed packages between systems (Debian-based distros can use dpkg --get-selections and --set-selections, others have similar tools), so you would have pretty closely matching environments everywhere.
Or if you really want to keep everything exactly the same you could use Puppet or similar to force your machines into the same mold and manage software installations, configuration, updates and everything via that. It has a pretty steep learning curve, but it’s possible.
But if you want to match x86 workstations with handheld ARM devices it’s not going to work very well. Usage patterns are wildly different, software availability is hit or miss and the hardware in general differs enough that you can’t use the same configs for everything.
Maybe the closest thing would be to host web-based applications with everything and use only those, but that limits heavily on what you can actually do and doesn’t give you that much flexibility with hardware requirements, meaning either that your slower devices crawl to halt or that your powerful workstation is just sitting idle on whatever you do.
Maybe better approach would be to set up remote desktop environment on your desktop and just hop on to that whenever needed remotely. That way you could have the power on demand but you could still get benefits from portable devices.
Better internet connection - a lot of hosts have 40Gbps connections now, and it’s a data center grade connection with a lower contention ratio.
And also better infrastructure in general. VPS’s are running on a datacenter with (most likely) failsafes for everything. Multiple internet connections, pretty beefy setup for power reundancy with big battery banks and generators, multiple servers to take your stuff over in case a single unit fails, climate controls with multiple units and so on.
I could get 10Gbps connection (or theoretically even more) to my home, but if I want all the toys the big players are working with that would mean investing at least several tens of thousands euros to get anywhere and more likely hundred or two thousands to build anything even near to the same level. And that doesn’t include things like having mechanics to maintain generators, security stuff to guarantee physical safety and so on, so even if I had few millions to throw on a project like this it wouldn’t last too long.
So, instead of all that I have a VPS from Hetzner (I’ve been a happy customer with them for a long time) for less than a hamburger and fries per month. And that’s keeping my stuff running just fine. Obviously there’s caveats to look for, like backups in case Hetzner suddenly doesn’t exist anymore for whatever reason, but the alternative could as well be setting up a server farm in the Moon as that’s about as difficult to reach as getting similar reliability from them for ~100€/year.
Well, sure, I could leave just the z-wave endpoint at home and move the server to the cloud, but that would mean that none of my automations would work if the network happens to be down. And my ISP is pretty damn good to keep me on line, but that’s one thing of my home automation I’m not willing to compromise. Everything has to be local and not dependent on any kind of connectivity to outside.
Sure, the things rely on the infrastructure (networking very much included) I have in place in my house and it’s not perfect by any stretch and my HA server in itself would most likely be ‘safer’ in the cloud, but it still is my home automation and I want to keep it local to avoid connectivity issues, latency and other stuff beyond my control.
And sure, should my server PSU die tomorrow, it would bring the whole system down. As I mentioned, the setup is far from perfect, but it’s built the way I like it and, for me, this is the best approach. You may weigh pros/cons differently, and that’s perfectly fine. I have my reasons and you have yours, both equally valid.
But I’d still rather not mess with hardware, I just need at least one physical server and other stuff around to keep things running the way I like them.
Without a doubt a lot do, but I personally couldn’t care less. I have a server at home, but that’s just a necessary evil. If I could I’d just rent hardware for everything, but there’s technical and obviously financial limitations with that.
And hosting pretty much anything is practically identical regardless of the platform. Sure, there’s exceptions, like my Home Assistant server with z-wave, which needs to be physically nearby my other stuff, but things like fediverse instances and other browser-based stuff are exactly the same to maintain regardless of the underlying platform.
I don’t bother to take out the screws. I just drill handful of holes trough the whole thing. Or if you’re really paranoid a MAP torch is enough to melt the whole thing (don’t breath the smoke).
My personal opinions, not facts:
For hdd’s to be used as long term storage, what is usually the rule of thumb? Are there any recommendations on what drives are usually better for this?
Anything with a long history, like HGST or WD (red series preferably). Backblaze among others publish their data on longevity of drives, so look for what they’re offering. On ebay (and others) there’s refurbished drives available which are pretty promising, but I have no personal experience on those.
Considering this is going to store personal documents and photos, is RAID a must in your opinion? And if so, which configuration?
Depends heavily on your backup scheme, amount of data and available bandwidth (among other things). Raid protects you against a single point of failure on storage. Without raid, you need to replace the drive, pull data back from backups and while that’s happening you don’t have access to the things you stored on the failed disk. With raid you can keep using the environment without interruptions while waiting for a day or two for a replacement. If you have fast connection which can download your backups in less than 24 hours it might be worth the money to skip raid, but if it takes a week or two to pull data back, then the additional cost of raid might be worth it. Also, if you change a lot of data during the day, it’s possible that a drive failure happens before backup is finished and in that case some data is potentially lost.
On which level of RAID you should use, it’s a balancing act. Personally I like to run things with RAID5 or 6 even if I have a pretty decent uplink. Also, you need to consider what’s the acceptable downtime for your services. If you can’t access all of your photos in 48 hours it’s not a end of the world, but if your home automation is offline it can at least increase your electric bill for some amount and maybe cause some inconvenience, depending on how your setup is built.
And in case RAID would be required, is ubuntu server good enough for this? or using something such as unraid is a must?
Ubuntu server is well enough. You can do either sofware raid or LVM for traditionald RAID setup or opt for a more modern approach like zfs.
I was thinking of probably trying to sell the 1660 super while it has some market value. However, I was never able to have the server completely headless. Is there a way to make this happen with a msi tomahawk b450? Or is only possible with an APU (such as 5600g)?
No idea. My server has a on board graphics, but I haven’t used that for years. But it’s a nice option to have in case something goes really wrong. You can still sell your 1660 and replace that with the cheapest GPU you can find from ebay/whatever, at least as long as you’re comfortable with the console you can fix things with anything that can output plain text. If your motherboard has separate remote management (generally not available in consumer grade stuff) it might be enough to skip any kind of GPU, but personally I would not have that kind of setup, even if remote management/console was available.
If you guys find any glaring issues with my setup
I don’t know about actual issues, but I have spinning hard drives a lot older than my kids which still run just fine. Spinning rust is pretty robust (at least in sub 4TB capacity), so unless you really need the speed traditional hard drives still have their place. Sure, a ton more of spinning drives has failed on me than SSD’s, but I have working hard drives older than SSD as a technology has been around (at least in the sense of what we have now), so claiming that SSD’s are more robust (at least on my experience) is just a misnderstood statistics.
But it’s not obvious either. When I say ‘apt install firefox’, specially after adding their repository to sources.list, I’d expect to get a .deb from mozilla. Silently overriding my commands rubs me in a very wrong way.