Jesus Christ, the usability nightmare of this website is worse than the goofy animated GIF they think is an exaggeration.
www.wired.com
to get rid of the autoplaying video go fuck yourselves,www.wired.com
to get rid of the assorted gigantic flyover bullshit.I forgot how bad the default site was, I’m a regular Wired reader and have blocked things on at least a weekly basis.
Us, selfhosters - sure.
Average person who value convenience over privacy/cost - no. They’ll continue to pay and be in prisoned by the cloud.
Some might say they’re freeing themselves in a way though. Self hosting requires dedicating time you could spend doing other things especially when things break. People pay for convenience and saving time. When we simplify self hosting and updating to a point people can just download apps and press go then it will make sense for the average person
This. Self-hosting doesn’t need to be a nerd thing.
Docker kinda does it by being like an app store but for servers. It’s not very flexible but everyone using a particular image gets the same experience.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
Fewer Letters More Letters Git Popular version control system, primarily for code NAS Network-Attached Storage RPi Raspberry Pi brand of SBC SBC Single-Board Computer
[Thread #17 for this sub, first seen 10th Aug 2023, 17:55] [FAQ] [Full list] [Contact] [Source code]
Good bot.
I am so happy to see some of the useful bots make a reappearance here. And you got yourself quite the nifty name, too.
Use open-source software! Do not rely on “someone else’s computer”. Build your own locally hosted cloud! If you can use open-source hardware when doing so: awesome. If not, make at least sure that everything needed to run the system is open.
Build your own locally hosted cloud!
This is the hard part to sell people. I feel like for self-hosting to become popular, there would need to be a “plug ‘n’ play” device that essentially has everything you need to set up a small server on your home network. If you could set up a home server as easily as you can set up a Google Home device, that would be amazing.
I run a bunch of stuff on Docker on my Synology NAS. It’s not quite plug and play but at it’s best it’s quite within the realm of someone who’s got some computer skills. At it’s worst though it can suck up a lot of time. I enjoy that kind of stuff when it’s not mission critical but I used paid cloud services at work for things that I run for free at home - precisely because I don’t want to be the one dealing with downtime in an emergency situation.
“Quite within the realm of someone who’s got some computer skills” means “inaccessible to most people”. I don’t mean to sound like an ass about it, but most people just don’t care enough about this stuff to invest even a bit of time in it (nevermind the upfront cost for a Synology or Qnap NAS).
Sure - but you’ve got to start somewhere. There are a lot of people who aren’t experienced sys admins who are buying raspberry pis or arduinos and they are probably really good candidates for self-hosting some of their services. I was surprised to find my neighbor (who’s a PM with a physical security system company) trying to do something with chatGPT, at first I was a little dismissive because i figured she was just typing prompts into the website, but in reality she was having issues with the python bindings and getting her virtual environments straight. If you can get to that point, you can surely self host stuff.
I run git locally for some of my projects and that was trivial to set up - I think anyone who’s used github would have comparable skills to self host gogs or gitea.
Certainly it’s somewhat expensive, but people spend a lot of cloud hosted services too. I’m sure in my house we’re dropping over $100/month on dropbox, chatgpt, google, adobe and probably a half-dozen smaller ones.
Sure, a local backup is great but unsafe considering your homelab is still in the same geographic area as the things you’re backing up. In the event of ecological disasters (local or other), storing all data locally is keeping your eggs in one basket. Idk much about cryptomater (or other such software) but encrypted automatic backups sounds perfect.
Some sort of cloud backup is always good in terms of physical damage to your hw.
I really like to use open source and most of the software on my computer is open source, but it’s not always possible
I backup my Docker configs to Google Drive but that’s about it in terms of self-hosted software.
It’s funny how the pendulum swings, first people would never let other people have their files, then they invested wholesale in cloud computing, now they are seeing the downtimes and expense and are backing off.
Same thing with client/server, had it in mainframe days, then got away from it with PCs, now we have Chromebooks and Microsoft wants Windows to run from the cloud, which is basically back to client/server again.
We have so much computing power at home and the chances you have good reliable Internet at home are better than before. I revived 5 year old PCs and it’s way too much computing power for my self host needs. I’d have to pay $200+ a month for the same compute power in the cloud. Even a Raspberry Pi with 8GB is capable of running quite a bit for fractions of a penny in electricity.
My parents have a plethora of old office PC’s, so I can’t even imagine how people pay those priced to not even own the hardware.
An RPI can even fairly easily be solar powered if that’s ever a concern
This article, as much as I agree with it, conflates cloud hosting and remote-only software design. Cloud hosting really is a prison, but mostly for developers that are lured by its convenience and then become dependent on its abstractions. What we experience today in most mainstream software isn’t necessarily coupled to cloud hosting, but is instead a conscious product design choice and business strategy to deny users power and control of their data. In short, cloud providers like AWS, Azure, and GCP are doing to software companies what those companies are doing to us. There is a way to use shared data centers without this kind of software design philosophy. As mobile continues to dominate, the solution we need likely involves remote servers but with a model that treats them with skepticism and caution, allowing data portability and redundancy across a variety of vendors. I should be able to attach a few hosting services to a software experience I use and transfer my data between them easily. The idea that local-first software is “freed from worrying about backends, servers, and [hosting costs]” is misleading, since my local device has to become the client and/or server if there is any connectivity happening over the internet. Wresting control of our data from the dominant software companies will require creating experiences that are not only different, but better, and doing that with a mobile phone passing between cell towers functioning as the server is a tall order. We have grown to expect more than intermittent connectivity with conflict resolution. Nonetheless, we absolutely should not accept the current remote-only software paradigm, but instead need to devise better ways to abstract how remote hosts are inhabited and create a simple multi-host option that is intuitive for consumers.
Hey, you make a great point. There’s a false dichotomy being presented here. As you see it, local-first is a bit of a misnomer when you already expecting your device to join a remote environment.
Yes, makes sense that we’re being lured by the so-called cloud hosting. Following a business model that sells convenience in lieu of data control, cloud providers are distorting our current understanding of remote hosting. They’re breaking the free flow of information by siloing user data.
Now, with that being said, I’d like to add something about your presentation. I’d suggest you avoid walls of text. Use paragraph breaks. They’re like resting areas for the eyes. They allow the brain to catch up and gather momentum for the next stretch of text.
Regardless. You brought light to this conversation. For that, thank you.
I’m glad you found my take engaging!
Paragraph breaks now enabled.
Great points. It’s the proprietary nature and lack of interoperability of “the cloud” that causes problems. My email is hosted on a remote server but I have control over my data. There’s no algorithm controlling what order I see my mail in or who I can forward stuff to. There are many different tools and clients available to me and to everyone else to work with their data.
Imagine if publishing a photo from my phone to Instagram meant copying a file from one folder to another. Or if I want to create an automatically translated voiceover from the captions of all my old Facebook photos in a video editor. Right now these operations require complex software. But the technology is all there and has been for a long time.
I often think about https://upspin.io
Exactly - interoperability is key, and is intentionally removed from many software platforms once they become big enough. Cory Doctorow writes about this here.
Companies have a funny relationship with interop. When companies are small and trying to build up their customer-base, they love interop, love the idea of selling ink for someone else’s printer or a way to read your waiting messages on someone else’s social media giant. Facebook once had a whole suite of interoperability tools to make it easy to plug Facebook into other services, but it has whittled these away over the years and today it routinely threatens and even sues rivals that try to interoperate with it.
A trend that I actually like is more software supporting using a user’s own iCloud or Google Drive as a data store rather than using the company’s own servers. The step that needs to take place is a way to use many storage providers simultaneously (including home server) with syncing behavior abstracted away. The software would essentially be a database cluster with a variety of heterogeneous nodes supported. A library that abstracts this multi-host pattern for use in both Android and iOS apps would go a long way. There is still the problem of the controller orchestrating uploads and syncs, though, which for most users would be their phone.
Upspin is new to me but looks like it’s right up this alley. Making the whole thing work for non-technical users will be one of the hard parts I imagine.
That graphic is dope!
Thanks! Good read
Not with closed source OS’s it won’t.
I felt this “prison” very strongly with iCloud. Don’t get me wrong, I think iCloud functions exceptionally well. It’s an extremely well integrated cloud and works seamlessly with all Apple products. It’s just that after a while I start to realize just how much of my life was sitting on Apple servers and what a dependency I had on Apple, hoping they are the good guy (narrator: they were not, in fact, the good guy) or at least, not as bad as the next best option (I feel Google has legitimately become evil at this point). I was constantly reading about security and getting myself worried, etc.
Finally I just bought a NAS. Synology is my current choice, but use whatever you prefer. A NAS can replicate anything the “cloud” can do, it’s faster, it’s safer, it doesn’t rely on the good graces of any cloud provider. YOU hold the access to your data. As it should be. I still use the “cloud” for my backups with HyperBackup sending encrypted backups to Wasabi, but that is a different matter. Even if Wasabi decided to be evil, my data is encrypted before it ever leaves the NAS and Wasabi could never see my raw data like Apple/Google can.
The only thing holding people back from this, I guess, is price. Apple charges $0.99/month for 500gigs, while just the NAS itself with no drives will cost you several hundred. But man, not being worried about the latest cloud drama, government overreach, privacy scandals, etc is worth every cent. A Synology NAS with Tailscale is just about the safest place to put my data. All the Snyology mobile apps even pass the gf test for features and ease of use. I recommend a small 2-bay NAS to everyone I can.
Turn off the cloud, and take your data back.
But the NAS is in your house… which basically means if it gets flooded/burns down all your data is gone too.
I already have my data on my PC, a second backup inside the same house isn’t worth that much. But instead of relying on a cloud service I just rent a virtual server (for various things) and use Seafile to keep my data in sync.
PC breaks? House burns down? My data is on my own server in a datacenter. My server gets cancelled? My data is on my PCs.
So even with your NAS you are 100% reliant on a cloud backup still, so why did you get the NAS when you already have a copy of your data on your devices?
I don’t really understand your comment.
PC breaks? House burns down? My data is encrypted in a datacenter. My account gets cancelled? My data is on my NAS.
I don’t store much data on my PCs or devices at all. Any data that is there I treat as transient. The NAS acts as permanent storage. So if the devices die, I can quite literally restore them to the state they were in within hours of their death from the NAS. If my house is hit by a tornado and my NAS dies, my data is safely encrypted in an external location. I’ve lost nothing. If my NAS, devices, and Wasabi’s data center are all hit by tornadoes at the same time we have bigger problems to worry about. If that ridiculous scenario happened your server would not be immune either.
I’m not seeing the advantage of your rented server vs having backups in the cloud. Is it because the server will keep running? But if you’ve lost your devices in a fire you still can’t access it whether it’s running or not. When you replace your device you can then connect to your server, but I can simply download my data again. HyperBackup Explorer is available for every platform and can do a full restore back to a NAS, or individual file downloads for anything else.
Ah i see ignore my other comment - didn’t realize synology did remote backup as well as storage.
Sorry if I wasn’t clear about that. My essential thinking with the NAS was: Cloud is nice, but how vulnerable are you if the Cloud provider turns evil?
With Apple and Google, you’re basically screwed and there is nothing you can do.
With a NAS, you own the server. You don’t rent it. You own it. You can hold the thing that stores all your private data in your own two hands.
So what if the data center I host my backups on becomes evil? Well, then they find a bunch of encrypted blobs they can’t access while I move my backups to a different host. I’m not sure even the server hosting you’re talking about is as secure as that. What if they become evil? How much access do they have to your data? All “evil” takes is a single policy change from a suit who has no idea about actual tech. It happens all the time.
Maybe that comes off as paranoid, but with all the data breaches and enshittification happening lately I feel much more secure having my data literally in my own two hands and a built-in defense against evil policy changes/government overreach for anything that must be hosted externally. Coupled with Tailscale for remote access I believe this as secure as you can get.
And again, Synology was my choice for ease of use, but you can build a capable NAS from an old Optiplex on ebay for 200 bucks + drives.
deleted by creator
Looking at old defunct forums and blogs on the Wayback Machine, spam and security problems are frequently-cited reasons for shutting down or going read-only.
Over time, the internet has gotten more hostile.