Ubuntu’s popularity often makes it the default choice for new Linux users. But there are tons of other Linux operating systems that deserve your attention. As such, I’ve highlighted some Ubuntu alternatives so you can choose based on your needs and requirements—because conformity is boring.
From an engineering perspective, I prefer Debian distros. Apt is the greatest package manager ever built. For a production server, I’d choose Debian or maybe Ubuntu if I needed to pay someone for support.
But for a desktop, Ubuntu kinda sucks. These days, I think I’d recommend Fedora to Linux noobs.
And for my toys at home, I run Arch btw.
Heard. Debian in the streets, Arch in the sheets!
What about Ubuntu derivatives for desktop? My go to recommendations are Pop! OS and Linux Mint (which I use).
Linux Mint Debian Edition is my standard recommendation for desktop for those newer folks.
Straight up Debian for everything else. Debian is my desktop. And all of my servers (aside from some things I’m testing for work or something where I need to test against RHEL or something).
And Proxmox for VMs.
Pop! Os user going on a year now and I can’t recommend it enough, at least as a first distro.
the de matters more for new users honestly, so I usually recommend: gnome: ubuntu
kde: kde neon
cinnamon: mint
cosmic: pop
and just let them choose what they want
What’s your rationale for making that claim?
See the other thread.
TL;DR: Useful abstractions and a hell of a dependency solver.
How do you compare it with Pacman?
I was fighting rpm hell on redhat for the 3rd or 4th time using red hat linux 5 to 6 or perhaps 6 to 7. When i first installed debian potato on my daily driver. We had 20 ish servers, but the constant hunt for the right combo of rpm’s made me distro jump my own machine. A while later i was floored when i could apt-get full-upgrade to the next debian version without rpm hell and almost everything just worked. Never installed another redhat machine and have been using debian + kde ever since. And 99,3% of all servers i maintain are now debian. A few odd ubuntu machines for $$reasons.
Is rpm hell gone for good nowadays?
I think yum does a better job. But i never installed another redhat machine so who knows. Been thousands of debian machines over the years tho. Luckily now it is right click -> vm from template or terraform apply. and not hours swapping floppy discs ;)
Urgh, no, it’s not. Everything about it is super crusty if you go beyond simply installing packages and adding others’ PPAs IMO.
I could probably list more but I haven’t had to touch apt in a while, thankfully. But it is probably the #1 reason I avoid anything Debian-based. #2 is probably their Frankenstein sysvinit/systemd setup.
I do have to say that apt remove vs purge is pretty cool though.
What do you like about it?
That’s a problem of the package, not the package manager.
Generally this fits with Debian’s philosophy. But regardless I think it’s out-of-scope for why Apt is good. You could make a distro with Apt and not have your packages do this.
I’m not talking about
apt
the CLI tool, but the actual package manager. The plainapt
tool is only designed to be a convenience wrapper for common workflows implemented in other tools.As you correctly pointed out, Apt has the distinction between packages installed as a dependency (“auto installed”) versus packages installed directly (“manually installed”). This is precisely one of the reasons why I consider Apt the best package manager. (Yes, I know other package managers can do this, not all though.)
If you want to install a package as manual, then later mark it as auto, you can do that with
apt-mark
.Are you maintaining a PPA for others?
Frankly, I’ve never run into this problem.
dh_make
helps you create a package that adheres to Debian policy, and there is good reason for Debian to have those policies. But if you’re just packaging something yourself, you don’t have to use it. It’s just a template for new packages.At the end of the day, all you really need to create a deb is to create two files
debian/control
anddebian/rules
. These are the equivalent to a PKGBUILD. The control file specifies all of the dependency metadata, and therules
file contains the install script.The difference in packaging philosophy is that PKGBUILDs are external and they download the upstream sources. On the other hand, in Debian, they rehost the upstream package and add the
debian
directory. This means that building Debian packages is mostly hermetic: you don’t need access to the network.Mostly that it makes super useful distinctions between concepts. But there are other goodies.
I also do appreciate that Debian pre-configures packages to work together with the same set of conventions out of the box. But again, that’s a property of the packages, not of Apt.
Sure, but the interface is probably just as important as the actual logic behind it, isn’t it?
Honestly I would consider that one of the fundamental things a package manager must do, I didn’t think it was a special thing haha
Yeah, I know. But if you want to manually install a package like that, you have to remember the extra step after it’s finished installing instead of before the install. It’s just unergonomic, for something that could be a flag (e.g. in
emerge -1
) and that I at least use fairly often.Another problem with it being a two-step thing is that if you do it unconditionally in a script, it doesn’t retain the flag from before the previous installation command, you need a third step, i.e. checking if the package was installed before. My use case for this was installing dependencies for a package build which should be able to be removed again afterward, while not affecting the subset that were already installed explicitly.
Now that I think about it, it’s probably a good idea to always check if a package needs to be installed before installing it if you script it, though, because otherwise it might be unnecessarily reinstalled. Fair enough.
Yeah, I maintain some software/config/meta packages for the computers at the uni I study at. Before, I’m pretty sure the packages were manually packaged with every update and I wanted to automate it a bit and also make clear how to get from the source tarballs to the final build.
Ahh, the way it’s structured makes a lot more sense knowing that. Coming from packaging stuff for Arch, Gentoo and NixOS, where the packaging process is essentially the same for all three, with you usually supplying source download URLs, I had absolutely no idea how debian/rules would allow me to do anything and felt like I was missing a big thing. I guess it really is just a Makefile that you run directly, and that makes sense if you already have the sources in your tree?
This, at least version constraints, is another one I’d consider essential tbh. The rest are great though, I agree.
The logic is why I love Apt. Most robust dependency resolution algorithms I’ve used.
But also, I don’t have any issues with the CLI. Having a distinction between
apt-get
andapt-cache
andapt-mark
doesn’t feel weird to me. You’re practically just separating the top-level sub commands by a dash instead of a space. Theapt
command is really just a convenience thing, and there are specialized tools for the more advanced things. Which is fine by me.Also, the top level
apt
command doesn’t guarantee a stable CLI, so for scripting you’re supposed to useapt-get
and friends anyway.You’d be surprised. Homebrew (the de facto standard package manager for macOS) doesn’t do this. Though, you can at least lookup the “leaf” packages which are not dependencies of any other package.
And, most language-specific package managers can’t do this. E.g. if you install software with
pip
orcargo
.If the package is in use, it shouldn’t be an orphan.
For example, what if you race with a cleanup job that is removing orphans? (Debian is hyper stable, so I often enable unattended upgrades with autoremove. I’m not so comfortable doing that on Arch ;)
What you’ve described is just an
apt-get install
when you start and andapt-get remove
when you’re done. Or more properly setting it as a build dependency in your source package, to let Apt handle it.But also, why uninstall build tools?
Yeah, version constraints are common. But most other package managers bail with an error when they encounter a conflict. Apt is really good about solving conflicts and proposing solutions. Often it will propose multiple solutions to your conflict for you to choose from.
Again, it’s the solver part of Apt that makes it the best IMO.
Bit of a noob but what’s the practical differences between Apt and the others. I use Fedora and the only difference I notice is that instead of typing apt update and apt upgrade, I just type dnf update.
Practical difference: Both dnf and apt are slow as hell. Pacman is flying compared to them.
In terms of practical differences to normal people, there aren’t many, and it pretty much comes down to the syntax of using them and the speed at which they work.
Personally I like the syntax of using dnf, even if it is kinda slow, especially compared to the likes of pacman.
Arch on my PCs and Debian on my servers and VMs.
Fedora’s near daily update and restart cycle is so annoying esp when you have an encrypted hard drive. I know it’s part of the deal and I’m lazy, but all I’m using it for is a Jellyfin client.
What do you mean restart cycle? You only have to restart if you want to load the new kernel (there’s technically a way to avoid even that). If you don’t feel like installing a better tool for the job like Debian, just update less, most of your packages will still be newer than most distros. Also not sure why you would encrypt if its just jellyfin client.
In the Software Manager, whenever there is an update you must press “Restart & Install” in order to update. Never seen a restart not be required. Why would I not update when I would be potentially miss important security patches?
Also I typically encrypt during install for enhanced privacy. Probably overkill but yeah. I don’t really have a specific reason other than that.
My other system is Linux Mint 21.3 and restarts are very infrequent.
Ah I am not familiar with the software store, you don’t have to do that from the command line. And thats true, I’m not suggesting to never update, just less. Also if theres not much to steal on your computer, saftey is a little less important. I would personally feel comfortable updating once per month but thats up to each user. I sat on fedora 37 for way too long because Ubuntu made me afraid of major upgrades.
Same. Albeit I’m on manjaro which suffers from the same issue. Distro hopping on an encrypted drive with no separate home partition is a huge pain in the butt
deleted by creator
Yep. From an engineering perspective I prefer Debian distros. Ubuntu is a Debian distro. I said I would consider using Ubuntu in prod, and this is the reason.
deleted by creator
What can the apt of Debian do that Ubuntu can’t
Nothing. They’re mostly the same thing.
The Ubuntu version will sometimes print “ads” to your terminal :P.
For a prod server, I’d choose Debian over Ubuntu if I didn’t have paid support, because I’m not a fan of Canonical. If I needed paid support, I’d choose Ubuntu, because Debian is strictly a community distro. (That community happens to include major companies, like Google.)
Last time I used latest Ubuntu:
Default scaling on login screen and desktop sucked. If I had vision problems it would be unusable.
Settings application crashed after trying to open half of the menus.
Despite user interface looking like it’s made for tablets, the actual touch usability was horrible. I couldn’t even resize windows without being precise as fuck and there was no windows snapping despite it being a feature on Windows for more than a decade.
Couldn’t double click on Windows program to run it in Wine despite it being possible 10 years ago.
Reliance on snaps, even though installing software from 3rd party sources still being horrible.
I was a longtime Debian/apt diehard but I’m coming down on the same side of late. My homelab runs Proxmox (Debian based) with Ubuntu 22.04 LTS containers for more up-to-date packages, but my attempt to use KDE Neon (Ubuntu-based) for my desktop PC was a disaster. I’ve switched to Nobara (Fedora-based), and other than having to switch from Wayland back X11 because Wayland on NVidia breaks a bunch of things I need for work it’s been relatively smooth sailing.