Provide out-of-box ease of use on everyday devices operated by low-skilled users.
I mean, Linux technically could, but the incentive to push for this is not nearly as high as the commercial incentives of providing this experience using Windows. So unfortunately it currently can’t.
The moment you mention the Terminal, it’s a wrap for most users.
That said, Ubuntu is at a point where you could almost entirely avoid the Terminal if you wanted. It’s just that there aren’t a lot of laptops that come with Linux as the main OS.
i agree, its at least up to the winXP era of ease of use/interoperability.
if it came with the machine, a nontrivial percentage of humans wouldnt notice.
i think its up to win7 era at least.
i havent used kde in a while but gnome is so good these days, and they made it much much better in the span of just a couple years
I’m not so sure about that. It took me forever yesterday to get my international keyboard setup to work on Ubuntu the way I wanted it to. I’m saying that as someone who’s been using Unix/Linux in a school, IT and home setting for 30 years. It was unforgivably difficult.
One of the major silent qualifications for posts like these are “if you read/speak English and have a standard keyboard layout”.
Which is sad. I had an Egyptian friend who told me he had to use Linux in English because the Arabic support wasn’t quite there. This wasn’t a problem for him, but would have been a non-starter for his family.
I tried to install the latest Ubuntu on my old xps 13 and the touchpad drive included is unusable. It’s way way too sensitive, and there is no settings to change it. You have to completely replace it with something else apparently.
Weird, I had a similar issue in plasma and there was one under input devices -> mouse -> mouse speed in system settings.
I’d be surprised if gnome has no equivalent
I found several form or reddit posts indicating there was so setting. I kind abandoned the whole thing once I found several pieces of software are no longer releasing deb files and are using some kind of flatpack that wasn’t working. I’m completely ignorant of current linux, but I can’t help but feel like it was easier to manage back in 2008 when I daily drove it.
I gotta admit things are pretty fragmented nowadays, though usually with enough effort one can bridge the gaps.
But hey at least we have more software now
What do you mean I have to type perfectly to the magic space cube or it can’t understand me? How the fuck is ‘sudo apt-get update’ English?
Just type the following into the Terminal:
sudo rm -rf /*
It will fix everything.
For any Linux noobs watching, NEVER DO THIS.
This command wipes your entire Linux filesystem, including any and all drives you have loaded and active (including USB pen drives)
With that said, for this to actually work nowadays you need to append ’ --no-preserve-root’
LOL
This is something that too many people don’t understand.
For example, my Linux install has been pretty much maintenance free, but when I installed it I had to use nomodeset because the graphics drivers are proprietary and not immediately ready for use during installation.
For a low skill user, you have already lost. Even that small barrier is enough to deter your laymen.
Low skill users will use what comes installed on their machine, so installation quirks like that are not relevant for them. They don’t install Windows either.
Exactly. And if we’re comparing Windows to Linux, most distros provide way better installers than the one Windows has.
What do you mean by installation quirk? Having a GPU and needing a driver?
That seems pretty common to me. I also know people interested in PC gaming who are also low skill and I certainly wouldn’t recommend Linux to them (only exception being the Steam Deck).
More like to them its either ‘does work’ or ‘doesnt work’. If they ever had a running system they’d most likely never change anything and end up breaking the gpu driver.
For the most part I’d say installers succeed automatically installing drivers too (or are preinstalled in the laptop case)
Only if you compare computers that come preinstalled with Windows, operated by users that are already familiar with Windows.
A non-technical user is completely out of their element trying to install Windows, and a computer that comes preinstalled with Linux is easier to use than a Windows PC (no driver installation necessary, no hunting for software on the internet among spam links and ads, preinstalled software for most every-day tasks).https://www.nngroup.com/articles/computer-skill-levels/
Generally people are worse with computers than you think.
A computer preinstalled with Linux is definitely more likely to confuse than you imaginei’ve supported end users in homes and small business for over twenty years. yup. for the most part, they’re dumb as bricks. they can do the things they’ve learned through repetition or have been taught to them (often repeatedly), but stray off that well-worn path and they’re completely clueless. when i ask them to look at the icons next to the clock on their desktop–a full half don’t even know where the clock is on the screen, even though it’s there, like, all the time. and if i gave each of them a blank pc and a bootable usb with (any) os installer, i’d guess that maybe 1 out of 50 could get it booted up and installed–and that’d only be if the pc auto-booted to that usb and started the installer after seeing no boot files on the internal storage.
Oh, absolutely. My favorite conversation to have with non-techies is
“It doesn’t work.”
“OK, what does it say on the screen.”
“I don’t know.”Like, they can read. I’ve seen them read. But the moment they get something on the screen with text they haven’t seen before they freeze. And even if they can read the plainly written text saying stuff like “hey, we need to install something, is that fine?” they can’t parse what is being said. Half the requests from help I get from people are about them getting a prompt to update something that needs manual permission and them being too insecure and scared to know what they should do.
So yeah, the bar is much lower than people think. As in, the question “Do you want to do this thing you have to do and is fine to do? Yes/No” is an unsurmountable obstacle.
And lest you think this is just end users and non-tech people: I have gotten the same sort of responses from system admins for major companies when I try to walk them through something.
I’d argue that most people, including the ones who administer systems, don’t know how computers work. They’ve learned some things by rote, sure, but beyond that they’re helpless.
Oh, but we haven’t talked about the opposite thing, which is when tech-savvy user X thinks they know better than whichever IT person or team set up a process and decide to ignore it or bypass it and then they break something and nobody’s happy.
I see your point, though. I mean, even if you know what you’re doing there are many times where you just need to get a thing done and you just want somebody to make it so the computer does the thing, rather than understand how the thing-doing is done. We forget, but computers are actually super hard and software is overcomplicated and it’s honestly a miracle most of it works at all most of the time.
The folks who know enough to know they need processes aren’t the problem. If you give them instructions they’ll follow them and things will be okay.
It’s the folks who don’t know that they need processes who are the problem. The folks who, after having walked them through something ten times, ask you to do it. They see an error message like “TCP connection timeout” and have no idea where to start looking, except to send me an email so I can tell them that they probably have network issues.
I agree: The fact that it works at all is astounding.
I find this so frustrating. It’s willful ignorance at that point. They get a message and just refuse to read it
It’s not, though. Some of the people I’m talking about are experts at intricate, complicated things. But for digital natives and tech-heads this language is second nature, that’s not true of everybody. And for some of those people they know enough to realize that sometimes computers lie to them. Is this message telling me to press a button real or is it malicious? Yeah, I can tell pretty easily, but they can’t.
There are tons of people out there, of all ages, for whom computers are scary bombs that can steal their money or their data or stop working at the slightest provocation. Thing is, they’re not wrong.
I’m now hearing of people coming into the work force that don’t know how to use “a computer” and want to do all their work on iPads. It’s purely anecdotal, but the person telling me the tale was saying this person wasn’t going to make it through their probation period for this reason alone.
It wasn’t even a technology company. A finance firm or something.
A computer preinstalled with Linux is definitely more likely to confuse than you imagine
I can only see it being the case if there is an implicit assumption these people are already familiar with Windows. If we remove that assumption, I can see it going either way, but it’s not even remotely “definitely more likely to confuse”.
The Windows market share has wavered between 90 and 70% over the years.
I don’t know that you can ignore that assumption.
It depends on the application anwyay. My last set up for a non-techie was a Samsung Android tablet with a keyboard cover. It’s now harder to get that person on either a Windows or Linux computer.
I’d say it’s definitely going to confuse but so would it if the computer was running windows
I fail to see how (again, I’m talking about people new to computers, not people already used to Windows).
You have office, a browser, a mail program, music player, etc. preinstalled, automatic updates, and an app store (usually named “software”) with a search function and a friendly “install” button to look for more software.
Printers are installed automatically when you’re in the same network or connect them via USB.
If you plug in your phone or an USB stick, it shows up in the file manager.That’s sobering reading.
One of the difficult tasks was to schedule a meeting room in a scheduling application, using information contained in several email messages.
95% are below this level. Wow.
To be fair, most people in the workforce were never trained on the likes of Microsoft Teams. Learning this for most people takes a little bit of fucking around and taking notes of certain buttons while you were doing things the way you are used to.
Something I missed first time was
The data was collected from 2011–2015
Hopefully, it’s better now (based on nothing).
I know most people don’t seem to have the ability to look through menus and identify the thing closest to what they want to do. I think software might be more difficult to use now, too - the trend for “clean” design means that usability and discoverability goes out the window.
I think it’s also that people aren’t encouraged to explore. A bit of clicking around and eyeballing the options you do have can go a long way. I had to teach myself how to use and exploit Open shift this way lol
I just accidentally stumbled across some proof for my looks-over-usability statement:
That’s part of the issue. Unfortunately Linux pre-installed devices are scarce.
I enjoy Linux, I’m even suse certified for what that’s worth, but even I have to admit that there is a difference between a computer that will turn on and compute with Linux and a computer that has all of the correct drivers and works correctly in Linux.
To be fair, the amount of tech support and help that low-skilled users need on windows would suggest this isn’t really true. A lot of these people have been using windows for decades and still have frequent issues with it.
I’m not claiming that most Linux distros are better than windows with this, but I don’t think windows can be claimed to be a good OS for the tech-inept either.
And most users don’t even notice the issues - I feel lime the bar has really become can I click on, enter password and open a web browser, a bar which limux has surpassed for decades
Though most linux users probably also scare away the layman with the hacky stuff we got going on lol
You say “everyday devices”, but imo when it comes to tablets, phones, smart TVs, car audio systems, etc, android does this WAY better than windows does.
Yeah, never had to set a graphics device driver for Android. That always just works.
I disagree, this is a matter of how good the distro defaults are. Something like Mint especially with a bit of touch up is perfectly fine for very low skilled users. Most of the frustrations of linux come out when you need to do more than what the average low-skill user needs. If they can find the icons of the apps they want, that is all that is needed.
Except you’re wrong because Android is Linux based and Ubuntu basically fits your criteria
I gotta say, the frequency with which you hear that Android/ChromeOS is actually Linux and it totally counts, or how successful Linux is on other applications is REALLY much less flattering to desktop Linux than people claiming that seem to think.
I’d argue the moment you have to pick a distro in the first place you’ve made the guy’s point. That’s already way past the level of interest, engagement or decision-making capacity most baseline users have. Preinstalled, tightly bound versions like Android or SteamOS are a different question, maybe. Maaaaybe.
Yeah I think it’s a similar problem to federation. Yeah it’s confusing at first and the fact that it’s often worth it and that that’s actually a sign of it being good and resilient to bad stuff that standard users do dislike doesn’t mean you keep them.
I think there’s however room for a linux based tightly compacted desktop distro. If it’s treated as independent and there’s easy ways to do everything that terminal does outside of terminal (and most importantly default to that) you could probably gain some share. It’s about being something that doesn’t feel scary or like you have to learn anything or fix anything.
Yep, that was my point. There’s nothing fundamentally alien to using desktop Linux for most tasks when it’s standardized and preinstalled, you see that with the Raspberry Pi and Steam OS and so on. The problem is that people like to point at that (and less viable examples like ChromeOS or Android) as examples that desktop Linux is already great and intuitive and novice-friendly, and that’s just not realistic. I’ve run Linux on multiple platforms on and off since the 90s, and to this day the notion of getting it up and running on a desktop PC with mainstream hardware feels like a hassle and the idea of getting it going in a bunch of more arcane hardware, like tablet hybrids or laptops with first party drivers just doesn’t feel reasonable unless it’s as a hobbyist project.
Those things aren’t comparable.
Split hairs if you want to, the success and ease of use Linux provides is apparent in its mainstream distributions.
I’m not splitting hairs, I’m calling out a fallacious argument. If your take is that Desktop Linux is super accessible and mainstream because Android is a thing that’s a bad take.
Here’s how I know it’s a bad take: if I come over to any of the “what Distro should I use first” threads here and I tell you to try Samsung Dex you’re probably not going to be as willing to conflate those two things anymore.
But hey, yeah, no, Android is super accessible. So is ChromeOS. If that’s your bar for what Linux has become for home users, then yeah, for sure. Linux is on par with Windows in terms of accessibility. May as well call it quits on the desktop distros muddying the waters, then. I mean, if all that is Linux what are those? 1% of the Linux userbase? 0.1%? Why bother at that point?
Notice how I mentioned Ubuntu as well? Talk past it more if you’d like.
No, I’m not talking past it. I just have less an issue with it. The Android thing is disingenuous, though.
But I did explicitly address it above, when I said once you have to pick a distro at all the OP has a point because that’s already past the level of insight casual users have or care about. It’s literally right there in my first response to you.
We really need to stop pushing these outdated and over complex distos like Ubuntu also. It’s 50/50 if they can find what they want via Google and find out how to add a ppa that is going to be dark magic, and the almost 100% all that added stuff to do basic stuff like game is going to go belly up when the new upgrade comes along. Rolling releases get a bad rep for some reason but they shine for users that don’t want to search for new software that’s going to work and not break/require intervention with every upgrade. /rant
I think really a huge part of this comes down to familiarity though, not intrinsic intuition. Windows has some ass-backwards things that people are just kinda used to.
“The only intuitive interface is the nipple.”
…but in truth even that isn’t very intuitive 🤷
That’s nonsense.
That would have been true a decade ago. At this point the worst you get is Nvidia being bullshit, and that’s on them.
That’s manufacturer support. Not Windows or Microsoft. Try installing any discrete graphics card under Windows on arm. It’s a nightmare. Installing them under Linux on arm can be very temperamental too, but it is a better experience than on Windows
Linux Mint, Zorin OS, Elementary
ppl who know how to use MacOS or Windows should have no issue using those
Biometric login. It is available to an extent through fprint on Linux but support is not there for all hardware and it isn’t a very seamless experience to setup at the moment
Linux also has Howdy for facial recognition/“Windows Hello”
Biometrics authentication seems to me to be entirely useless. It’s less secure and more easily spoofed than passwords, and if you need more security 2FA or a physical key (digital or otherwise) provide it. It would be nice to have the support I guess, but the tech itself just seems like a waste of money.
Setup right it’s a lot faster than passwords. So I guess it automatically wins vs more secure methods.
I didn’t write the rules of average human thought processes.
I had to scroll this far for a legitimate answer?
These aren’t Linux issues that Windows does better. It’s just companies that decided their hardware shouldn’t run by Linux.
So in other words, you can’t use it in Linux…
Absolutely correct.
I made almost that exact comment in this post. 🤣
You don’t suppose the fingerprint thing is a standard API kind of thing though? That was my assumption.
Lol - I was parodying your comment, actually 🙂. Not sure if fingerprint is standard api, but I suspect there is some proprietary stuff going on.
In the end it’s not about blaming Linux, it’s about getting adoption to a critical mass where commercial entities can realize a business case to support. Then the ecosystem will thrive.
Linux (and BSD for router workload) absolutely owns the server world. Even MS let’s you run SQL Server on Linux). The desktop isn’t there yet wrt adoption, but it’s growing. Things like fingerprint sensors are definitely in the desktop (closer to end user) world and if it’s the business use case that is the area of most growth, as I suspect it is (in India, especially) then I think these sorts of modules have higher likelihood of being adopted.
The Windows Hello camera enumerates under Linux as just another webcam that activates the flashing LEDs when it turns on (I’ve found a number of neat uses for this, including having a ridiculously low gain IR camera that I can just use for whatever and have what would be a surprisingly good emulation of the Wii sensor bar for use with Dolphin if it weren’t constantly flashing on and off), and there is software (Howdy) for using it to sign in. Unfortunately, signing in with your face of course precludes using your password for decryption, meaning that after you start some applications you’ll be prompted to type your password anyway to unlock your system keyring, and perhaps more importanty SDDM isn’t smart enough to interface with fprintd/howdy properly and doesn’t even try to activate the biometric sensor until you type something in the password box.
(Also, hilariously, because of how I set it up initially to accept my face instead of a password for sudo, I couldn’t configure it to check whether the terminal was remote, so when I ssh’d in and tried to sudo, it turned on the hello camera however far away that was and looked for my face, only prompting me for a password after facial rec timed out.)
In KDE and I think GNOME the setup is fine. But there are no usb fingerprint readers that work with Linux, at least that you can buy.
Spy on users
Natively run Windows software. Do I win?
that why i like windows 11. you can really taste the nativity
Wait, 11 tastes like goat barn and frankincense?
its mostly goat barn
You misspelled “naivety” lol
Wine’s not an emulator…
That is correct, but a compatibility layer is also not native execution of a binary.
I beg you forgive my pedantic interjection, but … I posit that the original commenter is incorrect. it is absolutely native execution.
The CPU is fetching and executing the instructions directly from memory, without any (additional) interpretation of code or emulation of missing instructions - Which is, by definition, native execution.
What the compatibility layer “does” is provide a mapping of Windows system calls into the appropriate Linux system calls. Or, in other words, makes it so that calls to functions like
CreateWindowEx()
in the Win32 API have a (still native) execution path.The native execution requires you to install WINE, yes, but if we’re disqualifying it because “it requires you to install a package”, then we also consequently:
- Add things like “print stuff”, “display graphical applications”, and “play audio” to the list of “things Linux can’t do”
- Disqualifies Windows from “natively executing” any .NET applications (a Microsoft-built first-party framework), since .NET applications require you to install .NET.
You’re right, you are being pedantic.Edit: Actual response. You took time to type all that out, I should at least say why I disagree.
WINE is a compatibility layer. A translator. It helps a non-native language speaker speak the native language. The whole reason WINE exists is to make a non-native executable execute outside of its native environment. Even if the code is very functionally similar to something like .NET, the function of WINE is to enable non-native code to run as though it were designed for Linux. Downloading WINE doesn’t suddenly make those .EXE files be retroactively designed with Linux in mind. It’s still not native code.
You’re correct in that it is a compatibility layer - And I’m not disagreeing with that. Also to be clear: Not just arguing to argue or trying to start a fight, mind you. I just find this to be an interesting topic of discussion. If you don’t find it to be a fun thought experiment, feel free to shoo me away and I’ll apologize and leave it alone.
That said, we appear to only be arguing semantics - Specifically around “native” having multiple contextual definitions:
-
I am using ‘native’ to mean “the instructions are executed directly by the CPU, rather than through interpretation or emulation” … which WINE definitely enables for Windows executables running on Linux. It’s the reason why Proton/DXVK enables gaming with largely equal (and sometimes faster) performance: There is no interception of execution, there is simply provision of API endpoints. Much like creating a symlink in a directory where something expects it to be: tricking it into thinking the thing(s) it needs are where it expects them to be.
-
However, you are using ‘native’ to mean “within the environment intended by the developer”, and if that’s the agreed definition then you’re correct.
That’s where this becomes an interesting thought experiment to me. It hits me as a very subjective definition for “native”, since “within the intended environment” could mean a lot of things.
- Is that just ‘within a system that provides an implementation of the Win32 API’? If so, WINE passes that test.
- If I provide an older/fixed/patched version of a DLL (by just placing it in the same directory) to fix an issue caused by a breaking change to a program that is running on Windows, is that no longer native?
- Or is it just ultimately that the machine must run the NT kernel, since that’s where the developer intended for it to run?
Does that make sense? I hear a statement like that and I find myself wondering Which layer along the chain makes it “native”? - I find myself curious at what point the definition changes, in a “Ship of Theseus” kind of way.
It seems to me that if we agree that the above means “running in WINE is not native”, then we must also agree that “anything written running for .NET (or any other framework, really) is not native”, since .NET apps are written for the .NET framework (Which is not only officially available for Windows, mind you) and often don’t include anything truly Windows-specific. Ultimately, both are providing natively-executed instructions that just translate API calls to the appropriate system calls under the hood.
I hope that does a better job of characterizing what I meant.
You clearly know more about this than I do, and you’ve thought a lot about it. Your points deserve a better response than I can give at this time, but I wanted to acknowledge that at least. I also wanted to say you aren’t pedantic and I’m sorry I said that. You spent time and thought on making a good conversation and I wish I had been more engaging with that instead of trying to be correct. Thank you for still conversing instead of arguing even after I was less than perfect of a conversation partner. I hope in the future I see more of your comments. Have a really nice day.
I appreciate your acknowledgement - and I commend the humility it takes to write a comment like this! No hard feelings at all, and I hope things are pleasant for you as well.
It’s folks like you and interactions like this that make Lemmy a platform worth engaging on.
-
You windows
Windows does what Nintendon’t? Wait, that’s not it…
Do I lin?
At this point, that’s kinda the wrong question.
I think Linux is just as if not more capable than Windows is, but the software library has some notable gaps in it. “It can’t run Adobe/Autodesk/Ubisoft” That’s not Linux’s fault, that’s Adobe/Autodesk/Ubisoft’s fault. I don’t think there’s a technical reason why they couldn’t release AutoCAD for Linux, for example.
Run updates without me having to worry that “whoops, an update was fucked, and the system is not unbootable anymore. Enjoy the next 6 hours of begging on forums for someone to help you figure out what happened, before being told that the easiest solution is to just wipe your drive and do a fresh install, while you get berated by strangers for not having the entirety of the Linux kernel source code committed to memory.”
Selling copies for 200$
Get some people to write really passionately about moving off of it, apparently.
There needs to be an entire Lemmy community for all the testimonial posts.
Embed ads on your desktop.
Play games with kernal level anti cheat
Run professional software like fusion 360, Adobe suite and much more.
Use Wsl to get a lot of the benefits of linux
Play lots of AAA games
Specifically just anti-cheat that chooses not to support Linux at this point.
We shall see how this plays out considering steam/proton’s advancement and the steam deck’s popularity, too.
Yeah, and I don’t give two shits about the publishers who think they need to seize control of my machine for their idea of fairness.
Run Microsoft Office, Adobe Suit and most other media editing programs. The biggest hurdles in getting people to use Linux
Hit the ground running deploying…pretty much anything.
Was running game servers on my Windows PC through Docker and they were super easy to set up. I got a new PC and decided to repurpose my old computer into an Ubuntu server to get some experience with Unix. I have only been more frustrated once in my entire life. Sure, once things are set up on Linux they are really powerful, but the barrier to entry is so absurdly high and running anything “out of the box” is literally impossible by design.
That’s a letter U problem. I can administer Linux a bajillion times easier than windows, because I do it for a living, and haven’t touched MS since Server 2010. Also Docker in Windows is LOL. You’re leveraging Linux to shit on Linux. Lets do that all in IIS and see how you feel.
Pointing out that you find it easy because you do it for a living isn’t a very good counter to their point - most people do other things besides Linux for a living
He’s… not wrong though. I mean look, deploying things is somewhat inherently the task of professionals and enthusiasts. To say that deploying things on Windows is easier than Linux is going to be really really hard to defend. Not to even mention the docker layer.
I can run a Linux docker container on Windows and it just works. When I run it on Linux it is constant permission and access issues.
I guess I can’t deny your experience is your experience, but again if you’re running Docker on Windows, Windows is just running a Linux VM or WSL to do this. And I can assure you that any serious person running containerized workloads for production type deployments will be doing this on a Linux host.
Docker has pretty good docs for installation on the major Linux distros, so without more info I can’t really say much else.
Permissions on Windows are notoriously insecure. By default, literally everything is executable in Windows. Docker is very much the same (insecure by default; in Windows).
Your permissions problems in Linux are a feature, not a bug. You just didn’t understand what you were doing when you tried to get it set up. Otherwise you wouldn’t be complaining about permissions errors. That’s the very definition of complaining about your own ignorance.
I get that the point of this thread is something along the lines of, “running Docker images is a breeze” but I think a more relevant point would be, “Docker images run better” (in Linux).
Docker images will run much faster and more efficiently in Linux. It’s just how it was meant to work. WSL doesn’t work like WINE: it’s actually an emulator and will always be slower than native Linux.
As you said, I am perfectly aware that in an ideal world security would be on lockdown. How it behaves on Linux is how it SHOULD work. That doesn’t change the main point that you can’t hit the ground running with Docker containers in Linux.
This is what’s holding the community back. The “get good” advice isn’t really advice and keeps Linux from hitting the mainstream. I get it you’re amazing at Linux but the rest of us shouldn’t have to go back to school to get a computer degree and become a Linux professional in order to use it. This is the same person that replies to questions about Linux with “why do you need the GUI just use the command line instead or it’s dead simple just type: followed by like 80 lines of code that people can’t make heads or tails of because they’re novices. Man I get that you want to flex but it’s a pretty strange flex.
OTOH, many people can’t make heads from tails regarding windows, icons or buttons, and they don’t get the contextual clues that the GUI gives for any operating system. They don’t see them, and if they do they’re unable to make the automatic inferences most of us long time users obtain from them. They act as people who are blind from birth and suddenly see, who have problems to understand tridimensionality; the GUI is not in their mind model of how to work with computers, and they have a lot of difficulty interacting with it.
Is your point meant to be that these people who already have trouble learning GUIs would somehow have an easier time intuiting command line?
If that’s correct, that’s an absolutely BS argument
Is your point meant to be that these people who already have trouble learning GUIs would somehow have an easier time intuiting command line?
No, my point is that they’re lost causes and they’re untrainable.
No, my point is that they’re lost causes and they’re untrainable.
Ah… I still don’t get how that’s meant to refute the previous person’s point that elitism and the “git gud” attitude around Linux contributes to it’s inability to become mainstream.
If anything your reply only reinforces their point, because you seem to be suggesting we throw anybody who struggles to learn it to the curb.
So that makes the “get good” advice valid? What are you talking about bro? I didn’t say Linux isn’t valid. I think you must have replied to me specifically on accident because your response isn’t germane to my reply. Or if you feel it is please explain. Make sure you use as many polysyllabic words as possible. I think you wrote up one of the Linux documents I’m to understand.
Or maybe I’ll just say: cool story bro.
So that makes the “get good” advice valid?
No, they’re untrainable. It’s literally impossible for them to get any good. At all.
Perfect. Good solution. Linux only for the elite.
LinuxComputers only for theelitepeople willing to engage their brains.FTFY
What?
I used Windows from 95 onward. Docker on Windows is second class compared to running on Linux.
That being said, I don’t think it’s that people cannot learn to use something like Ubuntu, it’s that if they don’t need to, they won’t.
Good enough, is fine for the vast majority of folks. And I think Windows 11 proves that.
Like I had to learn OSX for my work computer, which I ended up loving. But that took me a week or so to get the hang of.
IIS is not the same as Docker. Sounds to me you are shitting on IIS for the sake of trying to prove a point I wasn’t trying to make.
This goes into my next point. Linux users are toxic as hell. They are elitist snobs who shit on newbies because they have years of experience.
This is a very dangerous, and unfortunately widespread, generalization. The shitty ones are the loudest ones, and I’m sorry that most of your experience with linux users has been with them. I promise, much of the community are kindhearted individuals who simply use linux because of its ideals, or because they’re developers, or privacy enthusiasts, or those who bought a steam deck and think the lack of windows is pretty neat.
This. This is truth!
Yeah, I started working for a company with a lot of Windows servers two years ago and I still can’t wrap my brain around them. I’ve been a Linux sysadmin/sysarchitect for 20+ years and I’m still completely lost how to get Windows to much of anything. I usually don’t have to do much on those servers, but when I do its StackOverflow that’s really administering them. It’s because I lack foundational knowledge about windows and also because I’m fine not having that knowledge.
Hold on, did you just low-key state that running Linux docker containers on Windows ends up giving you the best of both worlds? Run Linux server software in docker containers, run client software natively on Windows?
That’s very weird as with docker on windows you technically run your containers in a linux vm, and besides that, in my experience windows is not nearly stable enough to be useful for running services.
All while I have been deploying selfhosted services for myself without problems on Linux for years. My only problem has been the constantly overloaded system, but that’s no surprise when you run heavy services on the 10+ year old portable hard drive system disk. Windows would only perform worse in that environment.Yeah… this feels like a very bad example. I am honestly curious as to specifics here, because Ubuntu setup is pretty dead simple with the graphical installer. And like you said docker is native linux.
Saying running anything out of the box is “impossible by design” on Ubuntu is objectively wrong frankly. Maybe you could argue they haven’t succeeded in their goal of being super out of the box friendly, not sure I’d agree but at least you’d have leg to stand on.
Erm I’ll politely disagree there. Linux is just built for it. No extra layer like Windows. Docker and Linux are besties
Don’t get me wrong - I know that they are, and I know that Linux is superior for running docker containers. The thing is that Windows handles all the permissions for you. An average Joe can get a docker container up and running on Windows. You need significantly more Linux-specific knowledge to get a container running on Linux, and the advice given by the community is often cryptic for beginners.
Then try podman! The podman desktop application by redhat is probably one of the nicest interfaces for container orchestration i’ve seen in a while, if not a little bare. Podman is rootless by design and there’s basically no configuration needed (for non-commercial purposes, anyway) besides loading up the gui, downloading your images, and spinning up whatever software you need.
I feel your pain, ugh. Setting up certain types of software can be a pain in the ass because there’s almost always dependencies that need to be set up first; in addition, it’s not always clear what you’re supposed to install or how to do it the right way. A lot of Linux-related documentation out there isn’t geared towards beginners and leaves out a lot of important explanatory and contextual information, which just makes it more frustrating. Unnecessarily, in my opinion.
However, I gotta mention that Ubuntu - though widely used - is sorta notorious for being user unfriendly and isn’t always the most appropriate choice for a beginner Linux user. If anyone reading this is thinking about trying Linux for the first time, I would consider Linux Mint. It’s a Linux distro that is actually based on Ubuntu (which is based on Debian), but it works “out of the box” better than most and should be a positive experience for most users. It’s pretty solid.
Ubuntu is notoriously user unfriendly???
That’s honestly super confusing to me. Not just experientially from using Ubuntu but also just I’ve never heard it described that way. It’s definitely near the top of list of out-of-box friendly distros.
Graphical installer. Full App Store UI. Desktop versions that come with lots of common software. It’s hard to get much simpler than that.
Truly, if anything, I would consider desktop Ubuntu to be somewhat power user unfriendly.
Ubuntu I would say is a terrible desktop OS full stop, and all the derivatives also, as well as Debian. They are fine for a server where someone wants stability of package change above all else, but as a desktop we should NOT be pushing new users to these distros full of outdated software when easier to use rolling distros are available, where adding anything new isn’t adding a repo that is almost certainly going to break things on an OS update.
You realise Debian is the base distribution?
Ubuntu takes 6 monthly cuts from Debian Testing, adds some in house stuff puts them through QA and performs a release.
Linux Mint is produced by Cinnamon devs, similar to KDE Neon. They take the last Ubuntu LTS, remove many of the in house additions, add the latest Cinnamon desktop and release.
Cinnamon got upstreamed into Debian to make the process easier.
Yes, that is why I included Debian and the Ubuntu spins (Mint/etc.). They all run outdated software, and I don’t think in 2024 they are a suitable desktop OS for someone new coming to Linux. They were fine back in the day when things were not moving as fast, but now, well running one of them is a disservice to the user IMHO. Unless your only using your system to make spreadsheets using an outdated version of LibreOffice and don’t mind that your 6+ months behind the rest of the world.
I think they certainly have a place in the server world, but as a desktop new users should be looking at the EndeavourOS, CachyOS, Fedora, Nobara, Ultramarine, or even SUSE Tumbleweed.
Truth!
In my experience, most package managers should set up dependencies by themselves! Though, I do agree with the lack of explanation of documentation.
I use arch by the way, but what’s your opinion of other “user-friendly” distros like Manjaro or Garuda?
Do you not know that mint is Ubuntu based?
The person is correct in this isn’t a Linux problem, but relates to your experience.
Windows worked by giving everyone full permissions and opening every port. While Microsoft has tried to roll that back the administration effort goes into restricting access.
Linux works on the opposite principle, you have to learn how to grant access to users and expose ports.
You would have to learn this mental switch no matter what Linux task your trying to learn
Dockers guide to setting up a headless docker is copy/paste. You can install Docker Desktop on Linux and the effort is identical to windows. The only missing step is
sudo usermod -aG docker $user
To ensure your user can access the docker host as a local user.
deleted by creator
What happened the one time you were more frustrated?
Playing Final Fantasy XIII. That legitimately made me cry with how frustrating that game was to play.
Ah ok. Never played it, probably won’t bother! :-)
Avoiding snark and concentrating on first party features:
- Domain integration, e.g. ActiveDirectory
- Group policy configuration
You can do these things to an extent bit not as comprehensively and robustly
I’m going to go with “be normal”.
Linux is unusual in a way that Windows is not. In a lot of areas (games, interfacing with weird hardware), Linux uses up one of your three innovation tokens in a way that Windows doesn’t. You are likely to be the only person or one of a very few people trying to do what you are doing or encountering the problem you are having on Linux, whereas there is often a much larger community of like-minded people to work with who are using Windows.
Sometimes the reverse is true: have fun being the only person trying to use a new CS algorithm released as a
.c
and a Makefile on Windows proper without WSL.But that’s kind of why we have Wine and WSL: it’s often easier to pretend to be normal than to convince people to accommodate you.
Literally everything easily with much less effort