I read somewhere (I think the deloitte tech survey from a few years ago) that many people have replaced their pc with smartphones and use their phone as their primary tech device. Would be interesting to see if any of these low-level skill folks are actually high (or higher) on mobile skills.
From what I recall, particularly the younger generations that exclusively use mobile devices (though of course this is not limited to them) actually have terrible tech literacy across the board, primarily related to spending all of their time in apps that basically spoon-feed functionality in a closed ecosystem. In particular, these groups are particularly vulnerable to very basic scams and phishing attacks.
I work in tech at a credit union and we’ve hit a weird full circle point where the new folks entering the job market need a lot of training on using a computer for this reason. It’s been very bizarre being back at a point where I have to explain things like how to right click because a lot of people have grown up only using phone/tablets.
I’m in IT. There was a time when I was sure that the younger generations would be eclipsing my technical skills. I knew where I came from, and what I was exposed to and assumed that the younger generations would have everything I had, and even MORE technical exposure because of the continuing falling cost of technology. For about a decade that was true, and then it plateaued and then, as you experienced, I saw the younger generations regressing in technology skills.
There was a time when I was actually worried about job security due to an overabundance of young people wanting to enter the field. Nowadays, not so much.
On the other hand, I’m instead now worrying that younger generations might become even less able to understand the importance of digital rights if they don’t even understand the basics of the technology.
Think back to when we were kids. Remember that period of time when not everyone owned a computer? Or if they owned one, it wasn’t necessarily used much? There were people that were “computer people”, who used computers daily for entertainment or tinkering or socializing (once the consumer internet took off) and there were people that didn’t need or care about them outside of their workstation at the office.
Even after the Internet, this dynamic was there. You had the enthusiasts who really spent time on their computers and got to using them well, and you had people that simply owned them and checked email or browsed the Internet from time to time.
The enthusiast/non-enthusiast dynamic has always existed. There’s always a gap. It just takes different shapes.
Now, everyone owns a smartphone and uses for everything. They’re critical to life, enthusiast or no. That’s the baseline now. The gap is entirely in skill and usage, not so much hardware or time spent on it.
Before computers and the internet, no tech skill was needed to interact with our modern world.
After them, and for a few decades, the skill floor rose. You needed to learn technology to participate in the modern world.
Now technology has reached a point where the skill floor has dropped down to where it was before.
The mistake we made was in thinking that our generation learning to use technology was happening because they wanted to. It was incidental. Skill with technology comes from desire to obtain it, not simply using technology a lot.
The mistake we made was in thinking that our generation learning to use technology was happening because they wanted to. It was incidental. Skill with technology comes from desire to obtain it, not simply using technology a lot.
We learned the technology to accomplish specific mundane goals, and along the way learned the inner workings of the technology which became applicable to the working world. Now, to accomplish those same rather mundane tasks there is very little to learn, and very little ancillary learning benefit derived from doing those mundane things.
Yes, when you were learning those, you had a viewpoint from the outside. That you know there are things like discrete math, electricity, magnetism, transistors, ones and zeroes, etc, and level above level from these things a machine has been built. You wouldn’t know how exactly, but you would understand how complex it is and how important it is to approach it with logic.
Now these people don’t think. At all. Ads yell with music, pictures and colors at them, computers they use are about poking screens with fingers, and it takes a lot of courage to abandon that context.
Like in that experiment with a white ball being called black and a group of humans where some have been warned to call it black and some start doing it because others do.
Students have been using laptops in school and college for a long time now, no matter how much time they spend on their phone.
What I encountered in IT isn’t people who have no idea how to use a computer, it’s people that have very little idea how to use Windows over Apple or occasionally Chromebook. But even then, they usually still know Windows from needing to use it at some point in school. It’s the settings and other little things they struggle with, not the basics.
I have to explain things like how to right click because a lot of people have grown up only using phone/tablets
Or they come from iMac or MacBooks where right clicking is less emphasized as it is on Windows.
Could be, could be. This is just anecdotal on my part where I’ve helped people get up to speed and they’ve told me they basically never used a computer growing up. Maybe they don’t count Chromebooks as part of that group, dunno.
Shit I am a pretty high level electrical engineer, and even I loathe having to use Windows these days. I actually used to be team Microsoft, but have been primarily Linux for more than a decade now, and vastly prefer MacOS as an ssh client, which is mostly what I need a laptop to do.
They’re also market-locked. If you have so little ability to function outside of an app, you become incredibly resistant to moving from one to another unless it’s identical, and you’re incapable of using marginally more complex things.
It also gives immense market control to the app stores, have been allowed to exist mostly unregulated. Thankfully that might be changing.
When everyone must be spoon-fed, that makes the only company selling the spoons insanely wealthy and powerful.
It’s also going to have a degrading effect on popular software overtime. When the only financially viable thing is to make apps for the masses, you are not incentivized to make something extraordinary.
Compare Apple Music to iTunes, just on a software level. Just on the sheet number of things you can do with iTunes, all the nobs and levers, all the abilities it grants a user willing to use it to its max potential. At some point, it no longer became viable to create an excellent piece of software, because most people have no skills or patience or desire to use it.
So you start making things that don’t empower the user, instead you make things that treat them like children, and your products get stupid.
Would be interesting to see if any of these low-level skill folks are actually high (or higher) on mobile skills.
“Mobile skills” are likely still lower skill. Tablets and phones are mostly content consumption devices instead of content creation (photos/videos excluded). Does anyone do serious software development on a phone? How often are mobile users writing papers or prose using only their touchscreen? How many people are doing complicated video edit on an iPhone? Can those tasks be done on a table/phone? Sure, but I don’t think its common.
The reason this is a problem is that it means there is a barrier between deeper computer skills and the devices/environment that people are using daily. The reason many of us became computer savvy on a desktop wasn’t because we wanted to, its because we had to to get the game running we wanted or we had to write the paper we were required to. So being familiar with other uses on a computer, it is only a very mild extension to writing a script if the need arises. The only “new” or “foreign” part is the script, not the environment or interaction of where you’re creating it.
With a tablet/phone as your primary device it means learning not just scripting, but learning all the skills necessary to use a computer. Its a high barrier.
However with their examples you don’t need to write a script, you can solve them that way but you really don’t need to for these examples. This is some basic search refinement skills (Outlook would even help you build this unlike say a Google search with refinement filters) and either a small spreadsheet or a calculator app to max out at their level 3.
Scripting this I would put at a level 4, but I would be interested where the authors of the paper would fit that in as its their research and what sort of percentage would fit into that skill set.
I think scripting is certainly a level 4 activity, since to even get started solving the problem you would need to navigate an IDE and have basic knowledge of a scripting language. Most people wouldn’t even know where to start.
I always wonder how it’s even possible. I can’t do half shit that I do on my computer on my phone. And even if I can, I need to spend like 3x-4x more time because how inefficient touch screen is.
Recently lived like that for a while - until a laptop replacement part arrived. And now I am skeptical. I refuse to believe someone would go mobile-only willingly and long-time. Maybe if they cannot afford a computer, at max.
What do you think all those “hacker” scarecrow movies and alarmist articles and laws were aimed at and caused by?
Modern computers allow one person to do the monthly work of the Soviet Genplan on their home machine in a day if they are smart, in a month if they are average, and Soviet Genplan employed more than one person.
Together with the Internet they make power over masses a much less certain thing.
Except if you poison both, you can not just neuter, but invert the effects.
We still have more and less powerful people in our world.
What I mean is that I don’t think it’s a coincidence that “user-friendly” computing, bloating of the Web and rise of authoritarianism happened with the same intensiveness in the same ~20 years.
I am primarily mobile only, but I am also a Linux user on desktop. I just don’t use the desktop very often because it’s less convenient to have to sit down in front of a desktop or laptop versus just pulling out your phone and checking something. It’s more a, it’s the device I have and it’s always on and I don’t have to go anywhere to get it. As I said though, I’m high in both mobile and desktop because I run Linux and know how to use the command line and I flash custom ROMs on my phone and use primarily open source software. I also submit bugs to many open source, desktop, and mobile applications.
Wow, this is bleak.
I read somewhere (I think the deloitte tech survey from a few years ago) that many people have replaced their pc with smartphones and use their phone as their primary tech device. Would be interesting to see if any of these low-level skill folks are actually high (or higher) on mobile skills.
From what I recall, particularly the younger generations that exclusively use mobile devices (though of course this is not limited to them) actually have terrible tech literacy across the board, primarily related to spending all of their time in apps that basically spoon-feed functionality in a closed ecosystem. In particular, these groups are particularly vulnerable to very basic scams and phishing attacks.
I work in tech at a credit union and we’ve hit a weird full circle point where the new folks entering the job market need a lot of training on using a computer for this reason. It’s been very bizarre being back at a point where I have to explain things like how to right click because a lot of people have grown up only using phone/tablets.
I’m in IT. There was a time when I was sure that the younger generations would be eclipsing my technical skills. I knew where I came from, and what I was exposed to and assumed that the younger generations would have everything I had, and even MORE technical exposure because of the continuing falling cost of technology. For about a decade that was true, and then it plateaued and then, as you experienced, I saw the younger generations regressing in technology skills.
There was a time when I was actually worried about job security due to an overabundance of young people wanting to enter the field. Nowadays, not so much.
On the other hand, I’m instead now worrying that younger generations might become even less able to understand the importance of digital rights if they don’t even understand the basics of the technology.
Think back to when we were kids. Remember that period of time when not everyone owned a computer? Or if they owned one, it wasn’t necessarily used much? There were people that were “computer people”, who used computers daily for entertainment or tinkering or socializing (once the consumer internet took off) and there were people that didn’t need or care about them outside of their workstation at the office.
Even after the Internet, this dynamic was there. You had the enthusiasts who really spent time on their computers and got to using them well, and you had people that simply owned them and checked email or browsed the Internet from time to time.
The enthusiast/non-enthusiast dynamic has always existed. There’s always a gap. It just takes different shapes.
Now, everyone owns a smartphone and uses for everything. They’re critical to life, enthusiast or no. That’s the baseline now. The gap is entirely in skill and usage, not so much hardware or time spent on it.
Before computers and the internet, no tech skill was needed to interact with our modern world.
After them, and for a few decades, the skill floor rose. You needed to learn technology to participate in the modern world.
Now technology has reached a point where the skill floor has dropped down to where it was before.
The mistake we made was in thinking that our generation learning to use technology was happening because they wanted to. It was incidental. Skill with technology comes from desire to obtain it, not simply using technology a lot.
We learned the technology to accomplish specific mundane goals, and along the way learned the inner workings of the technology which became applicable to the working world. Now, to accomplish those same rather mundane tasks there is very little to learn, and very little ancillary learning benefit derived from doing those mundane things.
Yes, when you were learning those, you had a viewpoint from the outside. That you know there are things like discrete math, electricity, magnetism, transistors, ones and zeroes, etc, and level above level from these things a machine has been built. You wouldn’t know how exactly, but you would understand how complex it is and how important it is to approach it with logic.
Now these people don’t think. At all. Ads yell with music, pictures and colors at them, computers they use are about poking screens with fingers, and it takes a lot of courage to abandon that context.
Like in that experiment with a white ball being called black and a group of humans where some have been warned to call it black and some start doing it because others do.
I keep hearing this but it’s perplexing.
Students have been using laptops in school and college for a long time now, no matter how much time they spend on their phone.
What I encountered in IT isn’t people who have no idea how to use a computer, it’s people that have very little idea how to use Windows over Apple or occasionally Chromebook. But even then, they usually still know Windows from needing to use it at some point in school. It’s the settings and other little things they struggle with, not the basics.
Or they come from iMac or MacBooks where right clicking is less emphasized as it is on Windows.
Could be, could be. This is just anecdotal on my part where I’ve helped people get up to speed and they’ve told me they basically never used a computer growing up. Maybe they don’t count Chromebooks as part of that group, dunno.
Shit I am a pretty high level electrical engineer, and even I loathe having to use Windows these days. I actually used to be team Microsoft, but have been primarily Linux for more than a decade now, and vastly prefer MacOS as an ssh client, which is mostly what I need a laptop to do.
They’re also market-locked. If you have so little ability to function outside of an app, you become incredibly resistant to moving from one to another unless it’s identical, and you’re incapable of using marginally more complex things.
It also gives immense market control to the app stores, have been allowed to exist mostly unregulated. Thankfully that might be changing.
When everyone must be spoon-fed, that makes the only company selling the spoons insanely wealthy and powerful.
It’s also going to have a degrading effect on popular software overtime. When the only financially viable thing is to make apps for the masses, you are not incentivized to make something extraordinary.
Compare Apple Music to iTunes, just on a software level. Just on the sheet number of things you can do with iTunes, all the nobs and levers, all the abilities it grants a user willing to use it to its max potential. At some point, it no longer became viable to create an excellent piece of software, because most people have no skills or patience or desire to use it.
So you start making things that don’t empower the user, instead you make things that treat them like children, and your products get stupid.
“Mobile skills” are likely still lower skill. Tablets and phones are mostly content consumption devices instead of content creation (photos/videos excluded). Does anyone do serious software development on a phone? How often are mobile users writing papers or prose using only their touchscreen? How many people are doing complicated video edit on an iPhone? Can those tasks be done on a table/phone? Sure, but I don’t think its common.
The reason this is a problem is that it means there is a barrier between deeper computer skills and the devices/environment that people are using daily. The reason many of us became computer savvy on a desktop wasn’t because we wanted to, its because we had to to get the game running we wanted or we had to write the paper we were required to. So being familiar with other uses on a computer, it is only a very mild extension to writing a script if the need arises. The only “new” or “foreign” part is the script, not the environment or interaction of where you’re creating it.
With a tablet/phone as your primary device it means learning not just scripting, but learning all the skills necessary to use a computer. Its a high barrier.
However with their examples you don’t need to write a script, you can solve them that way but you really don’t need to for these examples. This is some basic search refinement skills (Outlook would even help you build this unlike say a Google search with refinement filters) and either a small spreadsheet or a calculator app to max out at their level 3.
Scripting this I would put at a level 4, but I would be interested where the authors of the paper would fit that in as its their research and what sort of percentage would fit into that skill set.
I think scripting is certainly a level 4 activity, since to even get started solving the problem you would need to navigate an IDE and have basic knowledge of a scripting language. Most people wouldn’t even know where to start.
I always wonder how it’s even possible. I can’t do half shit that I do on my computer on my phone. And even if I can, I need to spend like 3x-4x more time because how inefficient touch screen is.
Recently lived like that for a while - until a laptop replacement part arrived. And now I am skeptical. I refuse to believe someone would go mobile-only willingly and long-time. Maybe if they cannot afford a computer, at max.
What do you think all those “hacker” scarecrow movies and alarmist articles and laws were aimed at and caused by?
Modern computers allow one person to do the monthly work of the Soviet Genplan on their home machine in a day if they are smart, in a month if they are average, and Soviet Genplan employed more than one person.
Together with the Internet they make power over masses a much less certain thing.
Except if you poison both, you can not just neuter, but invert the effects.
We still have more and less powerful people in our world.
What I mean is that I don’t think it’s a coincidence that “user-friendly” computing, bloating of the Web and rise of authoritarianism happened with the same intensiveness in the same ~20 years.
I am primarily mobile only, but I am also a Linux user on desktop. I just don’t use the desktop very often because it’s less convenient to have to sit down in front of a desktop or laptop versus just pulling out your phone and checking something. It’s more a, it’s the device I have and it’s always on and I don’t have to go anywhere to get it. As I said though, I’m high in both mobile and desktop because I run Linux and know how to use the command line and I flash custom ROMs on my phone and use primarily open source software. I also submit bugs to many open source, desktop, and mobile applications.
I doubt that. Phones hide even more of the internal infrastructure than PCs do.