As adults experiencing a technological problem with some gadget, we've probably all heard the cliche to "just ask a ten year old" to fix it. It's true that kids are remarkable little machines when it comes to learning new things that might confound their parents. But that doesn't mean they really understand the gadget you are having trouble with.
The majority of kids today -- the so-called digital natives -- and adults too for that matter, are citizens of the app economy. In other words, kids may know how to install and run an app, but they have little to no understanding of the device itself (phone, tablet, computer, etc.), what makes it tick, or anything else about the device that doesn't make itself visible or known within the bounds of the device or apps being used.
Starting in the late 70s and into the 80s, these new-fangled micro computers were used mainly by hobbyists, like myself. We could tear them apart, reload operating systems, repair or replace parts that failed, and make them do most anything we wanted including experimenting in ways not envisioned by the manufacturer.

A World of Apps
What few users -- people that used a computer as a tool to get things done and not an end unto itself -- that existed were generally pretty savvy in their own right. Although not experimenters nor ones to enjoy pure tinkering, users could still do their jobs without asking for much help. They had to learn and figure out for themselves, for help was scarce and there was no Google to turn to.
Things started changing in the early 90s as computers made their way into many more offices and even a few homes. They were still very manual in that one had to understand a number of things not directly related to the task at hand in order to use the thing. e.g. Users were offered a set of tools and they needed to understand how each tool worked and how its proper use contributed to the overall thing that needed to be accomplished.
But those users generally could not diagnose problems on their own or fix any kind of hardware problem like their 70s and 80s brethren could. In other words, they could operate the thing all day long with little help even if they could actually fix or repair very little.
The Internet really started taking-off in the mid to late 90s (I know, right? Has it been that long ago?) and that's what really kick-started the PC revolution. As more people bought computers and the operating systems became more polished and sophisticated, people didn't become as well-versed and well-rounded in even using their computers.
I mean, it makes sense, right? Early adopters to an idea are generally the most ardent practitioners of that idea. People that come later do so because they recognize the utility but aren't usually interested in how it all works.
The automobile saw a similar evolutionary arc with its "users". Early adopters had to be skilled mechanics and be prepared to perform roadside repairs at any time and even carried a toolbox with them. Today, there's a fair number of people that could not even change out a flat tire, let alone perform any other maintenance or repair. How many people do you know that checks the tire's air pressure, let alone adding air? Do you?
Mind you, this is not to say that people have become stupider. It's to say there's a sharp correlation between mass-adoption and product or idea maturity. As product or idea maturity expands, the occasion by more people to delve into deeper understanding wanes. But an unfortunate side-effect is the reduced ability of a product's user base to use said product in any way not specifically envisioned by the manufacturer or to fix it in any way whatsoever.
Computers and related devices are literally everywhere. Our personal and business lives depend on these computers and devices and the internet they can access. Yet most of us are clueless to how it all (even roughly) works internally or how to get the most benefit from them.
Even today, in the mid 2020s, people continue to fall prey to malware, online scams, don't backup their data or even understand why it's necessary, do not understand the file system, don't really know what a browser is or what it does, have no clue what the various hardware bits of their devices are for (CPU, RAM, storage, PSU, LAN, USB, HDMI, etc.), and have no sense about security, passwords, phishing schemes, and data compromise. Yet being at least passively familiar with all these things are critical to living in the 21st century.
"Appification" is a newish proto-word that generally refers to the increasing trend today of using premade "apps" to perform an increasing number of things.
But I define appification differently; And that is the phenomenon whereby people are unable to operate a device or solve a problem outside of the narrow and specifically intended use for an app, and are lost when faced with any exception to the status quo.
Today's smartphones have only accelerated the appification of the populace, further reducing people's understanding of technology.
Apps prescribe not only what you do but how you do it, with little to no ad-hoc inventing or deviating by the user.
Apple's once tag line "There's an app for that" just demonstrates that mindset even more -- that there's a (dedicated) app for every need.
This sets up an environment where we no longer understand general-purpose computing tools. Instead of having a toolbox full of multi-purpose, versatile tools that enable us to solve a problem, we have a streamer trunk full of highly-specialized, single-tasker apps that do that for us. If there's no app to meet a need we might have then, well, that need just goes unmet.
In other words, by doing only what the app or device allows from the outset, we have no curiosity or drive push the boundaries or to learn how things work.
And make no mistake, this trend is promulgated and welcomed by Big Tech. The more that Big Tech can prescribe additional aspects of how users engage with their apps, the better.
This doesn't just mean using a smartphone although this is the genesis of the term. It could apply, for example, to someone incapable of using a printed map because they've only ever used a turn-by-turn GPS-based navigator app. Or someone who's never been to a library and would not know how to find a book using the card catalog (which is mostly dead today anyway). Or could not add air to a tire because they always called AAA for help.
In my observation, what I see, especially in younger people, is a declining comprehension in using a "regular" computer. That is, a laptop or desktop -- not a phone or tablet. Not all people, but a lot of them.
We're way past "peak PC" when regular computers were our main platform for doing stuff. Computer sales peaked around 2010 and has been on a slide ever since.
Many people today, again, tending toward younger people, are less likely to even own a computer. Mobile phones are the lingua franca of this newer generation. It shows up in reduced ad-hoc savvy, reduced problem solving, and doing worse on job interviews. Jobs, I might stress, that still require traditional computer savvy. Businesses as a rule don't conduct business or run their offices on mobile phones.
The decline in computer savvy that I've observed include these important skills:
I suspect these are a big reason why people are eschewing regular computers today in favor of mobile devices. It's true the mobile app economy doesn't require these skills for the most part. But by abandoning that knowledge, people are willfully turning over all aspects and control over their experiences to money-first Big Tech firms.
Believe me, I'm not trying to be an elitist snob here. I am discussing real concerns regarding the disconnect between people and their understanding of critical and ubiquitous technology. Concerns voiced by many experts on the matter.
There’s a growing gap between casual digital fluency and deep technical understanding with fewer people accidently acquiring foundational computing skills.
Many younger people today have limited understanding of how the apps and devices they use actually work, and less experience building, modifying, or troubleshooting systems outside of guided environments. This bodes poorly for a future where society will need ever more designers and creators.
There are, of course, exceptions to the foregoing. There are brilliant kids today with STEM interest that will develop tomorrow's technology. My son is one of them, having earned a computing engineering degree and is a competent developer.
What can you do to help? Parents would do well to involve their kids thus...
There are probably 37 more things you could do to kickstart your kid’s interest in how and why things work instead of them obsessing over optimizing their TikTok feed or grinding for skins in Fortnite.
Need help on how to proceed? Google or ChatGPT can give you some pointers.
Here's some well-written articles on the sad state of computer savviness today.
Marc Scott: Kids can't use computers
Dave Echols: Many People Can’t Use Computers