The Deceptive Nature of the “Digital Native”

This is more of an indirect response to IBM's leaked internal emails, in which “dinobabies” – employees above the age of 40 – are considered “a threat” to the company's economic future, yet applies to any situation involving the mention of this phrase.

It sounds like a buzzword, is a buzzword, and those attempting to describe it often do not even know what to come up with. Although it is easy to end any debate by saying “buzzword” and calling it a day, it has not stopped any tech company or think tank from throwing this idea around to make older people working in tech feel worthless or “too stupid” for current tech hypes. And since very few have made the effort to at least summarize some of the many layers of deception this phrase carries, it is long overdue to attempt a somewhat deeper analysis of its current usages for quick reference, as even its respecting Wikipedia entry is quite chaotic.

Etymological issues

The phrase “digital native” was first used by John Perry Barlow during an interview for Australian Personal Computer in 1995:

“I would say that, generally speaking, at this stage, if you're over 25, you're an immigrant. If you're under 25 you're closer to being a native, in terms of understanding what it is and having a real basic sense of it.”

He later would repeat this description in his manifesto “A Declaration of the Independence of Cyberspace”, considering government officials the contemporary equivalent to “boomers”. One thing to keep in mind is that this manifesto was written in 1996 by someone who was born in 1947. The generation of the “baby boomers” is often considered to include people born between 1946 and 1964. Barlow was in his late-40s when complaining about bureaucrats not knowing anything about the rising technology of the 90s. If you currently are above the age of 30, you are more likely to remember floppy disks and clunky monitors that easily strained your eyes; maybe you even remember modems and how one had to rely on services like America Online (AOL) to temporarily access the World Wide Web because data was very limited and thus quite costly.

Yet the political references did not carry over to its most common definition. The phrase was popularized by education consultant Marc Prensky in 2001. Prensky suggested a radical change of students, calling it a “singularity”:

Today‟s students – K through college – represent the first generations to grow up with this new technology. They have spent their entire lives surrounded by and using computers, videogames, digital music players, video cams, cell phones, and all the other toys and tools of the digital age. Today‟s average college grads have spent less than 5,000 hours of their lives reading, but over 10,000 hours playing video games (not to mention 20,000 hours watching TV). Computer games, email, the Internet, cell phones and instant messaging are integral parts of their lives.

There is not a single reference to where Prensky has gotten his numbers from. A manual search with any preferred search engine leads to a variety of articles and studies citing Prensky or others that only make a reference to his piece. Therefore, it is very likely that he made those numbers up.

It would not be all too surprising, as the same article states:

Marc Prensky is an internationally acclaimed thought leader, speaker, writer, consultant, and game designer in the critical areas of education and learning. He is the author of Digital Game-Based Learning (McGraw-Hill, 2001), founder and CEO of Games2train, a game-based learning company, and founder of The Digital Multiplier, an organization dedicated to eliminating the digital divide in learning worldwide.

(I do not really have to explicitly say what I think of “thought leaders”, do I?)

Georgia Tech, among other researchers, debunked the myth that sheer exposure to digital technology fundamentally changes the way in which people process information, stating that “only 30 percent of the world’s youth population between the ages of 15 and 24 years old have been active online for at least five years”, meaning that during the phrase's initial popularization, the majority of young people were not even exposed to most digital tech and, even if their environment included tech, were likely not allowed to use it as they pleased (at least I was not allowed to do much with my grandparents' PC running both Windows 95 and 98; internet access was not granted to me until I turned 12 and became a requirement for some school work). The phrase “digital native” was not openly considered a marketing phrase until 2014, when David Kipke wrote:

When I started at Adknowledge a few months ago, I have to admit that I knew next-to-nothing about digital marketing. To me at the time, online marketers were able to track you with cookies whenever you’re online, or something like that, and this was why display ads for the retro Kansas City Chiefs jersey I was coveting/stalking on eBay would show up on my Facebook page. As a Millennial, I did not realize that my exposure to the digital marketing industry was going to present me with a fascinatingly new perspective on my own generation.

This reveals three things:

  1. The phrase was first used in 1995 in reference to dumb – but not necessarily old – politicans,
  2. it did not gain an influential amount of attention until it was picked up by a self-proclaimed “opinion leader” in 2001,
  3. yet would not become one of the most widely-used buzzwords until giants like Facebook and Amazon already dominated the popular perception of the WWW consisting of nothing but a handful of social media sites owned by Meta and online-shopping on Amazon.

Current usage(s) and how none matches

Indeed, with increased usage, several new definitions and understandings have been created. Some have attempted to divide “digital natives” into three somewhat distinctive categories – “avoiders, minimalists, and enthusiastic participants” – but that is really it. The common belief behind this phrase remains unchanged; the younger people of their respecting period are always the digital natives since the 90s, always fundamentally different from their preceding generation(s), despite the tech they might have been exposed to as kids now largely considered to be “old and outdated”.

I myself grew up with CDs, MP3 players, flip phones and very limited access to the internet, though still enough to eventually fall for the (among Germans well-known) “Jamba Sparabo” and once causing my parents' phone bill to exceed €200 just because I often got bored in Latin class and checked Twitter, instead of translating texts about Romans and their geese. Within my grade at grammar school, I was the first to even learn about Twitter in 2010, shortly after my parents finally decided to tell our landline provider to enable internet access. Some knew about Facebook, yet would not start to use it regularly until a year or more later, when “schülerVZ” – our ripoff of Facebook that was exclusive to students younger than 21 – no longer was that social media site to be active on. I likely also was the last of the younger ones to briefly use the old MySpace and still remember the “good ol' days”, when YouTube users were able to fully customize their profiles. Especially Twitter was, more or less, a special case, as it was considered “too difficult to figure out” by many vocal internet users back in its early years (and I still do not know what the “difficult aspects” really were; I genuinely appreciated the variety of third-party clients and the lack of drama due to the more strict character and tweet limit).

By today's standards, I barely would fit the definition of “digital native”, as Wikipedia proposes one possible contemporary definition:

  1. They feel familiar with digital devices. 54% of them have a smartphone as a first personal mobile phone. These devices are used for entertainment and as a requirement in educational endeavors.
  2. They tend to be individualistic.
  3. They are able to multitask or focus on a single medium when needed.
  4. They are realistic. They are usually raised in an affluent environment, but due to the prolonged economic recession and the Fourth Industrial Revolution, they think their future is not clear. This kind of thinking makes them focus more on their reality.
  1. I did not receive my first smartphone until I was considered old enough to access the internet from my grandparents' computer and I solely used it to send SMS, some songs and silly videos via bluetooth, and later drive my parents' phone bill up by occasionally tweeting during class. Those tiny screens with their poor touchscreens never were adequate for any kind of school work and still are prohibited in class at my grammar school since 2012 (at least according to the public “house rules”).
  2. Another buzzword that deserves its own post.
  3. That makes no sense.
  4. This mentality is not exclusive to “young people” and “reality”, in this case, is highly dependent on who one asks. For many “digital natives”, the economic recession of 2008 had little to no influence, hell even I did not know what a recession is and that one was occurring; I was a 5th grader at that time.

Going by Prensky's definition, I would be considered a “digital native” by a tiny margin due to missing out on floppy disks and “the internet of the 90s”. One contemporary definition partially excludes me, as well, as I would not consider myself to be truly individualistic – whatever even that is supposed to mean in this context – and used three regular cell phones before eventually relying on a cheap smartphone with a terrible touchscreen manufactured by Samsung during my early years as a teenager. I also do not remember the recession of 2008 to have caused any significant effect on my personal life, besides my dad's complaints about the rising petrol prices during the “Euro crisis”, which was solely a partial result of the chain reaction caused by said recession.

Other understandings, such as limiting the core competences of “digital natives” to simply know how to use Microsoft Office products, do not make things clearer, either. Not knowing how to use Excel as intended and actively despising any kind of “Office software” either makes me an “avoidant native” or an incompetent one due to actively refusing to learn Excel formulas since my first business studies lesson – it depends on the think tank or whatever group tries to sell you an inefficient way to do things. My friends and some CS graduates only a little older than me, on the other hand, would consider me a genuine “techie” simply because I can somewhat install Arch Linux and know the difference between RAM and ROM.

TL;DR

The “digital native” is a marketing myth. Anybody appearing to be placing high value on young people and their supposed “digital skills” are fully aware that the vast majority of them are not different from their parents and grandparents. Just like older adults, they usually do not read Terms of Services, overshare many private aspects of their lives on social media, and could not care less about Big Tech making big profit off their desires to get quick dopamine kicks or five minutes of fame. The myth is largely spread by those with the least amount of “digital skills”, such as marketing consultants and “opinion leaders” (“influencers”), and target those just as incompetent as them – academics and the (over-)educated are the easiest to fool, after all.

Okay, there is a difference between us younger people and boomers: Boomers – or, as it would be closer to reality, members of the Silent and Greatest Generations – were able to send people to space by relying on computers requiring as little as 4 KB of RAM; Millennials and Gen Z'ers whine about top graphics cards getting more expensive, whilst playing freemium-laden “AAA games”.