What we can learn from minimal edtech: the persistent myth of digital natives
I recently found myself reading this 2014 piece again, a good entry in the annals of dispelling the still too ubiquitous myth of young people as “digital natives.”
One choice bit:
Siva Vaidhyanathan, chair of the media-studies department at the University of Virginia, describes Ms. Hargittai as a “pioneer of empirical Internet studies.” It is “absolutely untrue” that young people understand how the Internet works when they enroll in college, he says. “That myth is in the direct interest of education-technology companies and Silicon Valley itself. If we all decide that young people have some sort of savantlike talent with digital technology, than we’re easily led to policies and buying decisions and pedagogical decisions that pander to Silicon Valley.”
sidebar: For more on digital natives, see most recently a good overview in 2020 this, especially the handy summary chart, and, from 2008, this. As these and many other studies make clear, the injection of “digital natives” into the educational bloodstream resulted in a panic about how to meet the needs of these supposedly technological students. Edtech of recent decades has been built, in part, on educator and administrative fear that they are not meeting the needs of so-called “digital natives.” That this group of mythical “natives” (or “immigrants”— choose your nuance) does not exist or may not have the characteristics ascribed to it demands more critical approaches to the familiar educational technologies that have become so widespread in a short period of time.
Similar to the course and methods described in the article, I too have tried to teach students about online identity and the amount of information left behind. One shocking thing in the intervening period since 2014, anecdotally for me at least, has been watching students become increasingly nonplussed about the amount of information they bleed out to big tech online. Even when I do similar exercises of revealing everything about them through google or geneaological databases or the like, most don't really care. (Or at least they say it doesn't matter.) At the same time, the public discourse around big tech seems to have become much more prominent. But the reaction I get for 90% of my class is always that they really don't care that much. They see tech as inevitable and themselves as relatively powerless to do anything about it.
I suspect that much of this resignation to the status quo comes from having used technology as entertainment and communication for as long as they can remember. There was no shifting to a new way of doing things or being observed; they have always been recorded and observed and tracked.
Educational technologies bear significant blame for this normalizing of surveillance behavior, but I'm more interested in the way that by not adequately confronting the myth of the digital native, we've robbed students of the possibility of taking control of their own technology. The myth of the digital native is one of effortless mastery, where the youthful savant masters tools that are mysterious and magical to their elders. Quite the contrary in practice, these lines from that 2014 article ring even truer today:
The findings paint a picture not of an army of app-building, HTML-typing twenty-somethings, but of a stratified landscape in which some, mostly privileged, young people use their skills constructively, while others lack even basic Internet knowledge.
I've spent much of the past five years teaching groups that are, to say the least, widely divergent in their technological background and training. The gaps between the most facile (and, yes, often privileged) and the least are huge. Nowadays we are more likely to recognize this as a “digital divide” due to external factors, but it is still a far too binary view of the wide range of understandings and experiences young people have about technology.
Educational technology demands more training in technology, both a form of “data literacy” and a form of technological skill building. This runs counter to the way that edtech is typically marketed. There it is a “solution” that replaces some otherwise pedestrian function or fosters engagement or some other thing. Edtech does everything possible to pretend that it is not in fact a technology that requires training and skill and reflection. This is simply mistaken. All the talk about digital natives or generational divides feeds into false narratives about technological inevitability in education.
To use educational and academic technologies better we need to start with the fact that it is not a seamless, frictionless, or magic solution to what ails education. And we need to start with the fact that students are, as in all subjects, a diverse mix of interests, experience, skills, capacities, and understandings. Technology is a thing, complex perhaps, something made and something to be studied, inspected, questioned. That isn't just a concern of school IT or teachers. That should be front and center a matter of discussion, debate, and education for students. This may seem strange at first — how often do we contemplate the nature of the humble pencil, or the whiteboard, for example — but I think it would be valuable. X-ray and dissect the edtech tools you use with students, not apart from students outside of class. Expose how it works. Own and pause on where it fails. Let them think about what difference it makes that you choose to use one tool over another or that you use it in the way that you do.
Digital natives don't exist. All students need training from the ground up. The fact that there are a few who can do amazing things at a young age only shows what is true in any field — there are always the hyper-interested, the advanced, the prodigies. They can all benefit from critical engagement with technology. If that also means we might end up realizing that some of this technological stuff just isn't necessary, then all the better.