For almost two decades, people have been saying that eventually all or most education would be online But no one meant it like this.
That’s the trouble with predictions – even when they’re right, you have to read the fine print. That’s also a crucial lesson for us as almost every classroom across the country, at all levels, strives to figure out how to create a meaningful education for its students on the fly: we need to read the fine print.
Because while educational technology has produced some remarkable advances, it has also tried to sell a lot of snake oil, making big promises that never panned out. When times are desperate you can expect the amount of snake oil in a system to rise like a flood tide.
It will be tempting for schools, school systems, and states alike to invest in whichever technological “solution” promises the biggest miracle. But much like “miracle drugs,” educational technology should be viewed skeptically, tested, and vetted carefully, before being heavily invested in. Outcomes matter far more than promises.
Some technologies, we are coming to realize, simply do not work as advertised. As part of the New York Times “Privacy Project,” writer Charlie Warzel pointed out that perhaps the most depressing thing about so many of the technologies, like facial recognition, that steal our privacy on the promise of a better tomorrow, is that they work well enough to steal our privacy but not well enough to do what they’re actually supposed to.
“In the year I’ve been writing this column, and voraciously reading articles about digital privacy, an unsettling theme has emerged. A report introduces a piece of technology with terrifying, privacy-eroding implications. The technology — facial recognition, digital ad tracking, spyware, you name it — is being rapidly deployed by companies that aren’t considering the potential societal harms. The report produces understandable frustration and concern. Then, upon further examination, the claims regarding the technology break down. That groundbreaking piece of technology, it turns out, is deeply flawed. Instead of a perfect panopticon, you have a surveillance-state equivalent of a lemon, or worse yet, total snake oil.”
This same dynamic has absolutely happened with educational technology as well.
As Ulrik Juul Christensen wrote in Forbes in 2019:
“(T)oo much unproven EdTech has been rolled out without scientific evidence of what it can do, to the frustration of teachers, parents, and students alike. Even for helpful technology, the expectations are often inflated, becoming false promises that lead to disappointment.”
“The majority of this technology over promises and under delivers because it is technology first and education later – or never.”
There is perhaps no better example of this than 2012, which the New York Times declared “The Year of the MOOC” (Massive Online Open Courses). With massive hype, it was the year that hundreds of millions of dollars were invested in Udacity, Coursera, and EdX. Hundreds of thousands of students enrolled, and it was widely thought that the existence of open online courses would be the asteroid that ended the existence of most brick and mortar colleges.
But virtually none of those hundreds of thousands of students completed their courses of study, while traditional universities kept students graduating. Less than eight years later, those companies – though worthy – have become very niche businesses, while it took a global pandemic to put conventional universities online.
The point isn’t that new EdTech is bad – Calbright College is an online community college, we obviously believe in the power of online education to change lives for the better, on a massive scale. The point is that simply rushing to implement the latest technology is not effective teaching. If we care about student outcomes, we must understand what technology can and cannot accomplish, and how it enhances effective teaching, rather than trying to use it to replace effective teaching.
Whatever companies promise us, we simply cannot know what these technologies can accomplish until they’ve been tested.
Even then, we often have to ask hard questions. Sometimes things go wrong for reasons that have nothing to do with the tool. A great example is Ning, a “make your own social network” site that was very popular with colleges and universities early this century. They used it, they liked it, and they felt it enhanced their operations. But when Ning discontinued their free version, it caused enormous headaches for every institution that depended on it, and those headaches became migraines after Ning was acquired. The software did what it promised – but educators trusted in its continuity in a way that we now recognized to be naive in the world of technology.
We are in an educational crisis, and it is appealing to think that a big investment in new technology will be a lifeboat that can carry us to shore. But massive investments in new systems are more likely to lead to further mass confusion than they are solutions that are good for students.
EdTech is a good thing. It is a big part of the solution. But take it from an online college that is responsible to its students and stakeholders rather than its investors: taking the time to test new technologies and technological approaches carefully, so that you know and understand the impact it will really have on students and systems, is vital to doing better tomorrow than we were yesterday. California schools and colleges need digital think tanks that conduct these experiments, like Calbright, before they invest in what they don’t know.