Determining what to teach at schools and how to teach it are critical as technology and education become inseparably interwoven.
When we speak of disrupting education, it often means replacing human teachers with technology.
As a technologist at one of Western Australia’s universities shared with me last year, one of the things they’re working on is the ability to put one lecturer into a classroom of 10,000 students using virtual reality technology.
It might sound great for the university to deliver courses to larger numbers of students with fewer teaching staff, but is this teaching model actually good for learning outcomes? Would students pay for this type of experience?
With education currently Australia’s third largest export, these are very important questions.
MOOCs
While these VR classrooms don’t yet exist, the push towards more technology and fewer humans can be seen in the recent trend towards massive open online courses (MOOCs), which were heralded as the future of education. In these courses, all the material is put online, for free, and students can study at their own pace.
Beyond the creation of the course material, no actual humans are involved in the course delivery. The courses are free, but if you want the credentials at the end of the day, that’s when you pay.
They’re delivered by organisations like edX, which was created by MIT and Harvard in the US.
In a world where everything is online, on demand and automated, replacing teachers with massive online teacherless courses might make sense. It might seem like the logical next step for how we deliver education.
MOOCs were expected to disrupt education. But as Anton Crace, who covers technology for The PIE News (Professionals in International Education) shared with me, MOOCs have largely failed to live up to the hype.
It might be because teachers are actually needed now more than ever.
Disrupting disruption
I was in Sydney earlier this month to deliver a keynote address on this topic at NEAS 2018, a conference for English language teachers and managers.
The theme of the conference was, appropriately, ‘Beyond the digital revolution’; appropriately because I and several other speakers were calling for a rethink of the way we’re using technology.
The first thing to consider is what we’re teaching. With the Committee for Economic Development of Australia predicting that 40 per cent of Australian jobs will be lost to automation in the next 15 years, the skills we need to be teaching students now need to ensure we’re actually preparing them for this future.
Human skills
As Alibaba founder Jack Ma told the World Economic Forum in January of this year, trying to teach today’s students to compete with computers is destined for failure.
Instead of trying to compete, he said, we need to be teaching human skills – things AI still can’t do. For Mr Ma, this includes things such as art, sport, compassion and critical thinking.
While online courses can effectively impart information and knowledge, teaching skills – especially these very human ones – are still much better off delivered by a human.
Thinking back to my own time as a student, all the transformative moments I had were thanks to mentors and teachers. This was especially true for breakthroughs I had in being a better writer or critical thinker. Most importantly, it was having a teacher believe in me that helped me to believe in myself – an experience that could not have been delivered by a MOOC.
Experience first
The second thing we need to consider is how we’re using technology.
A few years ago, I was asked to create content for some newly purchased smart tables, which had been bought without anyone really knowing how they’d be used, or what problem they solved.
This is unfortunately not an uncommon way for technology to be integrated into the classroom. Sure, it’s cool to have a smart table, but perhaps chalk and paper and paint and clay might have done a better job.
The technology becomes the outcome, rather than enabling the outcome.
I learned this lesson the hard way in the first-ever game I created, Ghost Town (which turned out to be Perth’s first alternate reality game). The client wanted a bluetooth game, because at the time (2007) bluetooth was the latest and greatest new technology.
It turned out to be the worst game I ever created.
Thankfully, this failure greatly informed the creation of the game that I’m most proud of – Gentrification: The Game, a chalk-based live action monopoly style game.
The design rationale was to first be guided by the experience and then use whatever technology (old or new, analogue or digital) was most appropriate. While it only used a small bit of digital technology for scorekeeping, it ended up winning the ‘best use of technology award (and best game) at a games festival in New York City. This spoke volumes to me about the best way to use technology.
Lead with the experience, not with the technology.