Edtech Enters Its Golden Age

My son Leo was born to ride subways. His bountiful love of public transit has resulted in way more time than I ever imagined on trains and buses across the U.S. and around the world. Over Christmas, he badgered me to allow him to fly to Newark so he could spend the week riding New Jersey Transit in lieu of vacation with the rest of us. So when Leo began sending poetry about Toronto’s subway system (the TTC) to my parents and siblings in Toronto, it seemed as natural as a chime followed by “stand clear of the closing doors.”

Leo began with an ode to Line 4, which runs along Sheppard Avenue:

Line 4 Sheppard, oh how grand
A subway line that spans this land
From Don Mills to Downsview Park
It takes us where we need to start
Along the way, so much to see
Line 4 Sheppard, a path to our dreams

It went on and on from there.

This happened to be the same weekend ChatGPT was released, OpenAI’s third generation large language model (LLM). So my brother and sister correctly guessed Leo was playing with AI trained on an unfathomable amount of text data, including books, articles, and social media posts, and asked him to “write” poetry about other TTC lines. My sister requested the new light rail line being built along Eglinton Ave. (“The Eglinton Crosstown LRT, a glistening strand / Of steel and glass, stretching across the land”). My brother commented that the Sheppard poem was a fair depiction… of the Spadina Line, prompting:

The Spadina line, a serpentine beast
Winding its way through Toronto's concrete feast

So let us ride the Spadina line
And witness the city in all its shine

You get the idea. But you know who didn’t get the idea? My mother, copied on the bad subway poetry, was impressed as only a grandparent can be (“you have clearly inherited some major writing talent… It is delightful to have such multi-talented grandchildren”) yet baffled (“perhaps you should send a copy to the TTC head office”).

***

There are productive uses of GPT like fooling grandparents and dealing with bureaucracy; to satisfy the requirements of his student visa, one of our summer associates has already used GPT to produce a “2,500-word reflection on differences between the work culture of the U.S. and the country I come from” – an essay no human will ever read, let alone grade. But as GPT advises us to beware, there are highly unproductive uses, like cheating.

For the past month, cheating doomsayers have dominated the education discourse around GPT. For teachers who’ve toyed with the tool, panic is the order of the day: “What GPT can produce right now is better than the large majority of writing seen by your average teacher or professor.” On Tuesday, New York City schools blocked access to GPT, citing “negative impacts on student learning.” One Wharton professor demonstrated GPT could not only write an essay, but come up with the prompt, create a grading rubric, and grade its own answer (it received 80/100). How could writing assignments not be headed for the dustbin of history (or at least the history department)?

With the availability of GPT, every student is capable of producing passable paragraphs. So spring semester 2023 marks the start of a new era for education. Instructors will no longer be able to assign generic topics – “read the book and write about it” = malpractice. As if helping students become employable weren’t incentive enough, GPT adds urgency to project-based learning or learning-by-doing. And teachers will have to do more to help students make a connection between each assignment and specific critical thinking and communication skills they might wish to acquire someday.

But education’s immediate response will be to meet technology with technology – in our case, edtech. There are three ways schools will use edtech to maintain academic integrity:

1. Write in class
Use class time to ensure students complete assignments unAIssisted. To free up time and provide students with pathways beyond burger flipping, teachers will need to do some classroom flipping (i.e., have students watch pre-recorded lectures). Students perform better when teachers use technology to transform the classroom into the locus of assessment – not only summative writing, but frequent formative assessments administered in class via apps from providers like iClicker, Echo360, and CourseKey. Doing so allows teachers to ascertain whether students understand key concepts while encouraging peer learning, group problem solving, and project-based learning.

Unfortunately, many people flip out when you ask them to change how they do their job. In teaching, it’s not merely obstinance: flipping the classroom is as much work as developing a brand new course while delivery is more work than bulldozing through a course outline; teachers need to be on their game every class. And why bother when schools have no mechanisms to measure outcomes or reward change?

2. Write at home while we watch
During our year of remote learning, online proctoring became commonplace. Tools like Examity, Proctorio, and Meazure Learning lock screens and watch students while they write. But while online proctoring has become the norm for high-stakes tests, the cost of even AI-assisted proctoring puts it outside the bounds of regular assignments. More problematic, no one loves a panopticon, as seen in anti-proctoring petitions and protests. Some students claim online proctoring has harmed their mental health while others have successfully sued schools for invasion of privacy.

3. Cheat at your peril
Schools have been playing an extended game of whack-a-mole with technology-assisted cheating. When “model” term papers were first posted online – first in the hundreds, then in the hundreds of thousands – Turnitin was the first tech fix, detecting plagiarism by matching submitted papers against a database that absorbed all newly submitted work.

Can technology detect work created by AI? It’s not as though GPT doesn’t leave clues e.g., no typos, more likely to use common words (the, it, is). According to leading student writing company, Packback, the answer is yes. Right before the holiday, Packback announced that for spring semester, AI-content detection and flagging has been built into its discussion platform, Packback Questions, and will be released in the next few weeks for its new writing platform, Deep Dives. According to Packback co-founder and Chief Product Officer, Jessica Tenuta, “because large language models like GPT generate content based on the statistical probability of what word most likely follows the prior word in a sentence, this same approach can be used in reverse to detect the ‘signature’ of content generated by AI.” Packback flags text over a specific level of statistical expectedness and under a specific level of “entropy” (unexpected words and patterns). As of Christmas Day, early results suggest Packback will detect and flag the majority of all AI-generated content posted on the platform with a very low false positive rate (0.04%) – where students write like a boring machine. Packback continues to fine tune the technology.

Packback won’t be the only new edtech tool for catching AI content. Turnitin already has one product that can already detect some forms of AI-assisted writing and will be incorporating new AI-writing-detection capabilities across its portfolio later this year. Before long, schools will have a range of options and students will be required to submit even simple assignments through one of these platforms.

As schools allocate more class time to assessment, adopt online proctoring, and integrate AI-detection technology, GPT is bound to usher a lot more tech into the classroom. So the edtech-in-trouble narrative driven by last year’s decline in B2C edtech stock prices couldn’t be more wrong. Schools and colleges need help and 2023 is the first year it will be impossible to imagine education without education technology.

***

Thrilled at what GPT can produce – clearly they haven’t seen Leo’s subway poetry – some boosters are touting AI as “the biggest driver of productivity since the steam engine.” Imagine what GPT will do to improve customer service – it can’t make it much worse, right Southwest? But only if we get the education part right.

With edtech safeguards in place, the next step is for students to build the necessary skills to improve upon the new AI baseline. Taking advantage of GPT and successor technologies to produce superior work will require new technical skills like optimizing instructions for AI tools and changing parameters.

Just as the emergence of low-code/no-code platforms has eliminated coding as a prerequisite for tech jobs, GPT signals that leveraging AI won’t require a data science background, but rather an ability to utilize AI tools backed by a basic understanding of how LLMs work. It will be up to schools and colleges to ensure graduating students can use them effectively.

And so GPT should not only set off alarm bells for English teachers wedded to five-paragraph essays, but also for teachers of scientific and technical subjects. While our current approach is largely to teach the history of science – witness the physics grad who relayed to me her professor’s words upon graduation “congratulations, you’re now caught up to 1940” – more time is required on applying current and pending science and technology. Much of what is taught in STEM could be moved to history of science and profitably traded for applied technical learning. If we can do this, we’ll have all the ingredients for AI to become a useful tool for school work, but – thanks to edtech – not a shortcut (like Wikipedia).

Beyond tech skills, taking advantage of AI also demands new cognitive skills including selection, structuring, and determining when it’s wrong. As early adopters of GPT have already recognized, big data isn’t always right. In the 1948 presidential election, all available data projected Thomas Dewey would deny President Truman reelection. But young James Snyder (later known as Jimmy the Greek) had a hunch. Dewey had a mustache while Truman did not. So Greek surveyed Ohioans and found 4 out of 5 said they didn’t trust politicians with mustaches. He bet $10K on Truman at 17:1 and won. Jimmy the Greek’s first big score came from knowing when to go against the data. Students must be taught to look for mustaches.

As edtech enters its golden age, one thing is certain: when it comes to developing the new technical and cognitive skills students need to harness powerful AI, there will be new edtech tools to help.