When London’s Hunterian Museum reopened last month after a five-year renovation, its most famous exhibit was missing. The skeleton of Charles Byrne, likely the 18th century’s tallest man, had been removed from display. Byrne was born in Northern Ireland and grew to at least 7’ 7” before moving to London at the age of 20 to appear in shows as The Irish Giant. Before he passed away only two years later, he’d made it known that he wanted to be buried at sea so as to avoid being on display for eternity. Sadly, his friends sold his body to a wealthy surgeon and anatomist (John Hunter) who made Byrne’s skeleton the centerpiece of his collection.
In an era when average male stature was only 5’5”, no one knows why Byrne was so tall. Modern explanations include tumors and genetic mutations. One popular theory in his day was that Byrne was conceived on top of a haystack. So if the Museum’s decision is one more example of the arc of the moral universe being (very) long but still bending toward justice, the haystack hoopla reminds us that at any given moment, we are inexorably and suffocatingly trapped by the ideas and references of our own time: in the 18th century, the tallest thing most people could think of was a haystack.
A new giant (technology) has emerged in the past few months: ChatGPT and generative AI produced by large language models. And while it’s risky business for anyone haystack-bound in 2023 to foretell its impact, this hasn’t stopped hundreds of commentators from jabbering on about how it’ll change how we live, work, and learn (including how it could result in a “ nuclear-level catastrophe”). We’re told AI will make essays and homework redundant. ChatGPT has been compared to Covid-19 ( a plague upon education) and called a sign of the education apocalypse: “in less than a decade we will see the new iteration of ChatGPT writing a nice obituary for decrepit institutions of education that have lost any relevance.” Meanwhile, more perspicacious observers say only that it’s now impossible to imagine education without education technology.
But with schools increasingly focused on – but still not accountable for – employment and economic outcomes, what about AI’s impact on the ultimate consumers of postsecondary education: employers? After two decades of digital transformation, the demands of employment and employers have shifted away from traditional higher education, frustrating career launch for Millennials and Gen Z and now, belatedly, placing unprecedented enrollment pressure on non-selective colleges – witness private colleges closing at record rates and community college enrollment down 37% since 2010 (because even free doesn’t make sense if programs don’t lead to good jobs). What further changes will generative AI produce? How will the good entry-level jobs for which colleges and universities ostensibly prepare students get sideswiped?
In considering these questions, it’s helpful to distinguish between the skills gap and experience gap (and where better than the Gap Letter?). The skills gap reflects discrete abilities employers expect of (but aren’t seeing in) candidates, primarily digital or platform skills along with soft skills employers love to complain about. While the two gaps aren’t mutually exclusive, the experience gap is the mortar around these skill bricks. Do candidates have enough experience to know what to do with their skills – how to apply them in a specific job function and industry? For most employers, the only way to assess this is demonstrated experience. Otherwise, it could be months or even years before candidates know what they’re doing.
The good news is that AI appears likely to help close some skills gaps. Digital platforms will soon be equipped with functionality so users can explain in natural language what they want systems to do; while building automations in Salesforce currently requires months of training and a whole lot of trial and error, AI will make this skill much more accessible. And as ChatGPT is already producing functional code, everyone can be a software developer without paying $15K to attend a coding bootcamp. AI won’t only take over the drudge work of drafting code, but also text, images, presentations, and even coming up with questions for interviews and depositions. Professionals who have already begun leveraging ChatGPT for their work report it’s incredibly useful and satisfying; MIT researchers found that by relieving employees of many mundane duties, ChatGPT significantly improves job satisfaction. But what’s troubling is that these automatable tasks constitute much of what we now know as good entry-level jobs across many industries.
My first good job involved research, writing, and building presentations in an industry I knew nothing about. I’d estimate that at least 50% of my time could have been saved by the current version of ChatGPT, which would have allowed me to progress and extend the work much further (i.e., getting something done rather than seeing my report end up on a shelf). What will that job description look like in a year or two? Employers might reasonably expect entry-level workers to be conversant with AI and something like 50% more productive. And this means while the skills gap may narrow, the experience gap could become a chasm.
As AI makes skills more accessible, employers will place a higher premium on knowing what to do with skills. Think about going to work in the claims department of a health insurer. With AI, every claim won’t be reviewed by an entry-level worker. Instead, only claims that trip one or more flags will warrant human intervention, and such work is more likely to involve problem solving and troubleshooting – skills requiring some level of insurance claims experience to be effective. Or the tens of thousands of new college graduates who toil away in investment banks building presentations to sell clients on one transaction or another. AI will save much of this time, allowing new analysts to perform higher order, higher value work like developing real sector expertise, networking, and business development. Or new digital marketers who currently divide their time between writing social media posts and performance marketing (bidding on and placing ads). As AI takes on social media posts and optimizes Google and Facebook ad spending, that digital marketing job is going to be more about strategy and results (e.g., new clients, incremental revenue). Accounting? The same drill: grunt work completed by AI, leaving even the greenest green eyeshades to focus on judgment calls and higher-level work. In all cases, AI-inflected entry-level jobs look a lot like today’s (mid-level) jobs that demand years of experience. As a result, years of experience – or the equivalent in demonstrable skills and certifications – is what employers are likely to add to “entry-level” job descriptions.
We’ve seen this play out in cybersecurity, a sector already worked-over by AI. It’s not uncommon to see position descriptions for entry-level security operations center (SOC) analyst positions demanding “at least four years of experience, including time doing penetration testing, digital forensics and vulnerability assessments; and professional certificates.” One way to think about cybersecurity: Tier I detection and response has been largely automated, so junior jobs are tier II and above. As one college senior posted on LinkedIn, “I’ve lost count of the number of ‘junior’ cybersecurity role advertisements I’ve seen that want 1-3 years of experience and a CISSP. Anyone who knows anything about the CISSP knows you need minimum five years of full-time experience.” Olivia Rose, former Chief Information Security Officer at Mailchimp, wrote “it breaks my heart to see all these young, driven, hard-working young people trying so very hard to get their first job, but . . . we cannot just get our %^)-(#% together and give them a chance.”
The problem is that while career launchers can theoretically solve a skills gap via last-mile training, an experience gap is a tougher nut to crack. Digital transformation made employers more selective and gun-shy about entry-level hiring, but generative AI will take it to a new level. More than ever, employers will only want to onboard entry-level workers with industry experience who are capable of higher-level work. Career ladders will be cruelly yanked up and away so few will be able to reach the first rung.
Last week, the New York Times joined the club of AI prognosticators with an article about the likely impact on the legal profession. AI, the Times concluded, will “force everyone in the profession, from paralegals to $1,000-an-hour partners, to move up the skills ladder to stay ahead of the technology. The work of humans… will increasingly be to focus on developing industry expertise, exercising judgment in complex legal matters, and offering strategic guidance and building trusted relationships with clients.”
Even though I spent ample time and treasure on law school, one reason I didn’t want to become a lawyer is that it seemed like a profession in which experience matters much more than talent and effort. Because the content of law is so vast, if you haven’t encountered an issue before, you’re much better off talking to someone who has rather than trying to reason or work through it. Developing competence can easily take a decade. And that’s frustrating for a 20-something in a hurry.
The problem will become even more acute with the emergence of industry-specific large language models. Bloomberg has already developed BloombergGPT, “a 50 billion parameter language model that is trained on a wide range of financial data.” It's generative AI for the finance industry that will only be useful to workers with finance experience. We’ll see similar specialized AI emerge for every major industry: insurance, healthcare, logistics, cybersecurity, etc. After this comes function-specific AI for sales, marketing, product management, purchasing, customer support, HR, and IT. The upshot is more jobs will look law-like: entry-level workers without requisite domain knowledge won’t know what to ask AI for. And no one is suggesting that some new form of AI will help them figure out what to ask AI.
On a recent text chain, several of my college roommates used (abused?) ChatGPT to share movie scenes and poetry about drunken nights of yore. One noticed that the AI consistently produced text ending with a moral about the evils of excessive drinking, prompting another to suggest: “Tell it to rewrite, but be less judgmental.” The chain concluded with this comment: “Someone said that a billion hours will be wasted by AI before a billion hours are saved. I’m doing my part do achieve the wasting target.”
Given the amount of wasted time so far, we may not be far from saving our first billion hours. And there will be billions and billions after that. With this level of savings within reach, profit-maximizing employers won’t stand pat on their biggest expense: people. Expectations and job descriptions will look very different, and that means big changes for career launch and the schools that produce talent for entry-level jobs.
In the AI era, the future of career launch and socioeconomic mobility will depend on scaling pathways that not only teach, but also provide relevant work experience. Schools will scramble. Career and technical education and youth apprenticeships will become core at every high school. In higher education, it means: reforming Federal Work-Study to prioritize meaningful off-campus work over menial on-campus work; integrating real work projects into coursework via experiential learning marketplaces like Riipen; and doing much more to provide every student with multiple (paid) internships (likely by abolishing career services). Even more important, we need a revolution in how we think about (and the importance of) apprenticeships: how apprenticeship jobs are actually created, and how governments fund and support them. As generative AI transforms entry-level jobs and puts a premium on experience, “earn and learn” apprenticeships are likely to be the best bet for helping millions of young people launch careers.
The alternative is bleak: a dramatic decline in good jobs that are truly entry-level, massive youth unemployment and underemployment, even higher inequality with tech elites getting even richer, and political instability. States and nations that fail to prepare for the impact of generative AI on career launch are likely to end up on the haystacks of history.