So Much for ‘Learn to Code’

Story by Kelli María Korducki 

The quickest way to second-guess a decision to major in English is this: have an extended family full of Salvadoran immigrants and pragmatic midwesterners. The ability to recite Chaucer in the original Middle English was unlikely to land me a job that would pay off my student loans and help me save for retirement, they suggested when I was a college freshman still figuring out my future. I stuck with English, but when my B.A. eventually spat me out into the thick of the Great Recession, I worried that they’d been right.

After all, computer-science degrees, and certainly not English, have long been sold to college students as among the safest paths toward 21st-century job security. Coding jobs are plentiful across industries, and the pay is good—even after the tech layoffs of the past year. The average starting salary for someone with a computer-science degree is significantly higher than that of a mid-career English graduate, according to the Federal Reserve; at Google, an entry-level software engineer reportedly makes $184,000, and that doesn’t include the free meals, massages, and other perks. Perhaps nothing has defined higher education over the past two decades more than the rise of computer science and STEM. Since 2016, enrollment in undergraduate computer-science programs has increased nearly 49 percent. Meanwhile, humanities enrollments across the United States have withered at a clip—in some cases, shrinking entire departments to nonexistence.

But that was before the age of generative AI. ChatGPT and other chatbots can do more than compose full essays in an instant; they can also write lines of code in any number of programming languages. You can’t just type make me a video game into ChatGPT and get something that’s playable on the other end, but many programmers have now developed rudimentary smartphone apps coded by AI. In the ultimate irony, software engineers helped create AI, and now they are the American workers who think it will have the biggest impact on their livelihoods, according to a new survey from Pew Research Center. So much for learning to code.

ChatGPT cannot yet write a better essay than a human author can, nor can it code better than a garden-variety developer, but something has changed even in the 10 months since its introduction. Coders are now using AI as a sort of souped-up Clippy to accelerate the more routine parts of their job, such as debugging lines of code. In one study, software developers with access to GitHub’s Copilot chatbot were able to finish a coding task 56 percent faster than those who did it solo. In 10 years, or maybe five, coding bots may be able to do so much more.

People will still get jobs, though they may not be as lucrative, says Matt Welsh, a former Harvard computer-science professor and entrepreneur. He hypothesizes that automation will lower the barrier to entry into the field: More people might get more jobs in software, guiding the machines toward ever-faster production. This development could make highly skilled developers even more essential in the tech ecosystem. But Welsh also says that an expanded talent pool “may change the economics of the situation,” possibly leading to lower pay and diminished job security.

If mid-career developers have to fret about what automation might soon do to their job, students are in the especially tough spot of anticipating the long-term implications before they even start their career. “The question of what it will look like for a student to go through an undergraduate program in computer science, graduate with that degree, and go on into the industry … That is something I do worry about,” Timothy Richards, a computer-science professor at the University of Massachusetts at Amherst, told me. Not only do teachers like Richards have to wrestle with just how worthwhile learning to code is anymore, but even teaching students to code has become a tougher task. ChatGPT and other chatbots can handle some of the basic tasks in any introductory class, such as finding problems with blocks of code. Some students might habitually use ChatGPT to cheat on their assignments, eventually collecting their diploma without having learned how to do the work themselves.

Richards has already started to tweak his approach. He now tells his introductory-programming students to use AI the way a math student would use a calculator, asking that they disclose the exact prompts they fed into the machine, and explain their reasoning. Instead of taking assignments home, Richards’s students now do the bulk of their work in the classroom, under his supervision. “I don’t think we can really teach students in the way that we’ve been teaching them for a long time, at least not in computer science,” he said.

Fiddling with the computer-science curriculum still might not be enough to maintain coding’s spot at the top of the higher-education hierarchy. “Prompt engineering,” which entails feeding phrases to large language models to make their responses more human-sounding, has already surfaced as a lucrative job option—and one perhaps better suited to English majors than computer-science grads. “Machines can’t be creative; at best, they’re very elaborate derivatives,” says Ben Royce, an AI lecturer at Columbia University. Chatbots don’t know what to do with a novel coding problem. They sputter and choke. They make stuff up. As AI becomes more sophisticated and better able to code, programmers may be tasked with leaning into the parts of their job that draw on conceptual ingenuity as opposed to sheer technical know-how. Those who are able to think more entrepreneurially—the tinkerers and the question-askers—will be the ones who tend to be almost immune to automation in the workforce.

The potential decline of “learn to code” doesn’t mean that the technologists are doomed to become the authors of their own obsolescence, nor that the English majors were right all along (I wish). Rather, the turmoil presented by AI could signal that exactly what students decide to major in is less important than an ability to think conceptually about the various problems that technology could help us solve. The next great Silicon Valley juggernaut might be seeded by a humanities grad with no coding expertise or a computer-science grad with lots of it. After all, the discipline has always been about more than just learning the ropes of Python and C++. Identifying patterns and piecing them together is its essence.

Read More From: For A Free America

In that way, the answer to the question of what happens next in higher education may lie in what the machines can’t do. Royce pointed me toward Moravec’s paradox, the observation that AI shines at high-level reasoning and the kinds of skills that are generally considered to reflect cognitive aptitude (think: playing chess), but fumbles with the basics ones. The curiosity-driven instincts that have always been at the root of how humans create things are not just sticking around in an AI world; they are now more important than ever. Thankfully, students have plenty of ways to get there.

Please follow and like us:

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *