Blog

AI: revenge of the English majors?

George Nicholas

Creative Strategist

Cognitively, I’m a one-trick pony. I look for connections, analogs, precedents in everything I’m studying or evaluating—even enjoying. It is how I am now wired. Movies. Novels. Comedians. Cereal Boxes. Where are the antecedents? Who are the influences?
Is this being done the way someone has done it before?
Or, this is being done very differently from this competitor?
These people are really working off the foundation laid by X.
This reminds me of Y, but updated to reflect Z.

This is not confined to business. And I must admit to experiencing a pleasant frisson when recognizing the golden patterns of the robe in Klimt’s “The Kiss” are pulled straight from Byzantine iconography.

Of course, to do this on the fly, you need to keep a fair amount of stuff in your head, much of it useless. We’re talking fairly shallow knowledge here. Once a connection is made, a shading noted, a resonance recognized, you can go back and drill down for something more meaningful and, one hopes, useful.

I credit years of writing critical papers as an English major (more on English majors later) for this wiring, this constant sub-conscious cross-referencing of my collection of seemingly random data.

But what happens when that internal database is no longer needed? When you know that anything in the world is just a search (and now a prompt) away, why make your brain a database? It can’t possibly house what the cloud can.

If you don’t bother making your brain a database (because there’s a much larger one available) do you use your brain differently? Have you freed up more brainpower for logic, reasoning, interpretation? I’m talking about a re-wiring very much like the evolutionary leap implied in Arthur C. Clarke’s “2001.”

But there’s a catch 22 (yes, it’s the title of a Joseph Heller novel, from which the aphorism sprang fully formed like Athena from the head of Zeus—see what I mean?): if you don’t have that baseline of “fairly shallow” knowledge, you don’t have the cues to lead you to the precedents that provide the depth and, if lucky, a deeper understanding.

Recently, the CEO of NVIDIA, Jensen Huang, called AI “the revenge of the English majors.” He was referring to AI’s enhanced ability to create code which, when combined with well-written prompts, overturns the coders perceived primacy atop the tech world. Clear, simple language—the kind we ideally use with each other—becomes more valuable than the ability to write code. (While Huang’s statement was tongue in cheek; a more nuanced, reality-based version was presented by Reid Hoffman in a recent interview).

Upon hearing that, I was transported back to 1978 and a job posting in U.C. Irvine’s English department. A computer science professor needed someone to grade papers for the only CS class that wasn’t coding. Called Ethics in Computer Science, the sole text was Joseph Weizenbaum’s seminal “Computer Science and Human Reason,” and essays were required.

I took the job, graded the papers, and charitably gave one student a C for a really horrible paper. He asked for a meeting and told me he should have a better grade.

When I wouldn’t change his grade, he said that writing didn’t matter. “In twenty years, writing in English won’t matter. You’ll be obsolete. Code will be what matters.”

I don’t think of AI as my revenge, but it was nice to hear.

Get to Extraordinary

Take me there
Grafik
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.