The Hottest New Programming Language Is... English?
Prompt Engineering is natural language programming. You don't need to code to command AI.
"Prompt Engineering is natural language programming."
— Elon Musk
In 1941, a brash 25-year-old named Orson Welles released his first film. Having never worked in Hollywood or studied moviemaking, he approached the project with sheer ignorance. He knew little to nothing of the standard conventions of lighting, camera angles, or story structure.
"There's no confidence to equal it," he later remarked of his novice mindset during an interview in which he reflected on his early career.
Unbound by theoretical impossibilities, Welles embodied the essence of an "auteur" — a filmmaker whose personal influence and artistic control over his movie were so profound that he was regarded as the author of the film — not just its writer, director, and producer. In the process of making his first film, he pioneered techniques like deep-focus cinematography and non-linear storytelling. Citizen Kane is now regarded as one of the most influential films ever made. But in 1941, it was a shock to the system.
"It's only when you know something about a profession that I think you're timid or careful," he observed. "I thought you could do anything with a camera that the eye... or the imagination could do."
Welles was also lucky to have a cameraman who told him at the outset that there was nothing special about camera work — nothing, that is, that he "couldn't learn in half a day."
"The Great Mystery that requires 20 years doesn't exist in any field," Welles added, "and certainly not in the camera."
To some, artificial intelligence may seem like a great mystery. But the basics can be learned in much less than half a day (more like half an hour). Writers and content creators who choose to work with artificial intelligence find themselves in a similar position to Welles as he embarked on his masterwork. Just as Welles pioneered new filmmaking techniques, unbound by convention, today's writers can explore the potential of AI models without being constrained by preconceptions of what is possible.
If you worry that you lack the expertise to use this technology, you are in a perfect position to learn. Like Welles, your ignorance may be your greatest asset.
The Language Basics of AI
There is much hype (and confusion) around the coveted role of "prompt engineer." San Francisco-based AI company Anthropic was offering a $350,000 salary to top prompt engineering talent. Those back-end jobs demand a variety of technical coding skills and understanding of AI technology. But the skill level necessary to proficiently command AI has down-shifted and requires much less experience.
This lowered barrier to entry came about with the advent of Large Language Models (LLMs) — the breakthrough technology behind AI assistants like ChatGPT.
Although the inner workings of LLMs remain opaque to most, these AI architectures enabled a paradigm shift. For the first time, anyone fluent in a natural language could easily interact with advanced intelligence. In other words, previously it took specialized coding skills to operate AI systems, whereas now plain language prompts can command powerful functionalities.
Former Tesla AI Director Andrej Karpathy has labelled this innovation "Software 3.0," comparing it to the revolution that came with graphical user interfaces (GUIs). Those intuitive windows and menus opened personal computing to a far wider population by replacing arcane code terminals. Similarly, today's AI systems allow anyone to accomplish complex tasks just by expressing goals conversationally instead of demanding coding skills.
Hence, Karpathy's now-viral tweet: "The hottest new programming language is English."
Despite the technical-sounding name, prompt engineering is less about technical mastery and more about the artful application of language when communicating with AI. "Prompt engineering" is a misnomer that suggests a high level of technical understanding when, in fact, that's unnecessary outside of the upper echelons of the AI industry.
I prefer the simple term, prompt writing over prompt engineering, because you don't need advanced knowledge of the underlying mechanics of the large language models to engage in the subtle art of prompting AI. All you need is a good understanding of the English language and a few core principles.
Will Prompt Engineering Become Redundant?
The introduction of chat windows where plain English is used to interact with AI has significantly improved the archaic "command line" interface. Plus, for those seeking an even more straightforward experience than conversational chat, hundreds of helpful, new applications have been customized for specific recurring situations or common use cases.
Based on this trend, OpenAI CEO Sam Altman speculates that the skill of prompt engineering will soon be integrated into mass-market AI products, such that average users will not need to know how to write and design effective prompts.
At the opposite end of the spectrum, others predict that nearly 50% of all jobs in the future will be "prompt engineering" jobs on some level.
More likely than either extreme is a middle path. Basic prompt literacy will be assumed despite the proliferation of custom GPTs and AI-powered tools with preset prompts. In some professions, it will become a core competency, just as digital proficiency rose in importance with the spread of computers.
There will be some percentage of specialized AI engineers who understand the underlying code and architecture. But there will be many more natural language programmers who write their own prompts or even create their own chatbots for their specific use cases. For the latter, deep intuitive knowledge of prompting will be essential.
Learning prompt design yourself is more empowering than relying on overpaid engineers to make one-size-fits-all tools. When relying on prepackaged prompts and AI writing tools, you face a trade-off between their ease of use and the limitations they impose.
Writers, or anyone with strong language abilities, has an advantage here in this emerging field of natural language programming. Investing further in prompting skills lets you unlock the full potential of LLMs. Don't settle for presets — instead, craft your own prompts tailored to your own goals.
A Peek Under the Hood
So, what does the aspiring independent prompt writer need to know about the underlying technology?
Large language models (LLMs) like GPT-3 generate text by predicting the likelihood of the next word in a sequence, considering both the preceding words and the broader context of the prompt. These models develop their predictive capabilities through a process known as "training," wherein they analyze vast bodies of textual data. For instance, GPT-3 was trained on a dataset comprising over a trillion words.
There is vigorous debate around the question of whether LLMs understand language or are just advanced pattern recognizers — a glorified "auto-complete" function. Some downplay the capabilities of AI by noting that, at the root, these tools are "just predicting the next most likely word." But this criticism falls flat.
Ilya Sutskever, one of the co-founders of OpenAI, explains why this statistical learning of LLMs equates to a deeper understanding:
"When we train a large neural network to accurately predict the next word in lots of different texts from the Internet, what we are doing is that we are learning a world model. It may look on the surface like learning correlations in text, but it turns out that to 'just learn' the statistical correlations in text, to compress them really well, what the neural network learns is some representation of the process that produced the text."
Large language models have begun to outperform humans in specific areas, notably in tasks such as translation. However, these models still have limitations. They are good at discrete tasks but tend to go off the rails when given complex or multi-step jobs. They have limited long-term memory and primitive deductive reasoning. Thus they cannot connect concepts over time without human supervision. Finally, their expertise is limited to the "congealed knowledge" of their training data.
As a result, they need humans to oversee them — to "seed" them with specific, contextualized information, and guide them toward useful responses.
Whether you call it prompt engineering or prompt writing, working with AI requires you to be able to express your aims and objectives in English. This, in turn, requires little more than being able to think for yourself — unconstrained by what others say is possible.
As we turn the page to the practical steps of AI-assisted writing, begin to think of yourself as an "AI auteur." Become a prompt writer, director, editor, and producer, all bundled into one.
This post is adapted from "Commanding the Page" (2023).


