The most impressive language generator yet

OpenAI’s GPT-3 language generator is lighting up the internet.
Sign up for the Freethink Weekly newsletter!
A collection of our favorite stories straight to your inbox

OpenAI’s new machine learning language generator, GPT-3, is currently an internet darling. With a catholic knowledge of the English canon, drawn from the vast corners of the internet, the AI can generate a piece of writing that can, at times, read as fine as any a human could compose.

It’s a comprehensive step beyond AI’s usual playing with language, and perhaps a hint that machine learning could soon assume powerful roles in writing, much like those it is assuming in medicine and robotics.

GPT-3 is the most powerful language generator yet made, according to MIT’s Technology Review. Inside any neural network’s black box are parameters, the guardrails for its training: GPT-2, released last year, had a massive 1.5 billion parameters; GPT-3, in comparison, has 175 billion — one of those hard-to-comprehend numbers.

Sucking this dataset, Charybdis-like, into its maw, GPT-3 can now spit out, based on all it has “read,” whatever words should follow a given prompt.

Styles and Shortcomings

GPT-3 uses its vast dataset to apply a mathematical prediction as to what words, and in what order, will best complete a given prompt. Some of the results have been deeply impressive, despite that ghostly quality AI writing seems to always contain.

If given a prompt with a sci-fi bent — say, the opening line to George Orwell’s 1984, from the Guardian‘s Alex Hern — it will return a suitably sci-fi result. 

“It was a bright cold day in April, and the clocks were striking thirteen,” Orwell begins.

“I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run,” the language generator continued. “I just imagined what the day would be like. A hundred years from now. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science.”

It’s a hair disjointed, but read altogether, it does feel like a particularly promising freshman comp assignment. 

People playing with GPT-3 have prompted poetry and prose in the styles of certain authors, composed music, and created code. It’s even written an article on itself. (Your correspondent will fight an AI to keep his job.)

Little wonder that the Twitter cage is rattling.

GPT-3 is not a perfect mimic, or an artificial general intelligence, however. Kevin Lacker’s Turing test of GPT-3 reveals some of its shortcomings.

The language generator has little issue predicting answers to trivia-style questions; it knows who won the 1995 World Series, how many eyes a giraffe has, and the human life expectancy in the United States.

 It proved pretty good at answering common sense questions, too, knowing that an elephant is heavier than a mouse, and a pop can heavier than a paperclip. But give it a prompt that’s a bit odd — Lacker asked, is a pencil heavier than a toaster? — and you can trip it up. 

(GPT-3’s response: the pencil is heavier. This is likely because the literature on comparing pencils and toasters is probably pretty … light.)

Interestingly, it won’t admit it is wrong, and it won’t  call out your question if it’s nonsensical. When asked how to sporgle a morgle, it replied “with a sporgle” (uh, duh).

But even OpenAI’s founder, Sam Altman, tried to temper things a bit. 

“The GPT-3 hype is way too much,” he tweeted. “It’s impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot still to figure out.”

Still, while it’s true that GPT-3 is not an artificial general intelligence — or even intelligent, in, like, a philosophical sense (“Music is the most advanced form of mathematics” is a sophomoric, one-blunt insight, GPT-3) — that does not mean it’s as simple as a chatbot toy.

With the ability to write, with varying levels of convincingness, everything from fiction to music, poetry to technical information, journalism to code, GPT-3 could prove to be a powerful tool. Any task that requires the written word could be augmented — or even automated — with a high quality language generator.

And GPT-3 might just be the lede.

Sign up for the Freethink Weekly newsletter!
A collection of our favorite stories straight to your inbox
Related
Should we turn the electricity grid over to AI?
AI could one day be woven throughout the grid management system — here are the pros and cons.
AI skeptic Gary Marcus on AI’s moral and technical shortcomings
From hallucinations to regulatory battles, Gary Marcus argues the AI status quo has failed us and it’s time citizens demand something more.
Flexport is using generative AI to create the “holy grail” of shipping
Flexport is using generative AI to read documents, talk to truckers, and create a “knowledge agent” that’s an expert in shipping.
The West needs more water. This Nobel winner may have the answer.
Paul Migrom has an Emmy, a Nobel, and a successful company. There’s one more big problem on the to-do list.
Can we automate science? Sam Rodriques is already doing it.
People need to anticipate the revolution that’s coming in how humans and AI will collaborate to create discoveries, argues Sam Rodrigues.
Up Next
Subscribe to Freethink for more great stories