This bustling new AI can make recipes that sound human, but still taste unpleasant


The result was barely edible. It looked more like a watermelon omelette bagel than a cookie, and tasted like a sugary, gloomy nightmare. My four-year-old daughter was the only fan in our house, saying they tasted “weird” but also protesting when I threw them into the compost.

If a friend had sent me this recipe, I would have been disappointed and asked what trauma had brought them to this horrible meeting. But since it was produced by artificial intelligence, I was surprised: of course, the food tasted terrible, but the recipe was so well written that it could have been easily prepared by a watermelon-loving human.

Watermelon cookie recipes are just one of countless text-based things that new AI technology from nonprofit research company OpenAI, which was co-founded by Elon Musk, has Microsoft as one of its sponsors.

Feed it with a few words or sentences: the beginning of a recipe, a snippet of a poem, the first line of a story, and it will do its best to expand on the message by creating text that matches your style as closely as possible. can.

The results can be outrageous, bizarre, disturbing, or something else entirely, and can continue for several paragraphs while sticking to the same topic. They show how skillful computers are becoming consistent sound text production, even though they don’t understand the meaning behind the words they link. They also highlight OpenAI’s goal of building AI that can be used for many different purposes, and the potential danger of training AI on a vast collection of Internet text.

An AI-generated recipe for watermelon cookies looked like it might work, with ingredients including watermelon, an egg white, and half a cup of sugar.  However, it didn't taste good.

AI software, known as GPT-3, is accessed through what is known as the OpenAI API. Both launched in June and are currently available only to a select group of companies and software developers (for now, it’s free to use, but OpenAI is finally planning to charge for it). Despite its limited availability, it’s receiving a lot of attention across the tech community, particularly on Twitter, as data scientists, venture capitalists, and others posted about its implications.

GPT-3 was trained on a dataset that is approximately two orders of magnitude larger than that of its predecessor GPT-2 (which itself was controversial in 2019, as OpenAI initially promised not to publish it for fear that so) good for composing text that could be misused, for example, to create fake but true news articles. This dataset is huge, with 570 gigabytes of text from what is known as the Common Crawl dataset, which is made up of billions of web pages.
The OpenAI API is already being used for a variety of applications, from improving a text-based adventure game and helping to create better customer service chatbots to allowing people to create website design code by simply writing what they want. (like “a button for all the colors of the rainbow”). An online “playground” – essentially a window where you can type in a message where you want the AI ​​to expand – like “Peanut Butter Jelly Pie Recipe” – allows for simple experimentation. (OpenAI declined to allow CNN Business to access it, but one developer took a look at us.)
This quirky experiment highlights AI's biggest challenges
Janelle Shane, an optical research scientist who runs the popular blog AI Weirdness, is inciting AI to generate outrageous things, such as often false “facts” about horses and whales. These include ideas that a horse “has two eyes on the outside and two eyes on the inside” and is “about five times the size of an elephant, three times the size of a cow, and almost the size of a giraffe.” The AI ​​also determined that whales, “and especially baleen whales, are well known for their enormous size, but most types of whales are no bigger than an adult adult human.”

Shane, who has had access to the AI ​​system for several months, sees GPT-3 as a leap forward in the consistency of text that can be generated, beyond GPT-2, which could generate entire paragraphs that more or less kept on target, she said.

An AI-generated recipe for watermelon cookies was created with the help of a new AI system called GPT-3 that can produce remarkably human text.

Still, he doesn’t know things that even a very young child would know, making it unlikely that he will replace most of us human writers any time soon.

“It is interesting to see how these highly advanced language models can produce comprehensive technical texts and codes, but they will reject this question of how many eyes a horse has,” he said.

However, it is extremely good at imitation. Joshua Schachter, who generated and tweeted the watermelon cookie recipe I tried, is exploring how well AI can mimic everything from fine art to job advertisements. Schachter, who founded the now defunct social bookmarking site del.icio.us, has managed to get AI to produce drawing instructions for conceptual artist Sol Lewitt (“a shape like an octagon with eight sides, all half a circle, connected by vertical line segments, each attached in a different place “) quirky cookie ideas (” Clam Cookies, “for example, that came with the claim,” Although these clams might not make you happy as a clam, this recipe it’s a delicious dessert “) and more.

He showed me, among other things, how he can even create his own detailed recipe blog posts, complete with a meandering story at the top about how meaningful the recipe is to the writer. For example, given the prompt “Peanut Butter Jelly Pop-Tarts Recipe” and “When I was a Little Kid”, the AI ​​followed up with “I couldn’t wait for my school’s bake sale. They would sell different kinds of baked goods” . And every year they sold these rare homemade pop tarts. Of course, I would always be very excited and buy a package, just to grab a bite and be left with a face full of disappointment. “

“It’s a lot of fun to play with him,” said Schachter.

This AI is so good at writing that its creators won't let you use it
Of course, while this AI can create a human-like copy, there are still many shortcomings. Shreya Shankar, a machine learning engineer at Viaduct, which makes vehicle AI, found that the GPT-3 can combine text patterns really well, which makes sense given the amount of Internet it has been trained on. However, as she and others have noted, the sheer amount of training data means that it can also deliver toxic results.

“If you memorize the entire Internet, for example, you memorize the good and the ugly,” he said.

And as I saw firsthand in my kitchen, just because it can generate text that sounds like it was written by a human being, like the recipe for watermelon cookies, doesn’t mean this AI really understands anything about cooking theory or to bake. At one point, the recipe for watermelon cookies told me to add an egg white to a pot of sugar water and hot watermelon, resulting in watermelon scrambled eggs.

.