The AI research team OpenAI Inc. recently released the latest version of its GPT-3 general-purpose natural language processing model in private beta, and its capabilities are staggering to early testers.
GPT-3 is the third generation of OpenAI’s Generative Preformed Transformer, which is a general-purpose language algorithm that uses machine learning to translate text, answer questions, and write text predictively. It works by analyzing a sequence of words, text, or other data, then expanding on these examples to produce completely original output in the form of an article or image.
The algorithm’s predecessor, GPT-2, had already proven somewhat controversial due to its ability to create extremely realistic and consistent “fake news” articles based on something as simple as an opening sentence. The potential for misuse was such that OpenAI declined to make the algorithm publicly available. Now, with the release of GPT-3, the algorithm has become exponentially more powerful.
After originally publishing its GPT-3 research in May, OpenAI gave selected members of the public access to the model last week through an API. And in recent days, a series of text samples generated by GPT-3 have begun to circulate widely on social media.
One of the most interesting examples comes from Founders Fund director Delian Asparouhov, a former partner at Khosla Ventures, who fed the GPT-3 algorithm half of an investment memo he had written and posted on his company’s website.
holy shit
I gave GPT-3 the first half of my Sword Health note that I have on my website …
And it actually generated some relatively consistent follow-up paragraphs … including a section on risk and long-term strategy
WE ARE FUCKING FUCKS YOOOOOOOOO https://t.co/4HGQeya6pS pic.twitter.com/jzQwdbektd
– delian (@zebulgar) July 17, 2020
Asparouhov then went ahead and gave GPT-3 half of an essay on how to conduct effective board meetings:
Omfg, ok, so feed GPT3 the first half of my
“How to organize an effective Board meeting” (first screenshot)
AND FUCKIN WRITTEN A 3-STEP PROCESS ON HOW TO RECRUIT BOARD MEMBERS THAT I HONESTLY MUST NOW PUT ON MY DAMN TEST (second / third screenshot)
I’m losing my mind pic.twitter.com/BE3GUEVlfi
– delian (@zebulgar) July 17, 2020
In both examples, GPT-3 was able to generate not only consistent additional paragraphs of text, but was also able to follow the above format in such a way that it was hardly distinguishable from the original human-written text.
GPT-3 is so good at what it does that it can fool people into almost the topic at hand, even if that topic is writing about itself. Take the example of Zeppelin Solutions GmbH Chief Technology Officer Manuel Araoz, who used GPT-3 to create a complex article about a bogus experiment on the popular Bitcointalk forum using a basic ad as a guide.
The article, “OpenAI’s GPT-3 may be the most important thing since bitcoin,” describes how GPT-3 tricked members of the Bitcointalk forum into believing that their comments were genuine. At various points in the text, GPT-3 also describes various possible use cases for language prediction models, noting that they could be used for “fake news,” investigative journalism, “publicity, politics, and propaganda.”
The text was pretty perfect, the only flaws were a missing table and several referenced skipped screenshots. Araoz said the text was generated using just a title, a handful of tags, and this short summary:
“I share my first experiments with OpenAI’s new beta language prediction model (GPT-3). I explain why I think GPT-3 has a disruptive potential comparable to that of blockchain technology. “
Araoz tested GPT-3 in several other ways, using it to make complex texts more understandable, writing Borges-style poetry in Spanish, and writing music in ABC notation.
Another tester, Debuild.co founder Sharif Shameem, used GPT-3 to write JSX code from a basic description of a website design:
This is amazing.
With GPT-3, I built a layout generator where it only describes the layout you want and generates the JSX code for you.
WHAT pic.twitter.com/w8JkrZO4lk
– Sharif Shameem (@sharifshameem) July 13, 2020
GPT-3 seems to take away from its predecessor’s capabilities, thanks in part to the more than 175 billion learning parameters it has, allowing it to perform virtually any task assigned to it. That makes it an order of magnitude larger than Microsoft Corp.’s second most powerful language model, the Turing-NLG algorithm, which has only 17 billion parameters.
OpenAI provides access to the GPT-3 API by invitation only, and there is a long waiting list for the paid version, which will launch in approximately two months.
Image: geralt / Pixabay
Since you’re here …
Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant business content and emerging technology. Thank you!
Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.
… We would also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of content, not advertising. Unlike many online posts, we don’t have a pay wall or run ads, because we want to keep our journalism open, without influence or the need to chase traffic.Journalism, reporting and commentary on SiliconANGLE, along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams in the cube – It takes a lot of hard work, time and money. Maintaining high quality requires the support of sponsors who are aligned with our vision of ad-free journalism content.
If you like reports, video interviews and other ad-free content here, please take a moment to view a sample of the video content endorsed by our sponsors, tweet your supportand keep going back to SiliconANGLE.
.