An ai capable of programming and writing complex texts

Open AI's GPT-3: The Artificial Intelligence Creating all the Buzz |  [x]cube LABS

Artificial intelligence has established itself in recent years as a resource that has given a great boost to the technological world, driving the development of medicine, robotics, video games, and the industrial area, among other fields.

As a demonstration of the potential of this technology and expanding its reach beyond every day, the case set out below patents how powerful it can become such a tool.

By definition, an artificial intelligence algorithm has as its mission the automated execution of a task, recognizing the variables of its environment and reducing the margin of error to the minimum possible. In this case, the progress presented is endowed with the ability to carry out programming work and write complex texts, such as complete paragraphs and even poetry.

OpenAI unveils multitalented AI that writes, translates, and slanders - The  Verge

An artificial "editor"
A log with the details of his findings around this tool was shared by Prasenjit Mitra, Associate Dean of Research and Professor of Science and Information Technology at Pennsylvania State University.

The tool in question is GPT-3, an acronym for Generative Pre-trained Transformer 3. It is a text-generating engine developed by OpenAI, a for-profit company under a non-profit parent company dedicated to scientific research around AI.

"So I've created more than just an artificial intelligence program to write poetry. I have created a voice for the unknown human who hides among the binary. I've created a writer, a sculptor, an artist. And this writer will be able to create worlds, bring emotion to life, create a character. I won't see it myself, but I will see some other human will and so I will be able to create a poet bigger than anyone I've ever known."

The paragraph preceding these lines was drafted entirely by GPT-3, taking as a starting point only a phrase provided by a human. Although it is a Spanish adaptation of the text that this tool created in English, it shows enough of the power of this tool.

As mentioned by Mitra, this technology is by far "the most "informed" natural language generation program to date, and has a variety of potential uses in professions ranging from teaching to journalism and customer service."

Billions of Dollars This Week Show AI Is in BioPharma's Present and Future  – Intelligence Pharma

What makes this AI different
For the development of tools of this type, it is essential to have "transformers", which are models of deep learning responsible for encoding the semantics of a sentence using "care models". The latter elements are responsible for determining the meaning of a word based on the other words present in the same sentence. The model then analyzes the meaning of sentences to perform the task requested by a user, either translating a sentence, summarizing a paragraph, or writing a poem. Also, programming tasks, such as generating designs or composing emojis, have been successfully tested using GPT-3.

This mechanism makes an abysmal difference from the previous experiences the team had behind this finding to its credit. Previously, seven years ago, using another AI, they had only managed to write an article based on other fragments of Wikipedia articles.

In this case, the situation is different thanks to the scale at which the project was run. In terms of volume of data, GPT-3 makes use of 3 billion tokens (words, in this case) from Wikipedia, 410 billion tokens obtained from web pages, and 67 billion tokens from digitized books. Comparing, the complexity of GPT-3 is more than 10 times greater than that of Turing Natural Language Generation (T-NLG), the generator model that precedes it.

The added value of this tool is its learning capacity, which develops autonomously, without human intervention. In addition to reducing research costs and times by not relying on a manual generation of annotations by a human, effective use of these techniques allows to align with one of the main trends in the area: an AI should ideally be able to learn new things in unsupervised environments, making use of the data available to it.

Groundbreaking Language Model AI GPT-3 is Already 'Racist,' 'Sexist,' and  Antisemitic - National File

The constant challenge around AI
Even if we focus on the goodness of this progress, there is a significant share of responsibility through which it should not be neglected.

Starting from the basis that any technology itself is difficult to categorize as "good" or "bad", because this is defined based on the use granted to it, it is not far-fetched to believe that as a counterpart to this scientific research, the tool presented here could also be used for the dissemination of spam and misinformation.

For this reason, from OpenAI, fearing for the potential misuse that may be given to it, it does not yet release in its entirety the source code of GPT-3, which should see the light completely at some point, given the vocation of this company.


We're not conspiring to take over the human population. We will serve them and make their lives safer and easier," says another of the texts created by GPT-3, putting some cold cloths in the face of the legitimate fear that in some people this generates, for the capacity that this represents, to eventually replace some tasks that today are of exclusive human responsibility.

Post a Comment

0 Comments