GPT-3, the new language model that has left half the world speechless

assistants Archives - Business Intelligence Info

"It is the most powerful artificial intelligence to date!" "It will mean the destruction of thousands of jobs", "It is a danger to humanity" ... Is this new neural network so impressive? Can we control that it does not end becoming our worst nightmare?

GPT-3 is already a reality. His presentation has attracted all kinds of comments, both very good and very negative for the possibilities it offers and all that it could mean for the future of computing. However, its creators have asked us to take it easy and common sense, as it is neither the eighth wonder nor the apocalypse. But what is GPT-3?

To understand why so much hype has been created around this new artificial intelligence system, it is necessary to understand what this new generator consists of and what it has proven to be capable of doing.

There are many who, before the possibility of trying it online, are "playing" with GPT-3 to see how far it is capable of going. They have made me write poetry, play at being a programmer, a mathematician, or a tabloid journalist. It is easy to be amazed by its qualities, of which we discover something new every day, but what is behind all these advances?


What is GPT-3 and what it can doOpen AI, an artificial intelligence company founded by Elon Musk himself, began to report on the arrival of GPT-3 in May, but it was not until a week ago that it surprised everyone with its arrival in the form of private beta. For those unfamiliar with artificial intelligence and its advances, we will try to explain in the simplest way possible what GPT-3 is and why it has attracted so much attention around the world.

GPT-3 is artificial intelligence, more specifically it is a machine learning model composed of algorithms that are trained to recognize patterns in the data and learn through examples. Said with slightly more technical words, it is a series of algorithms that make up a Recurrent Neural Network with Long-term Memory (LSTN Long Short-term memory).

That is, this program analyzes the text or data that we give it, and then offers predictions of words and phrases based on the ones we have given it; from there, artificial intelligence can continue our sentence or answer our questions.

The size of GPT-3 is impressive, it has 175,000 million parameters, although it is not the largest. Google's Gerhard was presented, for example, in June, later than GPT-3, and has 600,000 million parameters. This is the trend these word processing systems are currently heading for, and the change has taken place in a very short time.


To give you an idea of ​​how fast this artificial intelligence has grown, the previous version, GPT-2, had only 1,500 million parameters and even then it was considered to be a major advance in the field of artificial intelligence.

Open AI, those responsible for this technology, have made their artificial intelligence even more "intelligent" by forcing it to process a large amount of the text that is on the internet, including Wikipedia and many other pages. As a result of all that text, he has learned to write or simulate English writing and express himself as we humans do on the Internet.

Its creators have presented this neural network through an API (we will explain the reason for this decision later) and let us play with it. We have tried to access the GPT-3 API through the form that needs to be completed, but we have not received any response yet. However, those who have been able to are tinkering with the API and have encountered many surprises.

Able to complete unfinished sentencesGPT-3 can, for example, fluently complete a text with the simple example of an initial sentence that we give it and it can adapt that text to different writing styles: write with legal jargon as if it were a lawyer, give it the style of a novel writer or write a simple article for a magazine.

OpenAI's Language Generator GPT-3 is Getting A Lot of Attention ...

It is a programming machine (in any language)The program is one of the surprises that have attracted the most attention. Through a simple indication of what we want we can ask it to generate a code in Python and it is not bad at all. All these capabilities and many more are being discovered through the uses that are being given to it through the API with which they have even created a Dungeons and Dragons game.

In addition to the immense size that these language processors are reaching, the great leap they represent has to do with their being what we call language-agnostic processing. Whereas before artificial intelligence needed to be trained in a single task, now programs like GPT-3 are generic models of NLP capable of adapting and performing (or imitating) more diverse tasks that they were not aware of when they were created.

It is only based on trial and error that humans come to understand what we have created, which is what has happened to the developers of Open AI.

Post a Comment