GPT-3 demystified – scary AI or overrated text generator?

0

GPT-3The ability to write code, letters, and even novels, often with the input of one or a few words, is downright frightening. But there are two things you need to know:

1, He has no idea what you say (type).

2. He has no idea of ​​the meaning of what he produces.

It’s just math, trained neural networks.

Before digging into GPT-3, a quick review of NLP is in order:

By now everyone knows conversational NLP like Siri, Alexa or Cortana. This is automatic natural language processing.

There are a few sub-disciplines in NLP, such as:

  • Optical Character Recognition – Conversion of written or printed text into data.
  • Speech Recognition – Converting spoken words into data or commands to follow.
  • Machine Translation – Converting your spoken or written language into another person’s language and vice versa.
  • Natural Language Generation – The machine producing meaningful speech in your language.
  • Sentiment Analysis – Determination of emotions expressed through language.

For NLP-enhanced business analytics, the conversation might be: “Download the latest pricing analysis to my phone.” The key thing to remember is that the computer does not understand what you are saying. It can process and respond, but make no mistake – it’s all done with math.

Organizations that offer NLP capabilities don’t have to start from scratch. There are open source Python libraries that software can integrate with, such as spaCy, textacy, or neuralcrret, and a few in other languages ​​such as CoreNLP in Java. John Snow Labs has developed and maintained an open source NLP library, Spark NLP.

The steps taken by a natural language processor to answer your question:

  1. Sentence segmentation, separate words.
  2. Tokenization of words: words = tokens.
  3. Predict the part of speech for each token. Feed the token with some surrounding tokens for context in a trained part of the speech classifier.
  4. Lemmatization of the text: knowing the basic form of each word and its inflections; find the most basic form of each word.
  5. Identify the “stop” words (such as un, un, le, …) and filter them.
  6. Dependency analysis.
  7. Find noun phrases: groups of words that mean the same things.
  8. NER (Named Entity Recognition): Detects and tags names to real-world concepts. Names of people, companies, geolocation, dates and times, sums of money, names of events, etc.
  9. Coreference resolution: Attach meaning to words like pronouns, or that.

The above steps are used to understand your written, typed, spoken or even machine-generated request. The underlying implementation of the technology is machine learning, usually various types of neural networks.

A bit of background on NLP models

Google developed the BERT model which was trained with 340 million parameters from millions of books and Wikipedia. It was designed to handle simple question and answer queries. The accuracy of the model was quite good. Microsoft and Facebook have together developed models based on BERT, RoBERTa and codeBERT. The industry has concluded that larger natural language models improve accuracy. Microsoft alone has developed Turing Project, launching a product called Turing Natural Language Generation (T-NLG), at 17 billion parameters, which was the most massive model ever formed at the start of 2020.

About six months later, OpenAI launched its beta version of GPT-3, based on an unprecedented 175 billion parameters. To put that into perspective, 175 billion is almost the entire internet.

For all of you, like me, who have always found SQL tedious and nuanced (I never get it right the first time), GPT-3 can accept as input “Show total number of employees in HR department”, and instantly construct a well-formed SQL statement, “Select count

of Emp where dept_id = ‘HR’.

Don’t have time to write an email? GPT-3 can help you:

“Write an email from Neil to Mark at abchotels to book a suite in Menlo Park for three nights with my usual preferences”

To Mark at abc hotels, by Neil Raden, subject: Reservation request at Menlo Park location.

Greetings Mark. I am writing to request a reservation for a suite at your Menlo Park location for three nights. I am currently in the bay area on business and would like to make the reservation for April 1, 2, and 3. I would also like to request a room with a view of the Golden Gate Bridge (OK, I made that up. You can’t see the Menlo Park GGB, I don’t think). Please let me know if you have availability, and I’ll be happy to provide my credit card information over the phone. Thank you.

Training data for GPT-3 is controversial. Much of it is available openly, such as “tweets” from Twitter, a repository known for its violent, racist and misogynistic language. Although the model works to a greater extent, researchers are concerned that it poses a strong threat to misinformation, where bad actors can use it to create endless amounts of fake news, spread misinformation, and more.

July 19, 2020

my catch

OpenAI is currently forming a GPT-4, which is said to have ONE HUNDRED TRILLION parameters. This approach has its detractors. Stuart Russell, Berkeley computer science professor and AI pioneer, argues that […] “focusing on raw computing power completely misses the point

We don’t know how to make a machine truly intelligent – even if it were the size of the universe.” There is another item: GPT-3 costs about $4.6 million

in computer science. This would put a price of $8.6 billion for the computer to form GPT-4. There is reluctance to say that these monstrous models are out of control. There is also another problem. Sam Altmman believes that each iteration of GPT will come closer to the inevitable AGI (Artificial General Intelligence), but there is an equally believable fallacy. “Why AI is harder than we think” – this is the title of a recent article by Melanie Mitchell at the Santa Fe Institute. Her assertion is that the prevailing attitude, and certainly that of OpenAI, is that narrow intelligence is on a continuum with general intelligence.Mitchell, however, argues that a

Narrow AI advancements are not “first steps” towards AGI (Artificial General Intelligence) as they still lack common sense knowledge.

The implication is that the path to truly thinking machines is not through ever bigger computers, but through better theories leading to better and more economical algorithms. This fits perfectly with my background in topology, where I had a professor who wouldn’t accept a proof longer than two pages.

Share.

About Author

Comments are closed.