GPT..! GPT..! GPT..! (GPT3)

Generative Pre-Training popularly called GPT. Another drizzle in the world of Artificial Intelligence in the month of MAY, 2020. A powerful language model ever designed by the openai, an artificial intelligence lab in San Francisco of Elon Musk, started by the year 2015. It can generate its own pattern which is even beyond human imagination, write parodies, songs, poems, essays. It has been provided to some of the prerequisites only, they say chatting with it feels very similar to chatting with a human. It not only translates to other languages but also has perfection with translation. Let's compare its training

  1. GPT2 -> 1.5 Billion Parameters

  2. NVIDIA's Megatron -> 8 Billion Parameters

  3. Microsoft's Turing NLG -> 17 Billion Parameters

  4. GPT3 -> 175 Billion Parameters

It stands over the models like BERT, GPT2, roBERTa, T5, and various other variants. It is undoubtedly a sense of Technical achievement. It has significantly increased the architecture of the Natural Language Process (NLP) with great creativity.

It is really short to describe but wide to show.

1. CONVERSATION: It chats in a very interactive manner don't let you feel boar ever!









It answers each and every question in a very disciplined manner.


2. It can code, create any structure

Writing code for creating a website or designing something is difficult it takes too much time even one has to handle too many errors after that it gets completed. Let's look at how GTP3 performs it, one has to type in our normal language want one wants to create. Suppose I need to create a website so I have to write in a proper format its features and then it itself reflects that page which even looks better. It can create videos, songs, and much more stuff.





CODES CREATION









MUSIC CREATION





Experiment: Researchers conducted an experiment in which they gave input as for the conduction of Board Meeting. Within a few seconds, it gave a 3-step process for its conduction.


The output it generates is a language that it calculates to be a statistically possible response to the input it is given based on whatever humans have published online.

It also has a limitation that it lacks somewhat in True common sense. But it's not bad


SamAltman: The GPT-3 hype is way too much. It’s impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot still to figure out.

#artificialintelligence #machinelearning #aiml #models #aitechnology #advancedai #bigdata #languagemodel #openia #GPT3 #humanimagination #nlp #naturallanguageprocess #ai #ml #researchers #musiccreation #website #code #roberta #GPT2 #microsoftturingnlg #2020blog #codecreation #ainews #innovations #openai #BERT #datascience #dataanalytics #elonmusk #programming #computerscience #engineering #technews #google #microsoft #technology #crazzylearners

42 views
  • CREATED BY ANMOL VARSHNEY & PALAK GUPTA