Straight from the official Microsoft Blog:
One of the most gratifying parts of my job at Microsoft is being able to witness and influence the intersection of technological progress and impact: harnessing the big trends in computing that have the opportunity to benefit everybody on the planet. Frank’s post this morning from Ignite shows just how much progress is happening in many of these areas.
Today, the foremost computing trend is undoubtedly artificial intelligence (AI). As we increasingly develop the ability to deploy huge AI models at scale in a way that can be leveraged by all developers and businesses, AI is becoming a platform – an environment upon which folks can build amazing new experiences, just like we’ve seen happen before with personal computers, mobile devices or the internet.
Getting this AI platform off the ground requires unprecedented computing horsepower. So, this May, we expanded upon our ongoing partnership with the world-leading AI research organization OpenAI to announce one of the world’s most powerful supercomputers – a custom-designed, Azure-hosted home for training OpenAI’s equally massive AI models.
What is GPT-3?
GPT-3 is an AI language model, meaning that (in very general terms) its objective is to predict what comes next based on the previous data. It’s like a kind of “autocomplete” that we have in search engines like Google, but of course, at a much higher level. You can for example write two or three sentences of an article and GPT-3 will take care of writing the rest of the article. You can also generate conversations and the answers will be based on the context of the questions and answers above.
It is important to understand that each response offered by GPT-3 is only one possibility, it does not have to be the only one and the same request can always be offered a different or even contradictory response. A model that returns answers based on what has been previously said and relating it to everything you know in order to obtain the most meaningful possible answer. But of course, when what has been learned is millions of web pages, books or Wikipedia… the results are amazing.