Microsoft acquires exclusive license for Elon Musk’s largest GPT-3 AI language model

Microsoft CEO Satya Nadella (L) and OpenAI founder Elon Musk (R)BI India
  • Microsoft is expanding its partnership with billionaire Elon Musk’s OpenAI as it acquires an exclusive license for GPT-3.
  • GPT-3 is the largest and most sophisticated AI language model on the market right now.
  • According to Microsoft CTO Kevin Scott, the company will be utilising the capabilities of GPT-3 in its own products and services.
Microsoft already had a partnership going with tech billionaire Elon Musk’s artificial intelligence (AI) research company OpenAI. Now, it’s taking that collaboration a step further with a new exclusive license for the largest AI language model ever created — GPT-3.

The tech behemoth founded by Bill Gates was already lending its Azure cloud computing resources to OpenAI for training its many models. Microsoft made a $1 billion investment to become the company’s exclusive cloud provider.

“While we’ll be hard at work utilizing the capabilities of GPT-3 in our own products, services and experiences to benefit our customers, we’ll also continue to work with OpenAI to keep looking forward,” said Microsoft’s CTO and executive vice president Kevin Scott in a statement.

Advertisement

The new license, which gives Microsoft rights over GPT-3, is another signal of the company’s confidence in OpenAI’s research.

“The deal has no impact on continued access to the GPT-3 model through OpenAI’s API, and existing and future users of it will continue building applications with our API as usual,” said the Musk-owned enterprise.

What is GPT-3?
Released in July, GPT-3 is the third iteration of the company's growing language model.

Advertisement

Despite sparking debate over the ethics of AI and how it's used, GPT-3 has been lauded for being the most sophisticated AI-language model in the industry.


GPT-3 can predict what you want to say before you even ask by running nearly 45 terabytes of text data through 175 billion parameters. It looks at patterns in data to make suggestions. To provide a glimpse of the scale at which GPT-3 operates, Wikipedia’s 6 million articles only make up 0.6% of the entire training dataset.

GPT versionNumber of parameters
GPT 1117 million
GPT 21.5 billion
GPT 3175 billion

Advertisement

SEE ALSO:
Facebook India Head goes to Supreme Court against Delhi Assembly Panel's summons on 'hate speech'

EXCLUSIVE: YES Bank CEO Prashant Kumar clarifies that the forensic investigation isn't restricted to the top 10 defaulters

EXCLUSIVE: Dailyhunt is curating an entire family of 'Bharat' apps — claims it can push a new mobile app every three days