scorecard
  1. Home
  2. tech
  3. enterprise
  4. news
  5. Microsoft acquires exclusive license for Elon Musk’s largest GPT-3 AI language model

Microsoft acquires exclusive license for Elon Musk’s largest GPT-3 AI language model

Microsoft acquires exclusive license for Elon Musk’s largest GPT-3 AI language model
  • Microsoft is expanding its partnership with billionaire Elon Musk’s OpenAI as it acquires an exclusive license for GPT-3.
  • GPT-3 is the largest and most sophisticated AI language model on the market right now.
  • According to Microsoft CTO Kevin Scott, the company will be utilising the capabilities of GPT-3 in its own products and services.
Microsoft already had a partnership going with tech billionaire Elon Musk’s artificial intelligence (AI) research company OpenAI. Now, it’s taking that collaboration a step further with a new exclusive license for the $4 — GPT-3.

The tech behemoth founded by Bill Gates was already lending its Azure cloud computing resources to OpenAI for training its many models. Microsoft made a $1 billion investment to become the company’s exclusive cloud provider.

“While we’ll be hard at work utilizing the capabilities of GPT-3 in our own products, services and experiences to benefit our customers, we’ll also continue to work with OpenAI to keep looking forward,” said Microsoft’s CTO and executive vice president Kevin Scott in a $4

The new license, which gives Microsoft rights over GPT-3, is another signal of the company’s confidence in OpenAI’s research.

“The deal has no impact on continued access to the GPT-3 model through OpenAI’s API, and existing and future users of it will continue building applications with our API as usual,” said $4.

What is GPT-3?
Released in July, GPT-3 is the third iteration of the company's growing language model.

Despite $4 over the ethics of AI and $4, GPT-3 has been lauded for being the most sophisticated AI-language model in the industry.


GPT-3 can predict what you want to say before you even ask by running nearly 45 terabytes of text data through 175 billion parameters. It looks at patterns in data to make suggestions. To provide a glimpse of the scale at which GPT-3 operates, Wikipedia’s 6 million articles only make up 0.6% of the entire training dataset.

GPT version

Number of parameters

GPT 1

117 million

GPT 2

1.5 billion

GPT 3

175 billion


SEE ALSO:
$4

$4

$4

READ MORE ARTICLES ON



Popular Right Now



Advertisement