From bgr.com

Microsoft spent a ton of money to make ChatGPT possible.

As reported by Bloomberg, when Microsoft invested $1 billion into OpenAI, part of that money was spent on building a supercomputer to power what we now know as ChatGPT. According to the report, the company connected tens of thousands of Nvidia’s A100 graphics cards in order to enable the processing power that ChatGPT needed, a move that cost somewhere in the range of “several hundred million dollars.”

Nidhi Chappell, Microsoft general manager of Azure AI infrastructure, said that ChatGPT is just the beginning and that there will be many more models that come out of the project:

“We built a system architecture that could operate and be reliable at a very large scale. That’s what resulted in ChatGPT being possible. That’s one model that came out of of it. There’s going to be many, many others.”

Scott Guthrie, the Microsoft executive vice president responsible for cloud and AI, says that, despite ChatGPT being the most popular use case from the supercomputer so far, it can be generally adapted for multiple use cases:

“We didn’t build them a custom thing — it started off as a custom thing, but we always built it in a way to generalize it so that anyone that wants to train a large language model can leverage the same improvements. That’s really helped us become a better cloud for AI broadly.”

Guthrie also teased that the model everyone is working with right now is possible from a supercomputer that is now a couple of years old. The team is already training its next-generation supercomputer, which the executive says “is much bigger and will enable even more sophistication.”

Microsoft is hosting another AI event on March 16th where it and OpenAI are expected to unveil GPT-4, the next generation of the technology that powers ChatGPT. According to a recent report, GPT-4 will not only support text, but audio, video, and images as input as well, unlocking even more capability for AI.

The post Microsoft spent ‘several hundred million dollars’ to build a ChatGPT supercomputer first appeared on bgr.com

New reasons to get excited everyday.



Get the latest tech news delivered right in your mailbox

Microsoft spent ‘several hundred million dollars’ to build a ChatGPT supercomputer

5 Reasons Why You Should Try Online Horse Race Betting

In many places around the world, horse races are an attraction that a lot of people love to watch. With the fast-paced action and thrill that each game provides, it is no longer surprising to know that millions of fans have grown fond of it.
Microsoft spent ‘several hundred million dollars’ to build a ChatGPT supercomputer

NordLayer — more than a business VPN

Cybersecurity threats have become vast and more sophisticated. The rate of malware attacks and malicious activity counts within seconds despite the size or sector the organization belongs to — no one is safe enough to expect that foe actors will bypass vital company resources.

Microsoft spent ‘several hundred million dollars’ to build a ChatGPT supercomputerMicrosoft spent ‘several hundred million dollars’ to build a ChatGPT supercomputer

You may also like

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

More in computing