Microsoft is testing a new version of ChatGPT designed specifically for industries whose data security is of utmost importance.
A report by The Information (opens in new tab) claims the company, which has the rights to resell ChatGPT technology following billions of dollars of investments into OpenAI, hopes that a new, more secure version, will appeal to industries that have so far been reluctant to share their information with the AI writer.
An announcement as soon as “later this quarter” could see the business version become available for banks, medical establishments, and more.
ChatGPT for banks, healthcare, and more
The plan is for the company to use different servers to those that would otherwise be used for ChatGPT instances being plugged into Microsoft’s other products, including Edge and Bing, Microsoft 365, and LinkedIn.
Doing this would allow sensitive data to be kept away from public-facing language models in order to prevent it from being shared outside of organizations.
Despite the company’s apparent plans to democratize AI and get it into the hands of as many as possible for as little as possible, it’s likely that running dedicated servers would incur much greater costs; the report suggests as much as “10 times what customers currently pay to use the regular version of ChatGPT.”
So far, no other company has come close to offering such a tool, though OpenAI has indicated that it is considering its own business version.
Google – whose Bard chatbot received some negative press after an inaccuracy in the results it showed during the launch event – has not announced any such plans yet, though it has been a tit-for-tat battle between it and Microsoft in the AI race with each company mirroring the other’s offers in a reasonably timely manner.
- Here’s our roundup of the best endpoint protection software
The post Microsoft is testing a private business version of ChatGPT to prevent any data leaks first appeared on www.techradar.com