Microsoft unveils more secure AI chatbot

Microsoft announced Tuesday an AI-powered, more secure version of Bing specifically for businesses, designed to help professionals securely share potentially sensitive information with chatbots.
Bing Chat Enterprise does not store your chat data, it is sent to Microsoft servers.
“What is this [update] That means the data doesn’t leak outside the organization,” Yusuf Mehdi, Microsoft’s vice president and chief consumer marketing officer, told CNN in an interview. “We do not mix your data with web data or store it without your permission. So your data is not stored on our servers and you can also use data chat for training AI models. is not.”
Since the launch of ChatGPT late last year, the new line of powerful AI tools is expected to boost employee productivity.However, in recent months, several businesses have JPMorgan Chase banned the use of ChatGPT Citing security and privacy concerns among employees, other large companies reportedly We took similar action on concerns about sharing confidential information with AI chatbots.
In April, Italian regulators temporary ban Resolved this issue in domestic ChatGPT after OpenAI revealed a bug that allowed some users to see the subject line of other users’ chat history. The same bug is now fixed, and some users are unable to see “another active her user first and last name, email her address, payment address, last 4 digits of credit card number (only), and credit card expiration date You can also check the ,” OpenAI said in a blog post at the time.
Microsoft, like other tech companies, is rushing to develop and deploy a range of AI-powered tools for consumers and professionals amid growing investor enthusiasm for new technologies. Microsoft also announced Tuesday that it will add visual search to its existing AI-powered Bing Chat tool.And the company said about Microsoft 365 co-pilotis a previously announced AI-powered tool that helps you edit, summarize, create and compare documents across different products and costs $30 per user per month.
Bing Chat Enterprise will be free to all 160 million Microsoft 365 subscribers starting Tuesday if the company’s IT department manually enables the tool. However, after 30 days, Microsoft rolls out access to all users by default. Subscribing companies can disable the tool if they wish.
Rethink AI chatbots for the workplace
Current Conversational AI Tools (Consumer Edition Bing Chat sends data from your personal chat to our servers to train and improve our AI models.
Microsoft’s new enterprise option is the same as Bing’s consumer version, but it doesn’t remember your conversations, so you have to go back and start over every time. (Bing recently started enabling saved chats in its consumer chat model.)
With these changes, Microsoft, which uses OpenAI’s technology to power its Bing chat tool, said its employees can be “completely confident” that their data “will not be leaked to the outside world.” of the organization. ”
To access the tool, users must sign into the Bing browser using their work credentials, and the system will automatically detect the account and put it into protected mode, according to Microsoft. Above the “Ask me anything” bar it says, “Your personal and company data is protected in this chat.”
In a demo video released to CNN ahead of the service’s launch, Microsoft showed how users can enter sensitive information into Bing Chat Enterprise, such as someone sharing financial information as part of preparing a bid to purchase a building. . With the new tools, users will be able to: Ask Bing Chat to create a table comparing the property to other buildings in the neighborhood and an analysis highlighting the strengths and weaknesses of your bid compared to other local bids.
In addition to alleviating privacy and security concerns about AI in the workplace, Mehdi also addressed the issue of factual error. To reduce the chance of inaccuracies, or what some industry insiders call “hallucinations,” he encourages users to create clearer, more appropriate prompts and confirm included quotes. Proposed.