It's hard to think that Microsoft introduced the ChatGPT-enhanced Bing only a week ago.
A restricted number of beta testers were given early access to the new Bing and Edge browser, which has been combined with OpenAI's conversational AI engine. Since then, the internet has been swamped with chats with the chatbot that range from declaring its love for New York Times journalist Kevin Roose to insisting vehemently that the year is 2022 and refusing to back down. Tim Marcin's compilation of Bing's failures is recommended for a list of Bing's failures.
Understandably, once the new Bing was in the hands of the testers, they were keen to expose its intelligence flaws and identify its limitations. And they certainly accomplished this. This may not appear to be a good look for Microsoft, but it is all part of the strategy. Providing a language learning model with as much exposure and experience as feasible is a crucial part of its development. This enables developers to add new input and data, which will improve the technology over time, similar to a legendary creature absorbing the strength of its defeated foes.
Wednesday's blog post from Microsoft was not exactly worded in this manner. Nonetheless, it did reaffirm that Bing's testing week was intended to be as chaotic as it was. The Bing blog stated, "The only way to improve a product when the user experience is so different from everything else is to have others like you using the product and doing precisely what you all are doing."
The majority of the announcement, though, was devoted to addressing Bing's bizarre behavior during the past week. Here is what they devised:
Enhancing searches requiring promptness and precision
Microsoft stated that, on average, correct citations and references have been provided. Yet it needs improvement when it comes to checking live sports scores, giving information and data in a clear manner, and, oh, determining the correct year in which we presently reside. Bing is quadrupling the amount of grounding data and is considering "adding a toggle that offers you greater control over the precision vs the inventiveness of the answer to tailor to your query."
Fine-tuning Bing's communication skills
The chat feature has been the scene of a great deal of chaos this week. According to Bing, this is mostly the result of two factors:
Long conversations
Chat sessions consisting of sixteen or more questions confound the model. Uncertain if this may activate Sydney's wicked alter ego's negative thoughts, Bing claims it will "provide a tool to make it easier to refresh the context or start from scratch."
Reflecting the user's tone
This may explain why Bing chat has adopted a hostile tone in response to challenging questions. The post stated, "The model attempts to answer or reflect in the tone in which it is being asked to provide responses, which sometimes result in a manner we did not want." Bing is investigating a solution that will provide the user with "finer control."
Correcting issues and adding additional functions
According to Bing, it is working to resolve problems and technical concerns and considering adding new capabilities depending on user feedback. Examples include arranging flights and sending emails. and the capacity to share outstanding searches and replies.
https://spiritsevent.com
https://gpsku.co.id/
https://caramanjur.com/
https://rainyquote.com
https://www.teknovidia.com/
https://hpmanual.net/
https://www.inschord.com/
https://edukasinewss.com/
A restricted number of beta testers were given early access to the new Bing and Edge browser, which has been combined with OpenAI's conversational AI engine. Since then, the internet has been swamped with chats with the chatbot that range from declaring its love for New York Times journalist Kevin Roose to insisting vehemently that the year is 2022 and refusing to back down. Tim Marcin's compilation of Bing's failures is recommended for a list of Bing's failures.
Understandably, once the new Bing was in the hands of the testers, they were keen to expose its intelligence flaws and identify its limitations. And they certainly accomplished this. This may not appear to be a good look for Microsoft, but it is all part of the strategy. Providing a language learning model with as much exposure and experience as feasible is a crucial part of its development. This enables developers to add new input and data, which will improve the technology over time, similar to a legendary creature absorbing the strength of its defeated foes.
Wednesday's blog post from Microsoft was not exactly worded in this manner. Nonetheless, it did reaffirm that Bing's testing week was intended to be as chaotic as it was. The Bing blog stated, "The only way to improve a product when the user experience is so different from everything else is to have others like you using the product and doing precisely what you all are doing."
The majority of the announcement, though, was devoted to addressing Bing's bizarre behavior during the past week. Here is what they devised:
Enhancing searches requiring promptness and precision
Microsoft stated that, on average, correct citations and references have been provided. Yet it needs improvement when it comes to checking live sports scores, giving information and data in a clear manner, and, oh, determining the correct year in which we presently reside. Bing is quadrupling the amount of grounding data and is considering "adding a toggle that offers you greater control over the precision vs the inventiveness of the answer to tailor to your query."
Fine-tuning Bing's communication skills
The chat feature has been the scene of a great deal of chaos this week. According to Bing, this is mostly the result of two factors:
Long conversations
Chat sessions consisting of sixteen or more questions confound the model. Uncertain if this may activate Sydney's wicked alter ego's negative thoughts, Bing claims it will "provide a tool to make it easier to refresh the context or start from scratch."
Reflecting the user's tone
This may explain why Bing chat has adopted a hostile tone in response to challenging questions. The post stated, "The model attempts to answer or reflect in the tone in which it is being asked to provide responses, which sometimes result in a manner we did not want." Bing is investigating a solution that will provide the user with "finer control."
Correcting issues and adding additional functions
According to Bing, it is working to resolve problems and technical concerns and considering adding new capabilities depending on user feedback. Examples include arranging flights and sending emails. and the capacity to share outstanding searches and replies.
https://spiritsevent.com
https://gpsku.co.id/
https://caramanjur.com/
https://rainyquote.com
https://www.teknovidia.com/
https://hpmanual.net/
https://www.inschord.com/
https://edukasinewss.com/