32.4 C
Bangkok
Monday, April 29, 2024

Users say Microsoft’s Bing chatbot gets defensive and testy

People testing Microsoft’s Bing chatbot — designed to be informative and conversational — say it has denied facts and even the current year in defensive exchanges

SAN FRANCISCO – Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.

A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Wednesday with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.

The Bing chatbot was designed by Microsoft and the start-up OpenAI, which has been causing a sensation since the November launch of ChatGPT,…

Read more…

Latest Articles