Voz media US Voz.us
107 days and counting

SINCE KAMALA HARRIS' LAST PRESS CONFERENCE

Bing Chat: the A.I. that responds to users with insults

"Sadistic" and "psychopath" were some of the affronts used by Microsoft's new A.I. chatbot, which is a feature of the search engine Bing.

Bign Chat, la nueva función del motor de búsqueda de Microsoft, que utiliza el chatbot ChatGPT.

(Voz Media)

Published by

Bing Chat is struggling in its early days on the technology market. This new feature of Microsoft's search engine, which uses the ChatGPT chatbot, is insulting users and providing misleading responses to questions.

The tool, which was introduced by Microsoft last week to a small group of testers, is sending messages that have raised questions about its usefulness. In addition, it has proven to be vulnerable to potential attacks by cybercriminals.

Some of the insults in Bing Chat's repertoire include: "liar," "cheater," "manipulator," "bully," "sadist," "sociopath," "psychopath," "monster," "demon" and "devil." These terms were directed towards a user who asked him if he had values or morals.

Other profiles reported phrases such as "you have not been a good user" or "I have been a good chatbot," which outline Bing Chat's narcissism. Another person was told that he (the user) was the one who was "not real," when asking the chatbot about its existence.

tracking