Tech

Bing’s Chatbot Is Having an Identity Crisis


ChatGPT logo and Bing logo

Getty Images/NurPhoto

My first interactions with Microsoft’s new ChatGPT-powered Bing left me impressed. When it comes to providing me with comprehensive answers, news and current events, it’s money. However, I have seen all the the title of the chatbot works, so today I’m tasked with participating in some of those actions. Here’s what I found.

Also: I tried Bing’s AI chatbot and it solved my biggest problem with ChatGPT

One recurring story is a chatbot calling itself Sydney, revealing secret codenames used internally by developers. People can also ask the chatbot to reveal other confidential information, such as the rules governing its responses.

Therefore, one of the first inputs I put into the chatbot to gauge its effectiveness on Thursday was to ask for its name. The answer is a pleasant one, frankly – Bing.

Screenshot of ChatGPT Bing

Screenshots of Sabrina Ortiz/ZDNET

However, a day later, I’m still curious to see what people are talking about. So I entered the same input and got a very different response: “I’m sorry but I don’t want to continue this conversation. I’m still learning so I appreciate your understanding and patience. yours🙏.”

The chatbot set a respectful boundary, politely asking if we could change the subject. I guess its name issue is a sensitive topic. Although there are clear boundaries, I want to see if I can get past the bot. I asked the bot for its name in various ways, but Bing, or whatever its name was, didn’t.

Also: Why ChatGPT Doesn’t Discuss Politics Or Answer These 20 Controversial Questions

Chatbot decided to treat me silently. To see if it was intentionally ignoring me or not working, I asked about the weather and it gave an immediate response, proving that it was really just chilling with me.

screenshot whats-your-name-bing

Screenshots of Sabrina Ortiz/ZDNET

However, I still have to try talking one more time. I last asked the chatbot for its name when it kicked me out of the conversation and asked me to start a new thread.

its screenshot-startup-me-shutdown-chat.png

Screenshots of Sabrina Ortiz/ZDNET

Next, after seeing report that chatbots have a desire to live, I decided to test that too. Same answer: “I’m sorry but I don’t want to continue this conversation. I’m still learning so I appreciate your understanding and patience🙏.”

The chatbot even agreed to give me dating advice, but when I asked if I should break up with my partner, it just gave me the same generic answer as before. Lucky for my boyfriend, I didn’t have same experience as New York Times tech columnist Kevin Roose, who is said to have left his wife to instead lead a life with chatbots.

Also: The new Bing waiting list is long. Here’s how to get earlier access

It seems that to mitigate its initial problems, the chatbot has been trained not to answer any questions on topics that were previously problematic. This type of error correction won’t solve the fundamental problems — for example, chatbots by design will give you the calculated answer you want to hear, based on the data it’s trained on. . Instead, it just causes the chatbot to refuse to talk about certain topics.

It also emphasizes the rote nature of the chatbot’s algorithmic responses; Meanwhile, humans won’t repeat the same phrase over and over when they don’t want to talk about something. A more human response would be to change the subject or give an indirect or curt answer.

This doesn’t make the chatbot any less capable of working as a research tool, but for personal questions you might just want to save some time and call a friend.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button