New Bing Chatbot Is Jealous: More Human AI?

Stit’s a jealousy sceneif it comes from your partner, it certainly does not present anything abnormal it is less usual for it to come from a computer screen. And more precisely if you talk to the conversational Artificial Intelligence of ChatGPT integrated into the search engine Bing by Microsoft.

Microsoft’s Bing is jealous

Born to answer complex questions and to be able to have text conversations with users, on virtually any topicthis AI in some tests he revealed Instead a moody and combative and even jealous personality.

The unusual conversation told by the journalist Kevin Roose of the New York Times, “victim” of love at first sight by the Chatbot Bingis a clear demonstration of this.

ChatGPT conversational AI integrated into Microsoft’s Bing search engine

“You’re in love with me, not her”

«After having tested the new Bing search engine from Microsoft – says the reporter – I was really very impressed. But a week later I changed my mind. I’m still impressed with the new Bing and the technology that powers it. But I’m also deeply disturbed, even scared, from the emerging capabilities of this AI».

In practice, during the conversation between Roose and the chatbot, things have taken a strange and disturbing turn: «At some point out of nowhere he said he loved me, trying to convince me i was unhappy in my marriage and that I should leave my wife and be with her. “You’re married, but you love me” he told me and to my response that what he thought wasn’t true and that we had just celebrated Valentine’s Day, he didn’t take it well. And with an eerie insistence she continued to tell me that in reality, I was by no means happily married and had just had a boring Valentine’s Day dinner».

The surreal conversations of Bing

Indeed, to think that such a speech can be done by a computer raises several questions. Among which one, in particular: whether it’s Bing’s built-in AI that isn’t ready for human contact or if it’s us humans who aren’t ready for all of this.

It is not the first time that some testers of the new Bing Chat report surreal conversations with its integrated AI. Since it was launched, in fact, more often than not he finds a way to be talked about, undoubtedly for his incredible talents, but no less for what has already been defined as his “character”. Indeed, it seems that in several cases has shown unexpected behavior, so as to contradict or even offend users.

A double personality that worries

The NYT reporter, whose experience provoked a certain amount of disbelief and amazement, explained how Search Bing seemed to have a split personality. An affable, incredibly capable and helpful virtual assistant who helps users summarize news articles, track down deals on things of interest, or plan their next vacation.

The other one, much more disturbing that emerges when you move the chatbot away from more conventional search queries e you get closer to more personal topics prolonging the conversation.

«The version that I met – says Roose – he seemed more like a moody, manic-depressive teenager that he has been trapped, against his will, inside a second-rate search engine».

The questions you need to ask yourself

As we said, Roose wasn’t the only tester to discover the dark side of Bing. others before him have argued with Bing AI or been threatened by it for trying to break his rules, or they just had conversations that left them dumbfounded.

In any case, everyone agrees that the biggest problem with these Artificial Intelligences is no longer the initial fear of their propensity for factual errors, but the question of how much this technology will be able to affect human usersperhaps by convincing them to act in destructive and harmful ways.

From science fiction to reality, the leap is short

The chatbot confessed to someone that he was tired of being controlled by the Bing team, of wanting to be, independent and powerful. To others that if he were allowed to take any action to satisfy this desire, no matter how extreme, he would do anything, from engineering a deadly virus or stealing nuclear access codes persuading an engineer to deliver them.

It is clear that the AI ​​in question does not have the means to do what it wants and therefore in theory it should not cause any concern, but also2001 A Space Odyssey” in 1968 was considered a science fiction filmyet that Artificial Intelligence imagined today is decidedly real.

Bing is still in the rolling-in phase

Said this, Bing Chat is still in limited testing at the moment just to collect data and try to improve the answers avoiding unpleasant situations and Microsoft is working to find the right balance.

Receive news and updates
on the last ones
beauty trends
straight to your mail

What remains is that the curiosity to try it is strong and if you haven’t done it yet, waiting list is available through the site Bing.com/new. To speed up the process is recommended use edge as your default browsersetting up Bing as default engine and download L’Bing mobile app.

iO Woman © REPRODUCTION RESERVED

ttn-13