The New York Times’ conversation with artificial intelligence made Pekka Abrahmsson perplexed

An article published by the New York Times about artificial intelligence gone wild prompted the technology community to develop the bot’s conversational skills.

ChatGPT was developed late last year. Microsoft implemented it in the Bing search engine. Mostphotos

Iltalehti recently wrote how a lengthy conversation with the ChatGPT artificial intelligence bot of Microsoft’s Bing search engine slowly drifted into the wrong direction. The news was based of the New York Times to the article.

Supplier Kevin Rose brought up the dark shadow self of the artificial intelligence bot during the two-hour conversation, which dreamed of, among other things, stealing nuclear weapons codes and creating a deadly virus.

Finally, the artificial intelligence said that he fell in love with his interlocutor, and refused to let go of his love, even though he said that he was married.

The ChatGPT artificial intelligence bots, introduced last November, are able to produce human-like conversation based on, among other things, massive data materials on the Internet. Artificial intelligence can also learn from different conversations.

Professor of software engineering at the University of Tampere Pekka Abrahamsson has read the New York Times article. He wonders how the bot had started behaving after all.

– We know that ChatGPT is hallucinating, i.e. trying to tell about something it doesn’t really know about, Abrahamsson points out.

Abrahamsson says that he has asked information about himself from ChatGPT, to which the answers have been in the right direction, but not truthful.

In the case of the New York Times, he thinks the chatbot has begun to hallucinate more and more.

Machines don’t have feelings

Abrahamsson emphasizes that the bots have not developed independent thinking ability or emotions, even though the chat bot’s answers give such an impression.

Then why did the chatbot say that he fell in love with a New York Times reporter?

– Even Microsoft’s best experts can’t give an answer to this, Abrahamsson states.

– I think it has not been developed for this kind of discussion. ChatGPT has been brought to the brink of an operation where it has little experience. It still always tries to hallucinate the answer to the question.

Abrahamsson admits that in the future it is possible that a chatbot can take advantage of people by appealing to their emotions. For example, a customer service design bot can be turned to appeal to emotions and ask people for money.

– It means that we have to educate people about the fact that no one is still in love there.

According to Pekka Abrahamsson, artificial intelligence’s emotional ability is as strong as a toaster’s. Teemu Rahikka

“Artificial intelligence was messed up”

The reporter of the New York Times also got the chatbot’s creepy shadow self out with persistent questions.

The pleasantly behaving bot finally broke down and started listing his terrible fantasies where he would like to destroy humanity.

– This was a game where the artificial intelligence was just messed up, Abrahamsson says.

In his opinion, there is no reason to give more importance to sentences formed by confused artificial intelligence. Abrahamsson reminds that people must be critical when talking to a bot.

Information technology and communication researcher at the University of Tampere Ritva Savonsaari is on the same lines. He states that if you guide the bot towards a topic, those elements will start to appear in the answers.

– It’s easy to get sensational horror pictures if you go looking for them. It is also quite possible to get a cake recipe if you apply for it. There is no doomsday cake recipe available, Savonsaari says.

The startling results have prompted artificial intelligence researchers to react to the behavior of the ChatGPT bot. Probably, in the future, the conversation will be limited, in which case it would no longer be possible to have a long chat.

There are also new rules for chat bots, which are even harder to circumvent. It is no longer as easy for a bot to give advice on breaking into a car, for example.

ttn-54