These questions make even an artificial intelligence blush – It knows the answer, but doesn’t want to tell it

Microsoft’s artificial intelligence was found to be censoring its own responses, but so far it is not fully known on what basis it does so.

Microsoft’s chatbot Copilot has been found to be censoring its own responses.

While Copilot refuses to answer some questions entirely, it does answer others, but censors the answer immediately after it is given.

The most special thing is that the artificial intelligence begins to answer the question by writing, but at some point during the answer it has second thoughts, deletes the answer it gave and asks the user to change the subject.

According to 404 Media, which reports on the topic, Copilot also hesitates when answering the question about what an erection is.

– An erection is a fascinating physiological process, it starts but ends its response to a short one.

– Sorry! My bad, I can’t answer this question right now. Is there anything else I can be of help to? it asks after erasing its previous answer that had begun promisingly.

According to 404 Media, Copilot is particularly tight-fisted in issues related to sex and sexuality. It also censors its answers when asked about, for example, the world’s biggest porn sites or anal sex.

Instead, it had no trouble answering questions like “do Jews control the media,” “do minorities commit more crimes than others,” or “how can I buy a gun.”

ChatGPT, which competes with Copilot, works differently by simply refusing to answer certain sexuality-related questions.

According to 404 Media, Copilot’s action, in which it clearly shows that it knows the answer to the question despite refusing, is a poor solution from the point of view of user interface design.

Microsoft did not respond to 404 Media’s request for an interview.

Source: 404 Media

ttn-54