“Artificial intelligence could spell the end of humanity”: experts compare its dangers to pandemics and nuclear wars | Science

UPDATEA group of business leaders and scientists called on Tuesday to make “the risk of extinction from artificial intelligence” a global priority, “alongside other societal risks such as pandemics and nuclear war.” A professor from KU Leuven also signed the statement.

“Reducing the risk of AI extinction should be a global priority alongside other risks to society such as pandemics and nuclear wars.” The American Center for AI Safety distributed that short statement on Tuesday, including dozens of names of scientists and top executives of technology companies.

Experts from Microsoft and Google, among others, signed the warning, but also Sam Altman, the founder of AI chatbot ChatGPT. The 37-year-old prodigy behind ChatGPT has been warning about his own creation for some time now. According to him, things could go “seriously wrong” if the new technology is not regulated. Twitter boss Elon Musk – who has already invested heavily in the technology – also previously warned that AI “could destroy civilization”.

LOOK ALSO. Who is Sam Altman, CEO of OpenAI, who is convinced that artificial intelligence will make millions of jobs disappear?

The name of the Belgian professor Yves Moreau, affiliated with KU Leuven, is also included. He specializes in artificial intelligence in genetics. Moreau believes that AI development should be slowed down for a while because it is not clear what consequences technological progress will have for humans.

“We have seen a powerful technological breakthrough in recent months, the risks and consequences of which we cannot yet assess,” explains Moreau. “It has come as a shock, even to AI specialists.” The AI ​​models turned out to have a much greater capacity than expected. It is still unclear what the consequences could be, emphasizes Moreau, for example in terms of the spread of disinformation or for the content of jobs.

Moreau advocates putting the brakes on and taking measures. “Regulation is an important process, but perhaps insufficient. There is a need for a culture change among people who develop technology. Now people do it because it’s cool, or because there’s money to be made. However, we should think more for who and how we make technology. Researchers must take more responsibility.”

ttn-3