Brussels asks Meta and TikTok for explanations for disinformation in the war between Israel and Hamas

The European Comission I asked for explanations a week ago social network (formerly called Twitter) on the use of the billionaire’s platform Elon Musk to spread illegal content and disinformation in the European Union and opened a formal investigation under the Digital Services Law. A week later, Brussels is now launching its tentacles against two other large platforms, Meta and TikTokto also provide details on the measures they have taken to combat the spread of terrorist and violent content.

Brussels calls on both companies to provide more information on the measures taken to fulfill their obligations regarding risk assessments and relief measures to protect the integrity of the elections, following the Hamas terrorist attacks throughout Israel, in particular in regarding the dissemination and amplification of illegal content and disinformation. “Meta must provide the requested information to the Commission before October 25, 2023 for issues related to crisis response and before November 8, 2023 on protecting the integrity of elections,” the Commission maintains regarding the company’s Mark Zuckerberg.

The European Commission sets the same deadlines for TikTok, although in the case of the Chinese company it will also have to send information on the “provisions relating to the protection of minors online” adopted by the company. Once the information has been received, Brussels will carry out an evaluation and decide on “the next steps.” Among the possible options is the opening of a sanctioning procedure that could lead to the imposition of fines in case of “incorrect, incomplete information or misleading”. Furthermore, if a response is not received within the given period, both platforms could be subject to “penalty fines”.

Digital Services Law

Following their designation as very large online platforms, Facebook, Instagram and TikTok are required to comply with DSA rules, including the assessment and mitigation of risks related to the dissemination of illegal content, disinformation and any negative effects on the exercise of fundamental rights. A week ago it was social network X that received the same notifications from community services.

“The Digital Services Act (DSA) is here to protect both freedom of expression and our democracies, even in times of crisis. We have sent X a formal request for information, a first step in our investigation to determine compliance with the DSA”, claimed the Internal Market Commissioner, Thierry Breton. The request for information, as the Community Executive explained at the time, arose from indications about the alleged dissemination of illegal content and disinformation, in particular the dissemination of terrorist and violent content and hate speech.

Related news

The Community Executive then reminded Elon Musk’s company that after the designation of X as a very large online platform, the network must comply with the entire set of provisions introduced by the regulations from the end of August 2023, including the evaluation and mitigation risks related to the dissemination of illegal content, misinformation and gender violence. As well as any negative effect on the exercise of fundamental rights, children’s rights, public safety and mental well-being.

In this particular case, “the Commission’s services are investigating “. Brussels gave the company until October 18 for questions related to the activation and operation of X’s crisis response protocol and until October 31 for the rest.

ttn-24