Is the cure worse than the disease? A panel of experts that informed the House of Representatives on Wednesday evening about a European bill to combat the spread of child pornography and digital grooming is unanimous: chat control goes way too far.
“An absurd idea for a fantastic surveillance system,” said Bert Hubert, former AIVD member and former member of a supervisory committee on the intelligence services, about the controversial European bill. As if, said privacy expert and associate professor Jaap-Henk Hoepman (Radboud University Nijmegen), every home will have a camera that automatically switches on when domestic violence is detected. “And that is a very big if, because how do you detect that reliably?”
While spreading CSAM – Child Sexual Abuse Material, or photos or videos in which minors are sexually exploited – and grooming are major problems, the European Union is taking the wrong path with this proposal, according to all four experts. This puts them in a growing group from hundreds of leading scientists, European privacy watchdogsthe National Rapporteur on Human Trafficking and Sexual Violence against Children and child abuse fighters themselves, such as Offlimits, the organization behind the Dutch child pornography hotline. Last year, the hotline took action against nine thousand images of minors.
The Dutch cabinet is also critical of large parts of the bill, but wants to “take another step against the spread of online child sexual abuse material,” according to outgoing Minister of Justice and Security Dilan Yesilgöz (VVD). last summer in a letter to Parliament. A motion from the House of Representatives for EU legislation of chat control not to supportYesilgöz put it aside.
Also read this article: The internet drain is in a North Holland village
Not in the best interest of the children
For Offlimits, the organization behind the Dutch Child Pornography Hotline, the current proposal goes too far, says director Robbert Hoving in a written statement. The bill is not in the interests of the children it claims to protect. “Because the world of young people is online, sexual experimentation such as sexting can also be part of it. Viewing private communications therefore contributes to insecurity and causes a huge invasion of users’ privacy.”
According to European governments, more and more child pornographic material is being shared via chat services such as WhatsApp, Signal and Telegram. This message traffic is largely hidden from the view of investigative services, because the chats are exchanged privately and are properly encrypted. The chat service itself cannot read either. Governments fear losing sight of criminals who hide behind encrypted communications.
The European Commission therefore presented last year a bill that forces chat services to allow investigative services to read along. ‘Communications services’ such as WhatsApp can receive a ‘detection order’, which obliges them to detect messages containing grooming or child pornographic material – in images, text or speech – and report them to a competent EU authority.
Client side scanning
The only way to read along with the messages without weakening the encryption by creating a special backdoor for law enforcement services is client side scanning, explains researcher Jaap-Henk Hoepman. The chat service then reads the messages before they are encrypted, like a postman who takes note of the contents of a letter before it is sealed and sent.
“Technically speaking client side scanning does not handle the encryption,” says Hoepman. “But that is a very narrow interpretation: encryption is a means and not an end in itself. The goal is to protect the confidentiality of communications. Of client side scanning just read along over your shoulders. I feel like this bill really makes a fundamentally different way of detection possible.”
The other main point of criticism is the major role that the European Commission sees for artificial intelligence. The scale on which grooming and child pornography material must be detected is enormous: WhatsApp has an estimated 2.8 billion users worldwide who send dozens of messages per day. Artificial intelligence must distinguish between innocent photos of toddlers in the bath or flirty messages between teenagers on the one hand and grooming and child pornography material on the other.
The technology is far from being that far, professor and internet security expert Michel Van Eeten (TU Delft) warned MPs. “For detecting unknown child pornography material and grooming, this proposal is actually madness.” Even the best algorithms, former AIVD member Bert Hubert predicted, will flood investigative services with false positive reports.
Van Eeten believes it is more conceivable to recognize already known child pornography material, for example images that were seized during criminal investigations. In the Netherlands, such technology is already used successfully by many hosting parties, which scan and clean their servers on a large scale.
‘Foot in the door’
Outgoing minister Yesilgöz also thinks that detecting existing material on the basis of such a ‘black list’ can be done reliably enough. In negotiations with other countries, she is trying to adjust the bill in that direction, she said at the end of June.
On Wednesday evening, a new version of the bill was leaked, which includes the detection of grooming according to NOS has been killed. However, the possibility of also detecting new material with artificial intelligence remains in the law and can later be made effective by the EU government leaders, without having to go to the EU parliament again. “The latest version of the proposal tries to keep a small foot in the door for detecting new material,” Van Eeten said on Wednesday evening.
Hubert: “This is not a law that needs to be adjusted slightly: we are making the decision for five hundred million Europeans at once to use those scanners. If we agree to this, we will be crossing a line we have never crossed before. Namely that every European must be monitored.”
Also read this article: ‘WhatsApp threatened to leave the Netherlands over wiretapping duties’