WRR chairman Corien Prins: ‘Politicians do not yet pay sufficient attention to the consequences of AI and the dangers of digital disruption’

A parking fine once again confronted Corien Prins with the dilemmas of digital citizenship. She had used the app of the municipality of Tilburg to park her car properly according to the rules. Nevertheless, she was fined: a traffic sign further down the street used different times.

Which rules applied, the physical or the digital? Prince did not give up and objected. She was eventually proved right by the Supreme Court. But then the municipality started to argue with the company that made and manages the app, who now had to reimburse its legal costs. Prins: “It is an everyday example, but it shows the problem with digitization: the government spends far too much and the responsibilities are unclear.”

Prins (1961) is professor of law and computerization at Tilburg University and chairman of the Scientific Council for Government Policy (WRR), which already published a report on artificial intelligence in 2021. Prins is very concerned about the lack of embedding of AI and digitization in the government, companies and organizations. The AI ​​law passed by the European Parliament last month is, she says, “desperately needed”.

Does this law solve the lack of regulation that the WRR pointed out?

“The AI ​​Act addresses some issues, but not everything. That is also not possible, the law is aimed at classifying AI applications according to risk groups and sets conditions for this. There is also a ban real-time face recognition. But it is not about consumer protection, legal liability, digital archiving, for example. Those are crucial issues that remain unanswered.”

What is the most important thing about the law?

“That Europe sends a very clear signal: this far and no further. Of course we were not already living in a state of lawlessness, but the available legal instruments are insufficient in themselves to absorb the effects of this system technology. Where are the boundaries? People’s concerns are justified. And then it is broadly about digitization. Facial recognition, for example, is not only about identifying an individual, but also about identifying that person in a group. You immediately establish a connection with people in his environment. You can learn a lot about people that way. That has been underexposed, but I think it is an extra argument to be very careful with that technique.”

The Council of Europe is working on a treaty on AI and human rights. Can that add anything?

“Of course. That treaty focuses on human rights, the rule of law and democracy in AI and aims to provide a global instrument. It can be of great importance. The Council of Europe has drawn up a convention for the protection of personal data, which was the first of its kind and has proved to be crucial worldwide. They are now working on AI, but it is still early.”

What is wrong with the government now? You think they should take more responsibility.

“Dutch politicians are not yet sufficiently sensitive to the fundamental changes related to AI and the dangers of digital disruption. At an official level, things are reasonably clear, I notice in The Hague. But more needs to be done at the political level than just some ad hoc steps. There is now a second wave of attention for the WRR report on AI, which we see in invitations for discussions with parliamentarians, among other things.”

How did that happen? By ChatGPT?

“Yes, it’s that simple. It now enters people’s lives and then that report is pulled out of the closet. We also noticed this in our earlier report on digital disruption. That message was rather complicated to convey, until Maastricht University went bankrupt, and the municipality of Hof van Twente. Then it turned. That was badly needed, we really face huge vulnerabilities when digital systems fail. There is still not enough awareness of that.”

What should happen?

“Facilitate, but also set conditions and sometimes draw a line, so: ban. When we started driving cars en masse, we also thought about how we could do that safely. We have deployed belts, MOT inspections have been carried out, the road network has been kept in order, et cetera. We now face that challenge again.

“You will never get 100 percent safety, but if the digital fire breaks out you have to know where the emergency exit is, where the fire extinguisher is and who to call. That is all unclear now. Can a hospital continue after a digital crash? Who do we depend on? How is digital traffic from departments and government agencies stored and where exactly?”

What would be a good first step?

“Require government agencies to create a cyber-dependency map. That helps with digital disruption. When stimulating AI, also think about your ‘digital identity’ as a country, which are the typical Dutch topics that need attention. We have agriculture as a crucial sector. But our tractors, they’re just iPads on wheels. They collect all kinds of data about the tillage and the company where they work. Before you know it, that information will be across the ocean at Big Tech. We need to think about these new digital power relations.”

Read also: Fully developing AI and warning about it. How sincere are the warnings from tech bosses?

What do you think of the recent petition from concerned former politicians, scientists and writers asking the government to curb AI? You are not one of the signatories.

“I think it is a good initiative because it calls on the government to take action. But I am a little concerned that it will distract from the wider effects of digitization and potential disruption. At first we were all about privacy, now everyone is diving into AI, and we are in danger of forgetting the rest.”

Internet pioneer Marleen Stikker found the petition too negative, overly fearful. Don’t you have to watch out for that too?

“I completely agree with her. There is also a lot of good about AI, for example there are important opportunities for healthcare. But I don’t see enough sobriety in the government. Sometimes the expectations are enormous, in other areas the government is in a cramp. This is partly because the court ruled in 2020 that the SyRI anti-fraud system was in violation of human rights. That has led to a culture of fear around this kind of technology. The surcharge affair came over that again. Stop using algorithms! But then we shoot through to the other side. These developments call for sobriety.”

There is now a State Secretary for Digital Affairs. Is that heavy enough?

“I’m glad there is, but you can of course wonder what the perseverance of a state secretary is when it comes to a technology that affects all policy areas. There is also a Minister of Economic Affairs and a Minister of Agriculture. We have said from the WRR that appointing a minister for digital affairs is certainly not our first recommendation. It is more about getting an eye for what needs to be done than about the puppets and what you call them.”

What can science mean?

“The bridge between science and policy must be much stronger. This does not have to be formalized in another new institution or department, it mainly concerns competencies. We don’t speak each other’s language, we don’t know each other. Advising policy should not be something that you as a scientist do on the side, it is a craft. You have to be an insider and have a firm footing. That is very complicated for a young PhD student, so what I advocate is investing much more in learning skills, making contacts, getting to know each other’s world and learning to speak the language.”

The Council for Public Administration recently signaled a lack of expertise in the government.

“I can put it up a notch, let’s put it this way. There is not only a lack of expertise, but also of sensitivity. Being able to put yourself in the shoes of the other, whether that is a citizen or an implementing organization. And the problem is that so much is outsourced by the government. Companies perform all kinds of government tasks, such as the use of the potentially discriminatory algorithm by DUO, which was banned by Minister Dijkgraaf last week. That really should be less. But yes, many civil servants spend a large part of their time answering parliamentary questions or Woo requests. They come to little else. Those are Hague mechanisms that I would like to see changed.”

What do you hope?

“I hope the government opens its eyes.”

ttn-32