To determine the activity of human brains, we mainly look at scans made from the outside. Those methods don’t show as much detail as you’d like. Researcher Iris Green looked into people’s brains via implants and thus gained more insight into our vision.
How do you measure brain activity in the brains of living people?
‘Usually you use scanners that measure from the outside, such as with an fMRI or EEG measurement. We actually measured inside people’s brains with electrodes. That had never been done with high precision before. Of course you can’t just do that: for that you have to open the skull and the brain. In this case, it concerned people who already had the electrodes in their brain, namely epilepsy patients. They had been given these implants for clinical purposes. We were able to use it for our research.’
What did you look at?
‘We studied how the brain works in the part responsible for visual perception. How do we see images and how do we process them? Usually it is not only difficult to get inside the brain, but also to measure in that specific brain area. Our brain activity originates from the nerve cells in our brain, the neurons, that communicate with each other: they send signals by firing. The step from what we see in the outside world to the patterns of firing neurons is actually not quite clear yet. We have tried to map them out.’
READ ALSO
The silent children of the Concorde
So how we can see things has never been studied?
‘Yes, a lot of research is being done into visual perception. And you can already look inside the brain of laboratory animals, but never in humans. Based on previous experiments in animals, we created mathematical models that indicate how neurons respond to visual input. But we didn’t know if it works the same way in the human brain. With this research we show that the visual part of the brain has one mechanism that can be captured in a simple mathematical model.’
What can you do with that model?
‘I personally want to see how we can use this knowledge to improve computer algorithms. It could help artificial intelligence – AI – to ‘see’ better. Much of the current AI is already inspired by the knowledge we have about the brain. These kinds of principles are built into so-called neural networks, which has already led to, for example, facial recognition on your telephone.
But AI can still learn things from the human process. So far it can only perform very specific tasks that belong to a certain input. If that input changes, the system must be completely retrained. It cannot yet sufficiently adapt to changing images. Our mathematical model shows how the human brain does that.’
Can the research also help people see better?
‘My research is not directly concerned with this, but other researchers are working on implants for people who are blind because there is a problem with the connection between their eyes and their brain. Then the brain is actually okay, but the information from the eye can’t get there.
In that case, all you have to do is place a device where visual information normally comes in and stimulate the neurons there. A number of patients are already trying out these implants. That is truly pioneering work. Our research may help with that.’