top of page

Images of Human Faces Reassembled from Monkey Brain Signals


Precision images of real faces have been recreated simply by monitoring the activity of certain cells in the brains of macaque monkeys as they looked at photographs of people.

A group of researchers published a study this week in which they showed off an ability to reconstruct a photo of a human face just from recording the brain waves of a monkey that was viewing the photo. It represents a huge leap in our understanding of how the brain recognises faces and a potential window into recording what the brain is seeing.

The study, published in Cell, does not mean that we’ve cracked open the monkey brain and will soon be viewing images straight out of screwed-up monkey dreams. It does, however, demonstrate a breakthrough in research on a very specific function of the brain. You can see the photo of a human face that was shown to test monkeys (“actual face”) and the algorithmic output from their brain waves (“predicted face”) in the image below:

Professor Rodrigo Quian Quiroga, a neuroscientist at the University of Leicester, tells The Guardian that this is “quite a revolution in neuroscience.” Aside from just being cool as hell, this research inches closer to solving the debate around how the brain recognises faces. For a time, it was believed that the brain uses what was known as “grandmother cells”—neurons that store information about individual objects. Then, clusters of neurons that were referred to as “face patches” were believed to be dedicated to recognising faces. These face patch regions respond the most to visual stimuli from human faces. One theory has been that individual cells in this region are encoded with specific facial identities. “This paper completely kills that,” Quiroga says.

Basically, there appears to be a way to break down the visual information of a face into a set of numeric values and coordinates, then record the firing of neurons and convert the numeric values that have been logged back into visual information.

The researchers describe the process as almost accidental in its accuracy. First, they mapped out 25 landmark positions on human faces and tested how those numbers could be translated into an image. These were broad strokes like the distance between the eyes and the top of the forehead. After comparing the outputs of the numbers to the actual photograph of a face, they settled on a set of criteria that appeared to be the most accurate but was still extremely abstract. Then, they chose 25 more measurements that filled in more details like skin tone and eye color.

The researchers identified the “face patch” region of two male rhesus macaques brains by studying what areas responded most to the visual stimulus of faces. Then, the subjects were shown photos of human faces. Using their custom algorithm for decoding the neuron data, the scientists thought that some sort of mistake had occurred. The images that were outputted almost perfectly mirrored the photographs that the subjects had been shown. Again and again, the scientists were able to show the subject a photograph, record its brain activity, decode that information, and output an image that was almost exactly the same as the one that the subject was originally shown.

Again, this research only covers photos of human faces. We’re still a long way from having mapped all of the objects that would be necessary to see through the eyes of a monkey, or a human for that matter. The scientists believe that this research puts to bed the idea that individual cells are hard encoded with a facial identity and they’re hopeful that this will lead to further research and mapping of other objects.

 

 

Source: Journal reference: Cell, DOI: 10.1016/j.cell.2017.05.011

320 views0 comments
bottom of page