Quantcast

Understanding Face/Place Memory

Apr 30, 14 Understanding Face/Place Memory

Memory is a tricky thing, and apparently trying to understand just how it works is even more so. There are still a great many mysteries about just how the brain functions and how we are able to process data within our own heads. How are we able to recall things? How are we able to pick out a single face from a crowd of people? How are we able to distinguish one place from another based on thought and memory alone? All question begging to be answered, and though we still do not possess the answers to these questions in full, we have taken another great step forward in our understanding of just how memory works.

A new study by researchers at the Massachusetts Institute of Technology (MIT) has shown just how the brain achieves the sort of attention that is required to pick a face out of a crowd of people. Despite the apparent simplicity of a task that one might assume, doing so is actually very complicated. It involves our brains retrieving the memory of the face that we are looking for and holding while we scan over the crowd, trying to identify similarities and comparing those to what he have stored in our biological archive. Our brains do this using a part of the prefrontal cortex that is known as the inferior frontal junction, or IFJ. At present, the scientific community knows much less about this sort of attention to detail, known as object-based attention, than it does about spatial attention, which is what we use to focus on what is happening in a particular location. What the researchers have found is that these two different types of attention possess many similar mechanisms involving related regions of the brain. In both types of attention the prefrontal cortex, which is the control center for most of our cognitive functions, takes charge of the brain’s attention and other relevant parts of the visual cortex, which is what receives sensory input.

What this new study found is that the IFJ works with the a region of the brain that processes faces, which is known as the fusiform face area or FFA, as well as another region that interprets information about places, which is known as the parahippocampal place area or PPA, both of which were first identified in humans by Nancy Kanwisher, the Walter A. Rosenblith Professor of Cognitive Neuroscience at MIT. Using magnetoencephalography, or MEG, the researchers found that when they asked subjects to look for faces – showing them overlapping images of faces and houses so that the brain could not use spatial information to distinguish them – they found that activity between the FFA and IFJ become synchronized, which suggests a level of communication. When asked to look for houses instead, the IFJ synchronized with the PPA rather than the FFA. They also found that all communication between these two brain regions was initiated by the IFJ and took as long as it would take for neurons to convey information from one region to another – about 20 milliseconds. This has led to the conclusion that the IFJ is what holds onto the idea of the object that the brain is scanning for and then has the other relevant part of the brain – either the FFA or PPA – look for it.

A truly remarkable and fascinating discovery.

Image Credit: Thinkstock

Facebook Twitter Pinterest Plusone Digg Reddit Stumbleupon Email

About 

Joshua is a freelance writer, aspiring novelist, and avid table-top gamer who has been in love with the hobby ever since it was first introduced to him by a friend in 1996. Currently he acts as the Gamemaster in three separate games and is also a player in a fourth. When he is not busy rolling dice to save the world or destroying the hopes and dreams of his players, he is usually found either with his nose in a book or working on his own. He has degrees in English, Creative Writing, and Economics.