Author Interviews, NYU, PLoS / 15.02.2018
Modeling Intelligence As Ability To Access Multiple Brain States
MedicalResearch.com Interview with:
Glenn N. Saxe, MD
Professor of Child & Adolescent Psychiatry
Hassenfeld Children’s Hospital at NYU Langone
Department of Child and Adolescent Psychiatry
Child Study Center, One Park Avenue
New York, NY 10016
MedicalResearch.com: What is the background for this study? Would you briefly explain what is meant by brain entropy and how it relates to intelligence?
Response: Think of human intelligence as the capacity for a human being to understand their complex and ever-changing world. The world of a person is really complex and constantly in flux so the human brain must be ready to understand whatever may come – when there is no way beforehand to predict what might come. How does the brain understand its world? It creates specific models of the information it receives through specific patterns of neuronal connection. These are called brain states. The way the brain understands its world is largely through using such models, or brain states, to accurately predict what comes next. So you can see that for an intelligent brain to properly understand and predict events in the world, it will need to have access to a very, very large number of brain states. And this is how entropy is defined.
Entropy is a very old and very powerful concept in the history of science. Not only is it fundamental for thermodynamics – what we learned in high school physics – but it is also fundamental for the nature of information and it’s processing. Entropy is defined as the number of states – or distinct configurations – any system has access to at any point in time. High entropy means access to a very large number of states. Low entropy means access to a very small number of states. A solid is a phenomenon with very low entropy. A gas is a phenomenon with very high entropy. Life, and the brain, are somewhere in between.
Although it is impossible to precisely measure the number of states a brain has access to at any one moment, there is a highly related concept that can be measured. A system with access to a very high number of possible states (like a gas) has components with behavior that is highly unpredictable. A system with access to very few possible states (like a solid) has components whose behavior is highly predictable. We measured brain entropy through the predictability of the brains components at the smallest scale we had access to: what are called voxels in an fMRI scan. These are 3mm cubes of neurons in a functional MRI scan, and there are many thousands of these voxels in our measurement and each of these voxels contains information on the activity of hundreds of thousands of neurons. We measured the predictability of each of these voxels and then found clusters of voxels where their predictability - or entropy - was related to intelligence.
(more…)