Characterising our emotions is the subject of much debate, as is the identification of their neural substrates. A team from the University of Geneva has been examining the brain components of emotions, confirming that they are the brain’s synchronised response to events.
Emotions are complex phenomena that influence our minds, bodies and behaviour. A number of studies have sought to connect given emotions, such as fear or pleasure, to specific areas of the brain, but without success. Some theoretical models suggest that emotions emerge through the coordination of multiple mental processes triggered by an event. These models involve the brain orchestrating adapted emotional responses via the synchronisation of motivational, expressive and visceral mechanisms. To investigate this hypothesis, a research team from the University of Geneva studied brain activity using functional MRI. They analysed the feelings, expressions and physiological responses of volunteers while they were playing a video game that had been specially developed to arouse different emotions depending on the progress of the game. The results, published in the journal Plos Biology, show that different emotional components recruit several neural networks in parallel distributed throughout the brain, and that their transient synchronisation generates an emotional state. The somatosensory and motor pathways are two of the areas involved in this synchronisation, thereby validating the idea that emotion is grounded in action-oriented functions in order to allow an adapted response to events.
Most studies use passive stimulation to understand the emergence of emotions: they typically present volunteers with photos, videos or images evoking fear, anger, joy or sadness while recording the cerebral response using electroencephalography or imaging. The goal is to pinpoint the specific neural networks for each emotion. «The problem is, these regions overlap for different emotions, so they’re not specific», begins Joana Leitão, a post-doctoral fellow in the Department of Fundamental Neurosciences (NEUFO) in UNIGE’s Faculty of Medicine and at the Swiss Centre for Affective Sciences (CISA). «What’s more, it’s likely that, although these images represent emotions well, they don’t evoke them».
A question of perspective
Several neuroscientific theories have attempted to model the emergence of an emotion, although none has so far been proven experimentally. The UNIGE research team subscribe to the postulate that emotions are «subjective»: two individuals faced with the same situation may experience a different emotion. «A given event is not assessed in the same way by each person because the perspectives are different,» continues Dr Leitão.
In a theoretical model known as the component process model (CPM) - devised by Professor Klaus Scherer, the retired founding director of CISA- an event will generate multiple responses in the organism. These relate to components of cognitive assessment (novelty or concordance with a goal or norms), motivation, physiological processes (sweating or heart rate), and expression (smiling or shouting). In a situation that sets off an emotional response, these different components influence each other dynamically. It is their transitory synchronisation that might correspond to an emotional state.
Emotional about Pacman
The Geneva neuroscientists devised a video game to evaluate the applicability of this model. «The aim is to evoke emotions that correspond to different forms of evaluation», explains Dr Leitão. «Rather than viewing simple images, participants play a video game that puts them in situations they’ll have to evaluate so they can advance and win rewards». The game is an arcade game that is similar to the famous Pacman. Players have to grab coins, touch the «nice monsters», ignore the «neutral monsters» and avoid the «bad guys» to win points and pass to the next level.
The scenario involves situations that trigger the four components of the CPM model differently. At the same time, the researchers were able to measure brain activity via imaging; facial expression by analysing the zygomatic muscles; feelings via questions; and physiology by skin and cardiorespiratory measurements. «All of these components involve different circuits distributed throughout the brain», says the Geneva-based researcher. «By cross-referencing the imagery data with computational modelling, we were able to determine how these components interact over time and at what point they synchronise to generate an emotion».
A made-to-measure emotional response
The results also indicate that a region deep in the brain called the basal ganglia is involved in this synchronisation. This structure is known as a convergence point between multiple cortical regions, each of which is equipped with specialised affective, cognitive or sensorimotor processes. The other regions involve the sensorimotor network, the posterior insula and the prefrontal cortex. «The involvement of the somatosensory and motor zones accords with the postulate of theories that consider emotion as a preparatory mechanism for action that enables the body to promote an adaptive response to events», concludes Patrik Vuilleumier, full professor at NEUFO and senior author of the study.
December 4, 2020