Buscar en
International Journal of Clinical and Health Psychology
Toda la web
Inicio International Journal of Clinical and Health Psychology Dimensional emotions are represented by distinct topographical brain networks
Journal Information
Vol. 23. Issue 4.
(October - December 2023)
Share
Share
Download PDF
More article options
Visits
1276
Vol. 23. Issue 4.
(October - December 2023)
Original article
Full text access
Dimensional emotions are represented by distinct topographical brain networks
Visits
1276
Yoonsang Lee1, Yeji Seo1, Youngju Lee, Dongha Lee
Corresponding author
donghalee@kbri.re.kr

Corresponding author.
Cognitive Science Research Group, Korea Brain Research Institute, 61 Cheomdan-ro, Dong-gu, Daegu 41062, Republic of Korea
This item has received
Article information
Abstract
Full Text
Bibliography
Download PDF
Statistics
Figures (7)
Show moreShow less
Tables (3)
Table 1. Brain regions showing high behavioral representations of emotion on valence topography (a threshold of the top 5% of correlation coefficients, total 12 regions).
Table 2. Brain regions showing high behavioral representations of emotion on arousal topography (a threshold of the top 5% of correlation coefficients, total 12 regions).
Table 3. Brain regions showing high behavioral representations of emotion on valence & arousal topography (a threshold of the top 5% of correlation coefficients, total 12 regions).
Show moreShow less
Abstract

The ability to recognize others’ facial emotions has become increasingly important after the COVID-19 pandemic, which causes stressful situations in emotion regulation. Considering the importance of emotion in maintaining a social life, emotion knowledge to perceive and label emotions of oneself and others requires an understanding of affective dimensions, such as emotional valence and emotional arousal. However, limited information is available about whether the behavioral representation of affective dimensions is similar to their neural representation. To explore the relationship between the brain and behavior in the representational geometries of affective dimensions, we constructed a behavioral paradigm in which emotional faces were categorized into geometric spaces along the valence, arousal, and valence and arousal dimensions. Moreover, we compared such representations to neural representations of the faces acquired by functional magnetic resonance imaging. We found that affective dimensions were similarly represented in the behavior and brain. Specifically, behavioral and neural representations of valence were less similar to those of arousal. We also found that valence was represented in the dorsolateral prefrontal cortex, frontal eye fields, precuneus, and early visual cortex, whereas arousal was represented in the cingulate gyrus, middle frontal gyrus, orbitofrontal cortex, fusiform gyrus, and early visual cortex. In conclusion, the current study suggests that dimensional emotions are similarly represented in the behavior and brain and are presented with differential topographical organizations in the brain.

Keywords:
Emotion representation
Valence
Arousal
Functional magnetic resonance imaging
Representational similarity analysis
Full Text
Introduction

Emotions affect our daily lives, especially how we decide, remember, learn, and perceive the world (Levine & Burgess, 1997). However, how well we understand our emotions and those of others is related to our mental health (Gyurak et al., 2011; Lerner et al., 2015; Okon-Singer et al., 2015; Schlegel et al., 2017), as appropriate emotion processing could influence our physiological health because emotional experiences impact our physiological responses (DeSteno et al., 2013). The knowledge of how people recognize others’ facial emotions is becoming increasingly important after the COVID-19-pandemic that causes stressful situations in emotion regulation (Hamilton et al., 2021; Schelhorn et al., 2022; Zochowska et al., 2022). Emotion knowledge is defined as one's ability to perceive and label emotions of oneself and others (Izard et al., 2001; Trentacosta & Fine, 2010). Emotion perception based on this knowledge is used to maintain successful social relationships (Keltner & Haidt, 1999; Oatley et al., 2006). To recognize the emotions of others with ease and efficiency, it is necessary to appropriately detect and interpret relevant sensory stimuli with emotional cues (Damasio, 1999; Peelen et al., 2010). Facial expressions are representative of the most common and important emotional cues (Ekman, 1976; Little et al., 2011). Besides, a widely accepted theory about emotional states is that basic emotions include happiness, anger, sadness, fear, disgust, and surprise (Brooks et al., 2019; Ekman, 1992; Gu et al., 2019; Smith et al., 2005).

Further, according to the dimensional theory of emotion (Hamann, 2012; Russell, 1980, 2003), emotional states can be represented by affective dimensions in bipolar space, such as valence and arousal (core affects). The valence dimension is situated in a pleasant (positive)-to-unpleasant (negative) direction, whereas the arousal dimension is situated in an excited-to-calm direction (Hamann, 2012; Mehrabian & Russell, 1974; Russell, 1980). The separation approach of the nature of emotion has been widely used in emotion studies that utilized of its parsimony and applicability, proving its empirical accountability for a wide range of effects (Kensinger & Schacter, 2006; Lang & Bradley, 2010; Lang & Davis, 2006). Although emotion is well-represented in our physiological system and can be measured using biomarkers (Egger et al., 2019; Izard et al., 2001), the two dimensions of emotional state may be differentially reflected in peripheral and physiological processes (Anders et al., 2004; Bernat et al., 2006; Codispoti et al., 2006; Cook III et al., 1992; Cuthbert et al., 1996; Vrana et al., 1988). Based on this dissociation of dimensional emotions, several studies have attempted to characterize distinct neural responses to the emotional valence and arousal (e.g., Blair et al., 1999; Ethofer et al., 2009; Gerdes et al., 2010; Nielen et al., 2009; Posner et al., 2009). However, a previous study reported that basic emotions do not conflict with the affective dimensions but are represented by core affects (Gu et al., 2019). For example, happiness represents positive valence and high arousal, whereas sadness represents negative valence and low arousal. Uncovering these dimensions of emotion is crucial as it is the cornerstone for the underlying understanding of an emotion perception system.

Multivariate pattern analysis (MVPA) is becoming increasingly popular in functional magnetic resonance imaging (fMRI) studies. Mass-univariate analysis focuses on the localization of activation responses to stimuli, whereas MVPA focuses on representational content via pattern information (Carlson et al., 2003; Haxby et al., 2014, 2001; Lewis-Peacock & Norman, 2014). Similarity-based MVPA, which is called representational similarity analysis (RSA) (Kriegeskorte et al., 2008), enables the exploration of relationship between multi-voxel patterns and stimuli by mapping them onto higher-order representational spaces (Haxby et al., 2014; Kragel & LaBar, 2016; Kriegeskorte & Kievit, 2013; Popal et al., 2019). Furthermore, using RSA, representational geometries can be compared between different sources such as means of measurement (e.g., fMRI, electroencephalograhy, magnetoencephalography, and functional near-infrared spectroscopies), brain regions, behavioral responses, and computational models (Kriegeskorte et al., 2008; Lee & Almeida, 2021; Nili et al., 2014). It is necessary to combine the data obtained from various modalities in social neuroscience (Popal et al., 2019) and emotion studies (Brooks et al., 2019; Chavez & Heatherton, 2015; Chen et al., 2020; Grootswagers et al., 2020; Li et al., 2022; Sievers et al., 2021; Skerry & Saxe, 2015; Ventura-Bort et al., 2022; Wegrzyn et al., 2017). In particular, RSA has been used to predict neural patterns from both dimensional (valence and arousal) or discrete emotions (see review in Kragel & LaBar, 2016; as proven in Saarimaki et al., 2016). However, these studies did not consider whether the dimensional emotions of the brain are similar to those of behavior despite emotion perception occurring in the conscious (behavioral) and unconscious (neural) manners (Gyurak et al., 2011).

Here, we tested this broader issue by comparing the representation of emotion in behavior, brain, and computational models. Specifically, we focused on the similarity of behavioral patterns for emotion perception in affective dimensions. To this end, we constructed a simple but intuitive behavioral paradigm inspired by the construction of representational spaces, wherein participants categorized emotional faces according to each emotional dimension by clustering the face images with the same emotion and physically spacing the clusters in accordance with perceived valence or arousal. Importantly, we included a condition in which the participants categorized emotional face images by applying valence and arousal dimension information simultaneously, reproducing the process of emotion perception more similar to real-life circumstances. This behavioral paradigm differs from those in previous studies on emotion, wherein each dimension was rated separately (e.g., Chavez & Heatherton, 2015; Grootswagers et al., 2020; Li et al., 2022; Skerry & Saxe, 2015; Ventura-Bort et al., 2022).

Although efforts have been made to explore both behavioral and neural correlates of emotion response (e.g., Dore et al., 2017) or use neural patterns as predictors of emotion regulation functions (Lee et al., 2008), they have focused only on certain types of emotion categories, emotional functions, or brain regions. To better understand how emotions are processed at explicit (or conscious) and implicit (or unconscious) levels, we considered both behavioral and neural methods for estimating affective emotions that psychologists typically depend on for dimensional measurements of continuous valence and arousal levels (Toisoul et al., 2021). Therefore, in this study, we aimed to investigate how the dynamics between behavioral and neural correlates in response to all emotion categories are reflected in the brain as topographical distribution. To explore how affective dimensions are projected onto the brain, we constructed topographic maps of their distribution by comparing representational geometries derived from behavioral and neural patterns of 246 regions parcellated in the human Brainnetome Atlas (http://atlas.brainnetome.org) (Fan et al., 2016). Finally, using emotional dimension topography, we tested whether emotion knowledge is derived from perceived or semantic information. We predicted that emotion representation in the brain and behavior differs according to affective dimensions. We also predicted that the emotion-processing brain regions will show dominance in using perceived information rather than brain semantic information, i.e., experienced an emotion rather than recognizing it by knowledge. These findings suggest that emotion knowledge is represented in both the brain and behavior differently according to affective dimensions.

Materials and methodsParticipants

In total, 31 healthy participants (9 males and 22 females, aged 20–35 years, mean = 25.0, standard deviation [SD] = 3.9) participated in this study. All participants were right-handed and had normal or corrected-to-normal vision. None of the participants reported any history of neurological or psychiatric disorders. The current study was approved by the Institutional Review Board of the Korea Brain Research Institute, and all participants provided written informed consent.

Experimental design and procedures

All participants completed two tasks: a behavior task (affective dimension judgment) on Day 1 and an fMRI task (emotion perception) on Day 2 with a gap of approximately 1 week (mean = 6.1 days, SD = 2.0). For the two tasks, we used a set of seven different facial-emotion expressions (happy, sad, angry, surprise, fear, disgust, and neutral) for 4 individuals (total 28 images, 7 emotions × 2 males and 2 females), randomly selected from the Yonsei Face Database (Chung et al., 2019) that consisted of total 518 emotional face images of 74 individuals (37 males and 37 females) (see Fig. 1).

Fig. 1.

Overview of experimental procedures and analysis process. Emotion representation was constructed using behavioral responses, neural activity patterns, and computational visual features. (A) Thirty-one participants performed an affective dimension judgment task on Day 1. Behavioral representational dissimilarity matrices (RDMs) were calculated based on the Euclidean distance of all pairs of face stimulus clusters arranged on the screen. (B) On Day 2, the participants performed an emotion perception task during functional magnetic resonance imaging. Neural RDMs were calculated using the dissimilarity (1 − correlation) of all pairs of neural activity patterns. (C) Computational RDM was calculated with dissimilarity between visual features extracted in the highest layer on a deep learning model. Behavioral RDMs were correlated with neural RDMs to compare behavioral and neural representations. Computational RDM was correlated with neural RDMs to compare perceived and semantic representations.

(0.5MB).
Affective dimension judgment task

On the first day of visitation, participants conducted an affective dimension judgment task comprising three sessions. In this experiment, the participants were instructed to arrange facial-emotion images on the computer screen according to their dimensions of emotion. In the first trial, the full set of facial-emotion images (7 emotions × 4 people) was presented around a circle (Fig. 2A). Participants were then told to place facial-emotion images on the screen according to their affective dimension by clicking and dragging each image. However, from the second to the last trials, the subsets of facial-emotion images were randomly presented to them to estimate an optimal similarity between images. In the valence dimension session, participants were instructed to place the facial-emotion image in the left (negative) or right (positive) direction. While in the arousal dimension session, the facial-emotion image was required to be placed either at the top (high) or bottom (low) level, and in the valence and arousal dimension session, participants placed the facial-emotion images by considering the negative or positive direction and high or low level. All participants conducted each session for 20 min and completed 26 trials on average (mean = 25.7, SD = 10.5).

Fig. 2.

Experimental design for behavioral and neural representations of dimensions of emotion. (A) Example of an emotion-similarity judgment for valence, arousal, and valence & arousal. (B) Example of an emotion perception functional magnetic resonance imaging task.

(0.48MB).
Emotion perception task

After 1 week, participants performed an emotion perception task during fMRI and completed five runs. In this task, participants were presented with emotional facial images (7 emotions × 4 people) of two males and two females, one after the other, and the data collected were used for the behavioral task assessment. During fMRI, each face image was presented for 1 s and separated by 5 s of fixation. In total, 96 vol per run were obtained, including 12 s of initial fixation and 12 s of closing fixation (Fig. 2B). All emotional facial images with 1000 × 600 pixels in size (approximately 10° of visual angle) were presented to the participants using Psychtoolbox-3 (http://psychtoolbox.org).

Magnetic resonance imaging data acquisition and processing

All structural and functional MRI data were obtained using a Siemens Magnetom Skyra 3.0T scanner with a 20-channel head coil (Erlangen, Germany). A high-resolution structural T1-weighted image was obtained using the following parameters: magnetization prepared rapid gradient echo sequence; acquisition matrix, 256 × 256; field of view, 230 mm; voxel size, 0.9 × 0.9 × 0.9 mm3; repetition time (TR), 2400 ms; and echo time (TE), 2.3 ms. fMRI data were acquired axially using a T2*-weighted single-shot echo-planar imaging (EPI) sequence using the following parameters: generalized auto-calibrating partially parallel acquisitions; acquisition matrix, 64 × 64; field of view, 192 mm; 36 (interleaved) slices; voxel size, 3.0 × 3.0 × 3.0 mm3; TR, 2000 ms; TE, 28 ms; flip angle, 90°; and gap, 0.9 mm.

Standard preprocessing of fMRI data was conducted using statistical parametric mapping (SPM12, http://www.fil.ion.ucl.ac.uk/spm/, Wellcome Trust centre for Neuroimaging, London, UK) (Friston et al., 1995) and an in-house MATLAB toolbox (MathWorks, Inc.). All fMRI data (EPIs) underwent slice timing correction, head motion correction (realignment), co-registration of T1-weighted images to the mean EPI, and spatial normalization to convert the EPIs into the Montreal Neurological Institute template space using nonlinear transformation in SPM12. The normalized EPIs were interpolated to 2.0 × 2.0 × 2.0 mm3 voxels, and spatial smoothing was not conducted to prevent spill-over effects between voxels (Todd et al., 2013).

Behavioral representation of emotion perception

To construct the geometrical models of behavioral, neural, and computational representations of emotions, we performed RSA (Kriegeskorte et al., 2008).

For behavioral representations, we used a multi-arrangement method (Kriegeskorte & Mur, 2012; Mur et al., 2013) in MATLAB (MathWorks, Inc.). The participants completed an emotion-similarity judgment task composed of 28 emotional facial expression images. The images were arranged according to the dimensions of emotion (i.e., valence, arousal, and valence & arousal) on the two-dimensional (2D) screen. The distance between the images was calculated using the mean of the Euclidean distances between each pair of emotional face clusters in each trial. The estimated dissimilarities were used to construct the representational dissimilarity matrices (RDMs) for behavioral emotion representation. The group-averaged behavioral RDM was constructed by averaging the behavioral RDMs of all participants.

Neural representation for emotion perception

For neural representations, we concatenated five runs and performed a general linear model (GLM) analysis that explained a response variable with a linear combination of categorical covariates. In the GLM analysis, each of the 28 emotions was modeled as a condition. The GLM was modeled using the following equation: y=Xβ+e, where y is the fMRI signal, β is the coefficient, and e is noise. Beta was converted into t-values that were used to construct neural RDMs in the 246 brain regions by calculating the correlation distance (i.e., 1–Pearson's r) of all pairs of 28 emotions. The group-average neural RDM was constructed by averaging the neural RDMs of all participants.

Neural encoding of dimensions of emotion derived from behavioral judgment

Based on the previous study (Parkinson et al., 2017), we used GLM decomposition to encode dimensions of emotion into the brain. Each neural RDM can be modeled using a linear combination of behavioral RDMs. To remove the variance accounted for the remaining RDMs from each RDM, we merely modeled each RDM with the other RDMs as predictor variables in an ordinary least squares regression and then took the residuals. This calculation was repeated 246 times.

Emotion representation of semantic visual information

To extract semantic visual features from emotional face images, we conducted transfer learning using ResNet152, which is a deeper network of up to 152 layers (He et al., 2016). Total 518 emotional face images (e.g., 7 emotions × 74 individuals) were divided into a training dataset (N = 511) and testing dataset (N = 7). Because ResNet152 was pretrained on over a million images in ImageNet to classify images into 1000 object categories, we replaced the last fully connected (FC) layer with a new FC layer with seven output nodes and then applied a soft-max function to classify seven emotional categories. The classification accuracy was calculated for a testing image in the testing dataset. This process was performed using 74-fold cross-validation. The mean classification accuracy was 0.9575 (SD = 0.0876). Using this trained model, the visual features of each face image were extracted from the last FC layer. All processing for transfer learning was accelerated using graphics processing units (NVIDIA GeForce RTX 3090) in Python 3.9.7, PyTorch 1.11.0, and CUDA 11.5.

Statistical evaluation of representational geometries

We performed a correlation analysis to examine the relationship between the representation of dimensional emotions in the brain and behavior. For statistical inference, a permutation-test was conducted by calculating actual correlation between two RDMs (e.g., neural RDM vs. behavioral RDM, or neural RDM vs. computational RDM). After randomizing only the labels of one RDM, the correlation was calculated. This process was repeated 10,000 times to generate a null distribution of the correlation coefficients. If the actual correlation coefficient is within the top 5% of the null distribution, we can reject the null hypothesis that all labels in the RDM have consistent patterns. To test which representational geometries or topographies of affective dimensions are more similar to each other, we tested their linear combinations using the Meng–Rosenthal–Rubin method (Meng et al., 1992).

ResultsRepresentational geometry of affective dimensions in behavior

Fig. 3 shows behavioral representational geometries and multidimensional scaling plots of affective dimensions. In the valence RDM, happy was dissimilar to angry, sad, disgust, fear, and neutral, but was similar to surprise (Fig. 3A). In the arousal RDM, angry was dissimilar to neutral, happy, and surprise, but was similar to disgust, fear, and sad. The valence & arousal RDM had the same results on the valence and arousal RDMs. Emotions were differently distributed in a 2D space according to affective dimensions (Fig. 3B). Valence RDM was highly similar to valence & arousal RDM, and arousal RDM was highly similar to valence & arousal RDM (Fig. 3C). However, valence and arousal RDMs were less similar to each other. The correlation distance of valence–arousal was significantly higher than those of valence–valence & arousal (Meng's Z = 12.99, p < 0.0001) and of arousal–valence and arousal (Z = 12.72, p < 0.0001). However, there was no significant difference in correlation distance between valence–valence & arousal and arousal–valence & arousal (Z = 0.39, p = 0.6961).

Fig. 3.

Representational geometry of affective dimensions derived from behavioral judgment. (A) Representational dissimilarity matrices for valence, arousal, and valence & arousal. (B) Multidimensional scaling plots. (C) Statistical comparison of behavioral representational dissimilarity matrices for valence, arousal, and valence and arousal. ⁎⁎⁎p < 0.0001. n.s. non-significant.

(0.82MB).
Topography of affective dimensions in the brain

Fig. 4 shows whole-brain topography of affective dimensions. Topographic distributions of behavioral dimensions of emotion were calculated by encoding behavioral RDMs into each of the local neural RDMs using the GLM decomposition (Fig. 4A). Behavioral representations of emotion were differently distributed in the whole brain according to affective dimensions (Fig. 4B). Valence topography was highly similar to valence & arousal topography, and arousal topography was highly similar to valence & arousal topography (Fig. 4C). However, valence & arousal topography were less similar to each other. The correlation distance between valence and arousal topographies was significantly higher than those of valence–valence & arousal topographies (Z = 10.87, p < 0.0001) and arousal–valence & arousal topographies (Z = 12.35, p < 0.0001). However, there was less significant difference in correlation distance between valence and valence & arousal and between arousal and valence & arousal (Z = 2.46, p = 0.0141).

Fig. 4.

Topography of affective dimensions derived from neural encoding. (A) General linear model decomposition of affective dimensions. (B) Whole-brain neural encoding of affective dimensions. (C) Statistical comparison of topographies for valence, arousal, and valence & arousal. ⁎⁎⁎p < 0.0001, p < 0.05.

(1.13MB).
Differential distributions of perceived emotion information in the brain

Fig. 5 presents the brain regions with significant correlations between neural and behavioral RDMs in each affective dimension. Among the 246 regions of interest, brain regions were selected with a threshold false discovery rate of < 0.05 and the top 5% of beta values (Fig. 5).

Fig. 5.

Differential brain representation of perceived emotion information according to emotional valence, arousal, and valence & arousal. Significant brain nodes are displayed (false discovery rate < 0.05).

(0.24MB).

Perceived similarity for valence dimension was significantly distributed in regions such as the left dorsolateral prefrontal cortex (A9/46d, A9m), precuneus (dmPOS), right frontal eye fields (A8m), supramarginal gyrus (A40c), fusiform gyrus (A37mv), anterior superior temporal sulcus (aSTS), early visual cortex (OPC, rLinG, rCunG), and angular gyrus (A39rv, A39c) bilaterally.

Significantly higher correlations of emotion similarity for arousal were distributed in the left dorsolateral prefrontal cortex (A9/46d), precentral gyrus (A6cdl), cingulate area (A24cd), right middle frontal gyrus (IFJ), orbitofrontal cortex (A12/47o), primary somatosensory cortex (A2), angular gyrus (A39rv), right fusiform gyrus (A37lv), aSTS, early visual cortex (rCunG), and bilateral supramarginal gyrus (A40c).

For the dimension in which valence & arousal were combined, significantly higher correlations of emotion similarity were distributed in the left dorsolateral prefrontal cortex (A9/46d), orbitofrontal cortex (A14m), frontal eye field (A8vl), precentral gyrus (A6cdl), posterior superior temporal sulcus (cpSTS), right supramarginal gyrus (A40c), primary somatosensory cortex (A2), fusiform gyrus (A37mv), aSTS, and angular gyrus (A39rv, A39c) bilaterally. Detailed information on the perceived similarity of the affective dimensions are summarized for valence (Table 1), arousal (Table 2), and valence & arousal (Table 3).

Table 1.

Brain regions showing high behavioral representations of emotion on valence topography (a threshold of the top 5% of correlation coefficients, total 12 regions).

Lobe  ROIs  ROIs (subdivision)  Anatomical and modified cytoarchitectonic descriptions from Brainnetom Atlas  Representational similarity - Behavior (Pearson r)  Representational similarity - DNN (Pearson r)  MNI (X,Y, Z) 
Frontal lobe  Middle frontal gyrus  Dorsolateral prefrontal cortex  L.A9/46d (dorsal area 9/46)  0.1753  0.1034  −41, 41, 16 
  Superior frontal gyrus  Dorsolateral prefrontal cortex  L.A9m (medial area 9)  0.1718  0.0193  −5, 36, 38 
  Superior frontal gyrus  Frontal eye field  R.A8m (medial area 8)  0.1590  0.4304  7, 16, 54 
Occipital lobe  Lateral occipital cortex  Early visual cortex  L.OPC (occipital polar cortex)  0.1553  0.0717  −18, −99, 2 
  MedioVentral occipital cortex  Early visual cortex  R.rLinG (rostral lingual gyrus)  0.1341  0.0550  −17, −60, −6 
  MedioVentral occipital cortex  Early visual cortex  L.rCunG (rostral cuneus gyrus)  0.1455  0.1524  −5, −81, 10 
Parietal lobe  Inferior parietal lobule  Angular gyrus  R.A39rv (rostroventral area 39(PGa))  0.2887  0.2006  53, −54, 25 
  Inferior parietal lobule  Angular gyrus  L.A39c (caudal area 39, PGp)  0.1818  0.2087  −34, −80, 29 
  Inferior parietal lobule  Supramarginal gyrus  R.A40c (caudal area 40, PFm)  0.1540  0.0245  57, −44, 38 
  Precuneus  Precuneus  L.dmPOS (dorsomedial parietooccipital sulcus, PEr)  0.1458  0.092  −12, −67, 25 
Temporal lobe  Fusiform gyrus  Fusiform gyrus  R.A37mv, medioventral area 37  0.1292  0.3125  31, −62, −14 
  Middle temporal gyrus  Anterior STS  R.aSTS (anterior superior temporal sulcus)  0.2149  0.1493  58, −16, −10 
Table 2.

Brain regions showing high behavioral representations of emotion on arousal topography (a threshold of the top 5% of correlation coefficients, total 12 regions).

Lobe  ROIs  ROIs (subdivision)  Anatomical and modified cytoarchitectonic descriptions from Brainnetom Atlas  Representational similarity - Behavior (Pearson r)  Representational similarity - DNN (Pearson r)  MNI (X,Y, Z) 
Frontal lobe  Middle frontal gyrus  Premotor cortex  R.IFJ (inferior frontal junction)  0.1386  0.0684  42, 11, 39 
  Middle frontal gyrus  Dorsolateral prefrontal cortex  L.A9/46d (dorsal area 9/46)  0.1027  0.1034  −27, 43, 31 
  Orbital gyrus  Orbitofrontal cortex  R.A12/47o (orbital area 12/47)  0.2038  −0.0785  40, 39, −14 
  Precentral gyrus  Premotor cortex  L.A6cdl (caudal dorsolateral area 6)  0.1618  0.1399  −32, −9, 58 
Limbic lobe  Cingulate gyrus  Caudodorsal cingulate cortex  L.A24cd (caudodorsal area 24)  0.261  −0.1022  −5, 7, 37 
Occipital lobe  MedioVentral occipital cortex  Early visual cortex  R.rCunG (rostral cuneus gyrus)  0.1291  0.0871  7, −76, 11 
Parietal lobe  Inferior parietal lobule  Supramarginal gyrus  L.A40c (caudal area 40, PFm)  0.2027  −0.0139  −56, −49, 38 
  Inferior parietal lobule  Supramarginal gyrus  R.A40c (caudal area 40, PFm)  0.1162  0.0245  57, −44, 38 
  Inferior parietal lobule  Angular gyrus  R.A39rv (rostroventral area 39(PGa))  0.1072  0.2006  53, −54, 25 
  Postcentral gyrus  Primary somatosensory cortex  R.A2 (area 2)  0.1469  0.2646  48, −24, 48 
Temporal lobe  Fusiform gyrus  Fusiform gyrus  R.A37lv (lateroventral area 37)  0.1091  0.1998  43, −49, −19 
  Middle temporal gyrus  Anterior STS  R.aSTS (anterior superior temporal sulcus)  0.1746  0.1493  58, −16, −10 
Table 3.

Brain regions showing high behavioral representations of emotion on valence & arousal topography (a threshold of the top 5% of correlation coefficients, total 12 regions).

Lobe  ROIs  ROIs (subdivision)  Anatomical and modified cytoarchitectonic descriptions from Brainnetom Atlas  Representational similarity - Behavior (Pearson r)  Representational similarity - DNN (Pearson r)  MNI (X,Y, Z) 
Frontal lobe  Middle frontal gyrus  Dorsolateral prefrontal cortex  L.A9/46d (dorsal area 9/46)  0.1199  0.1034  −41, 41, 16 
  Middle frontal gyrus  Frontal eye field  L.A8vl (ventrolateral area 8)  0.1123  0.2070  −33, 23, 45 
  Orbital gyrus  Orbitofrontal cortex  L.A14m (medial area 14)  0.2098  0.0833  −7, 54, −7 
  Precentral gyrus  Premotor cortex  L.A6cdl (caudal dorsolateral area 6)  0.2067  0.1399  −32, −9, 58 
Parietal lobe  Inferior parietal lobule  Angular gyrus  R.A39rv (rostroventral area 39(PGa))  0.2356  0.2006  53, −54, 25 
  Inferior parietal lobule  Angular gyrus  L.A39c (caudal area 39, PGp)  0.1751  0.2087  −34, −80, 29 
  Inferior parietal lobule  Angular gyrus  R.A39c (caudal area 39, PGp)  0.1265  0.1215  45, −71, 20 
  Inferior parietal lobule  Supramarginal gyrus  R.A40c (caudal area 40, PFm)  0.1424  0.0245  57, −44, 38 
  Postcentral gyrus  Primary somatosensory cortex  R.A2 (area 2)  0.1615  0.2646  48, −24, 48 
Temporal lobe  Fusiform gyrus  Fusiform gyrus  R.A37mv, medioventral area 37  0.1046  0.3125  31, −62, −14 
  Middle temporal gyrus  Anterior STS  R.aSTS (anterior superior temporal sulcus)  0.2155  0.1493  58, −16, −10 
  Posterior superior temporal sulcus  Posterior STS  L.cpSTS (caudoposterior superior temporal sulcus)  0.1957  −0.0266  −52, −50, 11 
Predominance of perceived emotion information over semantic visual information

We further tested whether emotion representation in the above mentioned brain regions is driven by perceived emotion or semantic visual information of the stimuli. In the valence dimension, perceived emotion similarity was higher than semantic visual similarity in the left dorsolateral prefrontal cortex (A9/46d, A9m), precuneus (dmPOS), right angular gyrus (A39rv), supramarginal gyrus (A40c), and aSTS (Fig. 6). In contrast, semantic visual similarity was higher than perceived emotion similarity in the left angular gyrus (A39c), right frontal eye fields (A8m), fusiform gyrus (A37mv), and early visual cortex (OPC, rLinG, rCunG) bilaterally.

Fig. 6.

Comparison of perceived emotion and semantic visual similarities in the brain representation of affective dimensions.

(0.28MB).

For the arousal dimension, a higher perceived similarity than semantic similarity was observed in the left preceantral gyrus (A6cdl), cingulate area (A24cd), right orbitofrontal cortex (A12/47o), premotor cortex (IFJ), early visual cortex (rCunG), aSTS, and bilateral supramarginal gyrus (A40c). A higher semantic similarity than perceived similarity was observed in the left dorsolateral prefrontal cortex (A9/46d), right primary somatosensory cortex (A2), angular gyrus (A39rv), and fusiform gyrus (A37lv).

For the valence and arousal dimensions, a higher perceived similarity than semantic similarity was found in the left dorsolateral prefrontal cortex (A9/46d), orbitofrontal cortex (A14m), precentral gyrus (A6cdl), cpSTS, right angular gyrus (A39rv, A39c), supramarginal gyrus (A40c), and aSTS. Semantic visual similarity was higher in the left frontal eye field (A8vl), angular gyrus (A39c), right primary somatosensory cortex (A2), and fusiform gyrus (A37mv) than perceived emotion similarity.

Topographical organization of emotion knowledge

Fig. 7 summarizes the topographical organization of emotional knowledge. Emotion representations were situated differently in the brain according to affective dimensions. Valence-specific regions included the dorsolateral prefrontal cortex (A9m), frontal eye fields (A8m), precuneus (dmPOS), and early visual cortex (rCunG, rLinG, OPC) (Fig. 7). In contrast, arousal-specific regions included the cingulate gyrus (A24cd), middle frontal gyrus (IFJ), orbitofrontal cortex (A12/47o), fusiform gyrus (A37lv), and early visual cortex (rCunG). The valence & arousal regions were the frontal eye field (A8vl), orbitofrontal cortex (A14m), and cpSTS. Specifically, the dorsolateral prefrontal cortex (A9/46d), aSTS, angular gyrus (A39rv), and supramarginal gyrus (A40c) were commonly included in the valence, arousal, and valence & arousal dimensions.

Fig. 7.

Summary of shared representation of dimensional emotions on the brain.

(0.19MB).
Discussion

All knowledge on emotion is reflected in judgment or thought. Understanding how emotion knowledge is transferred from behavior to the brain is critical for harmonizing social interactions with other people. This study aimed to investigate the representational geometry of affective dimensions in the behavior and topography of affective dimensions in the brain. Using behavioral patterns for emotion judgment and topographic mapping of the brain, we showed that affective dimensions were similarly represented in the behavior and brain. Although we could not directly compare the affective dimensions of the behavior and brain, our findings distinctly demonstrated similar directions in the dimensional emotions of the behavior and brain. Remarkably, these dynamics between affective dimensions that contributed to neural patterns optimally reflected those between behavioral dimensions of emotion. For example, behavioral representational geometries and brain topographies of valence, arousal, and valence & arousal were significantly similar (Figs. 3 and 4). However, when the correlations of topographies were compared, we found that valence–arousal was significantly dissimilar to valence–valence & arousal and arousal–valence & arousal. In contrast, valence–valence & arousal were similar (behavior) or less similar (brain) to arousal. These findings may provide supporting evidence that behavioral decisions on emotion perception along each dimension of emotion are reflected in the brain through their dynamics. Emerging evidence on the topographic organization of dimensional emotion enables the study of emotion knowledge transfer from an emotional response in the brain to a change in behavior.

We further reported that the valence and arousal dimensions represented perceived emotions differently within the brain. We investigated which brain regions reflected each behavioral emotional dimension well and found different topographical distributions of the regions. First, we observed brain regions showing similar emotional neural representations for all behavioral dimensions of emotion (valence, arousal, and valence & arousal) (Fig. 5). For example, the fusiform gyrus (A37mv, A37lv) is involved in the processing of emotional facial expressions (Dricu & Fruhholz, 2020; Harry et al., 2013; Petro et al., 2013; Said et al., 2010). The right aSTS plays an important role in emotion perception when processing facial expressions (Narumoto et al., 2001). Intuitively, regions with more general executive functions also shared similar emotion representations with all three behavioral dimensions involved in emotion regulation, such as the angular gyrus (A39c, A39rv) (Kim et al., 2015). In particular, the dorsolateral prefrontal gyrus cortex (A9/46d, A9m) and supramarginal gyrus (A40c) are suggested to be involved in emotion regulation through attention deployment (Morawetz et al., 2016). Our findings suggest that common neural representations exist in the emotional dimension.

More importantly, emotional dimension-specific brain regions were observed. For instance, the left dorsomedial parietooccipital sulcus (dmPOS) and right frontal eye field (A8m) shared similar emotion representation with the valence dimension. While the frontal eye field modulates visual attention (Corbetta, 1998; Munoz & Everling, 2004; Schafer & Moore, 2007), similar to the dorsolateral prefrontal cortex and supramarginal gyrus, its right side (A8vl) also showed similar emotion representation with the valence and arousal dimensions in these regions. Interestingly, the left medial parietooccipital sulcus is a region related to self-projection (Buckner & Carroll, 2007; Chrastil, 2018; Spreng et al., 2009), which includes the dmPOS that shares similar emotion representation with the valence dimension only. In other words, a region that is more specifically involved in social functioning is selectively correlated with the valence dimension in terms of emotion representation. Similarly, the left cingulate gyrus (A24cd) only showed similar emotion representation for the arousal dimension; this is in line with previous findings on the assessment of internal emotional states (Grabenhorst et al., 2008; Rolls, 2015). The orbitofrontal regions showed high emotion representation for the arousal dimension (A12/47o) and valence and arousal dimension (A14m), which were both engaged in emotion regulation mainly through reward evaluation and learning (Elliott et al., 2000; Gourley et al., 2016; Hooker & Knight, 2006; Noonan et al., 2010; Plassmann et al., 2010). Other regions highly correlated with emotion representation of both arousal and valence and arousal dimensions are the premotor cortex (A6cdl, IFJ) and primary somatosensory cortex (A2), each contributing to emotion perception through the mirror neuron system (Gallaher, 2001) and body representation (Critchley et al., 2004; Gallo et al., 2018). Overall, this discriminative distribution of regions indicates that emotion is represented in the brain topologically unique to each emotional dimension.

The early visual cortex (rLinG, rCunG, OPC) shared similar emotion perception with both valence and arousal dimensions, but not with the valence & arousal dimension. In contrast, cpSTS shared similar emotion perception with only the valence & arousal dimension. These findings show that the neural representation of perceived emotion for the sum of each dimension of emotion is not equal to that for combined dimensions of emotion. Thus, we believe that our study provides evidence that behavioral emotion representation can be predicted from neural emotion representation derived from different topological distributions of regions relevant to different dimensions of emotion and their combinations.

To further test whether the relationship between behavioral and neural emotion representations is derived from perceived or semantic information, we compared perceived emotion and semantic visual similarities in the brain representation of affective dimensions. Perceived information was calculated from the similarity between neural and behavioral representations, whereas semantic information was extracted from the visual features of the facial stimulus in the last layer (Dharmaretnam & Fyshe, 2018; Lee & Almeida, 2021). This comparison is based on the assumption that the emotion-processing regions can cope with perceived emotion information rather than semantic information, compared with other regions, which might “know” rather than “feel” emotional cues. In our results, the emotion similarity representation of regions with more cognitive functions, such as the dorsolateral prefrontal cortex (A9/46d, A9m), supramarginal gyrus (A40c), dmPOS, aSTS, and cpSTS, tended to have a higher correlation with that of behavioral dimensions than the deep neural network (DNN). However, regions that are more relevant to visual processing of stimuli, such as the frontal eye field (A8m, A8vl), early visual cortex (rLinG, rCunG, OPC), and fusiform gyrus (A37mv, A37lv), tended to have a higher correlation with the DNN than the behavioral dimensions. It can be interpreted that the regions with a more regulative role in emotion processing represent emotion from the emotion information perceived from the stimuli, and that the areas that are more relevant to visual perception tend to rely on high-level visual information of the stimuli.

One notable limitation of our study is the range of emotions employed for determining affective dimensions. Although this study focused on the representations of affective dimensions, an unbalanced range of emotions was used for the valence dimension, such as negative valence (sad, angry, fear, and disgust) and positive valence (happy and surprise). Therefore, this limitation should be considered in future affective dimension studies.

In conclusion, we found that affective dimensions were similarly represented in both the brain and behavior. Specifically, behavioral patterns to emotions along dimensions reflected in the brain with similar geometrical orientation directly reflected the physiological representation observed in behavior. In addition, valence and arousal are represented in distinct brain networks.

Funding

This research was supported by KBRI basic research program through Korea Brain Research Institute funded by Ministry of Science and ICT (23-BR-05–01, 23-BR-05–02).

CRediT authorship contribution statement

Yoonsang Lee: Investigation, Formal analysis, Data curation, Visualization, Writing – original draft, Writing – review & editing. Yeji Seo: Investigation, Data curation, Formal analysis, Writing – original draft, Writing – review & editing. Youngju Lee: Writing – review & editing. Dongha Lee: Conceptualization, Formal analysis, Visualization, Funding acquisition, Supervision, Methodology, Writing – original draft, Writing – review & editing.

References
[Anders et al., 2004]
S. Anders, M. Lotze, M. Erb, W. Grodd, N. Birbaumer.
Brain activity underlying emotional valence and arousal: A response-related fMRI study.
Human Brain Mapping, 23 (2004), pp. 200-209
[Bernat et al., 2006]
E. Bernat, C.J. Patrick, S.D. Benning, A. Tellegen.
Effects of picture content and intensity on affective physiological response.
Psychophysiology, 43 (2006), pp. 93-103
[Blair et al., 1999]
R.J.R. Blair, J.S. Morris, C.D. Frith, D.I. Perrett, R.J. Dolan.
Dissociable neural responses to facial expressions of sadness and anger.
Brain : A Journal of Neurology, 122 (1999), pp. 883-893
[Brooks et al., 2019]
J.A. Brooks, J. Chikazoe, N. Sadato, J.B. Freeman.
The neural representation of facial-emotion categories reflects conceptual structure.
Proceedings of the National Academy of Sciences of the United States of America, 116 (2019), pp. 15861-15870
[Buckner and Carroll, 2007]
R.L. Buckner, D.C. Carroll.
Self-projection and the brain.
Trends in Cognitive Sciences, 11 (2007), pp. 49-57
[Carlson et al., 2003]
T.A. Carlson, P. Schrater, S. He.
Patterns of activity in the categorical representations of objects.
Journal of Cognitive Neuroscience, 15 (2003), pp. 704-717
[Chavez and Heatherton, 2015]
R.S. Chavez, T.F. Heatherton.
Representational similarity of social and valence information in the medial pFC.
Journal of Cognitive Neuroscience, 27 (2015), pp. 73-82
[Chen et al., 2020]
P.H.A. Chen, E. Jolly, J.H. Cheong, L.J. Chang.
Intersubject representational similarity analysis reveals individual variations in affective experience when watching erotic movies.
[Chrastil, 2018]
E.R. Chrastil.
Heterogeneity in human retrosplenial cortex: A review of function and connectivity.
Behavioural Neuroscience, 132 (2018), pp. 317-338
[Chung et al., 2019]
K.M. Chung, S. Kim, W.H. Jung, Y. Kim.
Development and validation of the yonsei face database (YFace DB).
Frontiers in Psychology, 10 (2019), pp. 2626
[Codispoti et al., 2006]
M. Codispoti, V. Ferrari, M.M. Bradley.
Repetitive picture processing: Autonomic and cortical correlates.
Brain Research, 1068 (2006), pp. 213-220
[Cook et al., 1992]
E.W. Cook III, T.L. Davis, L.W. Hawk, E.L. Spence, C.H Gautier.
Fearfulness and startle potentiation during aversive visual stimuli.
Psychophysiology, 29 (1992), pp. 633-645
[Corbetta, 1998]
M. Corbetta.
Frontoparietal cortical networks for directing attention and the eye to visual locations: Identical, independent, or overlapping neural systems?.
Proceedings of the National Academy of Sciences of the United States of America, 95 (1998), pp. 831-838
[Critchley et al., 2004]
H.D. Critchley, S. Wiens, P. Rotshtein, A. Ohman, R.J. Dolan.
Neural systems supporting interoceptive awareness.
Nature Neuroscience, 7 (2004), pp. 189-195
[Cuthbert et al., 1996]
B.N. Cuthbert, M.M. Bradleym, P.J. Lang.
Probing picture perception: Activation and emotion.
Psychophysiology, 33 (1996), pp. 103-111
[Damasio, 1999]
A.R. Damasio.
The feeling of what happens: Body and emotion in the making of consciousness.
Houghton Mifflin Harcourt, (1999),
[DeSteno et al., 2013]
D. DeSteno, J.J. Gross, L. Kubzansky.
Affective science and health: The importance of emotion and emotion regulation.
Health Psychology : Official Journal of the Division of Health Psychology, American Psychological Association, 32 (2013), pp. 474-486
[Dharmaretnam and Fyshe, 2018]
D. Dharmaretnam, A. Fyshe.
The emergence of semantics in neural network representations of visual information.
Proceedings of the conference of the north American chapter of the association for computational linguistics: Human language technologies,
[Dore et al., 2017]
B.P. Dore, J. Weber, K.N. Ochsner.
Neural predictors of decisions to cognitively control emotion.
Journal of Neuroscience, 37 (2017), pp. 2580-2588
[Dricu and Fruhholz, 2020]
M. Dricu, S. Fruhholz.
A neurocognitive model of perceptual decision-making on emotional signals.
Human Brain Mapping, 41 (2020), pp. 1532-1556
[Egger et al., 2019]
M. Egger, M. Ley, S. Hanke.
Emotion recognition from physiological signal analysis: A review.
Electronic Notes in Theoretical Computer Science, 343 (2019), pp. 35-55
[Ekman, 1976]
P. Ekman.
Pictures of facial affect.
Consulting Psychologists Press, (1976),
[Ekman, 1992]
P. Ekman.
An argument for basic emotions.
Cognition & Emotion, 6 (1992), pp. 169-200
[Elliott et al., 2000]
R. Elliott, R.J. Dolan, C.D. Frith.
Dissociable functions in the medial and lateral orbitofrontal cortex: Evidence from human neuroimaging studies.
Cerebral Cortex, 10 (2000), pp. 308-317
[Ethofer et al., 2009]
T. Ethofer, B. Kreifelts, S. Wiethoff, J. Wolf, W. Grodd, P. Vuilleumier, et al.
Differential influences of emotion, task, and novelty on brain regions underlying the processing of speech melody.
Journal of Cognitive Neuroscience, 21 (2009), pp. 1255-1268
[Fan et al., 2016]
L.Z. Fan, H. Li, J.J. Zhuo, Y. Zhang, J.J. Wang, L.F. Chen, et al.
The human brainnetome atlas: A new brain atlas based on connectional architecture.
Cerebral Cortex, 26 (2016), pp. 3508-3526
[Friston et al., 1995]
K.J. Friston, A.P. Holmes, J.B. Poline, P.J. Grasby, S.C. Williams, R.S. Frackowiak, et al.
Analysis of fMRI time-series revisited.
Neuroimage, 2 (1995), pp. 45-53
[Gallaher, 2001]
S. Gallaher.
Emotion and intersubjecttve perception: A speculative account.
Emotions, qualia, and consciousness, World Scientific, (2001), pp. 95-100
[Gallo et al., 2018]
S. Gallo, R. Paracampo, L. Muller-Pinzler, M.C. Severo, L. Blomer, C. Fernandes-Henriques, et al.
The causal role of the somatosensory cortex in prosocial behaviour.
[Gerdes et al., 2010]
A.B.M. Gerdes, M.J. Wieser, A. Muhlberger, P. Weyers, G.W. Alpers, M.M. Plichta, et al.
Brain activations to emotional pictures are differentially associated with valence and arousal ratings.
Frontiers in Human Neuroscience, 4 (2010),
[Gourley et al., 2016]
S.L. Gourley, K.S. Zimmermann, A.G. Allen, J.R. Taylor.
The medial orbitofrontal cortex regulates sensitivity to outcome value.
Journal of Neuroscience, 36 (2016), pp. 4600-4613
[Grabenhorst et al., 2008]
F. Grabenhorst, E.T. Rolls, B.A. Parris.
From affective value to decision-making in the prefrontal cortex.
European Journal of Neuroscience, 28 (2008), pp. 1930-1939
[Grootswagers et al., 2020]
T. Grootswagers, B.L. Kennedy, S.B. Most, T.A. Carlson.
Neural signatures of dynamic emotion constructs in the human brain.
[Gu et al., 2019]
S. Gu, F. Wang, C. Cao, E. Wu, Y.Y. Tang, J.H. Huang.
An integrative way for studying neural basis of basic emotions with fMRI.
Frontiers in Neuroscience, 13 (2019), pp. 628
[Gyurak et al., 2011]
A. Gyurak, J.J. Gross, A. Etkin.
Explicit and implicit emotion regulation: A dual-process framework.
Cognition & Emotion, 25 (2011), pp. 400-412
[Hamann, 2012]
S. Hamann.
Mapping discrete and dimensional emotions onto the brain: Controversies and consensus.
Trends in Cognitive Sciences, 16 (2012), pp. 458-466
[Hamilton et al., 2021]
O.S. Hamilton, D. Cadar, A. Steptoe.
Systemic inflammation and emotional responses during the COVID-19 pandemic.
Translational Psychiatry, 11 (2021), pp. 626
[Harry et al., 2013]
B. Harry, M.A. Williams, C. Davis, J. Kim.
Emotional expressions evoke a differential response in the fusiform face area.
Frontiers in Human Neuroscience, 7 (2013), pp. 692
[Haxby et al., 2014]
J.V. Haxby, A.C. Connolly, J.S. Guntupalli.
Decoding neural representational spaces using multivariate pattern analysis.
Annual Review of Neuroscience, 37 (2014), pp. 435-456
[Haxby et al., 2001]
J.V. Haxby, M.I. Gobbini, M.L. Furey, A. Ishai, J.L. Schouten, P. Pietrini.
Distributed and overlapping representations of faces and objects in ventral temporal cortex.
Science, 293 (2001), pp. 2425-2430
[He et al., 2016]
K. He, X. Zhang, S. Ren, J. Sun.
Deep residual learning for image recognition.
Proceedings of the IEEE conference on computer vision and pattern recognition,
[Hooker and Knight, 2006]
C.I. Hooker, R.T. Knight.
The role of lateral orbitofrontal cortex in the inhibitory control of emotion.
The Orbitofrontal Cortex, 307 (2006), pp. 1-18
[Izard et al., 2001]
C. Izard, S. Fine, D. Schultz, A. Mostow, B. Ackerman, E. Youngstrom.
Emotion knowledge as a predictor of social behavior and academic competence in children at risk.
Psychological Science, 12 (2001), pp. 18-23
[Keltner and Haidt, 1999]
D. Keltner, J. Haidt.
Social functions of emotions at four levels of analysis.
Cognition & Emotion, 13 (1999), pp. 505-521
[Kensinger and Schacter, 2006]
E.A. Kensinger, D.L. Schacter.
Processing emotional pictures and words: Effects of valence and arousal.
Cognitive, Affective & Behavioral Neuroscience, 6 (2006), pp. 110-126
[Kim et al., 2015]
J. Kim, J. Schultz, T. Rohe, C. Wallraven, S.W. Lee, H.H. Bülthoff.
Abstract representations of associated emotions in the human brain.
Journal of Neuroscience, 35 (2015), pp. 5655-5663
[Kragel and LaBar, 2016]
P.A. Kragel, K.S. LaBar.
Decoding the nature of emotion in the brain.
Trends in Cognitive Sciences, 20 (2016), pp. 444-455
[Kriegeskorte and Kievit, 2013]
N. Kriegeskorte, R.A. Kievit.
Representational geometry: Integrating cognition, computation, and the brain.
Trends in Cognitive Sciences, 17 (2013), pp. 401-412
[Kriegeskorte and Mur, 2012]
N. Kriegeskorte, M. Mur.
Inverse MDS: Inferring dissimilarity structure from multiple item arrangements.
Frontiers in Psychology, 3 (2012), pp. 245
[Kriegeskorte et al., 2008]
N. Kriegeskorte, M. Mur, P. Bandettini.
Representational similarity analysis - connecting the branches of systems neuroscience.
Frontiers in Systems Neuroscience, 2 (2008), pp. 4
[Lang and Bradley, 2010]
P.J. Lang, M.M. Bradley.
Emotion and the motivational brain.
Biological Psychology, 84 (2010), pp. 437-450
[Lang and Davis, 2006]
P.J. Lang, M. Davis.
Emotion, motivation, and the brain: Reflex foundations in animal and human research.
Progress in Brain Research, 156 (2006), pp. 3-29
[Lee and Almeida, 2021]
D. Lee, J. Almeida.
Within-category representational stability through the lens of manipulable objects.
Cortex; a Journal Devoted to the Study of the Nervous System and Behavior, 137 (2021), pp. 282-291
[Lee et al., 2008]
T.W. Lee, R.J. Dolan, H.D. Critchley.
Controlling emotional expression: Behavioral and neural correlates of nonimitative emotional responses.
Cerebral Cortex, 18 (2008), pp. 104-113
[Lerner et al., 2015]
J.S. Lerner, Y. Li, P. Valdesolo, K.S. Kassam.
Emotion and decision making.
Annual Review of Psychology, 66 (2015), pp. 799-823
[Levine and Burgess, 1997]
L.J. Levine, S.L. Burgess.
Beyond general arousal: Effects of specific emotions on memory.
Social Cognition, 15 (1997), pp. 157-181
[Lewis-Peacock and Norman, 2014]
J.A. Lewis-Peacock, K.A. Norman.
Multi-voxel pattern analysis of fMRI data.
The Cognitive Neurosciences, 512 (2014), pp. 911-920
[Li et al., 2022]
Y.W. Li, M.M. Zhang, S.C. Liu, W.B. Luo.
EEG decoding of multidimensional information from emotional faces.
[Little et al., 2011]
A.C. Little, B.C. Jones, L.M. DeBruine.
The many faces of research on face perception.
Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, 366 (2011), pp. 1634-1637
[Mehrabian and Russell, 1974]
A. Mehrabian, J.A. Russell.
An approach to environmental psychology.
M.I.T. Press, (1974),
[Meng et al., 1992]
X.L. Meng, R. Rosenthal, D.B. Rubin.
Comparing correlated correlation coefficients.
Psychological Bulletin, 111 (1992), pp. 172
[Morawetz et al., 2016]
C. Morawetz, S. Bode, J. Baudewig, E. Kirilina, H.R. Heekeren.
Changes in effective connectivity between dorsal and ventral prefrontal regions moderate emotion regulation.
Cerebral Cortex, 26 (2016), pp. 1923-1937
[Munoz and Everling, 2004]
D.P. Munoz, S. Everling.
Look away: The anti-saccade task and the voluntary control of eye movement.
Nature Reviews Neuroscience, 5 (2004), pp. 218-228
[Mur et al., 2013]
M. Mur, M. Meys, J. Bodurka, R. Goebel, P.A. Bandettini, N. Kriegeskorte.
Human object-similarity judgments reflect and transcend the primate-it object representation.
Frontiers in Psychology, 4 (2013), pp. 128
[Narumoto et al., 2001]
J. Narumoto, T. Okada, N. Sadato, K. Fukui, Y. Yonekura.
Attention to emotion modulates fMRI activity in human right superior temporal sulcus.
Brain Research Cognitive Brain Research, 12 (2001), pp. 225-231
[Nielen et al., 2009]
M.M. Nielen, D.J. Heslenfeld, K. Heinen, J.W. Van Strien, M.P. Witter, C. Jonker, et al.
Distinct brain systems underlie the processing of valence and arousal of affective pictures.
Brain and Cognition, 71 (2009), pp. 387-396
[Nili et al., 2014]
H. Nili, C. Wingfield, A. Walther, L. Su, W. Marslen-Wilson, N. Kriegeskorte.
A toolbox for representational similarity analysis.
Plos Computational Biology, 10 (2014),
[Noonan et al., 2010]
M.P. Noonan, M.E. Walton, T.E. Behrens, J. Sallet, M.J. Buckley, M.F. Rushworth.
Separate value comparison and learning mechanisms in macaque medial and lateral orbitofrontal cortex.
PNAS, 107 (2010), pp. 20547-20552
[Oatley et al., 2006]
K. Oatley, D. Keltner, J.M. Jenkins.
Understanding emotions.
Blackwell publishing, (2006),
[Okon-Singer et al., 2015]
H. Okon-Singer, T. Hendler, L. Pessoa, A.J. Shackman.
The neurobiology of emotion-cognition interactions: Fundamental questions and strategies for future research.
Frontiers in Human Neuroscience, 9 (2015), pp. 58
[Parkinson et al., 2017]
C. Parkinson, A.M. Kleinbaum, T. Wheatley.
Spontaneous neural encoding of social network position.
Nature Human Behaviour, 1 (2017), pp. 1-7
[Peelen et al., 2010]
M.V. Peelen, A.P. Atkinson, P. Vuilleumier.
Supramodal representations of perceived emotions in the human brain.
Journal of Neuroscience, 30 (2010), pp. 10127-10134
[Petro et al., 2013]
L.S. Petro, F.W. Smith, P.G. Schyns, L. Muckli.
Decoding face categories in diagnostic subregions of primary visual cortex.
European Journal of Neuroscience, 37 (2013), pp. 1130-1139
[Plassmann et al., 2010]
H. Plassmann, J.P. O'Doherty, A. Rangel.
Appetitive and aversive goal values are encoded in the medial orbitofrontal cortex at the time of decision making.
Journal of Neuroscience, 30 (2010), pp. 10799-10808
[Popal et al., 2019]
H. Popal, Y. Wang, I.R. Olson.
A guide to representational similarity analysis for social neuroscience.
Social Cognitive and Affective Neuroscience, 14 (2019), pp. 1243-1253
[Posner et al., 2009]
J. Posner, J.A. Russell, A. Gerber, D. Gorman, T. Colibazzi, S. Yu, et al.
The neurophysiological bases of emotion: An fMRI study of the affective circumplex using emotion-denoting words.
Human Brain Mapping, 30 (2009), pp. 883-895
[Rolls, 2015]
E.T. Rolls.
Emotion and decision-making explained: Response to commentators.
Cortex; a Journal Devoted to the Study of the Nervous System and Behavior, 62 (2015), pp. 203-210
[Russell, 1980]
J.A. Russell.
A circumplex model of affect.
Journal of Personality and Social Psychology, 39 (1980), pp. 1161
[Russell, 2003]
J.A. Russell.
Core affect and the psychological construction of emotion.
Psychological Review, 110 (2003), pp. 145-172
[Saarimaki et al., 2016]
H. Saarimaki, A. Gotsopoulos, I.P. Jaaskelainen, J. Lampinen, P. Vuilleumier, R. Hari, et al.
Discrete neural signatures of basic emotions.
Cerebral Cortex, 26 (2016), pp. 2563-2573
[Said et al., 2010]
C.P. Said, C.D. Moore, A.D. Engell, A. Todorov, J.V. Haxby.
Distributed representations of dynamic facial expressions in the superior temporal sulcus.
Journal of Vision, 10 (2010), pp. 11
[Schafer and Moore, 2007]
R.J. Schafer, T. Moore.
Attention governs action in the primate frontal eye field.
[Schelhorn et al., 2022]
I. Schelhorn, S. Schluter, K. Paintner, Y. Shiban, R. Lugo, M. Meyer, et al.
Emotions and emotion up-regulation during the COVID-19 pandemic in Germany.
[Schlegel et al., 2017]
K. Schlegel, J.R. Fontaine, K.R. Scherer.
The nomological network of emotion recognition ability.
European Journal of Psychological Assessment, (2017),
[Sievers et al., 2021]
B. Sievers, C. Parkinson, P.J. Kohler, J.M. Hughes, S.V. Fogelson, T. Wheatley.
Visual and auditory brain areas share a representational structure that supports emotion perception.
Current Biology, 31 (2021), pp. 5192
[Skerry and Saxe, 2015]
A.E. Skerry, R. Saxe.
Neural representations of emotion are organized around abstract event features.
Current Biology, 25 (2015), pp. 1945-1954
[Smith et al., 2005]
M.L. Smith, G.W. Cottrell, F. Gosselin, P.G. Schyns.
Transmitting and decoding facial expressions.
Psychological Science, 16 (2005), pp. 184-189
[Spreng et al., 2009]
R.N. Spreng, R.A. Mar, A.S. Kim.
The common neural basis of autobiographical memory, prospection, navigation, theory of mind, and the default mode: A quantitative meta-analysis.
Journal of Cognitive Neuroscience, 21 (2009), pp. 489-510
[Todd et al., 2013]
M.T. Todd, L.E. Nystrom, J.D. Cohen.
Confounds in multivariate pattern analysis: Theory and rule representation case study.
Neuroimage, 77 (2013), pp. 157-165
[Toisoul et al., 2021]
A. Toisoul, J. Kossaifi, A. Bulat, G. Tzimiropoulos, M. Pantic.
Estimation of continuous valence and arousal levels from faces in naturalistic conditions.
Nature Machine Intelligence, 3 (2021), pp. 42-50
[Trentacosta and Fine, 2010]
C.J. Trentacosta, S.E. Fine.
Emotion knowledge, social competence, and behavior problems in childhood and adolescence: A meta-analytic review.
Social Development, 19 (2010), pp. 1-29
[Ventura-Bort et al., 2022]
C. Ventura-Bort, J. Wendt, M. Weymar.
New insights on the correspondence between subjective affective experience and physiological responses from representational similarity analysis.
Psychophysiology, (2022),
[Vrana et al., 1988]
S.R. Vrana, E.L. Spence, P.J. Lang.
The startle probe response: A new measure of emotion?.
Journal of Abnormal Psychology, 97 (1988), pp. 487
[Wegrzyn et al., 2017]
M. Wegrzyn, M. Vogt, B. Kireclioglu, J. Schneider, J. Kissler.
Mapping the emotional face. How individual face parts contribute to successful emotion recognition.
[Zochowska et al., 2022]
A. Zochowska, P. Jakuszyk, M.M. Nowicka, A. Nowicka.
Are covered faces eye-catching for us? The impact of masks on attentional processing of self and other faces during the COVID-19 pandemic.
Cortex; a Journal Devoted to the Study of the Nervous System and Behavior, 149 (2022), pp. 173-187

These authors contributed equally to the manuscript and should be considered as first authors.

Copyright © 2023. The Author(s)
Article options
Tools