Le-Si Wang

and 5 more

Unselfishness is one of the admired facilitators for human group endeavors, especially in times of urgent calls for global collaboration. Despite its importance, the neural dynamics behind its formation is scarcely understood. With 26 triads interacting as turn-taking pairs in a coordination game, we investigated reciprocal interactions in this tri-fMRI hyperscanning experiment. The critical role of the right temporal-parietal junction (rTPJ) was examined by adopting both time- and frequency-domain analyses. For the former, in the successful versus failed “reciprocity” contrast, brain regions associated with the mirror neuron system (MNS) and the mentalizing system (MS) were identified. In addition, the differences of connectivity between the rTPJ (seed region) and the abovementioned network areas (e.g., the right Inferior Parietal Lobule, rIPL) were negatively correlated with the individual reward. These results both verified the experimental design, which favored ‘reciprocal’ participants/triads with larger gains, and supported the opposition of rTPJ (other-) vs. rIPL (self-concerned) areas during successful social exchanges. Furthermore, the cerebral synchronization of the rTPJs emerged between the interacting pairs, and the coupling between the rTPJ and the right Superior Temporal Gyrus (rSTG) was found between those interacting simultaneously with others of the same group. These coherence findings not only echoed our previous findings, but also reinforced the hypotheses of the rTPJ-rTPJ coupling underpinning simultaneous collaboration and the rTPJ-rSTG coupling for decontextualized shared meaning emergence. Taken together, these results support two of the multi-functions (other-concerning and decontextualizing) subserved by the rTPJ, and highlight its interaction with other self-concerning brain areas in reaching co-benefits.

HanShin Jo

and 5 more

The perception of two (or more) simultaneous musical notes, depending on their pitch interval(s), could be broadly categorized as consonant or dissonant. Previous literature has suggested that musicians and non-musicians adopt different strategies when discerning music intervals: musicians rely on the frequency ratio (defined as the relative ratio between the two fundamental frequencies, such as 3:2 (C-G), perfect fifth/consonant; vs. the mixtures of semitones, as 45:32 (C-#F), tritone/dissonant) for the musicians; and non-musicians on frequency differences (e.g., the presences of beats, perceived as rough), and their separate ERP differences in N1(~160ms) and P2(~250ms) along the midline electrodes. To replicate and extend, in this study we reran the previous experiment, and separately collected fMRI data of the same protocol (with sparse sampling modifications). The behavioral and EEG results to a large extent corresponded to our previous finding. The fMRI results, with the joint analyses by univariate, psycho-physiological interaction, multi-voxel pattern analysis, and representational similarity analysis (RSA) approaches, further reinforce the involvement of midline and related-brain regions in consonant/dissonance judgments. The final spatio-temporal searchlight RSA provided convincing evidence that medial prefrontal cortex, along with bilateral superior temporal cortex, as the joint locus of midline N1, and dorsal anterior cingulate cortex for the P2 effect (for musicians). Together, these analyses not just reaffirm that musicians rely more on top-down knowledge for consonance/dissonance perception; but also demonstrate the advantages of multiple analyses in constraining the findings from both EEG and fMRI.