Infos Autisme :  Phorum 7 The fastest message board... ever.
Ce forum accueille les informations, résumés d'articles ou de manifestations concernant l'autisme et les travaux de recherche qui concernent les troubles du spectre autistique 
Goto Thread: PreviousNext
Goto: Forum ListMessage ListNew TopicSearchLog In
Posted by: root (IP Logged)
Date: February 04, 2016 07:38AM

Human sounds convey emotions clearer and faster than words
January 19, 2016 brain

Credit: Wikimedia Commons

It takes just one-tenth of a second for our brains to begin to recognize emotions conveyed by vocalizations, according to researchers from McGill. It doesn't matter whether the non-verbal sounds are growls of anger, the laughter of happiness or cries of sadness. More importantly, the researchers have also discovered that we pay more attention when an emotion (such as happiness, sadness or anger) is expressed through vocalizations than we do when the same emotion is expressed in speech.

The researchers believe that the speed with which the brain 'tags' these vocalizations and the preference given to them compared to language, is due to the potentially crucial role that decoding vocal sounds has played in human survival.

"The identification of emotional vocalizations depends on systems in the brain that are older in evolutionary terms," says Marc Pell, Director of McGill's School of Communication Sciences and Disorders and the lead author on the study that was recently published in Biological Psychology. "Understanding emotions expressed in spoken language, on the other hand, involves more recent brain systems that have evolved as human language developed."

Of nonsense speech and growls

The researchers were interested in finding out whether the brain responded differently when emotions were expressed through vocalizations (sounds such as growls, laughter or sobbing, where no words are used) or through language. They focused on three basic emotions: anger, sadness and happiness and tested 24 participants by playing a random mix of vocalizations and nonsense speech, e.g. The dirms are in the cindabal, spoken with different emotional intent. (The researchers used nonsense phrases in order to avoid any linguistic cues about emotions.) They asked participants to identify which emotions the speakers were trying to convey and used an EEG to record how quickly and in what ways the brain responded as the participants heard the different types of emotional vocal sounds.

They were able to measure:

how the brain responds to emotions expressed through vocalizations compared to spoken language with millisecond precision;

whether certain emotions are recognized more quickly through vocalizations than others and produce larger brain responses; and

whether people who are anxious are particularly sensitive to emotional voices based on the strength of their brain response.

Anger leaves longer traces—especially for those who are anxious

The researchers found that the participants were able to detect vocalizations of happiness (i.e., laughter) more quickly than vocal sounds conveying either anger or sadness. But, interestingly, they found that angry sounds and angry speech both produced ongoing brain activity that lasted longer than either of the other emotions, suggesting that the brain pays special attention to the importance of anger signals.

"Our data suggest that listeners engage in sustained monitoring of angry voices, irrespective of the form they take, to grasp the significance of potentially threatening events," says Pell.

The researchers also discovered that individuals who are more anxious have a faster and more heightened response to emotional voices in general than people who are less anxious.

"Vocalizations appear to have the advantage of conveying meaning in a more immediate way than speech," says Pell. "Our findings are consistent with studies of non-human primates which suggest that vocalizations that are specific to a species are treated preferentially by the neural system over other sounds."


Emotions : Negative emotional face perception is diminished on a very early level of processing in autism spectrum disorder.
Posted by: root (IP Logged)
Date: May 04, 2020 03:14PM

Deficits in facial affect recognition (FAR) are often reported in autism spectrum disorder (ASD) due to inappropriate visual search strategies. It is unclear, however, whether or not deficits in subliminal FAR are still present in autism when visual focus is controlled. Thirteen persons with ASD and 13 healthy participants took part in this experiment. Supraliminal FAR was assessed using a standardized, computer-aided test. Subliminal FAR was obtained by an emotional face priming paradigm. By using an eye-tracking technique, it was assured that the initial visual focus was on the eyes of the prime. Persons with ASD showed worse FAR in supraliminal face recognition. Although controlled for initial gaze direction, participants also showed reduced negative face priming. These data confirm that FAR is disturbed already on a pre-attentive level in autism.

Emotion; autism spectrum disorder; face; recognition; visual priming

Soc Neurosci. 2019 Apr;14(2):191-194. doi: 10.1080/17470919.2018.1441904. Epub 2018 Feb 22.
Prehn-Kristensen A1, Lorenzen A1,2, Grabe F1, Baving L1.

Sorry, only registered users may post in this forum.
This forum powered by Phorum.