Abstract: Current research on acoustic encoding of emotional content suggests that there are universal cues, allowing for decoding within and across taxa. This is particularly important for human-animal relationships, wherein domestic animals are supposed to be particularly efficient in decoding human emotions. Here we investigated whether the decoding of the emotional content in human voices shared universal acoustic properties, or whether it could be influenced by experience. Emotional human voices were presented to two populations of horses, in which behavioral, cardiac, and brain responses were measured. The two populations differed in their living and working conditions: one population lived in naturalistic conditions (stable social groups in pastures) and were ridden occasionally for outdoor trail riding with one to a few different riders, while the other was kept in more restricted conditions (individual stalls) and participated in riding lessons involving many different riders. Assessment of the horses' welfare state (animal-based measures) and their relationships with humans, performed independently of the playback experiments, revealed that the populations differed in both aspects. Whereas both populations appeared to react to the angry human voice, the population with the best welfare state and relationship with humans showed little differentiation between the different emotional voices and exhibited low behavioral reactions. On the contrary, the other population showed high behavioral and cardiac reactions to all negative voices. Brain responses also differed, with the first population showing higher responses (increased gamma, i.e., excitation) for the happy voice and the second for fear and anger (increased theta, i.e., alarm). Thus, animals' affective state and past experiences appear very influential for their perception of (cross-taxa) acoustic emotional cues.
The Equine Research Bank provides access to a large database of publicly available scientific literature. Inclusion in the Research Bank does not imply endorsement of study methods or findings by Mad Barn.
This research summary has been generated with artificial intelligence and may contain errors and omissions. Refer to the original study to confirm details provided. Submit correction.
Research Overview
This study examined how horses decode emotional content in human voices and how this ability is influenced by their living conditions and relationships with humans.
Researchers measured behavioral, cardiac, and brain responses in two horse populations exposed to varied emotional human vocalizations.
Background and Purpose
Acoustic encoding of emotions is thought to contain universal cues that help decode emotions across different species.
Domestic animals like horses are proposed to be efficient in decoding human emotions due to close interactions with humans.
The study aimed to determine whether horses decode human emotional voices based on universal acoustic properties or if their learning and experience influence this decoding.
Subjects and Experimental Design
Two populations of horses were studied:
Population 1: Lived in naturalistic conditions with stable social groups on pastures; they were ridden occasionally by one to few familiar riders mainly for trail riding.
Population 2: Kept in more restricted conditions in individual stalls; participated in riding lessons with many different riders.
Researchers independently assessed:
The horses’ welfare state using animal-based measures.
The quality of the horses’ relationships with humans.
Both populations were exposed to human voices expressing different emotions (angry, happy, fear).
Behavioral changes, heart rate (cardiac response), and brain activity were recorded during the exposure.
Key Findings
Response to Angry Voices: Both populations reacted to angry human voices, indicating some universal recognition of negative emotions.
Differences in Emotional Differentiation and Reactivity:
Population 1 (better welfare and closer human relationship)
Showed little behavioral difference between different emotional voices.
Exhibited generally low behavioral reactivity.
Brain showed increased gamma waves, linked to excitation, in response to happy voices.
Population 2 (poorer welfare and varied human contact)
Exhibited strong behavioral and cardiac reactions to all negative voices (anger and fear).
Brain showed increased theta waves, indicative of alarm or heightened vigilance, especially for fear and anger voices.
Interpretation and Implications
The study reveals that horses do possess the ability to decode some emotional cues in human voices likely based on universal acoustic cues (e.g., recognizing anger).
However, this decoding is not purely innate or universal but is heavily influenced by the horses’ chronic affective state and their previous experiences with humans.
Horses in better welfare conditions and with stronger bonds to humans appear less reactive or more selective in their emotional responses, possibly due to habituation or better emotional regulation.
Conversely, horses in poorer welfare conditions and with more variable human contacts display heightened sensitivity and alarm responses to negative human emotions.
This underscores the importance of welfare and human-animal relationships in shaping how domestic animals perceive and respond to human emotional signals.
Broader Context
Findings support the idea that while there are cross-species emotional vocal cues, experience and individual emotional states modulate perception.
Enhanced understanding of emotional decoding in horses can improve training, welfare, and human-animal communication strategies.
The study contributes to ethology and animal cognition by integrating behavioral, physiological, and neurological data to explore emotion perception in non-human animals.
Cite This Article
APA
d'Ingeo S, Siniscalchi M, Quaranta A, Cousillas H, Hausberger M.
(2025).
Chronic State and Relationship to Humans Influence How Horses Decode Emotions in Human Voices: A Brain and Behavior Study.
Animals (Basel), 15(21), 3217.
https://doi.org/10.3390/ani15213217
Filippi P, Congdon JV, Hoang J, Bowling DL, Reber SA, Pašukonis A, Hoeschele M, Ocklenburg S, de Boer B, Sturdy CB. Humans recognize emotional arousal in vocalizations across all classes of terrestrial vertebrates: Evidence for acoustic universals. Proc. R. Soc. B Biol. Sci. 2017;284:20170990.
Mendl M, Burman OHP, Paul ES. An integrative and functional framework for the study of animal emotion and mood. Proc. R. Soc. B Biol. Sci. 2010;277:2895–2904.
Siniscalchi M, d’Ingeo S, Fornelli S, Quaranta A. Lateralized behavior and cardiac activity of dogs in response to human emotional vocalizations. Sci. Rep. 2018;8:77.
Nawroth C, Albuquerque N, Savalli C, Single M-S, McElligott AG. Goats prefer positive human emotional facial expressions. R. Soc. Open Sci. 2018;5:180491.
Barber ALA, Randi D, Müller CA, Huber L. The processing of human emotional faces by pet and lab dogs: Evidence for lateralization and experience effects. PLoS ONE 2016;11:e0152393.
Rosenberger K, Simmler M, Langbein J, Keil N, Nawroth C. Performance of goats in a detour and a problem-solving test following long-term cognitive test exposure. R. Soc. Open Sci. 2021;8:210656.
Salmi R, Jones CE, Carrigan J. Who is there? Captive western gorillas distinguish human voices based on familiarity and nature of previous interactions. Anim. Cogn. 2021;25:217–228.
Fureix C, Meagher RK. What can inactivity (in its various forms) reveal about affective states in non-human animals? A review. Appl. Anim. Behav. Sci. 2015;171:8–24.
Scheumann M, Hasting AS, Kotz SA, Zimmermann E. The Voice of Emotion across Species: How Do Human Listeners Recognize Animals’ Affective States?. PLoS ONE 2014;9:e91192.
Takahashi T, Murata T, Hamada T, Omori M, Kosaka H, Kikuchi M, Yoshida H, Wada Y. Changes in EEG and autonomic nervous activity during meditation and their association with personality traits. Int. J. Psychophysiol. 2005;55:199–207.
Putman P, van Peer J, Maimari I, van der Werff S. EEG theta/beta ratio in relation to fear-modulated response-inhibition, attentional control, and affective traits. Biol. Psychol. 2010;83:73–78.
Sammler D, Grigutsch M, Fritz T, Koelsch S. Music and emotion: Electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology 2007;44:293–304.
Balconi M., Vanutelli M.E.. Vocal and visual stimulation, congruence and lateralization affect brain oscillations in interspecies emotional positive and negative interactions.. J. Soc. Neuro. 2016;11:297–310.
Aftanas L.I., Golocheikine S.A.. Human anterior and frontal midline theta and lower alpha reflect emotionally positive state and internalized attention: High-resolution EEG investigation of meditation.. Neurosci. Lett. 2001;310:57–60.
Hausberger M., Cousillas H., Meter A., Karino G., George I., Lemasson A., Blois-Heulin C.. A crucial role of attention in lateralisation of sound processing?. Symmetry 2019;11:48.
Magyari L., Huszár Z., Turzó A., Andics A.. Event-Related Potentials Reveal Limited Readiness to Access Phonetic Details during Word Processing in Dogs.. R. Soc. Open Sci. 2020;7:200851.
Stomp M., d’Ingeo S., Henry S., Cousillas H., Hausberger M.. Brain activity reflects (chronic) welfare state: Evidence from individual electroencephalography profiles in an animal model.. Appl. Anim. Behav. Sci. 2021;236:105271.
Bolker B.M., Brooks M.E., Clark C.J., Geange S.W., Poulsen J.R., Stevens M.H.H., White J.-S.S.. Generalized linear mixed models: A practical guide for ecology and evolution.. Trends Ecol. Evol. 2009;24:127–135.
Nicastro N., Owren M.J.. Classification of domestic cat (Felis catus) vocalizations by naive and experienced human listeners.. J. Comp. Psychol. 2003;117:44–52.
Pongrácz P., Miklósi Á., Csányi V.. Owner’s beliefs on the ability of their pet dogs to understand human verbal communication: A case of social understanding.. Cogn. Psychol. 2001;20:87–107.
Tallet C., Špinka M., Maruščáková I., Šimeček P.. Human perception of vocalizations of domestic piglets and modulation by experience with domestic pigs (Sus scrofa). J. Comp. Psychol. 2010;124:81–91.
Matthews G.R., Antes J.R.. Visual attention and depression: Cognitive biases in the eye fixations of the dysphoric and the nondepressed.. Cogn. Ther. Res. 1992;16:359–371.
Burman O., Owen D., AbouIsmail U., Mendl M.. Removing individual rats affects indicators of welfare in the remaining group members.. Physiol. Behav. 2008;93:89–96.
Sankey C., Richard-Yris M.A., Leroy H., Henry S., Hausberger M.. Positive Interactions Lead to Lasting Positive Memories in Horses, Equus caballus.. Anim. Behav. 2010;79:869–875.
Gábor A., Andics A., Miklósi Á., Czeibert K., Carreiro C., Gácsi M.. Social relationship-dependent neural response to speech in dogs.. Neuroimage 2021;243:118480.
Horn L., Range F., Huber L.. Dogs’ attention towards humans depends on their relationship, not only on social familiarity.. Anim. Cogn. 2012;16:435–443.
D’Ingeo S., Siniscalchi M., Straziota V., Ventriglia G., Sasso R., Quaranta A.. Relationship between asymmetric nostril use and human emotional odours in cats.. Sci. Rep. 2023;13:3280.
Siniscalchi M., d’Ingeo S., Minunno M., Quaranta A.. Facial asymmetry in dogs with fear and aggressive behaviors towards humans.. Sci. Rep. 2022;12:12345.
Rogers L.J., Vallortigara G., Andrew R.J.. Divided Brains: The Biology and Behavior of Brain Asymmetries.. 2013.
Siniscalchi M., D’Ingeo S., Quaranta A.. The dog nose “KNOWS” fear: Asymmetric nostril use during sniffing at canine and human emotional stimuli.. Behav. Brain Res. 2016;304:34–41.
Motoi M., Egashira Y., Nishimura T., Choi D., Matsumoto R., Watanuki S.. Time window for cognitive activity involved in emotional processing.. J. Physiol. Anthropol. 2014;33:25.
Balconi M., Brambilla E., Falbo L.. Appetitive vs. defensive responses to emotional cues: Autonomic measures and brain oscillation modulation.. Brain Res. 2009;1296:72–84.
López-Madrona V.J., Pérez-Montoyo E., Álvarez-Salvado E., Moratal D., Herreras O., Pereda E., Mirasso C.R., Canals S.. Different theta frameworks coexist in the rat hippocampus and are coordinated during memory-guided and novelty tasks.. eLife 2020;9:e57313.