Analyze Diet
PloS one2024; 19(7); e0302893; doi: 10.1371/journal.pone.0302893

Automated recognition of emotional states of horses from facial expressions.

Abstract: Animal affective computing is an emerging new field, which has so far mainly focused on pain, while other emotional states remain uncharted territories, especially in horses. This study is the first to develop AI models to automatically recognize horse emotional states from facial expressions using data collected in a controlled experiment. We explore two types of pipelines: a deep learning one which takes as input video footage, and a machine learning one which takes as input EquiFACS annotations. The former outperforms the latter, with 76% accuracy in separating between four emotional states: baseline, positive anticipation, disappointment and frustration. Anticipation and frustration were difficult to separate, with only 61% accuracy.
Publication Date: 2024-07-15 PubMed ID: 39008504DOI: 10.1371/journal.pone.0302893Google Scholar: Lookup
The Equine Research Bank provides access to a large database of publicly available scientific literature. Inclusion in the Research Bank does not imply endorsement of study methods or findings by Mad Barn.
  • Journal Article

Summary

This research summary has been generated with artificial intelligence and may contain errors and omissions. Refer to the original study to confirm details provided. Submit correction.

The research article discusses a groundbreaking study that uses artificial intelligence (AI) models to recognize and interpret the emotional states of horses through their facial expressions. This is the first of its kind in a growing field known as animal affective computing.

Overview of the Research

The researchers wanted to understand and interpret the emotional states of horses. Prior studies in the field of animal affective computing primarily focused on identifying pain in animals. This study, however, aimed to delineate from that approach and explore emotions beyond pain in horses. Their primary tool was AI models capable of interpreting horses’ facial expressions.

Research Method and Techniques

Two different AI approaches were implemented to recognize the horses’ emotional states:

  • Deep Learning Model: This model makes use of video footage as its data source.
  • Machine Learning Model: This model utilizes EquiFACS (Equine Facial Action Coding System) annotations for data.

Comparison between Two Pipeline Approaches

The two models were compared and evaluated based on their performance. The Deep Learning model, which used video footage, had better performance compared to the Machine Learning model that used EquiFACS annotations.

Emotional States Detectable

The AI models aimed to distinguish four different emotional states:

  • Baseline
  • Positive Anticipation
  • Disappointment
  • Frustration

Results of the Study

The deep learning AI model could classify the four emotional states with an accuracy of 76%. However, it found it more challenging to distinguish between ‘anticipation’ and ‘frustration’, and only achieved 61% success in correctly distinguishing these emotions.

Cite This Article

APA
Feighelstein M, Riccie-Bonot C, Hasan H, Weinberg H, Rettig T, Segal M, Distelfeld T, Shimshoni I, Mills DS, Zamansky A. (2024). Automated recognition of emotional states of horses from facial expressions. PLoS One, 19(7), e0302893. https://doi.org/10.1371/journal.pone.0302893

Publication

ISSN: 1932-6203
NlmUniqueID: 101285081
Country: United States
Language: English
Volume: 19
Issue: 7
Pages: e0302893

Researcher Affiliations

Feighelstein, Marcelo
  • Information Systems Department, University of Haifa, Haifa, Israel.
Riccie-Bonot, Claire
  • Computer Science Department, University of Haifa, Haifa, Israel.
Hasan, Hana
  • Information Systems Department, University of Haifa, Haifa, Israel.
Weinberg, Hallel
  • Information Systems Department, University of Haifa, Haifa, Israel.
Rettig, Tidhar
  • Information Systems Department, University of Haifa, Haifa, Israel.
Segal, Maya
  • Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel.
Distelfeld, Tomer
  • Faculty of Electrical Engineering, Technion, Israel Institute of Technology, Haifa, Israel.
Shimshoni, Ilan
  • Information Systems Department, University of Haifa, Haifa, Israel.
Mills, Daniel S
  • Department of Life Sciences, Joseph Banks Laboratories, University of Lincoln, Lincoln, United Kingdom.
Zamansky, Anna
  • Information Systems Department, University of Haifa, Haifa, Israel.

MeSH Terms

  • Horses / psychology
  • Animals
  • Facial Expression
  • Emotions / physiology
  • Machine Learning
  • Deep Learning
  • Male
  • Humans

Conflict of Interest Statement

The authors have declared that no competing interests exist.

Citations

This article has been cited 5 times.
  1. Guo X, Shi L, Ma B, Feng C, Liu Z. Research on improved models for facial expression recognition in mice with abnormal glucose metabolism. Sci Rep 2026 Feb 10;16(1).
    doi: 10.1038/s41598-026-38863-3pubmed: 41667608google scholar: lookup
  2. Bhave A, Kieson E, Hafner A, Gloor PA. Identifying Novel Emotions and Wellbeing of Horses from Videos Through Unsupervised Learning. Sensors (Basel) 2025 Jan 31;25(3).
    doi: 10.3390/s25030859pubmed: 39943498google scholar: lookup
  3. O'Connell E, Dyson S, McLean A, McGreevy P. No More Evasion: Redefining Conflict Behaviour in Human-Horse Interactions. Animals (Basel) 2025 Jan 31;15(3).
    doi: 10.3390/ani15030399pubmed: 39943169google scholar: lookup
  4. Feighelstein M, Ricci-Bonot C, Hasan H, Weinberg H, Rettig T, Segal M, Distelfeld T, Shimshoni I, Mills DS, Zamansky A. Correction: Automated recognition of emotional states of horses from facial expressions. PLoS One 2025;20(2):e0319501.
    doi: 10.1371/journal.pone.0319501pubmed: 39937778google scholar: lookup
  5. König von Borstel U, Kienapfel K, McLean A, Wilkins C, McGreevy P. Hyperflexing the horse's neck: a systematic review and meta-analysis. Sci Rep 2024 Oct 2;14(1):22886.
    doi: 10.1038/s41598-024-72766-5pubmed: 39358404google scholar: lookup