Detection of mare parturition through balanced multi-scale feature fusion based on improved Libra RCNN.
Abstract: Once a mare experiences parturition abnormalities, the outcome between a live foal and a stillborn can change rapidly. Automated detection of mare parturition and timely human intervention is crucial to reducing risks during mare and foal parturition. This paper addresses the challenges of manual monitoring of parturition in large-scale equine facilities due to the unpredictability of mare parturition timing, proposing an algorithm for detecting mare parturition through a balanced multi-scale feature fusion based on an improved Libra RCNN. Initially, a ResNet101 backbone network incorporating the CBAM attention module was used to enhance parturition feature extraction capability; subsequently, a balanced content-aware feature reassembly feature pyramid, CARAFE-BFP, was employed to mitigate data imbalance effects while enhancing the quality of feature map upsampling; finally, the GRoIE module was utilized to merge CARAFE-BFP's multi-scale features, improving the model's perception of multi-scale objectives and minor feature changes. The model achieved a mean average precision of 86.26% in scenarios of imbalanced positive and negative samples of mare parturition data, subtle parturition feature differences, and multi-scale data distribution, with a detection speed of 15.06 images per second and an average recall rate of 98.17%. Moreover, this study employed a statistical method combined with a sliding window mechanism to assess the algorithm's performance in detecting mare parturition in video stream continuous monitoring scenarios, achieving an accuracy rate of 92.75% for mare parturition detection. The algorithm proposed in this study achieved non-contact, stress-free, intensive, and automated detection of mare parturition, also demonstrating the immense potential of artificial intelligence technology in the field of animal production management.
Copyright: © 2025 Wang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Publication Date: 2025-03-04 PubMed ID: 40036288PubMed Central: PMC11878943DOI: 10.1371/journal.pone.0318498Google Scholar: Lookup
The Equine Research Bank provides access to a large database of publicly available scientific literature. Inclusion in the Research Bank does not imply endorsement of study methods or findings by Mad Barn.
- Journal Article
Summary
This research summary has been generated with artificial intelligence and may contain errors and omissions. Refer to the original study to confirm details provided. Submit correction.
Overview
- This research presents a new artificial intelligence algorithm designed to automatically detect when a mare (female horse) is about to give birth, improving monitoring and reducing risks associated with equine parturition.
- The proposed method enhances feature extraction and balances multi-scale data using an improved Libra RCNN framework, achieving high accuracy and real-time detection speeds.
Background and Motivation
- Parturition abnormalities in mares can critically influence whether a foal is born alive or stillborn.
- Timely detection of mare labor onset is essential for prompt human intervention to reduce health risks for both mare and foal.
- Current monitoring methods often rely on manual observation in large-scale equine facilities, which is labor-intensive and inefficient due to unpredictability of the timing.
- Automated, non-contact, and real-time detection systems based on video data offer a solution to these challenges.
Research Problem and Challenges
- Detecting mare parturition from video involves several challenges:
- Imbalanced datasets: far fewer positive parturition samples versus negatives.
- Subtle visual differences indicating parturition onset.
- Multi-scale data distribution: features appear at different spatial scales in the images.
- These challenges complicate accurate detection of parturition events using existing computer vision models.
Methodology and Algorithm Design
- The authors proposed an improved Libra RCNN-based detection model incorporating several advanced components to address these challenges:
- Backbone Network – ResNet101 with CBAM Attention Module:
- ResNet101 serves as the feature extractor; it is a deep convolutional neural network known for strong representational power.
- The Convolutional Block Attention Module (CBAM) helps the network focus on important spatial and channel-wise features related to parturition cues.
- Balanced Content-Aware Feature Reassembly Feature Pyramid (CARAFE-BFP):
- This component handles multi-scale features, improving upsampling quality in feature maps.
- It also balances the feature content to mitigate dataset imbalance problems, helping to detect both common and rare features effectively.
- GRoIE Module (Generalized Region of Interest Enhancement):
- Designed to merge multi-scale features from CARAFE-BFP.
- Enhances the model’s sensitivity to multi-scale objects and subtle changes related to parturition.
Performance and Results
- The algorithm achieved a mean average precision (mAP) of 86.26% despite difficult conditions of imbalanced datasets and subtle feature differences.
- Detection speed reached 15.06 images per second, suitable for near real-time monitoring.
- Recall rate was 98.17%, indicating very effective detection of true parturition events.
- Additionally, a statistical method combined with a sliding window mechanism was proposed to analyze continuous video streams, improving detection reliability over time.
- In video monitoring scenarios, the method achieved a 92.75% accuracy rate for mare parturition detection.
Significance and Implications
- The study introduces a non-contact and stress-free automated system for intensive monitoring of mare parturition, reducing reliance on laborious manual observation.
- Demonstrates the potential of AI-based object detection technologies in improving animal production management and welfare.
- The balanced multi-scale feature fusion approach addresses common challenges in medical and veterinary image-based detection tasks, suggesting applicability to other domains with imbalanced and subtle visual data.
Summary
- The research advances automated detection of critical animal physiological events utilizing deep learning and enhanced multi-scale feature fusion techniques.
- It offers a practical solution for monitoring mare labor with high accuracy and speed, supporting timely intervention and thus improved outcomes for mares and foals.
Cite This Article
APA
Wang B, Duan W, Zhao J, Bai D.
(2025).
Detection of mare parturition through balanced multi-scale feature fusion based on improved Libra RCNN.
PLoS One, 20(3), e0318498.
https://doi.org/10.1371/journal.pone.0318498 Publication
Researcher Affiliations
- College of Computer and Information Engineering, Inner Mongolia Agricultural University, Hohhot, Inner Mongolia, China.
- Key Laboratory of Smart Animal Husbandry at Universities of Inner Mongolia Autonomous Region, Inner Mongolia Agricultural University, Inner Mongolia, China.
- College of Computer and Information Engineering, Inner Mongolia Agricultural University, Hohhot, Inner Mongolia, China.
- College of Animal Science, Inner Mongolia Agricultural University, Hohhot, Inner Mongolia, China.
- College of Animal Science, Inner Mongolia Agricultural University, Hohhot, Inner Mongolia, China.
MeSH Terms
- Animals
- Horses / physiology
- Female
- Parturition / physiology
- Pregnancy
- Algorithms
- Neural Networks, Computer
Conflict of Interest Statement
The authors have declared that no competing interests exist.
References
This article includes 47 references
- Lanci A, Perina F, Donadoni A, Castagnetti C, Mariella J. Dystocia in the standardbred mare: a retrospective study from 2004 to 2020.. Animals (Basel) 2022;12(12):1486.
- Ille N, Aurich C, Aurich J. Physiological stress responses of mares to gynecologic examination in veterinary medicine.. J Equine Veter Sci 2016;436–11.
- Hartmann C, Lidauer L, Aurich J, Aurich C, Nagel C. Detection of the time of foaling by accelerometer technique in horses (Equus caballus)-a pilot study.. Reprod Domest Anim 2018;53(6):1279–86.
- Aoki T, Shibata M, Violin G, Higaki S, Yoshioka K. Detection of foaling using a tail-attached device with a thermistor and tri-axial accelerometer in pregnant mares.. PLoS One 2023;18(6):e0286807.
- Müller A, Glüge S, Vidondo B, Wróbel A, Ott T, Sieme H. Increase of skin temperature prior to parturition in mares.. Theriogenology 2022;19046–51.
- Nagel C, Aurich C, Aurich J. Stress effects on the regulation of parturition in different domestic animal species.. Animal Reprod Sci 2019;207:153–61.
- Nuñez CMV, Adelman JS, Smith J, Gesquiere LR, Rubenstein DI. Linking social environment and stress physiology in feral mares (Equus caballus): group transfers elevate fecal cortisol levels.. Gen Comp Endocrinol 2014;196:26–33.
- Nagel C, Melchert M, Aurich J, Aurich C. Road transport of late pregnant mares advances the onset of foaling.. J Equine Veter Sci 2018;66:252.
- Sharma VK, Mir RN. A comprehensive and systematic look up into deep learning based object detection techniques: a review.. Comput Sci Rev 2020;38:100301.
- Mahmud MS, Zahid A, Das AK, Muzammil M, Khan MU. A systematic literature review on deep learning applications for precision cattle farming.. Comput Electron Agricult 2021;187:106313.
- García R, Aguilar J, Toro M, Pinto A, Rodríguez P. A systematic literature review on the use of machine learning in precision livestock farming.. Comput Electron Agricult 2020;179:105826.
- Li X, Xu F, Gao H, Liu F, Lyu X. A frequency domain feature-guided network for semantic segmentation of remote sensing images.. IEEE Signal Process Lett 2024.
- Appe SN, G A, Gn B. CAM-YOLO: tomato detection and classification based on improved YOLOv5 using combining attention mechanism.. PeerJ Comput Sci 2023;9:e1463.
- Balaji GN, Parthasarathy G. A modified convolutional neural network for tumor segmentation in multimodal brain magnetic resonance images.. AIP Conf Proc 2024;2919:050008.
- Qiao Y, Guo Y, He D. Cattle body detection based on YOLOv5-ASFF for precision livestock farming.. Comput Electron Agricult 2023;204:107579.
- Bhujel A, Arulmozhi E, Moon B-E, Kim H-T. Deep-learning-based automatic monitoring of pigs’ physico-temporal activities at different greenhouse gas concentrations. Animals (Basel) 2021;11(11):3089. doi: 10.3390/ani11113089
- Lei K, Zong C, Yang T, Peng S, Zhu P, Wang H, et al. Detection and analysis of sow targets based on image vision. agriculture 2022;12(1):73. doi: 10.3390/agriculture12010073
- Peng J, Wang D, Liao X, Shao Q, Sun Z, Yue H, et al. Wild animal survey using UAS imagery and deep learning: modified Faster R-CNN for kiang detection in Tibetan Plateau. ISPRS J Photogram Remote Sens. 2020;169:364–76. doi: 10.1016/j.isprsjprs.2020.08.026
- Chen C, Zhu W, Norton T. Behaviour recognition of pigs and cattle: journey from computer vision to deep learning. Comput Electron Agricult. 2021;187:106255. doi: 10.1016/j.compag.2021.106255
- Gu Z, Zhang H, He Z, Niu K. A two-stage recognition method based on deep learning for sheep behavior. Comput Electron Agricult. 2023;212:108143. doi: 10.1016/j.compag.2023.108143
- Liu L, Zhou J, Zhang B, Dai S, Shen M. Visual detection on posture transformation characteristics of sows in late gestation based on Libra R-CNN. Biosyst Eng. 2022;223:219–31. doi: 10.1016/j.biosystemseng.2022.09.003
- Ji H, Yu J, Lao F, Zhuang Y, Wen Y, Teng G. Automatic position detection and posture recognition of grouped pigs based on deep learning. Agriculture 2022;12(9):1314. doi: 10.3390/agriculture12091314
- Niknejad N, Caro JL, Bidese-Puhl R, Bao Y, Staiger EA. Equine kinematic gait analysis using stereo videography and deep learning: stride length and stance duration estimation. J ASABE. 2023.
- Liu C, Su J, Wang L, Lu S, Li L. LA-DeepLab V3+: a novel counting network for pigs. Agriculture 2022;12(2):284. doi: 10.3390/agriculture12020284
- Zhou Z. Detection and counting method of pigs based on YOLOV5_Plus: a combination of YOLOV5 and attention mechanism. Math Prob Eng. 2022;2022:1–16. doi: 10.1155/2022/7078670
- Li X, Xu F, Li L, Xu N, Liu F, Yuan C, et al. AAFormer: attention-attended transformer for semantic segmentation of remote sensing images. IEEE Geosci Remote Sens Lett. 2024.
- Li X, Xu F, Liu F, Lyu X, Tong Y, Xu Z, et al. A synergistical attention model for semantic segmentation of remote sensing images. IEEE Trans Geosci Remote Sensing. 2023;61:1–16. doi: 10.1109/tgrs.2023.3243954
- Li X, Xu F, Yong X, Chen D, Xia R, Ye B, et al. SSCNet: a spectrum-space collaborative network for semantic segmentation of remote sensing images. Remote Sens 2023;15(23):5610. doi: 10.3390/rs15235610
- Pang J, Chen K, Shi J, Feng H, Ouyang W, Lin D. Libra r-cnn: towards balanced learning for object detection. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2019. p. 821–30.
- He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2016. p. 770–8.
- Woo, S, Park, J, Lee, JY, Kweon, IS. Cbam: Convolutional block attention module. In Proceedings of the Proceedings of the European Conference On Computer Vision (ECCV); 2018, p. 3–19.
- Rossi, L, Karimi, A, Prati, A. A novel region of interest extraction layer for instance segmentation. In Proceedings of the 2020 25th International Conference on Pattern Recognition (ICPR). IEEE; 2021. p. 2203–9.
- Wang Z, Bovik AC, Sheikh HR, Simoncelli EP. Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process. 2004;13(4):600–12. doi: 10.1109/tip.2003.819861
- Lin TY, Maire M, Belongie S, Hays J, Perona P, Ramanan D, et al. Microsoft coco: common objects in context. In: Proceedings of the Computer Vision–ECCV 2014:13th European Conference, Zurich, Switzerland, 2014 Sept 6–12, Part V 13. Springer; 2014. p. 740–55.
- Hu J, Shen L, Sun G. Squeeze-and-excitation networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 2018 June 18–22. 2018. p. 7132–41.
- Cao Y, Xu J, Lin S, Wei F, Hu H. GCNet: non-local networks meet squeeze-excitation networks and beyond. In: Proceedings of the IEEE International Conference on Computer Vision Workshops (ICCVW), Seoul, South Korea, 2019 Oct 27–Nov 2. 2019. p. 1971–80.
- Lin TY, Dollar P, Girshick R, He K, Hariharan B, Belongie S. Feature pyramid networks for object detection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2017. pp. 2117–25.
- Wang J, Chen K, Xu R, Liu Z, Loy CC, Lin D. Carafe: content-aware reassembly of features. In: Proceedings of the IEEE/CVF International Conference On Computer Vision; 2019. p. 3007–16.
- He K, Gkioxari G, Dollar, P, Girshick, R. Mask r-cnn. In Proceedings of the IEEE International Conference on Computer Vision; 2017. p. 2961–9.
- Ioffe S, Szegedy C. Batch normalization: accelerating deep network training by reducing internal covariate shift. In: Proceedings of the International Conference on Machine Learning. PMLR; 2015. p. 448–56.
- Simonyan K. Very deep convolutional networks for large-scale image recognition. Proceedings of the 3rd International Conference on Learning Representations ICLR 2015-Conference Track Proceedings; 2015. p. 1.
- Zhang H, Wu C, Zhang Z, Zhu Y, Lin H, Zhang Z, et al. Resnest: split-attention networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2022. p. 2736–46.
- Ren S, He K, Girshick R, Sun J. Faster r-cnn: towards real-time object detection with region proposal networks. Adv Neural Inf Process Syst.2015;28.
- Tian Z, Shen C, Chen H, He T. Fcos: fully convolutional one-stage object detection. In: Proceedings of the IEEE/CVF International Conference on Computer Vision; 2019. p. 9627–36.
- Farhadi A, Redmon J. Yolov3: an incremental improvement. In: Computer vision and pattern recognition; 2018. p. 1–6.
- Zhou B, Khosla A, Lapedriza A, Oliva A, Torralba A. Learning deep features for discriminative localization. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2016. p. 2921–9.
- Zeng W, He M. Rice disease segmentation method based on CBAM-CARAFE-DeepLabv3+. Crop Protection. 2024;180:106665. doi: 10.1016/j.cropro.2024.106665
Citations
This article has been cited 0 times.Use Nutrition Calculator
Check if your horse's diet meets their nutrition requirements with our easy-to-use tool Check your horse's diet with our easy-to-use tool
Talk to a Nutritionist
Discuss your horse's feeding plan with our experts over a free phone consultation Discuss your horse's diet over a phone consultation
Submit Diet Evaluation
Get a customized feeding plan for your horse formulated by our equine nutritionists Get a custom feeding plan formulated by our nutritionists