Cross-Modality Interaction Network for Equine Activity Recognition Using Imbalanced Multi-Modal Data.
Abstract: With the recent advances in deep learning, wearable sensors have increasingly been used in automated animal activity recognition. However, there are two major challenges in improving recognition performance-multi-modal feature fusion and imbalanced data modeling. In this study, to improve classification performance for equine activities while tackling these two challenges, we developed a cross-modality interaction network (CMI-Net) involving a dual convolution neural network architecture and a cross-modality interaction module (CMIM). The CMIM adaptively recalibrated the temporal- and axis-wise features in each modality by leveraging multi-modal information to achieve deep intermodality interaction. A class-balanced (CB) focal loss was adopted to supervise the training of CMI-Net to alleviate the class imbalance problem. Motion data was acquired from six neck-attached inertial measurement units from six horses. The CMI-Net was trained and verified with leave-one-out cross-validation. The results demonstrated that our CMI-Net outperformed the existing algorithms with high precision (79.74%), recall (79.57%), F1-score (79.02%), and accuracy (93.37%). The adoption of CB focal loss improved the performance of CMI-Net, with increases of 2.76%, 4.16%, and 3.92% in precision, recall, and F1-score, respectively. In conclusion, CMI-Net and CB focal loss effectively enhanced the equine activity classification performance using imbalanced multi-modal sensor data.
Publication Date: 2021-08-29 PubMed ID: 34502709PubMed Central: PMC8434387DOI: 10.3390/s21175818Google Scholar: Lookup
The Equine Research Bank provides access to a large database of publicly available scientific literature. Inclusion in the Research Bank does not imply endorsement of study methods or findings by Mad Barn.
- Journal Article
Summary
This research summary has been generated with artificial intelligence and may contain errors and omissions. Refer to the original study to confirm details provided. Submit correction.
This research develops a new network, called Cross-Modality Interaction Network (CMI-Net), aimed at improving the recognition of activities in horses using data from wearable sensors. By addressing challenges with imbalanced data and multi-modal feature fusion, the network provides more accurate results than existing algorithms.
Introduction
- The article discusses the use of deep learning and wearable sensors in animal activity recognition, specifically equine activities. The focus is set upon overcoming two key challenges in this field – multi-modal feature fusion and handling imbalanced data.
Cross-Modality Interaction Network (CMI-Net)
- In order to improve the classification of equine activities, the researchers developed a Cross-Modality Interaction Network (CMI-Net). This network incorporates a dual convolution neural network architecture and a cross-modality interaction module (CMIM).
- The CMIM makes use of multi-modal information to recalibrate the temporal- and axis-wise features of each modality, enabling deeper intermodality interaction.
Class-Balanced Focal Loss
- The researchers used a class-balanced (CB) focal loss to supervise the training of the CMI-Net, in order to tackle the class imbalance problem. CB focal loss re-weights the importance of different classes based on their frequency, thus countering the negative effects of imbalanced class distribution.
Experimental Setup
- The team collected motion data from six neck-attached inertial measurement units on six horses. The gathered data was used to train and verify the CMI-Net, using a well-known validation scheme called leave-one-out cross-validation.
Results and Conclusion
- Compared to existing algorithms, the results showed that the CMI-Net provided significantly improved results with high precision, recall, F1-score, and overall accuracy.
- The implementation of CB focal loss contributed to the improved performance of the CMI-Net, particularly enhancing precision, recall, and F1-score.
- In sum, the study concludes that the proposed CMI-Net and the use of CB focal loss effectively improved the classification of equine activities using multi-modal sensor data that previously suffered from imbalances.
Cite This Article
APA
Mao A, Huang E, Gan H, Parkes RSV, Xu W, Liu K.
(2021).
Cross-Modality Interaction Network for Equine Activity Recognition Using Imbalanced Multi-Modal Data.
Sensors (Basel), 21(17), 5818.
https://doi.org/10.3390/s21175818 Publication
Researcher Affiliations
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China.
- Department of Computer Science, City University of Hong Kong, Hong Kong, China.
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China.
- College of Electronic Engineering, South China Agricultural University, Guangzhou 510642, China.
- Department of Veterinary Clinical Sciences, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China.
- Centre for Companion Animal Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China.
- Department of Computer Science, City University of Hong Kong, Hong Kong, China.
- Department of Infectious Diseases and Public Health, Jockey Club College of Veterinary Medicine and Life Sciences, City University of Hong Kong, Hong Kong, China.
- Animal Health Research Centre, Chengdu Research Institute, City University of Hong Kong, Chengdu 610000, China.
MeSH Terms
- Algorithms
- Animals
- Horses
- Neural Networks, Computer
Grant Funding
- 9610450 / City University of Hong Kong
Conflict of Interest Statement
The authors declare no conflict of interest.
References
This article includes 46 references
- Eerdekens A, Deruyck M, Fontaine J, Martens L, De Poorter E, Plets D, Joseph W. A framework for energy-efficient equine activity recognition with leg accelerometers. Comput. Electron. Agric. 2021;183:106020.
- Parkes RSV, Weller R, Pfau T, Witte TH. The Effect of Training on Stride Duration in a Cohort of Two-Year-Old and Three-Year-Old Thoroughbred Racehorses.. Animals (Basel) 2019 Jul 22;9(7).
- van Weeren PR, Pfau T, Rhodin M, Roepstorff L, Serra Bragança F, Weishaupt MA. Do we have to redefine lameness in the era of quantitative gait analysis?. Equine Vet J 2017 Sep;49(5):567-569.
- Bosch S, Serra Bragança F, Marin-Perianu M, Marin-Perianu R, van der Zwaag BJ, Voskamp J, Back W, van Weeren R, Havinga P. EquiMoves: A Wireless Networked Inertial Measurement System for Objective Examination of Horse Gait.. Sensors (Basel) 2018 Mar 13;18(3).
- Astill J, Dara RA, Fraser EDG, Roberts B, Sharif S. Smart poultry management: Smart sensors, big data, and the internet of things. Comput. Electron. Agric. 2020;170:105291.
- Rueß D, Rueß J, Hümmer C, Deckers N, Migal V, Kienapfel K, Wieckert A, Barnewitz D, Reulke R. Equine Welfare Assessment: Horse Motion Evaluation and Comparison to Manual Pain Measurements. Proceedings of the Pacific-Rim Symposium on Image and Video Technology, PSIVT 2019 Sydney, Australia. 18–22 November 2019; pp. 156–169.
- Kamminga JW, Meratnia N, Havinga PJM. Dataset: Horse Movement Data and Analysis of its Potential for Activity Recognition. Proceedings of the 2nd Workshop on Data Acquisition to Analysis, DATA 2019 Prague, Czech Republic. 26–28 July 2019; pp. 22–25.
- Kumpulainen P, Cardó AV, Somppi S, Törnqvist H, Väätäjä H, Majaranta P, Gizatdinova Y, Hoog Antink C, Surakka V, Kujala MV. Dog behaviour classification with movement sensors placed on the harness and the collar. Appl. Anim. Behav. Sci. 2021;241:105393.
- Tran DN, Nguyen TN, Khanh PCP, Trana DT. An IoT-based Design Using Accelerometers in Animal Behavior Recognition Systems. IEEE Sens. J. 2021.
- Maisonpierre IN, Sutton MA, Harris P, Menzies-Gow N, Weller R, Pfau T. Accelerometer activity tracking in horses and the effect of pasture management on time budget.. Equine Vet J 2019 Nov;51(6):840-845.
- Nweke HF, Teh YW, Al-garadi MA, Alo UR. Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges. Expert Syst. Appl. 2018;105:233–261.
- Noorbin SFH, Layeghy S, Kusy B, Jurdak R, Bishop-hurley G, Portmann M. Deep Learning-based Cattle Activity Classification Using Joint Time-frequency Data Representation. Comput. Electron. Agric. 2020;187:106241.
- Peng Y, Kondo N, Fujiura T, Suzuki T, Ouma S, Wulandari Yoshioka H, Itoyama E. Dam behavior patterns in Japanese black beef cattle prior to calving: Automated detection using LSTM-RNN. Comput. Electron. Agric. 2020;169:105178.
- Bocaj E, Uzunidis D, Kasnesis P, Patrikakis CZ. On the Benefits of Deep Convolutional Neural Networks on Animal Activity Recognition. Proceedings of the 2020 International Conference on Smart Systems and Technologies (SST) Osijek, Croatia. 14–16 October 2020; pp. 83–88.
- Eerdekens A, Deruyck M, Fontaine J, Martens L, de Poorter E, Plets D, Joseph W. Resampling and Data Augmentation for Equines’ Behaviour Classification Based on Wearable Sensor Accelerometer Data Using a Convolutional Neural Network. Proceedings of the 2020 International Conference on Omni-layer Intelligent Systems (COINS) Barcelona, Spain. 31 August–2 September 2020; pp. 1–6.
- Chambers RD, Yoder NC, Carson AB, Junge C, Allen DE, Prescott LM, Bradley S, Wymore G, Lloyd K, Lyle S. Deep Learning Classification of Canine Behavior Using a Single Collar-Mounted Accelerometer: Real-World Validation.. Animals (Basel) 2021 May 25;11(6).
- Liu N, Zhang N, Han J. Learning Selective Self-Mutual Attention for RGB-D Saliency Detection. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020 14–19 June 2020; pp. 13753–13762.
- Ha S, Choi S. Convolutional neural networks for human activity recognition using multiple accelerometer and gyroscope sensors. Proceedings of the 2016 International Joint Conference on Neural Networks (IJCNN) Vancouver, BC, Canada. 24–29 July 2016; pp. 381–388.
- Mustaqeem Kwon S. MLT-DNet: Speech emotion recognition using 1D dilated CNN based on multi-learning trick approach. Expert Syst. Appl. 2021;167:114177.
- Mustaqeem, Kwon S. Optimal feature selection based speech emotion recognition using two-stream deep convolutional neural network. Int. J. Intell. Syst. 2021;36:5116–5135.
- Xu X, Li W, Duan Q. Transfer learning and SE-ResNet152 networks-based for small-scale unbalanced fish species identification. Comput. Electron. Agric. 2021;180:105878.
- Zhang S, Li Z, Yan S, He X, Sun J. Distribution Alignment: A Unified Framework for Long-tail Visual Recognition. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021 19–25 June 2021; pp. 2361–2370.
- Tan J, Wang C, Li B, Li Q, Ouyang W, Yin C, Yan J. Equalization loss for long-tailed object recognition. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020 14–19 June 2020; pp. 11659–11668.
- Khan SH, Hayat M, Bennamoun M, Sohel FA, Togneri R. Cost-Sensitive Learning of Deep Feature Representations From Imbalanced Data.. IEEE Trans Neural Netw Learn Syst 2018 Aug;29(8):3573-3587.
- Cui Y, Jia M, Lin TY, Song Y, Belongie S. Class-balanced loss based on effective number of samples. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019 Long Beach, CA, USA. 16–20 June 2019; pp. 9260–9269.
- Lin TY, Goyal P, Girshick R, He K, Dollar P. Focal Loss for Dense Object Detection.. IEEE Trans Pattern Anal Mach Intell 2020 Feb;42(2):318-327.
- Wang T, Zhu Y, Zhao C, Zeng W, Wang J, Tang M. Adaptive Class Suppression Loss for Long-Tail Object Detection. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021 19–25 June 2021; pp. 3103–3112.
- Mao AX, Huang ED, Xu WT, Liu K. Cross-modality Interaction Network for Equine Activity Recognition Using Time-Series Motion Data. Proceedings of the 2021 International Symposium on Animal Environment and Welfare (ISAEW) Chongqing, China. 20–23 October 2021; in press.
- Zhang Z, Lin Z, Xu J, Jin WD, Lu SP, Fan DP. Bilateral Attention Network for RGB-D Salient Object Detection.. IEEE Trans Image Process 2021;30:1949-1961.
- Woo S, Park J, Lee JY, Kweon IS. CBAM: Convolutional block attention module. Proceedings of the European Conference on Computer Vision, ECCV 2018 Munich, Germany. 8–14 September 2018; pp. 3–19.
- Mustaqeem Kwon S. Att-Net: Enhanced emotion recognition system using lightweight self-attention module. Appl. Soft Comput. 2021;102:107101.
- Kamminga JW, Janßen LM, Meratnia N, Havinga PJM. Horsing around—A dataset comprising horse movement. Data 2019;4:131.
- He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2016 Las Vegas, NV, USA. 27–30 June 2016; pp. 770–778.
- Kamminga JW, Le DV, Havinga PJM. Towards deep unsupervised representation learning from accelerometer time series for animal activity recognition. Proceedings of the 6th Workshop on Mining and Learning from Time Series, MiLeTS 2020 San Diego, CA, USA. 24 August 2020.
- Nair V, Hinton GE. Rectified Linear Units Improve Restricted Boltzmann Machines Vinod. Proceedings of the 27th International Conference on Machine Learning, ICML 2010 Haifa, Israel. 21–24 June 2010.
- Joze HRV, Shaban A, Iuzzolino ML, Koishida K. MMTM: Multimodal transfer module for CNN fusion. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020 14–19 June 2020; pp. 13286–13296.
- Casella E, Khamesi AR, Silvestri S. A framework for the recognition of horse gaits through wearable devices. Pervasive Mob. Comput. 2020;67:101213.
- Zeng M, Nguyen LT, Yu B, Mengshoel OJ, Zhu J, Wu P, Zhang J. Convolutional Neural Networks for human activity recognition using mobile sensors. Proceedings of the 6th international conference on mobile computing, applications and services, MobiCASE 2014 Austin, TX, USA. 6–7 November 2014; pp. 197–205.
- Wolpert DH, Macready WG. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997;1:67–82.
- Wei J, Wang Q, Li Z, Wang S, Zhou SK, Cui S. Shallow Feature Matters for Weakly Supervised Object Localization. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021 19–25 June 2021; pp. 5993–6001.
- LeCun Y, Bengio Y, Hinton G. Deep learning.. Nature 2015 May 28;521(7553):436-44.
- Van der Maaten L, Hinton G. Visualizing Data using t-SNE. J. Mach. Learn. Res. 2008;9:2579–2605.
- de Cocq P, van Weeren PR, Back W. Effects of girth, saddle and weight on movements of the horse.. Equine Vet J 2004 Dec;36(8):758-63.
- Geng C, Huang SJ, Chen S. Recent Advances in Open Set Recognition: A Survey.. IEEE Trans Pattern Anal Mach Intell 2021 Oct;43(10):3614-3631.
- Yoshihashi R, You S, Shao W, Iida M, Kawakami R, Naemura T. Classification-Reconstruction Learning for Open-Set Recognition. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019 Long Beach, CA, USA. 16–20 June 2019; pp. 4016–4025.
- Cardoso DO, Gama J, França FMG. Weightless neural networks for open set recognition. Mach. Learn. 2017;106:1547–1567.
Citations
This article has been cited 8 times.- Aguilar-Lazcano CA, Espinosa-Curiel IE, Ríos-Martínez JA, Madera-Ramírez FA, Pérez-Espinosa H. Machine Learning-Based Sensor Data Fusion for Animal Monitoring: Scoping Review. Sensors (Basel) 2023 Jun 20;23(12).
- Mao A, Huang E, Gan H, Liu K. FedAAR: A Novel Federated Learning Framework for Animal Activity Recognition with Wearable Sensors. Animals (Basel) 2022 Aug 21;12(16).
- Mao A, Giraudet CSE, Liu K, De Almeida Nolasco I, Xie Z, Xie Z, Gao Y, Theobald J, Bhatta D, Stewart R, McElligott AG. Automated identification of chicken distress vocalizations using deep learning models. J R Soc Interface 2022 Jun;19(191):20210921.
- Wu Y, Liang W, Fang J, Zhou C, Sun X. VTC-Net: A Semantic Segmentation Network for Ore Particles Integrating Transformer and Convolutional Block Attention Module (CBAM). Sensors (Basel) 2026 Jan 24;26(3).
- Ding L, Zhang C, Yue Y, Yao C, Li Z, Hu Y, Yang B, Ma W, Yu L, Gao R, Li Q. Wearable Sensors-Based Intelligent Sensing and Application of Animal Behaviors: A Comprehensive Review. Sensors (Basel) 2025 Jul 21;25(14).
- Akbarein H, Taaghi MH, Mohebbi M, Soufizadeh P. Applications and Considerations of Artificial Intelligence in Veterinary Sciences: A Narrative Review. Vet Med Sci 2025 May;11(3):e70315.
- Lovatti JVR, Dijkinga KA, Aires JF, Garrido LFC, Costa JHC, Daros RR. Validation and interdevice reliability of a behavior monitoring collar to measure rumination, feeding activity, and idle time of lactating dairy cows. JDS Commun 2024 Nov;5(6):602-607.
- Ahn SH, Kim S, Jeong DH. Unsupervised Domain Adaptation for Mitigating Sensor Variability and Interspecies Heterogeneity in Animal Activity Recognition. Animals (Basel) 2023 Oct 20;13(20).
Use Nutrition Calculator
Check if your horse's diet meets their nutrition requirements with our easy-to-use tool Check your horse's diet with our easy-to-use tool
Talk to a Nutritionist
Discuss your horse's feeding plan with our experts over a free phone consultation Discuss your horse's diet over a phone consultation
Submit Diet Evaluation
Get a customized feeding plan for your horse formulated by our equine nutritionists Get a custom feeding plan formulated by our nutritionists