Analyze Diet
Sensors (Basel, Switzerland)2021; 21(17); 5697; doi: 10.3390/s21175697

Improving Animal Monitoring Using Small Unmanned Aircraft Systems (sUAS) and Deep Learning Networks.

Abstract: In recent years, small unmanned aircraft systems (sUAS) have been used widely to monitor animals because of their customizability, ease of operating, ability to access difficult to navigate places, and potential to minimize disturbance to animals. Automatic identification and classification of animals through images acquired using a sUAS may solve critical problems such as monitoring large areas with high vehicle traffic for animals to prevent collisions, such as animal-aircraft collisions on airports. In this research we demonstrate automated identification of four animal species using deep learning animal classification models trained on sUAS collected images. We used a sUAS mounted with visible spectrum cameras to capture 1288 images of four different animal species: cattle (), horses (), Canada Geese (), and white-tailed deer (). We chose these animals because they were readily accessible and white-tailed deer and Canada Geese are considered aviation hazards, as well as being easily identifiable within aerial imagery. A four-class classification problem involving these species was developed from the acquired data using deep learning neural networks. We studied the performance of two deep neural network models, convolutional neural networks (CNN) and deep residual networks (ResNet). Results indicate that the ResNet model with 18 layers, ResNet 18, may be an effective algorithm at classifying between animals while using a relatively small number of training samples. The best ResNet architecture produced a 99.18% overall accuracy (OA) in animal identification and a Kappa statistic of 0.98. The highest OA and Kappa produced by CNN were 84.55% and 0.79 respectively. These findings suggest that ResNet is effective at distinguishing among the four species tested and shows promise for classifying larger datasets of more diverse animals.
Publication Date: 2021-08-24 PubMed ID: 34502588PubMed Central: PMC8433839DOI: 10.3390/s21175697Google Scholar: Lookup
The Equine Research Bank provides access to a large database of publicly available scientific literature. Inclusion in the Research Bank does not imply endorsement of study methods or findings by Mad Barn.
  • Journal Article

Summary

This research summary has been generated with artificial intelligence and may contain errors and omissions. Refer to the original study to confirm details provided. Submit correction.

This research paper discusses the integration of small unmanned aircraft systems (sUAS) with deep learning networks to improve animal monitoring. The study demonstrates the automated identification of four different animal species through images gathered by these systems. The results suggest that the ResNet deep learning model may be particularly effective for such classifications.

Use of Small Unmanned Aircraft Systems (sUAS

  • The researchers utilized small unmanned aircraft systems (sUAS) to monitor animals. This technology was chosen due to it being customizable, easy to operate, capable of accessing hard-to-navigate areas, and having potential to disturb animals less.
  • The sUAS were equipped with visible spectrum cameras to acquire images of four different animal species: cattle, horses, Canada geese, and white-tailed deer. These animals were chosen due to their availability and because Canada geese and white-tailed deer are known aviation hazards easily identifiable from the sky.

Using Deep Learning Networks for Animal Identification

  • The researchers used the captured images to train deep learning models for automated animal identification. The goal included preventing animal-vehicle collisions, such as those between animals and aircraft on airports, by improving large-area animal monitoring capabilities.
  • A four-class classification system with these four species was developed using deep learning neural networks.

Comparison of Deep Neural Network Models

  • The researchers compared the performance of two deep neural network models, convolutional neural networks (CNN) and deep residual networks (ResNet).
  • The results showed that the ResNet model with 18 layers (ResNet 18) effectively classified animals, even with a relatively small number of training samples.
  • The ResNet model produced a 99.18% overall accuracy (OA) in animal identification and a Kappas statistic of 0.98, while the CNN model’s highest OA and Kappa were 84.55% and 0.79 respectively.

Conclusions and Potential Future Applications

  • The study concluded that the ResNet deep learning model shows promising potential for classifying a wide range of animal species due to its high performance in this research.
  • This technology could be developed further to identify and track a wider variety of animals, further improving large-area animal monitoring and potentially reducing dangerous collisions.

Cite This Article

APA
Zhou M, Elmore JA, Samiappan S, Evans KO, Pfeiffer MB, Blackwell BF, Iglay RB. (2021). Improving Animal Monitoring Using Small Unmanned Aircraft Systems (sUAS) and Deep Learning Networks. Sensors (Basel), 21(17), 5697. https://doi.org/10.3390/s21175697

Publication

ISSN: 1424-8220
NlmUniqueID: 101204366
Country: Switzerland
Language: English
Volume: 21
Issue: 17
PII: 5697

Researcher Affiliations

Zhou, Meilun
  • Geosystems Research Institute, Mississippi State University, Oxford, MS 39762, USA.
Elmore, Jared A
  • Department of Wildlife, Fisheries and Aquaculture, Mississippi State University, Box 9690, Oxford, MS 39762, USA.
Samiappan, Sathishkumar
  • Geosystems Research Institute, Mississippi State University, Oxford, MS 39762, USA.
Evans, Kristine O
  • Department of Wildlife, Fisheries and Aquaculture, Mississippi State University, Box 9690, Oxford, MS 39762, USA.
Pfeiffer, Morgan B
  • U.S. Department of Agriculture, Animal and Plant Health Inspection Service, Wildlife Services, National Wildlife Research Center, Ohio Field Station, Sandusky, OH 44870, USA.
Blackwell, Bradley F
  • U.S. Department of Agriculture, Animal and Plant Health Inspection Service, Wildlife Services, National Wildlife Research Center, Ohio Field Station, Sandusky, OH 44870, USA.
Iglay, Raymond B
  • Department of Wildlife, Fisheries and Aquaculture, Mississippi State University, Box 9690, Oxford, MS 39762, USA.

MeSH Terms

  • Aircraft
  • Algorithms
  • Animals
  • Cattle
  • Deep Learning
  • Deer
  • Horses
  • Neural Networks, Computer

Grant Funding

  • AP20WSNWRC00C010 and AP20WSNWRC00C026 / U.S. Department of Agriculture

Conflict of Interest Statement

The authors declare no conflict of interest.

References

This article includes 58 references
  1. Dolbeer R.A., Begier M.J., Miller P.R., Weller J.R., Anderson A.L.. Wildlife Strikes to Civil Aircraft in the United States, 1990–2019. .
  2. Pfeiffer MB, Blackwell BF, DeVault TL. Quantification of avian hazards to military aircraft and implications for wildlife management.. PLoS One 2018;13(11):e0206599.
  3. DeVault T.L., Blackwell B.F., Seamans T.W., Begier M.J., Kougher J.D., Washburn J.E., Miller P.R., Dolbeer R.A.. Estimating interspecific economic risk of bird strikes with aircraft. Wildl. Soc. Bull. 2018;42:94–101.
    doi: 10.1002/wsb.859google scholar: lookup
  4. Washburn B.E., Pullins C.K., Guerrant T.L., Martinelli G.J., Beckerman S.F.. Comparing Management Programs to Reduce Red–tailed Hawk Collisions with Aircraft. Wildl. Soc. Bull. 2021;45:237–243.
    doi: 10.1002/wsb.1177google scholar: lookup
  5. DeVault T., Blackwell B., Belant J., Begier M.. Wildlife at Airports. Wildlife Damage Management Technical Series .
  6. Blackwell B., Schmidt P., Martin J.. Avian Survey Methods for Use at Airports. USDA National Wildlife Research Center—Staff Publications .
  7. Hubbard S., Pak A., Gu Y., Jin Y.. UAS to support airport safety and operations: Opportunities and challenges. J. Unmanned Veh. Syst. 2018;6:1–17.
  8. Anderson K., Gaston K.J.. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013;11:138–146.
  9. Christie K.S., Gilbert S.L., Brown C.L., Hatfield M., Hanson L.. Unmanned aircraft systems in wildlife research: Current and future applications of a transformative technology. Front. Ecol. Environ. 2016;14:241251.
    doi: 10.1002/fee.1281google scholar: lookup
  10. Hodgson J.C., Mott R., Baylis S.M., Pham T.T., Wotherspoon S., Kilpatrick A.D., Raja Segaran R., Reid I., Terauds A., Koh L.P.. Drones count wildlife more accurately and precisely than humans. Methods Ecol. Evol. 2018;9:1160–1167.
  11. Linchant J., Lisein J., Semeki J., Lejeune P., Vermeulen C.. Are unmanned aircraft systems (UASs) the future of wildlife monitoring? A review of accomplishments and challenges. Mammal. Rev. 2015;45:239–252.
    doi: 10.1111/mam.12046google scholar: lookup
  12. Frederick P.C., Hylton B., Heath J.A., Ruane M.. Accuracy and variation in estimates of large numbers of birds by individual observers using an aerial survey simulator. J. Field Ornithol. 2003.
  13. Sasse D.B.. Job-Related Mortality of Wildlife Workers in the United States. Wildl. Soc. Bull. 2003;31:1015–1020.
  14. Buckland S.T., Burt M.L., Rexstad E.A., Mellor M., Williams A.E., Woodward R.. Aerial surveys of seabirds: The advent of digital methods. J. Appl. Ecol. 2012;49:960–967.
  15. Chabot D., Bird D.M.. Wildlife research and management methods in the 21st century: Where do unmanned aircraft fit in?. J. Unmanned Veh. Syst. 2015;3:137–155.
    doi: 10.1139/juvs-2015-0021google scholar: lookup
  16. Pimm SL, Alibhai S, Bergl R, Dehgan A, Giri C, Jewell Z, Joppa L, Kays R, Loarie S. Emerging Technologies to Conserve Biodiversity.. Trends Ecol Evol 2015 Nov;30(11):685-696.
    doi: 10.1016/j.tree.2015.08.008pubmed: 26437636google scholar: lookup
  17. Hodgson JC, Baylis SM, Mott R, Herrod A, Clarke RH. Precision wildlife monitoring using unmanned aerial vehicles.. Sci Rep 2016 Mar 17;6:22574.
    doi: 10.1038/srep22574pmc: PMC4795075pubmed: 26986721google scholar: lookup
  18. Weinstein BG. A computer vision for animal ecology.. J Anim Ecol 2018 May;87(3):533-545.
    doi: 10.1111/1365-2656.12780pubmed: 29111567google scholar: lookup
  19. Reintsma K.M., McGowan P.C., Callahan C., Collier T., Gray D., Sullivan J.D., Prosser D.J.. Preliminary Evaluation of Behavioral Response of Nesting Waterbirds to Small Unmanned Aircraft Flight. Cowa. 2018;41:326–331.
    doi: 10.1675/063.041.0314google scholar: lookup
  20. Scholten C.N., Kamphuis A.J., Vredevoogd K.J., Lee-Strydhorst K.G., Atma J.L., Shea C.B., Lamberg O.N., Proppe D.S.. Real-time thermal imagery from an unmanned aerial vehicle can locate ground nests of a grassland songbird at rates similar to traditional methods. Biol. Conserv. 2019;233:241–246.
  21. Lyons M.B., Brandis K.J., Murray N.J., Wilshire J.H., McCann J.A., Kingsford R.T., Callaghan C.T.. Monitoring large and complex wildlife aggregations with drones. Methods Ecol. Evol. 2019;10:1024–1035.
    doi: 10.1111/2041-210X.13194google scholar: lookup
  22. Nguyen H., Maclagan S.J., Nguyen T.D., Nguyen T., Flemons P., Andrews K., Ritchie E.G., Phung D.. Animal Recognition and Identification with Deep Convolutional Neural Networks for Automated Wildlife Monitoring. Proceedings of the 2017 IEEE International Conference on Data Science and Advanced Analytics (DSAA) Tokyo, Japan. 19–21 October 2017; pp. 40–49.
  23. Tabak M.A., Norouzzadeh M.S., Wolfson D.W., Sweeney S.J., Vercauteren K.C., Snow N.P., Halseth J.M., Di Salvo P.A., Lewis J.S., White M.D.. Machine learning to classify animal species in camera trap images: Applications in ecology. Methods Ecol. Evol. 2019;10:585–590.
    doi: 10.1111/2041-210X.13120google scholar: lookup
  24. Rush GP, Clarke LE, Stone M, Wood MJ. Can drones count gulls? Minimal disturbance and semiautomated image processing with an unmanned aerial vehicle for colony-nesting seabirds.. Ecol Evol 2018 Dec;8(24):12322-12334.
    doi: 10.1002/ece3.4495pmc: PMC6308878pubmed: 30619548google scholar: lookup
  25. Chabot D., Francis C.M.. Computer-automated bird detection and counts in high-resolution aerial images: A review. J. Field Ornithol. 2016;87:343–359.
    doi: 10.1111/jofo.12171google scholar: lookup
  26. Hong SJ, Han Y, Kim SY, Lee AY, Kim G. Application of Deep-Learning Methods to Bird Detection Using Unmanned Aerial Vehicle Imagery.. Sensors (Basel) 2019 Apr 6;19(7).
    doi: 10.3390/s19071651pmc: PMC6479331pubmed: 30959913google scholar: lookup
  27. Ratcliffe N., Guihen D., Robst J., Crofts S., Stanworth A., Enderlein P.. A protocol for the aerial survey of penguin colonies using UAVs. J. Unmanned Veh. Syst. 2015;3:95–101.
    doi: 10.1139/juvs-2015-0006google scholar: lookup
  28. Hayes M.C., Gray P.C., Harris G., Sedgwick W.C., Crawford V.D., Chazal N., Crofts S., Johnston D.W.. Drones and deep learning produce accurate and efficient monitoring of large-scale seabird colonies. Ornithol. Appl. 2021 123.
    doi: 10.1093/ornithapp/dꬂ2google scholar: lookup
  29. Manohar N., Sharath Kumar Y.H., Kumar G.H.. Supervised and unsupervised learning in animal classification. Proceedings of the 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI) Jaipur, India. 21–24 September 2016; pp. 156–161.
  30. Chaganti S.Y., Nanda I., Pandi K.R., Prudhvith T.G.N.R.S.N., Kumar N.. Image Classification using SVM and CNN. Proceedings of the 2020 International Conference on Computer Science, Engineering and Applications (ICCSEA) Sydney, Australia. 19–20 December 2020; pp. 1–5.
  31. He K., Zhang X., Ren S., Sun J.. Deep Residual Learning for Image Recognition. 2015.
  32. Han X., Jin R.. A Small Sample Image Recognition Method Based on ResNet and Transfer Learning. Proceedings of the 2020 5th International Conference on Computational Intelligence and Applications (ICCIA) Beijing, China. 19–21 June 2020; pp. 76–81.
  33. Marcon dos Santos G.A., Barnes Z., Lo E., Ritoper B., Nishizaki L., Tejeda X., Ke A., Lin H., Schurgers C., Lin A.. Small Unmanned Aerial Vehicle System for Wildlife Radio Collar Tracking. Proceedings of the 2014 IEEE 11th International Conference on Mobile Ad Hoc and Sensor Systems Philadelphia, PA, USA. 28–30 October 2014; 761p.
  34. McEvoy JF, Hall GP, McDonald PG. Evaluation of unmanned aerial vehicle shape, flight path and camera type for waterfowl surveys: disturbance effects and species recognition.. PeerJ 2016;4:e1831.
    doi: 10.7717/peerj.1831pmc: PMC4806640pubmed: 27020132google scholar: lookup
  35. Bennitt E, Bartlam-Brooks HLA, Hubel TY, Wilson AM. Terrestrial mammalian wildlife responses to Unmanned Aerial Systems approaches.. Sci Rep 2019 Feb 14;9(1):2142.
    doi: 10.1038/s41598-019-38610-xpmc: PMC6375938pubmed: 30765800google scholar: lookup
  36. Steele W.K., Weston M.A., Steele W.K., Weston M.A.. The assemblage of birds struck by aircraft differs among nearby airports in the same bioregion. Wildl Res. 2021;48:422–425.
    doi: 10.1071/WR20127google scholar: lookup
  37. Quenouille M.H.. Notes on Bias in Estimation. Biometrika 1956;43:353–360.
    doi: 10.1093/biomet/43.3-4.353google scholar: lookup
  38. . Training a Classifier—PyTorch Tutorials 1.9.0+cu102 documentation. .
  39. Wu H., Gu X.. Max-Pooling Dropout for Regularization of Convolutional Neural Networks. arXiv 2015.
  40. Liu T., Fang S., Zhao Y., Wang P., Zhang J.. Implementation of Training Convolutional Neural Networks. arXiv 2015.
  41. Zualkernan I.A., Dhou S., Judas J., Sajun A.R., Gomez B.R., Hussain L.A., Sakhnini D.. Towards an IoT-based Deep Learning Architecture for Camera Trap Image Classification. Proceedings of the 2020 IEEE Global Conference on Artificial Intelligence and Internet of Things (GCAIoT) Dubai, United Arab Emirates. 12–16 December 2020; pp. 1–6.
  42. . Torchvision.transforms—Torchvision 0.10.0 documentation. .
  43. Amari S.. Backpropagation and stochastic gradient descent method. Neurocomputing 1993;5:185–196.
  44. Attoh-Okine N.O.. Analysis of learning rate and momentum term in backpropagation neural network algorithm trained to predict pavement performance. Adv. Eng. Softw. 1999;30:291–302.
  45. Wilson D.R., Martinez T.R.. The need for small learning rates on large problems. Proceedings of the IJCNN’01 International Joint Conference on Neural Networks Proceedings (Cat No01CH37222) Washington, DC, USA. 15–19 July 2001; pp. 115–119.
  46. Goodfellow I., Bengio Y., Courville A.. Deep Learning. MIT Press; Cambridge, MA, USA: 2016. 800p Adaptive Computation and Machine Learning Series.
  47. Cho K., Raiko T., Ilin A.. Enhanced gradient and adaptive learning rate for training restricted boltzmann machines. Proceedings of the 28th International Conference on International Conference on Machine Learning, ICML’11 Omnipress; Madison, WI, USA: 2011; pp. 105–112.
  48. . Torch.optim—PyTorch 1.9.0 documentation. .
  49. Viera AJ, Garrett JM. Understanding interobserver agreement: the kappa statistic.. Fam Med 2005 May;37(5):360-3.
    pubmed: 15883903
  50. Keshari R., Vatsa M., Singh R., Noore A.. Learning Structure and Strength of CNN Filters for Small Sample Size Training. arXiv 2018.
  51. Liu L., Jiang H., He P., Chen W., Liu X., Gao J., Han J.. On the Variance of the Adaptive Learning Rate and Beyond. arXiv 2020.
  52. Li Y., Wei C., Ma T.. Towards Explaining the Regularization Effect of Initial Large Learning Rate in Training Neural Networks. arXiv 2020.
  53. Mikołajczyk A., Grochowski M.. Data augmentation for improving deep learning in image classification problem. Proceedings of the 2018 International Interdisciplinary PhD Workshop (IIPhDW) Świnouście, Poland. 9–12 May 2018; pp. 117–122.
  54. Perez L., Wang J.. The Effectiveness of Data Augmentation in Image Classification using Deep Learning. arXiv 2017.
  55. Thanapol P., Lavangnananda K., Bouvry P., Pinel F., Leprévost F.. Reducing Overfitting and Improving Generalization in Training Convolutional Neural Network (CNN) under Limited Sample Sizes in Image Recognition. Proceedings of the 2020-5th International Conference on Information Technology (InCIT) Chonburi, Thailand. 21–22 October 2020; pp. 300–305.
  56. Brownlee J.. Difference Between a Batch and an Epoch in a Neural Network. Machine Learning Mastery 2018.
  57. Lawrence S., Giles C.L.. Overfitting and neural networks: Conjugate gradient and backpropagation. Proceedings of the Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks IJCNN 2000 Neural Computing: New Challenges and Perspectives for the New Millennium Como, Italy. 27 July 2000; pp. 114–119.
  58. Seymour AC, Dale J, Hammill M, Halpin PN, Johnston DW. Automated detection and enumeration of marine wildlife using unmanned aircraft systems (UAS) and thermal imagery.. Sci Rep 2017 Mar 24;7:45127.
    doi: 10.1038/srep45127pmc: PMC5364474pubmed: 28338047google scholar: lookup

Citations

This article has been cited 8 times.
  1. Mancuso D, Castagnolo G, Porto SMC. Cow Behavioural Activities in Extensive Farms: Challenges of Adopting Automatic Monitoring Systems. Sensors (Basel) 2023 Apr 8;23(8).
    doi: 10.3390/s23083828pubmed: 37112171google scholar: lookup
  2. Besson M, Alison J, Bjerge K, Gorochowski TE, Høye TT, Jucker T, Mann HMR, Clements CF. Towards the fully automated monitoring of ecological communities. Ecol Lett 2022 Dec;25(12):2753-2775.
    doi: 10.1111/ele.14123pubmed: 36264848google scholar: lookup
  3. Ascagorta O, Pollicelli MD, Iaconis FR, Eder E, Vázquez-Sano M, Delrieux C. Large-Scale Coastal Marine Wildlife Monitoring with Aerial Imagery. J Imaging 2025 Mar 24;11(4).
    doi: 10.3390/jimaging11040094pubmed: 40278010google scholar: lookup
  4. Gao G, Ma Y, Wang J, Li Z, Wang Y, Bai H. CFR-YOLO: A Novel Cow Face Detection Network Based on YOLOv7 Improvement. Sensors (Basel) 2025 Feb 11;25(4).
    doi: 10.3390/s25041084pubmed: 40006313google scholar: lookup
  5. Elmore JA, Schultz EA, Jones LR, Evans KO, Samiappan S, Pfeiffer MB, Blackwell BF, Iglay RB. Evidence on the efficacy of small unoccupied aircraft systems (UAS) as a survey tool for North American terrestrial, vertebrate animals: a systematic map. Environ Evid 2023 Feb 13;12(1):3.
    doi: 10.1186/s13750-022-00294-8pubmed: 39294790google scholar: lookup
  6. Samiappan S, Krishnan BS, Dehart D, Jones LR, Elmore JA, Evans KO, Iglay RB. Aerial Wildlife Image Repository for animal monitoring with drones in the age of artificial intelligence. Database (Oxford) 2024 Jul 23;2024.
    doi: 10.1093/database/baae070pubmed: 39043628google scholar: lookup
  7. Luo W, Zhang G, Shao Q, Zhao Y, Wang D, Zhang X, Liu K, Li X, Liu J, Wang P, Li L, Wang G, Wang F, Yu Z. An efficient visual servo tracker for herd monitoring by UAV. Sci Rep 2024 May 7;14(1):10463.
    doi: 10.1038/s41598-024-60445-4pubmed: 38714785google scholar: lookup
  8. Krishnan BS, Jones LR, Elmore JA, Samiappan S, Evans KO, Pfeiffer MB, Blackwell BF, Iglay RB. Fusion of visible and thermal images improves automated detection and classification of animals for drone surveys. Sci Rep 2023 Jun 27;13(1):10385.
    doi: 10.1038/s41598-023-37295-7pubmed: 37369669google scholar: lookup