Acoustic sensors for detecting cow behaviour
Acoustic technologies provide a non-invasive method to generate information about the health, welfare, and environmental impact of livestock. This study demonstrated that rear leg attached acoustic sensors can be used to differentiate between seven different acoustic classes, with six based on cow b...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2023-02-01
|
Series: | Smart Agricultural Technology |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2772375522000363 |
_version_ | 1811255056958226432 |
---|---|
author | P.R. Shorten |
author_facet | P.R. Shorten |
author_sort | P.R. Shorten |
collection | DOAJ |
description | Acoustic technologies provide a non-invasive method to generate information about the health, welfare, and environmental impact of livestock. This study demonstrated that rear leg attached acoustic sensors can be used to differentiate between seven different acoustic classes, with six based on cow behaviours (Grazing, Breathing, Walking, Lying Down, Dung, Vocalization, Other) that were obtained from more than 150 cows under grazing conditions. The overall accuracy of the ensemble classification model was 96.2% based on a total of 700 acoustic recordings. The performance of the models for the duration, frequency (number of events per 10 seconds), or period (average time between events) of the six animal behaviours had an average coefficient of determination of R2 = 0.93. The model for respiration rate (Breathing class) while sleeping (R2 = 0.99) provides an alternative to more invasive differential pressure and thermistor-based methods. The model performance for bite rate (R2 = 0.91) was consistent with results obtained previously with collar and forehead attached microphones. There was ten-fold variation in the duration of dung events and the model for the duration of dung events (R2 = 0.96) allows for estimation of the total amount of dung deposited per day. The acoustic technology also provides (R2 = 0.91) an alternative to accelerometer-based methods for stepping frequency. Lying down events were characterised by scratching sounds generated by the microphone rubbing against the pasture that provided good prediction of duration of the time to lie down (R2 = 0.86). Models for vocalization duration (R2 = 0.92) and classification of the vocalization class (sensitivity 0.99; precision 0.95) demonstrate the feasibility of acoustic-based determination of vocalization traits, which provide information on the welfare and state of the animal. |
first_indexed | 2024-04-12T17:18:04Z |
format | Article |
id | doaj.art-ab09be2e36464ed4bf5d21cc91792d71 |
institution | Directory Open Access Journal |
issn | 2772-3755 |
language | English |
last_indexed | 2024-04-12T17:18:04Z |
publishDate | 2023-02-01 |
publisher | Elsevier |
record_format | Article |
series | Smart Agricultural Technology |
spelling | doaj.art-ab09be2e36464ed4bf5d21cc91792d712022-12-22T03:23:34ZengElsevierSmart Agricultural Technology2772-37552023-02-013100071Acoustic sensors for detecting cow behaviourP.R. Shorten0AgResearch Limited, Ruakura Research Centre, Private Bag 3123, Hamilton, New ZealandAcoustic technologies provide a non-invasive method to generate information about the health, welfare, and environmental impact of livestock. This study demonstrated that rear leg attached acoustic sensors can be used to differentiate between seven different acoustic classes, with six based on cow behaviours (Grazing, Breathing, Walking, Lying Down, Dung, Vocalization, Other) that were obtained from more than 150 cows under grazing conditions. The overall accuracy of the ensemble classification model was 96.2% based on a total of 700 acoustic recordings. The performance of the models for the duration, frequency (number of events per 10 seconds), or period (average time between events) of the six animal behaviours had an average coefficient of determination of R2 = 0.93. The model for respiration rate (Breathing class) while sleeping (R2 = 0.99) provides an alternative to more invasive differential pressure and thermistor-based methods. The model performance for bite rate (R2 = 0.91) was consistent with results obtained previously with collar and forehead attached microphones. There was ten-fold variation in the duration of dung events and the model for the duration of dung events (R2 = 0.96) allows for estimation of the total amount of dung deposited per day. The acoustic technology also provides (R2 = 0.91) an alternative to accelerometer-based methods for stepping frequency. Lying down events were characterised by scratching sounds generated by the microphone rubbing against the pasture that provided good prediction of duration of the time to lie down (R2 = 0.86). Models for vocalization duration (R2 = 0.92) and classification of the vocalization class (sensitivity 0.99; precision 0.95) demonstrate the feasibility of acoustic-based determination of vocalization traits, which provide information on the welfare and state of the animal.http://www.sciencedirect.com/science/article/pii/S2772375522000363CowAcousticMachine learningNeural networkBehaviourLivestock |
spellingShingle | P.R. Shorten Acoustic sensors for detecting cow behaviour Smart Agricultural Technology Cow Acoustic Machine learning Neural network Behaviour Livestock |
title | Acoustic sensors for detecting cow behaviour |
title_full | Acoustic sensors for detecting cow behaviour |
title_fullStr | Acoustic sensors for detecting cow behaviour |
title_full_unstemmed | Acoustic sensors for detecting cow behaviour |
title_short | Acoustic sensors for detecting cow behaviour |
title_sort | acoustic sensors for detecting cow behaviour |
topic | Cow Acoustic Machine learning Neural network Behaviour Livestock |
url | http://www.sciencedirect.com/science/article/pii/S2772375522000363 |
work_keys_str_mv | AT prshorten acousticsensorsfordetectingcowbehaviour |