Sex identification of ducklings based on acoustic signals.
Researchers
Journal
Modalities
Models
Abstract
Sex identification of ducklings is a critical step in the poultry farming industry, and accurate sex identification is beneficial for precise breeding and cost savings. In this study, a method for identifying the sex of ducklings based on acoustic signals was proposed. In the first step, duckling vocalizations were collected and an improved spectral subtraction method and high-pass filtering were applied to reduce the influence of noise. Then, duckling vocalizations were automatically detected by using a double-threshold endpoint detection method with 3 parameters: short-time energy (STE), short-time zero-crossing rate (ZCR), and duration (D). Following the extraction of Mel-Spectrogram features from duckling vocalizations, an improved Res2Net deep learning algorithm was used for sex classification. This algorithm was introduced with the Squeeze-and-Excitation (SE) attention mechanism and Ghost module to improve the bottleneck of Res2Net, thereby improving the model accuracy and reducing the number of parameters. The ablative experimental results showed that the introduction of the SE attention mechanism improved the model accuracy by 2.01%, while the Ghost module reduced the number of model parameters by 7.26M and the FLOPs by 0.85G. Moreover, this algorithm was compared with 5 state-of-the-art (SOTA) algorithms, and the results showed that the proposed algorithm has the best cost-effectiveness, with accuracy, recall, specificity, number of parameters, and FLOPs of 94.80, 94.92, 94.69, 18.91M, and 3.46G, respectively. After that, the vocalization detection score and the average confidence strategy were used to predict the sex of individual ducklings, and the accuracy of the proposed model reached 96.67%. In conclusion, the method proposed in this study can effectively detect the sex of ducklings and serve as a reference for automated sex identification of ducklings.Copyright © 2024 The Authors. Published by Elsevier Inc. All rights reserved.