223 Machine learning-aided ultrasonography for assessing follicular status in an endangered anuran
L. Chen A , M. Caprio A , S. Lampert A , D. Barber B , A. Kouba C and C. Vance AA Mississippi State University, Biochemistry, Molecular Biology, Entomology, and Plant Pathology, Mississippi State, MS, USA
B Fort Worth Zoo, Department of Ectotherms, Fort Worth, TX, USA
C Mississippi State University, Wildlife, Fisheries, and Aquaculture, Mississippi State, MS, USA
Reproduction, Fertility and Development 35(2) 240-241 https://doi.org/10.1071/RDv35n2Ab223
Published: 5 December 2022
© 2023 The Author(s) (or their employer(s)). Published by CSIRO Publishing on behalf of the IETS
Reproductive status is a crucial physiological parameter that informs management decisions surrounding animal breeding programs across several industries, including agriculture, medicine, and conservation. Traditional methods such as hormone monitoring and ultrasound imaging have been useful, but remain limited in application for evaluating female reproductive status. For example, hormone monitoring requires ample time to conduct and validate assays, which may not be feasible in cases where real-time diagnostics are needed to guide on-the-ground management decisions. Additionally, ultrasound interpretation is prone to observational subjectivity, which may result in variable assessments of follicular development across species, practitioners, and instrumentation. Machine learning using image data and artificial intelligence to develop prediction models has proven useful for optimising processes in medicine and agriculture and may provide a means for standardisation of ultrasound imagery for conservation breeding initiatives. The goal of this study was to train image classification models to evaluate female follicular development using multiple convolutional neural networks (CNNs) with ultrasound images. Ultrasound images (n = 80) were collected from the ventral region of endangered Houston toad (Anaxyrus houstonensis) females at the Fort Worth Zoo during their breeding season. All images were categorised according to their reproductive status designation, which ranged from 0–3 (with 0 indicating absence of follicular development and 3 indicating advanced folliculogenesis). These designations were split into a low group including grades 0–1 and a high group including grades 2–3. We used a five-fold cross validation, where one independent fold was held out for validation and the remaining four used to train the model for 10 iterations. Images were cropped to remove text from original images and image standardisation was applied to improve algorithm generalizability. Four deep CNNs (i.e. VGG-16, VGG-19, Inception-v3, Xception) were developed and tested to classify images as belonging to either a low or high reproductive stage. In the current two-class problem, the four models tested varied significantly (generalised additive model, P < 0.001), such that both VGG-16 (87.8 ± 7.1%) and VGG-19 (87.1 ± 9.3%) yielded higher classification accuracies than Inception-v3 (83.4 ± 8.9%; P < 0.05), while Inception-v3 and Xception (83.5 ± 8.7%) did not differ regarding mean performance accuracy (P > 0.05). Given that algorithms may vary in their predictive outcomes, the use and comparison of a multimodel approach is proposed, particularly when investigating novel questions and datasets. We demonstrate deep learning-aided ultrasound as an effective method for evaluating female follicular stage, which is a critically important consideration when selecting candidates for use in in vitro fertilisation and artificial insemination trials. Thus, CNNs using ultrasound data present a novel approach to reliably assess follicular status with the potential to refine management strategies surrounding conservation breeding programs.