< Back to previous page

Project

Quantifying earlier indicators of disease in group housed pigs using computer vision

Introduction Diseases in pigs cause negative impact on their wellbeing, increasing the cost of production in pig industry by the rate of weight loss and death observed in affected pigs and affects public health by the increased use of antimicrobials while the development of antimicrobial resistance. Early detection of health and welfare compromises in commercial piggeries is essential for timely intervention to enhance treatment success, reduce impact on welfare, and promote sustainable pig production. Although the importance of early detection of diseases has been recognized, the implementation of effective detection system has been limited by the difficulty and high cost in large-scale clinical and serological tests. From the clinical observations, as the first sign of animal infection are usually fever and reduced motion that leads to reduced consumption of water and feed. Novel non-invasive techniques are being investigated to monitor the change of these signs and help stock people in farm in detecting diseases earlier, reducing the use of medicine treatment, minimizing the propagation of the infection, reduce the cost of the pig production. Precision livestock farming (PLF) plays an important role in monitoring system in real time that provides tailored suggestions for specific animal welfare problems based on the detailed information by using advanced PLF technologies.   Physiological parameters such as body temperature (BT), heart rate (HR) and respiration rate (RR), could be useful indicators when monitoring illness in pigs. The most common methods for measuring these physiological parameters require human-animal interaction, such as using stethoscope for HR and RR measurement, and invasive thermometer for measuring rectal temperature (RT) to represent BT. Most of these techniques are labor-intensive, time-consuming, can generate stress to animals and consequently, these methods are not practical for continuous and large-scale animal monitoring. Some studies are also focusing on developing automatic and non-invasive technologies based on computer vison method to monitor these vital physiological parameters. For Body Temperature (BT) measurement, InfraRed Temperature Measurement Equipment (IRTME) is gaining popularity in recent years because of enabling non-contact temperature measurements. A variety of less invasive techniques has been developed for HR measurement, such as attached monitors. Researchers are currently developing computer vison-based techniques to monitor the HR. The obtrusive nature of contact-based sensors for RR monitoring makes them uncomfortable for extended use in long-term monitoring and vulnerable to movement-derived noise. Possibility of camera-based remote monitoring approaches have attracted considerable attention, such as remote photoplethysmography (rPPG). Another novel approach named Sorption-Enhanced Infrared Thermography (SEIRT) is available for animal RR. Its benefits spring from the integration of the infrared thermography (IRT) and chemical physics (phase transition heat release/absorption) within a single method, which can show a higher sensitivity than IRT images analysis. Apart from measuring physiological parameters, another way to achieve early detection of health and welfare compromises in animals is to utilize behavioral changes. Behavioral changes that precede or accompany subclinical and clinical signs have diagnostic value. Often referred to as sickness behavior, this encompasses changes in feeding, drinking, and elimination behaviors, social behaviors, and locomotion and posture. Automatic behavioral monitoring systems can overcome drawbacks of human intervention that can be time consuming, subjective, and impractical, particularly on large scale. Several pig behaviors can be directly classified from a single image, such as lying, standing, sitting, and kneeling. Some behaviors require spatial environment information, such as feeding, drinking, and enrichment activities. More complicated behaviours require a sequence of images taken over a period, such as running, playing, tail biting, and ear biting. Although, many studies have already explored how to automatically monitor daily activity budget, like feeding, drinking, elimination, posture, locomotion, social behavior, and disease-specific behaviors (i.e., coughing, and scratching), these systems require further development to provide a constant and detailed assessment of large groups under farm conditions. Besides, the studies on automatic behavior detection for disease indication are still limited. Based on the present research, I will continue to develop an early warning system by Computer Vision (CV) for continuous, real-time and automatic monitoring of individuals physiological parameters, behaviors, and explore the feasibility of this system for the Australian commercial piggeries. Aim and objectives: The main research objective is to develop computer vision algorithms that quantify earlier indicators of disease in commercial piggeries. Pigs’ physiological parameters and behaviors include information represent diseases and these can be detected early before the clinical observation. Combining this information can enhance treatment success, reduce impact on welfare, help the cost control of the commercial piggeries, and improve the safety of pork production. Based on this insights, more solid basis can be provided to the stocking people, veterinarian, piggeries merchants and other advisors. To reach the main objective, this research investigated the use of IRT cameras and RGB cameras in a commercial indoor piggery. This study had 4 work packages: Methodology Work Package 1: Build disease related behaviors public dataset. Task 1.1: Selection of Cameras and Position For monitoring BT and RR, we use thermal camera while monitoring HR and behaviors through RGB camera. The thermal camera needs to store on the feeder while the RGB camera should be stored on top of the pen. The video recorded by these two cameras will be synchronized. FLIR Duo® Pro R (FLIR Systems, Wilsonville, OR. USA) cameras were used during this study. These combine a high resolution radiometric thermal sensor and a 4 K visible RGB sensor. The IRT sensor had a spectral range of 7.5 – 13.5 μm, sensitivity < 50 mK, resolution of 640 × 512 pixels, emissivity of 0.985, and a frame rate of 30 Hz per second. The RGB sensor had a resolution of 4000 × 3000 pixels and a frame rate of 30 Hz per second. Furthermore, the RGB video camera will choose for continuous monitoring is Raspberry Pi camera (module V2.1), which is an 8-megapixel sensor, with a rate of 29 frames per second. Task 1.2 Data collection and Labelling data 1.2.1 Physiological parameters measurements with CV approaches and Gold Standards. This research will pay more attention on common diseases-related behaviors detection. For the physiological data collection, I will use the approaches developed by the previous results by University of Melbourne. To measure BT, the algorithm firstly extracted the radiometric information of IRT image, by using FLIR® Atlas SDK (FLIR Systems, Wilsonville, OR. USA). Secondly, it allowed to select the eye area as the region of interest (ROI; selected on the first frame and automatically tracked over the following frames), from where the maximum temperature was extracted. In this research, eye area can be a thermal window to the body temperature, which is always well perfused by blood and as such is a “window”. The best skin locations for high correlation between body temperature and rectal temperature are most likely thermal windows such as ear base, eye region and udder. With the aim of remotely measuring HR over the RGB images, two algorithms about tracking ROIs and measuring HR were integrated. The first algorithm uses computer vision techniques to recognize spatial patterns on specific ROIs (eye area) and automatically track them along the video. The second algorithm is based on the photoplethysmography (PPG) principles to assess HR changes by detecting changes on both light reflection off and transmission through body parts. To assess HR in the present study, the eye area was used as ROI because it presents low density hair, and because this area has been shown to be usefulness when using imagery in humans and animals. For remotely measuring RR IRT images were processed, choosing nose area as ROI. Like the HR analysis, the ROI (nose area) is firstly selected and tracked in order to improve the accuracy of the analysis. Subsequently, the algorithm extracts the maximum temperature within the ROI (nose area) in each frame, which are later used to calculate RR. The calculation is based on the changes of temperature that occur due to air flow (inhalation and exhalation), where the air that is expelled generates an increase in temperature within the nose area, decreasing later when the inhalation occurs. In order to be able to label physiological parameters(HR, RR, BT) each pig was recorded for a total of one minute and each parameter was also measured with a gold-standard method during the same period. Due to pigs’ fear of strangers, maintaining pigs in the same position for a minute will be a big challenge. It will be not possible to hear the HR of one pig in any position when they perform excessive motion and vocalization. Thus, we choose to train the technician who is familiar with those experimental pigs and handle this series of gold standard tests. 1.2.2 Behavioral data collection and labelling and Organize public datasets Data is one of the most challenging issues that we face in developing any Machine Learning / Artificial Intelligence pipelines in the field of computer vision. There are two aspects to this issue: Data collection and Data labelling (data annotation). In supervised learning, it is important to have labels for the training data as well as to make sure the labels do not have any noise in them to build a robust computer vision algorithm. Correctly labelling data for supervised learning can help model learn the main practical information for our research goals. From the clinical observations of disease related behaviors showed in Figure 1, scratching, reluctance to stand, vomiting, coughing, lameness, posterior paralysis, depraved appetite, lying down and getting up repeatedly, sitting down like a dog, listlessness, Irregular oestrus cycle, huddling, incoordination and many others. Every kind of abnormal behavior of animal is more or less in control of movement (also of specific body parts, e.g. ears), Lying pattern and posture, foraging and exploring, social interactions, vocalizations and many others. Precise definition of the behavior labelling is the core of the PLF system development. If it is done incorrectly, the developed PLF system will not function as expected. Using Ethogram in labelling work is suggested by the PLF experts which is a catalogue of behaviors exhibited by an animal. Then testing ‘Observer Reliability’ that can ensure we get an objective, correctly described and complete ethogram. Milestone 1: A fully annotated PigDisease dataset to fill a blank in this multidisciplinary field. All algorithm development and comparative studies will use this dataset as a benchmark. The dataset will be public available together with scientific publications. Work Package 2: Develop CV-based algorithms to track and detect diseases related behaviors Further development of existing CV algorithms will be conducted to provide a constant and detailed assessment of large groups under farm conditions, from pen level to individual pig level. Task 2.1 Calculate activity index of behaviors in group level Firstly, the activity index will be applied to the PigDisease dataset as the baseline algorithm to assess the overall activity at pen level. Task 2.2 Develop corresponding algorithm of dense object detection/ segmentation Secondly, we aim to quantify more complicated behaviors by make use of the-state-of-the-art detection and segmentation algorithm. Specifically, we will focus on dense object detection/segmentation since the main gap that prevent existing commercialization of existing algorithms is the increasing breeding scale. The homogeneous appearance of pigs brings big challenge to (re-)identify individual animal, in order to achieve long-terming tracking, Smart Ear Tags will be equipped for each animal, id-reader in the feeder will correct the miss-tracks and wrong-tracks when animal visit the id-reader. As a backup plan, the body marker used as unique vision identifier will be painted and renewed twice a week. No one has done research on this point yet, which is one of the part with highest risk. Milestone 2: A list of behaviours budgets of individual animal will be calculated, including lying, standing, sitting, and kneeling by a pose estimator, feeding and drinking behaviours by the interaction of each animal and the feeder/drinker region. For more complicated social interactions, we aim to explore the advanced Temporal Action Localization (TAL) algorithms that locate activities and categories in the untrimmed long video stream and output beginning and end timestamps (high risk). Work Package 3: Quantization and automation of behavioral changes as pig disease indicators According to swine ethology studies, a healthy pig usually maintains the same daily pattern of behavior and remains alert and active, example changes of sickness pigs are: Less active and spending more time resting, Lying apart from other pigs, Lying on the belly with drooping ears, Huddling and shivering, Walking with arched back and drooping tail, Sitting like a dog with labor breathing, No interest in people, pen-mates, feed or water, Less aggressive, less social interactions with other pigs. Instead of qualitative description of sickness pigs, quantitative and automatic analysis of changes in healthy and sick pigs will be perform. Firstly, develop the algorithm to distinguish the behavioral difference between healthy and sick pigs. Secondly, analysis these difference by statistics methods and generate practical information to the advisors. The success of this new technology will extend limited human measuring subtle and easily overlooked early signs of sickness. Milestone 3: Verify the feasibility of using CV technologies to quantify the behavioral changes of disease pigs by comparing model outputs with golden standard annotated in our PigDisease dataset. Furthermore, explore any subtle signs of sickness that easily overlooked by ethologist. Work Package 4: Commercial demonstration of disease outbreak prediction based on the integration of behavioral and physiological signals To evaluate the smart health monitoring technologies under commercial conditions, 4200 pigs will be studied throughout their grower/finisher stage over 12 months. CCTV and IRT cameras in each pen will record behaviours and thermal imagery. Daily health assessments will be conducted by identifying sick pigs in each pen. Sufficient sick pigs and their historical behavioral and physiological signals will be collected. The likelihood of disease outbreak in the next day/two days/week will be predicted by these historical data. A proper data visualization method will be employed to have an intuitive understanding about changes of behavioral and physiological signals. Then, bioprocess modelling methods (e.g. Dynamic ARX model) or time-series analysis methods in machine learning (e.g. recurrent neuro network, least square support vector machine) will be employed and compared for predicting the disease outbreak. Milestone 4: A disease prediction model will be developed and evaluated, finally demonstrated in commercial condition.

Date:31 May 2022 →  Today
Keywords:Computer vision, Precision Livestock Farming, Pig diseases behaviours
Disciplines:Animal health engineering, Computer vision, Image processing
Project type:PhD project