key: cord-0983196-ap1yegps authors: Lowe, G.L.; Sutherland, M.A.; Waas, J.R.; Schaefer, A.L.; Cox, N.R.; Stewart, M. title: Physiological and behavioral responses as indicators for early disease detection in dairy calves date: 2019-04-17 journal: J Dairy Sci DOI: 10.3168/jds.2018-15701 sha: ae86b35dbe14d158c5e9c3ea3014acc60bb9a9e5 doc_id: 983196 cord_uid: ap1yegps This study investigated physiological and behavioral responses associated with the onset of neonatal calf diarrhea (NCD) in calves experimentally infected with rotavirus and assessed the suitability of these responses as early disease indicators. The suitability of infrared thermography (IRT) as a noninvasive, automated method for early disease detection was also assessed. Forty-three calves either (1) were experimentally infected with rotavirus (n = 20) or (2) acted as uninfected controls (n = 23). Health checks were conducted on a daily basis to identify when calves presented overt clinical signs of disease. In addition, fecal samples were collected to verify NCD as the cause of illness. Feeding behavior was recorded continuously as calves fed from an automated calf feeder, and IRT temperatures were recorded once per day across 5 anatomical locations using a hand-held IRT camera. Lying behavior was recorded continuously using accelerometers. Drinking behavior at the water trough was filmed continuously to determine the number and duration of visits. Respiration rate was recorded once per day by observing flank movements. The effectiveness of inoculating calves with rotavirus was limited because not all calves in the infected group contracted the virus; further, an unexpected outbreak of Salmonella during the trial led to all calves developing NCD, including those in the healthy control group. Therefore, treatment was ignored and instead each calf was analyzed as its own control, with data analyzed with respect to when each calf displayed clinical signs of disease regardless of the causative pathogen. Milk consumption decreased before clinical signs of disease appeared. The IRT temperatures were also found to change before clinical signs of disease appeared, with a decrease in shoulder temperature and an increase in side temperature. There were no changes in respiration rate or lying time before clinical signs of disease appeared. However, the number of lying bouts decreased and lying bout duration increased before and following clinical signs of disease. There was no change in the number of visits to the water trough, but visit duration increased before clinical signs of disease appeared. Results indicate that milk consumption, IRT temperatures of the side and shoulder, number and duration of lying bouts, and duration of time spent at the water trough show potential as suitable early indicators of disease. Neonatal calf diarrhea (NCD), an enteric disease affecting beef and dairy industries worldwide, is a significant concern from both economic and animal welfare perspectives. Typically affecting calves during the first 28 d of life (Cho and Yoon, 2014) , NCD causes severe diarrhea, resulting in dehydration, weight loss, and potentially death (Schroeder et al., 2012) . Pathogens associated with NCD include rotavirus, coronavirus, Cryptosporidium, and Salmonella, which may act exclusively or in combination. The 2 most prevalent pathogens associated with NCD in New Zealand are rotavirus and Cryptosporidium (Al Mawly et al., 2015) . Neonatal calf diarrhea inflicts a substantial amount of damage on the intestines as a result of the pathogens causing villous atrophy and inflammation of the submucosa (Todd et al., 2010; Cho and Yoon, 2014) . As a consequence of this damage, there is a reduction in the surface area over which fluid and nutrients can be absorbed across the intestinal epithelium; this results in a greater amount of fluid being lost, leading to severe diarrhea (Cho and Yoon, 2014) . Neonatal calf diarrhea generates major economic losses, and reducing the prevalence of NCD is considered one of the biggest challenges affecting both beef and dairy industries worldwide (Meganck et al., 2015) . In addition to the cost of mortalities, other economic losses associated with NCD include treatment costs (e.g., antibiotics and electrolytes) and time spent caring for the affected animals. In the longer term, significant losses are associated with growth and production as a result of reduced weight gain and stunted growth in affected animals (de Graaf et al., 1999) . In addition, the extensive use of antibiotics to treat infected calves raises concerns regarding antibiotic resistance. When experiencing illness, it is common for animals to display sickness behaviors (Millman, 2007) ; these are associated with physiological and behavioral changes, the purpose of which are to increase the animal's ability to build up an immune response and thus enable the animal to focus its energy on fighting off the disease. At present, to identify an animal as being affected by disease, farmers will observe animals for clinical signs of illness (e.g., depression or apathy, diarrhea, dehydration, and decreased feed intake; de Verdier Klingenberg, 2000) . The issue with relying on clinical signs to identify a diseased animal is that many farm animals are prey species and as a result are typically stoic in nature. Being stoic, these animals have a tendency to mask signs of illness and vulnerability, which is a strategy that in the wild would potentially protect them against predation (Weary et al., 2009) . As a result, those responsible for observing animals need to be able to reliably detect subtle changes in behavior that are indicative of illness. The challenge with NCD is that by the time a calf exhibits clinical signs (e.g., diarrhea and dehydration), much of the internal damage to the intestines has already occurred. Therefore, systems need to be developed that would enable disease to be detected earlier than is currently possible based on when an animal presents overt clinical signs of disease. Over time, due to a desire to reduce labor costs, reliance on automated systems has increased, leading to a less hands-on approach to farming as illustrated, for example, by the shift from cows being milked by hand to cows being milked using robotic milking systems. Additionally, increasing herd sizes leads to less individual animal contact and, with fewer experienced stock people in the industry, can lead to an increase in the number of cases of NCD going undiagnosed; this results in reduced welfare and production and increased mortalities. Therefore, there is a need to develop measures that could be incorporated within automated systems on-farm to enable remote and reliable monitoring of animal health and welfare. Automated monitoring systems have been found to be capable of detecting disease based on changes in infrared temperatures (Schaefer et al., 2004 (Schaefer et al., , 2007 (Schaefer et al., , 2012 and feeding behavior (Svensson and Jensen, 2007; Borderas et al., 2009) . In contrast to the situation overseas, these automated systems have not been investigated in New Zealand conditions, where calves have different susceptibilities to certain diseases (e.g., prevalence of NCD vs. respiratory disease). Infrared thermography (IRT) is a noninvasive method of detecting radiated heat. Previous studies have investigated the use of IRT for detecting early signs of bovine respiratory disease (BRD) and bovine viral diarrhea (BVD; Schaefer et al., 2004 Schaefer et al., , 2007 Schaefer et al., , 2012 . Infrared thermography has also been used to investigate how temperature changes at different anatomical areas relate to the onset of BVD (Schaefer et al., 2004) and differences in feed efficiency (Montanholi et al., 2009 (Montanholi et al., , 2010 Martello et al., 2016) . Moreover, changes in feeding behaviors, measured using automated calf feeders, have been used to assess disease in calves (Svensson and Jensen, 2007; Borderas et al., 2009; Swartz et al., 2017; Sutherland et al., 2018) . Automated calf feeders are able to record information, including milk consumption, visit duration, and the number of rewarded (calf receives an allocation of milk) and unrewarded (calf does not receive an allocation of milk) visits calves make to the feeder. The accelerometer is another automated device that has been used in previous studies to monitor lying (Mattachini et al., 2013; Sepúlveda-Varas et al., 2014) and feeding (Mattachini et al., 2016) behavior and for detecting estrus (Valenza et al., 2012) and disease (Sepúlveda-Varas et al., 2014) . Previous studies have found that lying behaviors change in diseased calves (Hart, 1988; Johnson, 2002; Sutherland et al., 2018) ; therefore, there is potential for accelerometers to be used to measure lying behavior to detect the onset of NCD. Water intake, drinking behavior, and respiration rate (RR) also have the potential to be used as early indicators of disease. However, current methods for measuring these responses are often labor intensive and are not always suitable for monitoring individual animals. However, an automated water system (Insentec, Marknesse, the Netherlands) has been validated as a method of measuring drinking behavior and water intake in adult cattle (Chapinal et al., 2007) . The development of a similar system for calves would provide insights into the drinking behavior of calves and how water intake influences their health, welfare, and productivity. In addition to the above, RR can provide valuable information regarding stress, pain, and overall cow welfare (Pastell et al., 2006) . The purpose of our study was to identify physiological and behavioral responses associated with the early onset of NCD in calves infected with rotavirus and to 5391 assess the suitability of those responses as indicators for early disease detection. In addition, by assessing the thermal responses of different anatomical regions in response to disease, we investigated the suitability of IRT as a noninvasive method for early disease detection. All procedures involving animals in this study were approved by the University of Waikato Animal Ethics Committee (protocol no. 955) under the New Zealand Animal Welfare Act 1999. The study was undertaken at a farm in the Waikato region of New Zealand (37°48′29.6″ S, 175°04′48.7″ E) between August and October 2015. Forty-three mixedbreed calves (20 Friesian males and 23 Hereford females) were sourced from commercial farms and transported to the facility for enrollment in the study at 4 d of age. Unfortunately, BW could not be reliably recorded upon enrollment due to an equipment fault. Within a single barn, calves were housed in 1 of 2 equal-sized (6.6 × 8.1 m) pens constructed on a solid concrete base with solid walls on all 4 sides. Within each pen, an area (30 m 2 ) of wood chip (20 cm deep) was provided as bedding. Each pen contained a water trough (PT10; Stallion Limited, Palmerston North, New Zealand), hay feeder (purpose built, measuring 75 × 65 × 22 cm), meal feeder (purpose built using one half of a 210-L closedhead drum; ES Plastics, Hamilton, New Zealand), and automated calf feeder (RFID Calf Feeder; A. and D. Reid, Temuka, New Zealand) as used by Sutherland et al. (2018) . Calves were fed as described below using the automated calf feeders in addition to being provided ad libitum access to meal that consisted of 18.0% CP, 10.0% crude fiber, and 5.0% crude fat (Moozlee; NRM, Auckland, New Zealand). Additionally, calves were given ad libitum access to water and meadow hay. On arrival to the facility, calves were individually identified using numbered, colored collars (Calf Neck Bands; Shoof International Ltd., Cambridge, New Zealand). The trial consisted of 2 replicates of 2 treatments into which calves were randomly assigned upon arrival. The treatment groups were referred to as treatment group 1, in which calves were experimentally infected with rotavirus at 6 d of age (n = 20; replicate 1, n = 10 and replicate 2, n = 10), and treatment group 2, in which calves acted as uninfected controls (n = 23; replicate 1, n = 10 and replicate 2, n = 13). Calves in treatment group 1 were infected with rotavirus through an oral drench containing a mixture of 40 mL of water and 6 mL of feces collected from 2 calves known to be positive for rotavirus. Control and infected animals were housed separately using the 2 pens but were otherwise handled in the same manner. Health checks, IRT, and feeding, drinking, and lying behaviors were recorded daily as described below. Health checks were carried out each morning to assess the calves' general well-being and to identify when each calf began to display clinical signs of illness. Health checks assessed calves on the basis of their general appearance, coat condition, gut fill, and fecal consistency, with definitions presented in Figure 1 . Dehydration levels were assessed by monitoring calves for sunken eyes and by performing a tent test to measure skin elasticity in which the skin of the neck was pinched and the time for the skin to return to its normal position was recorded (Ghanem et al., 2012) . Health checks were also used to monitor signs of nasal and ocular discharge and navel infection (a disease resulting from bacterial infection via the umbilical cord soon after birth). Rectal temperatures were taken once per animal during each health check using a digital thermometer (MC-343; Omron, Kyoto, Japan). Temperatures were split into 3 categories: low (≤37.9°C), normal (38-39.5°C), and high (≥39.6°C). As part of the health checks, RR was measured by observing flank movements to record the time taken for each calf to complete 10 breaths; this was then used to calculate RR (breaths/min). Calves were defined as being clinically ill when they were observed as being diarrheic. To be considered diarrheic, a calf had to be witnessed passing malodorous feces with a loose to watery consistency, with the possibility of blood present in severe cases (a score of 2 or 3 for fecal consistency; Figure 1 ). For calves that had not been observed passing feces but were suspected of being diarrheic due to loose fecal matter present on the top of the tail or hind legs, a fecal sample was taken to confirm whether the calf was diarrheic. From all diarrheic calves a fecal sample was collected for analysis to confirm NCD as the cause of illness and to identify the specific pathogen responsible. Once deemed clinically ill, calves were treated accordingly with electrolytes (Dexolyte; Bayer New Zealand Ltd., Auckland, New Zealand) and antibiotics (Amphoprim; Virbac New Zealand Ltd., Hamilton, New Zealand) as needed to help them overcome the disease. Fecal samples were analyzed (New Zealand Veterinary Pathology, Hamilton, New Zealand) to determine the presence of rotavirus, coronavirus, Cryptospo- Daily health check definitions used to assess calves for the onset of neonatal calf diarrhea based on a scoring system of 0 to 3 for general appearance, sunken eyes, ocular discharge, nasal discharge, tent test, navel ill, joint ill, and fecal consistency. A scoring system of 0 or 1 was used to assess coat condition, rear end cleanliness, and gut fill. Measures for which scores were not applicable are denoted as N/A. ridium, and Salmonella to verify NCD as the cause of illness. The presence of Cryptosporidium was assessed by performing an acid-fast stain analysis, which caused oocysts to stain from light pink to red. Broth enrichment and selective plating were used to assess fecal samples for the presence of Salmonella. Samples that tested positive for Salmonella were then sent to the Institute of Environmental Science and Research (Auckland, New Zealand) for serotyping and phage typing to identify the individual strain of Salmonella present. A commercially available ELISA kit (Pourquier ELISA Calves Diarrhea; Institut Pourquier, Montpellier, France) was used to determine the presence of rotavirus and coronavirus. Each pen was fitted with an automated calf feeder that individually identified calves as they approached the feeder based on the electronic identification in their ear tags. Upon arrival to the facility, calves were trained to use the automated feeder. This was done by handlers encouraging calves into the feeder and guiding them toward the teat, where they were trained to press down on a lever with their nose as they suckled in order for milk replacer (Brown Bag CMR; Fonterra Ltd., Auckland, New Zealand) to be dispensed. The milk replacer consisted of 21.0% protein, 21.0% fat, and 48.5% lactose and was provided at a mixing rate of 150 g/ L. Initially, when calves were 4 to 7 d of age, they were given a total allowance of 4 L of milk replacer/d, which was increased at 8 d of age to 6 L/d. Each daily allowance was split into 2-L allocations. Between the full consumption of each 2-L allocation a stand-down period of 6 h (when no milk was delivered) had to pass before the calf could receive the next allocation of milk (per standard farm practice in New Zealand). The automated calf feeders recorded milk intake, the number of both rewarded and unrewarded visits, and the total number of visits (sum of both unrewarded and rewarded visits) calves made to the feeder. A rewarded visit was defined as one in which the calf visited the automated calf feeder and received an allocation of milk. There was no set minimum amount that had to be consumed; any visit in which the calf consumed milk, regardless of the amount, was regarded as a rewarded visit. An unrewarded visit was one in which the calf visited the feeder but did not receive an allocation of milk and hence no amount of milk was consumed. The automated feeders did not record visit duration or drinking speed. Milk was heated to approximately 26°C before being dispensed to the calf. Milk temperature was monitored using Thermochron iButton temperature data loggers (DS1922L-F5#; Maxim Integrated, San Jose, CA). Infrared thermography images of the animals' side (lateral), shoulder (proximal dorsal area over the trapezius muscle), back (dorsal), eye (orbital), and cheek (mandible area over the maxillary muscle) areas were collected every morning following health checks using a hand-held IRT camera (ThermaCAM S60; FLIR Systems AB, Danderyd, Sweden). The camera was calibrated daily for ambient temperature, humidity, and emissivity (0.98). Temperature and relative humidity were measured continuously using data loggers (EL-USB-2-LCD+; Lascar Electronics Ltd., Salisbury, UK) located next to each automated calf feeder. Images were collected by an observer as they walked around the pen capturing images at a set distance from each anatomical area (i.e., 1 m from the eye, 2 m from the back and shoulder, and 3 m from the side) at an angle of 90° to the animal. With the exception of the dorsal view, all other images were collected from the animals' left side. Images were analyzed using ThermaCAM Researcher Software (version 2.10; FLIR Systems AB) to calculate the maximum, minimum, and average temperatures for each area. This analysis involved tracing a circle over the area of interest as shown in Figure 2 . For the eye, this circle was placed so it included the eyeball and area surrounding the eyelid. The cheek area was defined by placing the circle over the cheek muscle. The side was defined by placing the circle over the rumen fossa. Dorsal images were divided into 2 areas of interest, the back and shoulders, which were defined by placing one circle over the lower (distal) back across the hips, including the ilium, and another over the shoulders, including the scapula. Image analysis was carried out by a single observer, and intraobserver reliability was calculated based on the reanalysis of 10% of the images from each area. Initial observations were compared with secondary observations, revealing intraobserver reliability ranging from 96 to 99% for all areas measured, with a combined overall average of 98%. The level of intraobserver reliability was calculated in Excel (version 16.10; Microsoft Corp., Redmond, WA) using the correlation function. Lying and standing behavior were recorded continuously using Hobo Pendant G data loggers (64k, Onset Computer Corp., Bourne, MA) set at 1-min intervals using the y-and z-axes as recommended in previous validation studies (Ledgerwood et al., 2010; Bonk et al., 2013) . The Hobo loggers were fitted into purpose-made fabric pouches that were attached on the lateral side of the right hind leg above the metatarsophalangeal joint using Velcro and Kamar glue (Livestock Improvement Corp., Hamilton, New Zealand). Data loggers were placed horizontally on the leg such that the x-and zaxes ran parallel to the ground, with the x-axis pointing in the cranial direction and the z-axis pointing toward the mid plane of the calf. The y-axis ran perpendicular to the ground, pointing in the dorsal direction. Data loggers were initialized and downloaded using Onset HOBOware Pro software (version 3.7.2; Onset Computer Corp.). The output was converted into daily summaries of standing and lying behavior using SAS software (version 9.3; SAS Institute Inc., Cary, NC). For the second replicate only (n = 23: n = 10 infected calves and 13 uninfected controls), drinking behavior at the water trough was recorded continuously using Panasonic video cameras (HC-V270, Panasonic, Osaka, Japan). Cameras were secured to camera stands at a height of 2 m from the ground, which were attached to the side of the pen. Three red lights (PAR38 80W, Red Globe; Philips, Shanghai, China) were placed in each pen 1.7 m above the ground to enable behavioral ob-servations to be conducted at night with minimal disturbance to the calves' behavior. A drinking behavior was defined as "calf's head is over the water trough." Video footage was analyzed continuously for each animal to record the frequency and duration of visits to the trough. Video footage was analyzed using Adobe Premiere Pro CC (version 12.0 Haberdasher; Adobe Systems, San Jose, CA). Video analysis was carried out by a single observer, and intraobserver reliability was calculated based on the reanalysis of the video footage at 3 stages of the trial (beginning, middle, and end). Initial observations were compared with secondary observations, revealing that intraobserver reliability ranged from 95 to 99% with a combined overall average of 98%. The level of intraobserver reliability was calculated in Excel (version 16.10, Microsoft Corp.) using the correlation function. The effectiveness of the rotavirus infection treatment was limited because not all calves infected with rotavirus contracted the virus (only 8 of 19 treated animals became infected). In addition, some calves in the control group also contracted rotavirus (8 of 24 control animals contracted rotavirus). Furthermore, an unexpected outbreak of Salmonella (an NCD-causing pathogen) during the trial led to all calves, including those in the healthy control group, developing NCD. Although all calves developed NCD, the clinical dates on which they were identified as being clinically ill varied across individuals. Therefore, it was decided that treatment would be ignored and instead each calf would be analyzed as its own control with data analyzed with respect to when each calf displayed clinical signs, comparing measures both pre-and post-clinical signs regardless of the causative pathogen. During analysis, 3 animals from the first replicate were excluded due to insufficient data. Data for feeding behavior (milk consumption, total visits, and unrewarded visits), IRT temperatures (eye, cheek, back, shoulder, and side), lying behavior (lying time, number of lying bouts, and lying bout duration), drinking behavior (number of visits and visit duration), and RR were modeled with REML using Genstat (version 19; VSN International Ltd., Hemel Hempstead, UK). Data for each of these variables were fitted with splines for each animal to (1) account for some of the noise in the data, (2) account for some animals not having data on some of the days, and (3) model the correlations between the repeated observations on the animals. Restricted maximum likelihood with smoothing splines was fitted to the data to enable the general trends for all variables to be observed across d −7 to 7 (relative to the clinical identification of disease, d 0). We then summarized and examined in more detail particular time frames preand postonset of disease based on trends found using the REML models. For all variables, comparisons of the mean values were made between d −7 to −4 and d −3 to 0 and between d −7 to −1 and d 0 to 6 using the 1-sample sign test (exact binomial probability; Dixon and Mood, 1946; Conover, 1980) . The 1-sample sign test was used to measure the significance of change of each animal between the average daily value between periods being compared; standard error of the means was used to measure the level of variability. Table 1 presents the number of positive changes detected from the total number of pairs tested using the sign test for each variable measured. These time periods were chosen to enable a comparison to be made between 2 periods of time before the clinical identification of disease (d −7 to −4 and d −3 to 0) while also allowing for a comparison between the period of time before the clinical identification of disease (d −7 to −1) and the period of time post-clinical identification (d 0 to 6). The time periods before clinical identification of disease (d −7 to −4 and d −3 to 0) were chosen based on the earliest time at which a change in response to the onset of NCD could be detected because with any disease, the sooner it can be detected the sooner animals can be isolated to prevent the spread of disease and enable treatments to be administered. The sign test was chosen over other analyses (e.g., ANOVA) because the data were not normally distributed. Age relative to clinical diagnosis was included as a covariate in the analysis for all variables and showed a significant influence only on lying time (P < 0.001). For all IRT data, ambient temperature (°C) and relative humidity (%) were included as additional covariates in the REML model to adjust for changes in environmental conditions that were highly significant for all anatomical locations (P < 0.001) with the exception of the eye, which is less sensitive to changes in environmental conditions (P = 0.288). The inclusion of temperature and humidity as covariates improved the fit of the REML model for all anatomical locations, including the eye. For the REML model, data on drinking behavior were square root transformed before analysis to meet the assumptions of homogeneous variance and normal distribution of the data. Figure 3 shows the changes in milk consumption, total number of visits, and percentage of unrewarded visits made to the automated calf feeder over the days before and after (d −7 to 7) clinical signs of disease were identified (d 0). Milk consumption was higher during d −7 to −4 compared with the 4 d (d −3 to 0) immediately before the clinical identification of disease (d 0; Table 1 ). No other significant changes in feeding behavior were found during d −7 to −4 compared with the 4 d (d −3 to 0) immediately before the clinical identification of disease (d 0). However, milk consumption, total number of visits, and percentage of unrewarded visits to the automated calf feeder were all lower in the 7 d after clinical signs occurred compared with the 7 d before (Table 1 ). Figure 4 shows the changes in IRT temperature in response to the onset of disease for the different anatomical areas measured (eye, cheek, back, shoulder, and side) over the days before and after (d −7 to 7) clinical signs were identified (d 0). When comparing the 4 d immediately before the clinical identification of disease (d −3 to 0) and the previous 4 d (d −7 to −4), IRT temperatures changed significantly, with an increase in side temperature (Table 1 ) and a decrease in shoulder temperature (Table 1) . No other significant changes in IRT temperature were found during d −7 to −4 compared with the 4 d (d −3 to 0) immediately before the clinical identification of disease (d 0). However, when comparing the 7 d after clinical signs occurred and the 7 d before, IRT temperatures changed significantly, with a decrease in eye, back, and shoulder temperatures and an increase in cheek and side temperatures (Table 1) . Figure 5 shows the changes in lying behavior in terms of lying time, number of lying bouts, and bout duration over the days before and after (d −7 to 7) clinical signs of disease were identified (d 0). When comparing the 4 d immediately before the clinical identification of disease (d −3 to 0) and the previous 4 d (d −7 to −4), there was a significant decrease in the number of lying bouts and an increase in bout duration (Table 1) . When comparing the 7 d after clinical signs occurred and the 7 d before, significant changes were seen, with decreases in both lying time and the number of lying bouts and an increase in bout duration (Table 1) . Figure 6 shows changes in drinking behavior in terms of the number of visits and visit duration over the days before and after (d −7 to 7) clinical signs of disease were identified (d 0). There was no change in the number of visits to the water trough before or following clinical identification of disease. However, when comparing the 4 d immediately before the clinical identification of disease (d −3 to 0) with the previous 4 d (d −7 to −4), there was a significant increase in visit duration (Table 1 ). Figure 7 shows the changes in RR over the days before and after (d −7 to 7) clinical signs of disease were identified (d 0). When comparing the 4 d immediately before the clinical identification of disease (d −3 to 0) and the previous 4 d (d −7 to −4), there was no significant change in RR. However, when comparing the 7 d after clinical signs with the 7 d before, RR was found to decrease (Table 1) . Little change in feeding behavior occurred in response to the onset of disease until after calves had displayed clinical signs of disease. Milk consumption was the only feeding behavior to show a significant change, with a decrease before clinical signs of disease were evident. This decrease in milk consumption is likely a result of the calves having a reduced appetite before the onset of disease. Following clinical signs, milk consumption, number of visits, and percentage of unrewarded visits to the automated calf feeder all decreased significantly in response to disease. In contrast to these findings, previous studies using automated calf feeding systems for detecting disease found no effect on milk consumption (Svensson and Jensen, 2007) . However, as in the current study, Borderas et al. (2009) found milk consumption to be influenced by the onset of disease: when fed high milk allowances (12 L/d), sick calves decreased milk consumption and visits to the feeder and increased visit duration (changes that were also most apparent following clinical diagnosis). In contrast, when sick calves were fed low milk allowances (4 L/d) following clinical diagnosis, Borderas et al. (2009) found no change in milk consumption, whereas visit duration was found to decrease. These changes in feeding behavior are indicative of disease: compared with healthy calves, diseased calves visit the feeder less often and the visits are of a longer duration (it takes a greater amount of time for the calf to consume their allowance). Borderas et al. (2009) highlighted the importance of considering an animal's milk allowance when using changes in feeding behavior to identify diseased animals. Although the current study provided calves with a daily milk allowance of 4 to 6 L/d (which follows standard farm practice in New Zealand), according to Borderas et al. (2009) , this would be considered a low allowance. It is possible in the current study that changes in feeding behavior may have been greater had calves been provided with a greater milk allowance. Also, similar to our findings, previous studies (Svensson and Jensen, 2007; Sutherland et al., 2018) have reported that sick calves decrease the number of unrewarded visits in response to disease; Sutherland et al. (2018) also detected this decrease in the days before clinical diagnosis. Age and breed could be factors contributing to the differences in feeding behaviors across the studies. Compared with drinking rate and milk consumption, Svensson and Jensen (2007) suggested that the number of unrewarded visits was the most useful indicator of disease. Unrewarded visits represent calves testing the feeder to see whether milk is available, so a decrease in the number of unrewarded visits is suggestive of a reduction in appetite. The IRT temperatures of all anatomical areas measured were found to change in response to disease. This is consistent with known chronology for the onset of disease with thermal biometric values (Schaefer and Cook, 2013) . However, before clinical signs of disease appeared, significant changes in IRT temperatures were found only for the shoulder and side. The decrease in shoulder temperature may be due to the animal generating a state of fever to fight against the infection by restricting blood flow to the skin and extremities. This reduces heat loss to the environment, which helps the animal maintain homeostasis. The decrease in temperature could also be a result of a reduction in feed and metabolic activity in response to the onset of disease. The contrasting increase in side temperature is likely a result of the area being situated over the rumen fossa, resulting in the side area being in close proximity to the site of infection and localized inflammation of the intestines. Previous studies investigating the use of IRT as a method for the early detection of BRD and BVD in beef cattle found IRT temperatures to increase before clinical signs of disease were evident (Schaefer et al., 2004 (Schaefer et al., , 2007 . Inconsistencies across studies could be due to differences in pathogenesis associated with the modes of action corresponding to the different diseases as well as the specific anatomical area being measured and differences in environmental temperature and relative humidity. Similar to the current study, Schaefer et al. (2004) collected images from various anatomical locations (side, back, eye, ear, and nose) and found that IRT temperatures of these areas increased by 1.5 to 4.0°C before clinical signs of BVD and that changes of <1°C were clinically significant. In the current study, changes in IRT of <0.3°C before clinical signs appeared were found to be significant (P < 0.001). Additionally, before clinical signs of disease, a significant (P < 0.001) change in IRT temperature was found for the side and shoulder but not for the eye, cheek, or back. This finding is similar to Schaefer et al. (2004) reporting that different anatomical areas presented differing levels of sensitivity; for example, significant changes in eye temperature occurred as early as 1 d postinfection compared with changes in nose, ear, side, or back temperatures, which were not found to be significant until 5 to 6 d postinfection (Schaefer et al., 2004) . Through measuring temperature changes at different anatomical locations, IRT has also been investigated in previous studies (Montanholi et al., 2009 (Montanholi et al., , 2010 Martello et al., 2016) for assessing feed efficiency. When comparing cattle with high residual feed intake (RFI) and those with low RFI, Martello et al. (2016) found that cattle with a low RFI had a higher IRT temperature on the front of the head, whereas other anatomical locations (i.e., eye, ribs, flank, rump, and feet) showed no significant difference in relation to RFI. However, other studies (Montanholi et al., 2009 (Montanholi et al., , 2010 found that temperatures of the snout, cheek, and hoof were more suitable indicators of feeding efficiency. Compared with core body areas, a consistent finding across all 3 studies (Montanholi et al., 2009 (Montanholi et al., , 2010 Martello et al., 2016) was that temperature differences measured at the extremities were the most useful indicators of feeding efficiency. In contrast, the current study found core body areas, rather than extremities, to be the most useful indicators of disease. This possibly suggests that the suitability of different areas is dependent on the specific application for which they are being measured and needs to be considered when determining which indicators will be the most suitable. Lying time, number of lying bouts, and bout duration all changed significantly following clinical signs of disease, with decreases in both lying time and the number of lying bouts and an increase in bout duration. In addition, before clinical signs appeared, there was a decrease in the number of lying bouts and an increase in bout duration. Ollivett et al. (2014) investigated the effect of BRD on the lying behavior of dairy calves and found that lying time increased in response to the onset of disease. Similarly, in a study by Borderas et al. (2008) , calves that were given an injection of bacterial LPS to stimulate a fever response also increased the amount of time they spent lying. Sutherland et al. (2018) reported that before clinical signs of NCD became evident, calves increased the amount of time spent lying and typically performed fewer lying bouts. In the current study, before clinical signs appeared, there was no change in lying time; however, similar to Sutherland et al. (2018) , calves performed fewer lying bouts. Additionally, the current study found that the duration of lying bouts increased in response to the onset of disease. This change in the number of lying bouts and bout duration suggests that calves have fewer periods of activity. This is perhaps due to a reduced appetite and calves becoming lethargic and attempting to conserve energy in response to the onset of disease. In terms of drinking behavior, the only significant change observed in the current study was an increase in visit duration before clinical signs of disease appeared. Although water intake was not measured in the current study, this increase in visit duration may suggest an increase in the amount of water being consumed during each visit. Previous studies (Jenny et al., 1978; Wenge Figure 7 . Mean (±SEM, dashed lines) respiration rate during d −7 to 7 relative to the clinical identification of disease (d 0) based on outcomes of the REML model. et al., 2014) found water intake to increase in diarrheic calves. Similar results have also been found in pigs, whereby water intake increased the day before overt clinical signs of an enteric disease were seen (Madsen and Kristensen, 2005) . Neonatal calf diarrhea subjects the intestines to substantial damage, which results in severe diarrhea. Therefore, in the current study, increasing visit duration and potentially water intake could be a mechanism that restores hydration levels as calves attempt to overcome the disease. Respiration rate decreased only following clinical signs of disease, suggesting that it may not be a suitable early indicator for disease detection. The lack of change in RR before clinical signs were evident could be due to NCD being a gastrointestinal disease as opposed to a respiratory disease such as BRD, where RR would be expected to change more rapidly. Milk consumption, IRT temperatures of the side and shoulder, number and duration of lying bouts, and duration of drinking visits all showed changes before the clinical identification of disease, therefore demonstrating potential suitability as early indicators of NCD. The integration of these measures into an automated on-farm system could enable earlier treatment and isolation of diseased animals to prevent further spread of disease by alerting farmers to diseased animals earlier than is currently possible based on overt clinical signs. In addition to facilitating decision-making abilities for the farmer, the development of such a system would reduce costs on-farm and to the industry as a whole and ultimately improve calf health and welfare. Prevalence of endemic enteropathogens of calves in New Zealand dairy farms Technical note: Evaluation of data loggers for measuring lying behavior in dairy calves Behaviour of dairy calves after a low dose of bacterial endotoxin Automated measurement of changes in feeding behavior of milk-fed calves associated with illness Technical note: Validation of a system for monitoring individual feeding and drinking behavior and intake in grouphoused cattle An overview of calf diarrhea-Infectious etiology, diagnosis and intervention Practical Nonparametric Statistics A review of the importance of cryptosporidiosis in farm animals Enhancement of clinical signs in experimentally rotavirus infected calves by combined viral infections The statistical sign test Clinical and haematobiochemical evaluation of diarrheic neonatal buffalo calves (Bubalas bubalis) with reference to antioxidant changes Biological basis of the behaviour of sick animals Effect of fluid intake and dry matter concentration on scours and water intake in calves fed once daily The concept of sickness behaviour: A brief chronological account of four key discoveries Evaluation of data loggers, sampling intervals, and editing techniques for measuring the lying behavior of dairy cattle A model for monitoring the condition of young pigs by their drinking behaviour Infrared thermography as a tool to evaluate body surface temperature and its relationship with feed efficiency in Bos indicus cattle in tropical conditions Automated measurement of lying behaviour for monitoring the comfort and welfare of lactating dairy cows Monitoring feeding behaviour of dairy cows using accelerometers Evaluation of a protocol to reduce the incidence of neonatal calf diarrhoea on dairy herds Sickness behaviour and its relevance to animal welfare assessment at the group level Assessing feed efficiency in beef steers through feeding behaviour, infrared thermography and glucocorticoids On the determination of residual feed intake and associations of infrared thermography with efficiency and ultrasound traits in beef bulls The effect of respiratory disease on lying behaviour in Holstein dairy calves. Page 34 in Proc. AD-SA-ASAS-CSAS Joint Annual Meeting Contactless measurement of cow behaviour in a milking robot Heat generation and the role of infrared thermography in pathological conditions. Pages 69-78 in Thermography: Current Status and Advances in Livestock Animals and in Veterinary Medicine Fondazione Iniziative Zooprofilattiche E Zootecniche The noninvasive and automated detection of bovine respiratory disease onset in receiver calves using infrared thermography The use of infrared thermography as an early indicator of bovine respiratory disease complex in calves Early detection and prediction of infection using infrared thermography. Can Development and performance evaluation of calf diarrhoea pathogen nucleic acid purification and detection workflow Lying behavior and postpartum health status in grazing dairy cows Measurement of dairy calf behavior prior to onset of clinical disease and in response to disbudding using automated calf feeders and accelerometers Identification of diseased calves by use of data from automatic milk feeders Short communication: Automated detection of behavioral changes from respiratory disease in pre-weaned calves Nonsteroidal anti-inflammatory drug therapy for neonatal calf diarrhea complex: Effects on calf performance Assessment of an accelerometer system for detection of estrus and treatment with gonadotropin-releasing hormone at the time of insemination in lactating dairy cows Using behaviour to predict and identify ill health in animals Water and concentrate intake, weight gain and duration of diarrhea in young suckling calves on different diets The authors gratefully acknowledge assistance from Ariane Bright, Tania Blackmore, Melissa Hempstead, and Mark Wilson of InterAg (New Zealand) and Richard Laven (Massey University, New Zealand). We also thank Alvin Reid for the use of the automated calf feeders. The study was funded by DEC International NZ, Hamilton, New Zealand.