Oral-fluid sampling was attempted on 513 individually housed, mixed-parity sows. Younger sows (P < .01) and re-sampling (P < .001) were associated with successful collection. Diagnostic results on samples collected on 2 successive days were correlated. Oral-fluid sampling in breeding herds would facilitate surveillance and animal welfare.
Distinct from tests used in diagnostics, tests used in surveillance must provide for detection while avoiding false alarms, i.e., acceptable diagnostic sensitivity but high diagnostic specificity. In the case of the reproductive and respiratory syndrome virus (PRRSV), RNA detection meets these requirements during the period of viremia, but antibody detection better meets these requirements in the post-viremic stage of the infection. Using the manufacturer's recommended cut-off (S/P ≥ 0.4), the diagnostic specificity of a PRRSV oral fluid antibody ELISA (IDEXX Laboratories, Inc., Westbrook, ME, USA) evaluated in this study was previously reported as ≥ 97 %. The aim of this study was to improve its use in surveillance by identifying a cut-off that would increase diagnostic specificity yet minimally impact its diagnostic sensitivity. Three sample sets were used to achieve this goal: oral fluids (n = 596) from pigs vaccinated with a modified live PRRSV vaccine under experimental conditions, field oral fluids (n = 1574) from 94 production sites of known negative status, and field oral fluids (n = 1380) from 211 sites of unknown PRRSV status. Based on the analysis of samples of known status (experimental samples and field samples from negative sites), a cut-off of S/P ≥ 1.0 resulted in a diagnostic specificity of 99.2 (95 % CI: 98.8, 99.7) and a diagnostic sensitivity of 96.5 (95 % CI: 85.2, 99.2). Among 211 sites of unknown status, 81 sites were classified as antibody positive using the manufacturer's cut-off; 20 of which were reclassified as negative using a cut-off of S/P ≥ 1.0. Further analysis showed that these 20 sites had a small proportion of samples (18.0 %) with S/P values just exceeding the manufacturer's cut-off (x̄ = 0.5). Whereas the remainder of positive sites (n = 61) had a high proportion of samples (76.3 %) with high S/P values (x̄ = 6.6). Thus, the manufacturer's cut-off (S/P ≥ 0.4) is appropriate for diagnostic applications, but a cut-off of S/P ≥ 1.0 provided the higher specificity required for surveillance. A previously unreported finding in this study was a statistically significant association between unexpected reactors and specific production sites and animal ages or stages. While beyond the scope of this study, these data suggested that certain animal husbandry or production practices may be associated with non-specific reactions.
This study aimed to utilize the only known pilot feed mill facility approved for pathogenic feed agent use in the United States to evaluate the effect of manufacturing Porcine Epidemic Diarrhea Virus (PEDV)-contaminated feed on subsequent feed mill environmental surface contamination. In this study, PEDV inoculated feed was manufactured and conveyed on equipment along with four subsequent batches of PEDV-free feed. Equipment and environmental surfaces were sampled using swabs and analyzed for the presence of PEDV RNA by PCR. The experiment was replicated three times with decontamination of the feed mill and all equipment between replications. Overall, environmental swabs indicated widespread surface contamination of the equipment and work area after a PEDV contaminated batch of feed was processed. There was little difference in environmental sample cycle threshold (Ct) values after manufacturing each of the subsequent PEDV-negative feed batches. In summary, introduction of PEDV-infected feed into a feed mill will likely result in widespread contamination of equipment and surfaces, even after several batches of PEDV-free feed are produced. Eliminating the PEDV RNA from the feed mill environment was challenging and required procedures that are not practical to apply on a regular basis in a feed mill. This data suggests that it is extremely important to prevent the introduction of PEDV-contaminated feed, ingredients, or other vectors of transmission to minimize PEDV-risk. More research should be conducted to determine if contaminated surfaces can lead to PEDV infectivity and to determine the best feed mill PEDV-decontamination strategies.
Research has confirmed that chemical treatments, such as medium chain fatty acids (MCFA) and commercial formaldehyde, can be effective to reduce the risk of porcine epidemic diarrhea virus (PEDV) cross-contamination in feed. However, the efficacy of MCFA levels below 2% inclusion is unknown. The objective of this experiment was to evaluate if a 1% inclusion of MCFA is as effective at PEDV mitigation as a 2% inclusion or formaldehyde in swine feed and spray-dried animal plasma (SDAP). Treatments were arranged in a 4 × 2 × 7 plus 2 factorial with 4 chemical treatments: 1) PEDV positive with no chemical treatment, 2) 0.325% commercial formaldehyde, 3) 1% MCFA, and 4) 2% MCFA. The 2 matrices were: 1) complete swine diet and 2) SDAP; with 7 analysis days: 0, 1, 3, 7, 14, 21, and 42 post inoculation; and 1 treatment each of PEDV negative untreated feed and plasma. Matrices were first chemically treated, then inoculated with PEDV, and stored at room temperature until being analyzed by RTqPCR. The analyzed values represent threshold cycle (CT), at which a higher CT value represents less detectable RNA. All main effects and interactions were significant (P < 0.009). Feed treated with MCFA, regardless of inclusion level, had fewer (P < 0.05) detectable viral particles than feed treated with formaldehyde. However, the SDAPtreated with either 1% or 2% MCFA had similar (P > 0.05) concentrations of detectable PEDV RNA as the untreated SDAP, while the SDAP treated with formaldehyde had fewer detectable viral particles (P < 0.05). The complete feed had a lower (P < 0.05) quantity of PEDV RNA than SDAP (39.5 vs. 35.0 for feed vs. SDAP, respectively) (P < 0.05). Analysis day also decreased (P < 0.05) the quantity of detectable viral particles from d 0 to 42, (33.2 vs. 44.0, respectively). In summary, time, formaldehyde, and MCFA all appear to enhance RNA degradation of PEDV in swine feed and ingredients; however, their effectiveness varies within matrix. The 1% inclusion level of MCFA was as effective as 2% in complete feed, but neither were effective at reducing the magnitude of PEDV RNA in SDAP.
Abstract All sectors of livestock production are in the process of shifting from small populations on many farms to large populations on fewer farms. A concurrent shift has occurred in the number of livestock moved across political boundaries. The unintended consequence of these changes has been the appearance of multifactorial diseases that are resistant to traditional methods of prevention and control. The need to understand complex animal health conditions mandates a shift toward the collection of longitudinal animal health data. Historically, collection of such data has frustrated and challenged animal health specialists. A promising trend in the evolution toward more efficient and effective livestock disease surveillance is the increased use of aggregate samples, e.g. bulk tank milk and oral fluid specimens. These sample types provide the means to monitor disease, estimate herd prevalence, and evaluate spatiotemporal trends in disease distribution. Thus, this article provides an overview of the use of bulk tank milk and pen-based oral fluids in the surveillance of livestock populations for infectious diseases.
The objective of this study was to evaluate targeted maternal weight gains in sows by parity group during gestation. Weight and backfat gains during gestation by parity, weight, and backfat groups also were analyzed. The data evaluated were a subset (374 sows) of a larger experiment that compared three methods of feeding sows during gestation on weight and backfat gains and subsequent reproductive performance. Feed allowances were based on modeled calculations of energy and nutrient requirements to achieve target sow maternal weight and backfat gains. Actual backfat gain for gilts and sows was regressed on maternal weight gain and estimated energy available for gain. The regression equations were then used to predict maternal weight gains for target backfat gains for three parity groups (gilts, Parity 1 and 2 sows, and Parity 3 and older sows). For gilts and Parity 1 and 2 sows, much greater target maternal weight gains are required to achieve 6 and 9 mm of backfat gain, whereas Parity 3 and older sows require maternal weight gains similar to those targeted to achieve the desired backfat gain. Given similar energy intake levels above maintenance, gilts gained more weight than multiparous sows, as gain was based more on protein and less on fat and thus was more efficient. Gilts required more maternal weight gain than sows to achieve similar backfat gains due to the higher protein and low fat contents of gain in younger, lighter sows compared with older parity sows. Low-backfat sows that needed to gain large amounts of backfat failed to achieve these large gains. We speculate this failure may be due to lower tissue insulation levels with the low backfat levels and higher activity levels of these sows compared with high-backfat sows. It seems that both parity and weight are individually important factors that influence energy and nutrient requirements for gestation in the modern sow.