Objectives To identify risk factors for culling of dairy cows from eight New South Wales dairy herds. Design A longitudinal population study of dairy cow culling in eight non‐seasonally calving dairy herds in the Camden district of New South Wales. Cox's proportional hazards model was used to evaluate various risk factors for culling for a specific reason (sales, deaths, reproductive failure, disorders of the udder and low milk production). Results Age at first calving was not a significant risk factor for culling. Milk production in the first lactation greater than the population mean did not influence length of productive life overall, but was associated with a greater hazard of removal for disorders of the udder. Risk of culling for reproductive failure differed significantly between farms, and was not related to events in the previous lactation such as calving‐to‐first service interval or calving‐to‐conception interval. Shorter calving intervals were associated with increased risk of removal for low milk production and disorders of the udder. Conclusion Longitudinal surveys to accurately identify reasons for removal from a wide range of herds, identification of herds with low culling rates (especially for reproductive failure and udder disorders), and the identification of practices associated with these culling rates would be worthwhile to the Australian dairy industry.
Formal decision-analytic methods can be used to frame disease control problems, the first step of which is to define a clear and specific objective. We demonstrate the imperative of framing clearly-defined management objectives in finding optimal control actions for control of disease outbreaks. We illustrate an analysis that can be applied rapidly at the start of an outbreak when there are multiple stakeholders involved with potentially multiple objectives, and when there are also multiple disease models upon which to compare control actions. The output of our analysis frames subsequent discourse between policy-makers, modellers and other stakeholders, by highlighting areas of discord among different management objectives and also among different models used in the analysis. We illustrate this approach in the context of a hypothetical foot-and-mouth disease (FMD) outbreak in Cumbria, UK using outputs from five rigorously-studied simulation models of FMD spread. We present both relative rankings and relative performance of controls within each model and across a range of objectives. Results illustrate how control actions change across both the base metric used to measure management success and across the statistic used to rank control actions according to said metric. This work represents a first step towards reconciling the extensive modelling work on disease control problems with frameworks for structured decision making.
The number of horses leaving the Australian Thoroughbred (TB) racing industry each year is of concern to animal welfare advocates, public and regulators. A horse's previous athletic performance is a significant driver of retirement from racing. Racehorse performance can be measured in terms of the total number of starts, duration of racing and prize money earned. This study investigated Australian racing records for the 2005 and 2010 Victorian TB foal crops to identify factors associated with total number of starts, racing career duration, prize money earned and age of last race start-up to the 10-year-old racing season. Racing Australia registered 4,577 TB horses born in Victoria in 2005 (n = 2,506) and 2010 (n = 2,071) that raced in Australia. Horses that started racing at 2-years of age had fewer race starts in their first racing season but an increased total number of starts, prize money and duration of racing. The median age of last start (LS) was five (Q1 4; Q3 7) years. Horses that had won a race, had a maximum handicap rating of 61 or above and those racing over distances of more than 2,400 m had an increased racing career duration and an age of last race start greater than 6-years of age. Horses participating in jumps races (n = 63) had the longest careers and older age of LS. These horses were more likely to have had a handicap rating over 80 and were just as likely to start their racing careers as 2-years-olds.
Abstract Objective To identify associations between the occurrence of sacrocaudal fusion and the potential morphology of certain hind limb bones in actively racing greyhounds. Methods The calcaneus, talus and patella from each hind limb were collected from 94 male and 77 female mature greyhound cadavers and grouped into four groups; right or left bones from greyhounds with a standard or fused sacrum. The measurements were recorded for the following parameters: body mass of the greyhound, mass, length, and width of the right and left calcanei, tali and patellae. Results A fused sacrum (4 sacral vertebrae) was present in 41% of specimens. The right and left calcanei, tali and patellae in greyhounds with a standard or fused sacrum were anatomically similar. Overall, left to right asymmetry was found, in the width of calcaneus ( P < 0.01) and the talus ( P < 0.05) and the length of calcaneus ( P < 0.001) all these being larger in bones from the left hind limbs. Comparing bones from dogs with a fused or unfused sacrum showed that the right calcaneus length ( P < 0.05) was significantly less than the left in those greyhounds with standard sacrum; the right calcaneus width was significantly less ( P < 0.01) than the left in those with a fused sacrum. There were no significant differences in the means of measurements of bones between greyhounds with a standard and those with a fused sacrum except for the mass of the right (95% CI 0.22 to 1.10, P < 0.01) and left (95% CI 0.18 to 1.04, P < 0.01) calcaneus which were heavier in greyhounds with a fused sacrum than those with a standard sacrum. Conclusion In a population of greyhounds that race on anticlockwise tracks, the left calcaneus was wider and longer than the right and the left talus was wider. This asymmetry was more significant in dogs with sacrocaudal fusion and those dogs had more massive calcanei than dogs with standard sacrums, suggesting a difference in the way these bones were loaded in dogs with sacrocaudal fusion compared to dogs with the standard sacral anatomy.
Contents This was an observational study of 828 lactations in 542 mixed‐age dairy cows that calved seasonally in a single, pasture‐fed herd in New Zealand in 2008 and 2009. The study objectives were to: (i) document daily liveweight change (∆ LW ) before and after observed oestrus for cows subsequently diagnosed pregnant or non‐pregnant and (ii) quantify the sensitivity and specificity of ∆ LW as a test for oestrus. The sensitivity and specificity of ∆ LW when combined with other commonly used oestrous detection methods was also evaluated. In cows that conceived as a result of service at detected oestrus, liveweight loss began 1 day before the day of detection and was greatest on the day of detection (−9.6 kg, 95% CI −11.3 kg to −7.8 kg; p < 0.01) compared with LW recorded 2 days before the day of detection. In cows that did not conceive to a service at a detected oestrus, the lowest liveweights were recorded 1 day before the day oestrus was detected (−4.3 kg, 95% CI −7.7 to −0.8 kg; p = 0.02) compared with LW recorded 4 days before the day of detection. The sensitivity and specificity of ∆ LW as a means of oestrous detection were 0.42 (95% CI 0.40–0.45) and 0.96 (95% CI 0.95–0.97), respectively. When ∆ LW was combined with tail paint and visual observation, the oestrous detection sensitivity and specificity were 0.86 and 0.94, respectively. Monitoring LW change holds promise to enhance the sensitivity and specificity of oestrous detection in combination with other oestrous detection methods.
Researchers from Australia, New Zealand, Canada and the United States collaborated to validate their foot and mouth disease models--AusSpread, InterSpread Plus and the North American Animal Disease Spread Model--in an effort to build confidence in their use as decision-support tools. The final stage of this project involved using the three models to simulate a number of disease outbreak scenarios, with data from the Republic of Ireland. The scenarios included an uncontrolled epidemic, and epidemics managed by combinations of stamping out and vaccination. The predicted numbers of infected premises, the duration of each epidemic, and the size of predicted outbreak areas were compared. Relative within-model between-scenario changes resulting from different control strategies or resource constraints in different scenarios were quantified and compared. Although there were differences between the models in absolute outcomes, between-scenario comparisons within each model were similar. In all three models, early use of ring vaccination resulted in the largest drop in number of infected premises compared with the standard stamping-out regimen. This consistency implies that the assumptions made by each of the three modelling teams were appropriate, which in turn serves to increase end-user confidence in predictions made by these models.
Few large studies exist on the outcome of patients treated for stage I/II mucosa-associated lymphoid tissue (MALT) lymphoma.We retrospectively reviewed the records of 77 patients consecutively treated for stage I (n = 66) or II (n = 11) MALT lymphoma at our institution. Progression-free survival (PFS), freedom from treatment failure (FFTF), and overall survival (OS) were calculated using the Kaplan-Meier method.The median follow-up time was 61 months (range 2-177 months). Fifty-two patients (68%) received local radiation therapy (RT) alone, 17 (22%) had surgery followed by adjuvant RT, five (6%) had surgery alone, two (3%) had surgery and chemotherapy, and one patient had chemotherapy alone. The median RT dose was 30 Gy (range 18-40 Gy). The 5-year PFS, FFTF, and OS rates were 76%, 78%, and 91%, respectively. The 5-year PFS (79% versus 50%; P = 0.002) and FFTF (81% versus 50%; P = 0.0004) rates were higher for patients who received RT as compared with patients who did not.The prognosis following treatment of stage I/II MALT lymphoma is excellent. RT improves PFS and FFTF and has an important role in the curative treatment of patients with localized disease.