Pathogens have the potential to maintain genetic polymorphisms by creating frequency-dependent selection on their host. This can occur when a rare host genotype is less likely to be attacked by a pathogen (frequency-dependent disease attack) and has higher fitness at low frequency (negative frequency-dependent selection). In this study, we used wheat genotypes that were susceptible to different races of the pathogen Puccinia striiformis to test whether disease created frequency-selection on its host and whether such selection could maintain polymorphisms for resistance genes in the wheat populations. Four different two-way mixtures of wheat genotypes were planted at different frequencies in both the presence and absence of disease. Disease created frequency-dependent selection on its host in some populations. Unknown factors other than disease also created frequency-dependent selection in this system because, in some instances, rare genotype advantage was observed in the absence of disease. Although the pathogen created frequency-dependent selection on its host, this selection was not sufficient to maintain genetic polymorphism in the host populations. In all cases where frequency-dependent selection occurred only in the diseased plots, one of the two genotypes was predicted to dominate in the population and the same genotype was predicted to dominate in both the presence and absence of disease. Only in cases where frequency-dependent selection was not caused by disease was there evidence that genetic polymorphisms would be maintained in the population. The frequency-dependent selection described in this study is a consequence of epidemiological effects of disease and differs from the time-lagged frequency-dependent selection resulting from coevolution between hosts and parasites. The impact of this direct frequency-dependent selection on the maintenance of genetic polymorphisms in the host population is discussed.Corresponding Editor: C. Lively
1. Effects of disease and environment on competitive interactions among wheat genotypes were investigated. Five wheat genotypes were grown in up to six different two-way combinations and as pure stands in two or three locations during one to three growing seasons in the presence and absence of wheat stripe rust (caused by Puccinia striiformis). 2. Overall yield of the mixtures relative to the means of the monocultures did not differ among locations and years. However, interactions between genotypes were often affected by location and to a lesser degree by year. Disease significantly affected seed weight and seed number of the two susceptible genotypes in pure stands and in mixtures. Disease also led to changes in competitive interactions between resistant and susceptible genotypes. 3. Competitive interactions among genotypes often changed from early in the season (as measured by the number of tillers) to late in the season (as measured by yield per tiller). In a few mixtures negative correlations between early competitive ability (relative number of tillers) and components of late competition suggested that intra-genotypic competition might have been stronger than inter-genotypic competition.
Total ring depopulation is sometimes used as a management strategy for emerging infectious diseases in livestock, which raises ethical concerns regarding the potential slaughter of large numbers of healthy animals. We evaluated a farm-density-based ring culling strategy to control foot-and-mouth disease (FMD) in the United Kingdom (UK), which may allow for some farms within rings around infected premises (IPs) to escape depopulation. We simulated this reduced farm density, or “target density”, strategy using a spatially-explicit, stochastic, state-transition algorithm. We modeled FMD spread in four counties in the UK that have different farm demographics, using 740,000 simulations in a full-factorial analysis of epidemic impact measures ( i.e ., culled animals, culled farms, and epidemic length) and cull strategy parameters ( i.e ., target farm density, daily farm cull capacity, and cull radius). All of the cull strategy parameters listed above were drivers of epidemic impact. Our simulated target density strategy was usually more effective at combatting FMD compared with traditional total ring depopulation when considering mean culled animals and culled farms and was especially effective when daily farm cull capacity was low. The differences in epidemic impact measures among the counties are likely driven by farm demography, especially differences in cattle and farm density. To prevent over-culling and the associated economic, organizational, ethical, and psychological impacts, the target density strategy may be worth considering in decision-making processes for future control of FMD and other diseases.
Including food production in non-food systems, such as rubber plantations and biofuel or bioenergy crops, may contribute to household food security.We evaluated the potential for planting rice, mungbean, rice cultivar mixtures, and rice intercropped with mungbean in young rubber plantations in experiments in the Arakan Valley of Mindanao in the Philippines.Rice mixtures consisted of two-or three-row strips of cultivar Dinorado, a cultivar with higher value but lower yield, and high-yielding cultivar UPL Ri-5.Rice and mungbean intercropping treatments consisted of different combinations of two-or threerow strips of rice and mungbean.We used generalized linear mixed models to evaluate the yield of each crop alone and in the mixture or intercropping treatments.We also evaluated a land equivalent ratio for yield, along with weed biomass (where Ageratum conyzoides was particularly abundant), the severity of disease caused by Magnaporthe oryzae and Cochliobolus miyabeanus, and rice bug (Leptocorisa acuta) abundance.We analyzed the yield ranking of each cropping system across site-year combinations to determine mean relative performance and yield stability.When weighted by their relative economic value, UPL Ri-5 had the highest mean performance, but with decreasing performance in lowyielding environments.A rice and mungbean intercropping system had the second highest performance, tied with high-value Dinorado but without decreasing performance in lowyielding environments.Rice and mungbean intercropped with rubber have been adopted by farmers in the Arakan Valley.
Abstract Epidemic outbreak control often involves a spatially-explicit treatment area (quarantine, inoculation, ring cull) which covers the outbreak area and adjacent regions where hosts are thought to be latently infected. Emphasis on space however neglects the influence of treatment timing on outbreak control. We conducted field and in-silico experiments with wheat stripe rust (WSR), a long-distance dispersed plant disease, to understand interactions between treatment timing and area interact to suppress an outbreak. Full-factorial field experiments with three different ring culls (outbreak area only to a 25-fold increase in treatment area) at three different disease control timings (1.125, 1.25, and 1.5 latent periods after initial disease expression), indicated that earlier treatment timing had a conspicuously greater suppressive effect than the area treated. Disease spread computer simulations over a broad range of influential epidemic parameter values (R 0 , outbreak disease prevalence, epidemic duration) suggested that potentially unrealistically large increases in treatment area would be required to compensate for even small delays in treatment timing. Although disease surveillance programs are costly, our results suggest that treatments early in an epidemic disease outbreak require smaller areas to be effective, which may ultimately compensate for the upfront costs of proactive disease surveillance programs.
Epidemics caused by long-distance dispersed pathogens result in some of the most explosive and difficult to control diseases of both plants and animals (including humans). Yet the factors influencing disease spread, especially in the early stages of the outbreak, are not well-understood. We present scaling relationships, of potentially widespread relevance, that were developed from more than 15 years of field and in silico single focus studies of wheat stripe rust spread. These relationships emerged as a consequence of accounting for a greater proportion of the fat-tailed disease gradient that may be frequently underestimated in disease spread studies. Leptokurtic dispersal gradients (highly peaked and fat-tailed) are relatively common in nature and they can be represented by power law functions. Power law scale invariance properties generate patterns that repeat over multiple spatial scales, suggesting important and predictable scaling relationships between disease levels during the first generation of disease outbreaks and subsequent epidemic spread. Experimental wheat stripe rust outbreaks and disease spread simulations support theoretical scaling relationships from power law properties and suggest that relatively straightforward scaling approximations may be useful for projecting the spread of disease caused by long-distance dispersed pathogens. Our results suggest that, when actual dispersal/disease data are lacking, an inverse power law with exponent = 2 may provide a reasonable approximation for modeling disease spread. Furthermore, our experiments and simulations strongly suggest that early control treatments with small spatial extent are likely to be more effective at suppressing an outbreak caused by a long-distance dispersed pathogen than would delayed treatment of a larger area. The scaling relationships we detail and the associated consequences for disease control may be broadly applicable to plant and animal pathogens characterized by non-exponentially bound, fat-tailed dispersal gradients.
Soft white club wheat ( Triticum aestivum ssp. compactum ) is a unique component of wheat production in the Pacific Northwest, comprising 10 to 12% of the wheat crop. It is valued for milling and baking functionality and marketed for export in a 10 to 30% blend with soft white wheat known as western white. Our goal was to develop a club wheat cultivar for the traditional club wheat‐growing region of central Washington, with better soilborne disease resistance than currently grown cultivars. The bulk pedigree breeding method was used to select Pritchett (Reg. No. CV‐1123, PI 678944) from the cross: ‘Chukar’/2*‘Bruehl’. Pritchett has significantly better grain yield and grain volume weight in environments receiving less than 30 cm annual precipitation than Bruehl, the cultivar that it is targeted to replace. Pritchett has better milling quality, producing larger diameter cookies and greater volume sponge cake. Pritchett has effective adult plant resistance to stripe rust, has moderate resistance to Cephalosporium stripe, and carries the Pch1 gene for moderate resistance to eyespot. Pritchett carries the Rht‐B1b allele for reduced plant height but has excellent emergence from deep sowing. Pritchett was released because of its superior agronomic productivity in the targeted region, combined with resistance to multiple diseases and superior end use quality.
ABSTRACT The velocity of expansion of focal epidemics was studied using an updated version of the simulation model EPIMUL, with model parameters relevant to wheat stripe rust. The modified power law, the exponential model, and Lambert's general model were fit to primary disease gradient data from an artificially initiated field epidemic of stripe rust and employed to describe dispersal in simulations. The exponential model, which fit the field data poorly (R (2) = 0.728 to 0.776), yielded an epidemic that expanded as a traveling wave (i.e., at a constant velocity), after an initial buildup period. Both the modified power law and the Lambert model fit the field data well (R(2) = 0.962 to 0.988) and resulted in dispersive epidemic waves (velocities increased over time for the entire course of the epidemic). The field epidemic also expanded as a dispersive wave. Using parameters based on the field epidemic and modified power law dispersal as a baseline, life cycle components of the pathogen (lesion growth rate, latent period, infectious period, and multiplication rate) and dispersal gradient steepness were varied within biologically reasonable ranges for this disease to test their effect on dispersive wave epidemics. All components but the infectious period had a strong influence on epidemic velocity, but none changed the general pattern of velocity increasing over time.
Published May 1996. Facts and recommendations in this publication may no longer be valid. Please look for up-to-date information in the OSU Extension Catalog: http://extension.oregonstate.edu/catalog