Phosphorus (P) loss from agricultural fields and watersheds has been an important water quality issue for decades because of the critical role P plays in eutrophication. Historically, most research has focused on P losses by surface runoff and erosion because subsurface P losses were often deemed to be negligible. Perceptions of subsurface P transport, however, have evolved, and considerable work has been conducted to better understand the magnitude and importance of subsurface P transport and to identify practices and treatments that decrease subsurface P loads to surface waters. The objectives of this paper were (i) to critically review research on P transport in subsurface drainage, (ii) to determine factors that control P losses, and (iii) to identify gaps in the current scientific understanding of the role of subsurface drainage in P transport. Factors that affect subsurface P transport are discussed within the framework of intensively drained agricultural settings. These factors include soil characteristics (e.g., preferential flow, P sorption capacity, and redox conditions), drainage design (e.g., tile spacing, tile depth, and the installation of surface inlets), prevailing conditions and management (e.g., soil-test P levels, tillage, cropping system, and the source, rate, placement, and timing of P application), and hydrologic and climatic variables (e.g., baseflow, event flow, and seasonal differences). Structural, treatment, and management approaches to mitigate subsurface P transport-such as practices that disconnect flow pathways between surface soils and tile drains, drainage water management, in-stream or end-of-tile treatments, and ditch design and management-are also discussed. The review concludes by identifying gaps in the current understanding of P transport in subsurface drains and suggesting areas where future research is needed.
ABSTRACT Concern over eutrophication has directed attention to manure management effects on phosphorus (P) loss in runoff. This study evaluates the effects of manure application rate and type on runoff P concentrations from two, acidic agricultural soils over successive runoff events. Soils were packed into 100‐ × 20‐ × 5‐cm runoff boxes and broadcast with three manures (dairy, Bos taurus ; layer poultry, Gallus gallus ; swine, Sus scrofa ) at six rates, from 0 to 150 kg total phosphorus (TP) ha −1 Simulated rainfall (70 mm h −1 ) was applied until 30 min of runoff was collected 3, 10, and 24 d after manure application. Application rate was related to runoff P ( r 2 = 0.50–0.98), due to increased concentrations of dissolved reactive phosphorus (DRP) in runoff; as application rate increased, so did the contribution of DRP to runoff TP. Varied concentrations of water‐extractable phosphorus (WEP) in manures (2–8 g WEP kg −1 ) resulted in significantly lower DRP concentrations in runoff from dairy manure treatments (0.4–2.2 mg DRP L −1 ) than from poultry (0.3–32.5 mg DRP L −1 ) and swine manure treatments (0.3–22.7 mg DRP L −1 ). Differences in runoff DRP concentrations related to manure type and application rate were diminished by repeated rainfall events, probably as a result of manure P translocation into the soil and removal of applied P by runoff. Differential erosion of broadcast manure caused significant differences in runoff TP concentrations between soils. Results highlight the important, but transient, role of soluble P in manure on runoff P, and point to the interactive effects of management and soils on runoff P losses.
Phosphorus (P) originating from agriculture has long been recognized as a surface water pollutant. Best management practices (BMPs) designed to prevent P loss must be site-specific to be effective for water quality protection. This chapter provides an overview of BMPs that are well established as tools for controlling point and nonpoint source P pollution from agriculture. BMPs are viewed primarily as practices that are implemented, sometimes involuntarily, to reduce nonpoint source pollution. Successful BMP adoption should result in cost-effective reductions in nutrient losses and prevent nutrient buildup in soils to values of environmental concern. The guiding principles for agricultural P management remain the same today as in the past: provide adequate P for economically optimum crop yields and prevent movement of P from land to water.
Phosphorus (P) loss from agricultural watersheds is generally greater in storm rather than base flow. Although fundamental to P-based risk assessment tools, few studies have quantified the effect of storm size on P loss. Thus, the loss of P as a function of flow type (base and storm flow) and size was quantified for a mixed-land use watershed (FD-36; 39.5 ha) from 1997 to 2006. Storm size was ranked by return period (<1, 1-3, 3-5, 5-10, and >10 yr), where increasing return period represents storms with greater peak and total flow. From 1997 to 2006, storm flow accounted for 32% of watershed discharge yet contributed 65% of dissolved reactive P (DP) (107 g ha(-1) yr(-1)) and 80% of total P (TP) exported (515 g ha(-1) yr(-1)). Of 248 storm flows during this period, 93% had a return period of <1 yr, contributing most of the 10-yr flow (6507 m(3) ha(-1); 63%) and export of DP (574 g ha(-1); 54%) and TP (2423 g ha(-1); 47%). Two 10-yr storms contributed 23% of P exported between 1997 and 2006. A significant increase in storm flow DP concentration with storm size (0.09-0.16 mg L(-1)) suggests that P release from soil and/or area of the watershed producing runoff increase with storm size. Thus, implementation of P-based Best Management Practice needs to consider what level of risk management is acceptable.
Abstract A legacy of using P fertilizers on grazed pastures has been enhanced soil fertility and an associated increased risk of P loss in runoff. Rainfall simulation has been extensively used to develop relationships between soil test P (STP) and dissolved P (DP) in runoff as part of modeling efforts scrutinizing the impact of legacy P. This review examines the applicability of rainfall simulation to draw inferences related to legacy P. Using available literature, we propose a mixing layer model with chemical transfer to describe DP mobilization from pasture soils where readily available P in the mixing layer is rapidly exhausted and contact time controls DP concentrations responsible for subsequent DP mobilization. That conceptual model was shown to be consistent with field monitoring data and then used to assess the likely effect of rainfall simulation protocols on DP mobilization, highlighting the influence of soil preparation, scale and measurement duration, and, most important, hydrology that can facilitate the physical transport of P into and out of surface flow. We conclude that rainfall simulation experimental protocols can have severe limitations for developing relationships between DP in runoff and STP that are subsequently used to estimate legacy P contributions to downstream water resources.
Concern over the enrichment of agricultural runoff with phosphorus (P) from land applied livestock manures has prompted the development of manure amendments that minimize P solubility. In this study, we amended poultry, dairy, and swine manures with two rare earth chlorides, lanthanum chloride (LaCl 3 ·7H 2 O) and ytterbium chloride (YbCl 3 ·6H 2 O), to evaluate their effects on P solubility in the manure following incubation in the laboratory as well as on the fate of P and rare earth elements (REEs) when manures were surface‐applied to packed soil boxes and subjected to simulated rainfall. In terms of manure P solubility, La:water‐extractable P (WEP) ratios close to 1:1 resulted in maximum WEP reduction of 95% in dairy manure and 98% in dry poultry litter. Results from the runoff study showed that REE applications to dry manures such as poultry litter were less effective in reducing dissolved reactive phosphorus (DRP) in runoff than in liquid manures and slurries, which was likely due to mixing limitations. The most effective reductions of DRP in runoff by REEs were observed in the alkaline pH soil, although reductions of DRP in runoff from the acidic soil were still >50%. Particulate REEs were strongly associated with particulate P in runoff, suggesting a potentially useful role in tracking the fate of P and other manure constituents from manure‐amended soils. Finally, REEs that remained in soil following runoff had a tendency to precipitate WEP, especially in soils receiving manure amendments. The findings have valuable applications in water quality protection and the evaluation of P site assessment indices.
Abstract Excessive phosphorus (P) concentrations can lead to conditions that limit the amenity of freshwater resources. This problem is particularly acute in agricultural catchments, where P fertilizer and manure amendments have been used to increase soil fertility and productivity. In these catchments, P indices are often used to help target critical source areas in order to reduce P exports. However, the overall impact of agricultural mitigation efforts on receiving waters has not always been consistent with declines in total P exports from catchments. In this paper we propose a model of dissolved P mobilization (i.e., entrainment) in surface runoff that accounts for this outcome and examine modifications to P indices that better accommodate dissolved P mobilization. We suggest that dissolved P mobilization commences near the soil surface and has two phases. When water is first applied, labile P is mostly mobilized by dissolution and advection. Subsequently, as the supply of readily accessible P is exhausted, diffusion and hydrodynamic dispersion mobilize P from other sources at a near constant rate for the remainder of the event. As most P exports occur in larger (i.e., longer) events, the second phase appears responsible for most dissolved P exports. Such a model of dissolved P mobilization is consistent with runoff monitoring data under natural and simulated rainfall, suggesting that on low (shallow) slopes where the interaction between surface soil and water may be prolonged, dissolved P concentrations are likely to be higher. Dissolved P mobilization from low‐slope areas is not well represented in P indices at present. We suggest that there needs to be a more complex, mechanistic structure to P indices that involves additional compartmentalization. Further, we suggest that this can be achieved without losing the simplicity of P indices or flexibility to integrate research data and experiential knowledge into tools that are relevant to specific regions.
Riparian seepage zones in headwater agricultural watersheds represent important sources of nitrate-nitrogen (NO-N) to surface waters, often connecting N-rich groundwater systems to streams. In this study, we examined how NO-N concentrations in seep and stream water were affected by NO-N processing along seep surface flow paths and by upslope applications of N from fertilizers and manures. The research was conducted in two headwater agricultural watersheds, FD36 (40 ha) and RS (45 ha), which are fed, in part, by a shallow fractured aquifer system possessing high (3-16 mg L) NO-N concentrations. Data from in-seep monitoring showed that NO-N concentrations generally decreased downseep (top to bottom), indicating that most seeps retained or removed a fraction of delivered NO-N (16% in FD36 and 1% in RS). Annual mean N applications in upslope fields (as determined by yearly farmer surveys) were highly correlated with seep NO-N concentrations in both watersheds (slope: 0.06; = 0.79; < 0.001). Strong positive relationships also existed between seep and stream NO-N concentrations in FD36 (slope: 1.01; = 0.79; < 0.001) and in RS (slope: 0.64; = 0.80; < 0.001), further indicating that N applications control NO-N concentrations at the watershed scale. Our findings clearly point to NO-N leaching from upslope agricultural fields as the primary driver of NO-N losses from seeps to streams in these watersheds and therefore suggest that appropriate management strategies (cover crops, limiting fall/winter nutrient applications, decision support tools) be targeted in these zones.