The limitations of the conventional visual rating system used to assess turfgrass quality include its subjective nature and the need for properly trained observers who can discern differences among treatments or turfgrass varieties. The objective of our study was to investigate if digital image analysis (DIA) and spectral reflectance [normalized difference vegetative index (NDVI)] can be used to evaluate turfgrass varieties. Trials were established at New Mexico State University and visual quality ratings, digital images, and NDVI were collected monthly on three warm‐season and three cool‐season variety trials and on one cool‐season and one warm‐season mixed species trial. Correlations among quality, NDVI, dark green color index (DGCI) and percent green cover (PCov) were computed. Multiple regression was used to determine if combining NDVI and DIA improved the association between visual turfgrass quality and other variables. Quality was most strongly associated with NDVI ( R 2 ranging from 0.37 to 0.65) for most datasets. Additionally, multiple linear regressions identified NDVI as the variable affecting a higher change in R 2 when entered to the model than either DGCI or PCov. Visual quality had a weaker association with sampling date than did NDVI or DGCI, which indicates that NDVI may track quality changes more reliably over time. However, a stronger association between variety and visual quality than between variety and NDVI or DGCI indicates that a visual assessment detects varietal differences better. Therefore, it is questionable whether visual assessments can be replaced by NDVI or DIA to characterize the aesthetic appeal of turfgrasses accurately.
A study was conducted in New Mexico from 2005 to 2007 to investigate the effects of two potable water‐saving strategies, irrigating with saline water and using subsurface systems, on changes in rootzone salinity and quality of nine warm‐season turfgrasses. Plots were irrigated using either sprinklers or subsurface drip with water of 1 of 3 salinity levels (0.6, 2.0, and 3.5 dS m −1 ). Plots were rated monthly for quality during the growing seasons and bi‐annually for spring and fall color. Soil samples were collected bi‐annually (June and November) and analyzed for electrical conductivity (EC), Na, and sodium adsorption ratio (SAR) at depths of 0 to 20 and 50 to 60 cm. Electrical conductivity and Na values in 0 to 20 cm peaked in June of 2005 and 2006 and dropped to lower levels after the summer rainy season. With the exception of moderately saline irrigated plots in 2005, summer EC did not differ between drip and sprinkler irrigated plots for any of the three water qualities. Electrical conductivity, Na, and SAR at a rootzone depth of 0 to 20 cm were highest in June 2006 reaching 4.7 dS m −1 , 1024 mg L −1 , and 16.1, respectively. For most of the grasses tested, EC, Na, or SAR values showed no significant relationship with turf quality. Drip irrigation resulted in earlier green‐up than sprinkler irrigation but had no effect on summer quality or fall color retention. Most of the warm season grasses included in this study maintained an acceptable quality level when drip‐irrigated with saline water.
Core Ideas Changes in soil salinity after 46 yr of irrigation. Using electromagnetic induction to map soil salinity and sodicity. Correlate soil salinity with irrigation system distribution uniformity. Chamizal National Park, located in El Paso, TX, extends over 140,000 m 2 and has been irrigated with saline water for 46 yr. In recent years, turf areas in the park have severely degraded and bare spots have developed. Root zone salinity and sodicity were suspected to be the main reasons for the turf conditions. Developing salinity management and remediation strategies to improve turf quality requires information on the distribution of salinity (EC e ) within the turf root zone. Electromagnetic induction (EMI) uses apparent electrical conductivity (EC a ) to delineate salinity distribution, and is reportedly superior to traditional wet chemistry analyses. This study was conducted to investigate the spatial distribution of soil salinity and sodicity using the EMI technique. In addition, we assessed irrigation distribution uniformity and compared findings with root zone salinity and sodicity. The EMI data correlated well with saturated paste results and indicated that root zone salinity ranged from <1 to 43 dS m −1 . In several parts of the park, EC e exceeded the threshold values for bermudagrass of 15 dS m −1 . Root zone sodium adsorption ratio values ranged from <1 to 21 mmol 1/2 L −1/2 and in areas where increased runoff and surface ponding were observed, values exceeded the threshold level of 12 mmol 1/2 L −1/2 . Correlation analysis between irrigation uniformity parameters and standard deviation of EC e and SAR values revealed that more than 90% of the variability of EC and SAR in the top 30 cm of the root zone could be explained by irrigation uniformity.
A 3‐yr study was conducted in New Mexico to investigate the effects of saline water on changes in quality, cover, and root zone salinity of seven cool‐season turfgrasses. Plots were irrigated using either sprinklers or subsurface drip with water of 0.6, 2.0, or 3.5 dS m −1 . From March to November plots were rated monthly for quality, and green cover was determined using digital image analysis. Soil samples were collected at depths of 0 to 10, 10 to 20, and 50 to 60 cm in June and November and analyzed for electrical conductivity (EC), Na, and sodium adsorption ratio (SAR). Changes in soil EC, Na content, and SAR reflected seasonal changes in irrigation and natural precipitation and EC and Na values were highest (6.1 dS m −1 and 943 mg L −1 , respectively) in June of 2006 on drip irrigated plots at depths of 0 to 10 cm. Electrical conductivity was higher in drip irrigated than sprinkler irrigated plots on four of the six sampling dates. Irrigation type and water quality did not affect EC and Na at soil depths of 50 to 60 cm. For four of the seven grasses tested, EC, Na, or SAR values showed a significant but weak relationship (0.18 < r 2 < 0.27) with turf quality, indicating that more than one stressor affected visual ratings. With the exception of tall fescue [ Festuca arundinacea (Schreb.)], cool‐season grasses could not be maintained at acceptable quality levels in the arid transitional climate when irrigated with saline water from either a sprinkler or a subsurface drip system.
Abstract Limited information is available on the optimal frequency and amounts of irrigation water needed to establish cool‐season turfgrasses from seed in arid environments. A 2 × 2 factorial study was conducted at New Mexico State University in 2012 and repeated in 2017 to investigate the effect of four irrigation treatments: two reference evapotranspiration (ET o ) rates (60 and 120%), and two frequencies (daily and every other day [EOD]) on the establishment of several varieties of Kentucky bluegrass ( Poa pratensis L.), perennial ryegrass ( Lolium perenne L.), and tall fescue [ Schedonorus arundinaceus (Schreb.) Dumort] (TF). Our hypothesis was that irrigation rates below 100% ET o replacement applied EOD would negatively affect turfgrass establishment. Turfgrasses were seeded in the fall and digital image analysis was used to estimate percent coverage. The Area Under the Curve model was used to estimate days after seeding needed to reach 50 (DAS50) and 90% coverage (DAS90). Statistical analysis indicated a significant interaction ( p < .0001) between ET o and species and that irrigation frequency did affect establishment. Perennial ryegrass established the fastest and DAS90 was not affected by irrigation treatment. Kentucky bluegrass and TF established faster when irrigated daily at 120% ET o . Varieties within the same species all performed equally. In general, our study suggests that cool‐season turfgrasses can be established in arid regions using water conserving measures such as irrigating EOD at rates below 100% ET o replacement. Further studies are needed to determine lowest ET o replacement levels or irrigation frequency possible to achieve establishment.
Core Ideas We describe the use of a plant growth regulator and a soil surfactant for water conservation. We investigated whether the beneficial effects of a surfactant and a plant growth regulator on drought stressed bermudagrass were enhanced when applied in combination versus individually. We investigated if both products affect soil moisture under reduced irrigation. Soil surfactants and plant growth regulators (PGR) have shown potential to lower irrigation requirements and increase turfgrass quality under drought conditions. A study was conducted from 2014 to 2016 to investigate the soil surfactant Revolution, (modified methyl capped block copolymer [Aquatrols, Paulsboro, NJ]), or the plant growth regulator ‘PrimoMaxx’ (A.I. trinexapac‐ethyl [4‐(cyclopropylhydroxymethylene)‐3,5‐dioxocyclohexanecarboxylic acid]) (Syngenta, Basel, Switzerland), or a combination of both on percent green coverage, turfgrass color, quality, soil volumetric water content (VWC) and uniformity on Princess 77 bermudagrass ( Cynodon dactylon L.) grown on a loamy sand (mixed, thermic Typic Torripsamments) and irrigated at either 80%, 65%, or 50% of reference evapotranspiration for short grass (ET OS ). With the exception of plots irrigated at 50% ETos in 2015, bermudagrass receiving trinexapac‐ethyl (TE) either in combination with Revolution or alone exhibited darker green color when compared to untreated controls at all irrigation levels throughout the research period. At 50% ETos, plots treated with any of the three chemical treatments had greater quality (with 1 = worst, 9 = best) than control plots from July to September, with quality ratings of 6 or greater from June to August. Whereas VWC was not consistently enhanced by all treatment combinations, applications of Revolution, TE, and the combination of both resulted in increased VWC uniformity and greater irrigation use efficiency. Our results suggest that by using a surfactant, a PGR, or both, bermudagrass quality can be maintained with 15 to 30% less irrigation water than the optimal rate (80% ETos) without a reduction in color or quality.
Abstract Deficit irrigation is a water conserving practice that involves watering below an estimated evapotranspiration (ET) replacement level. Research is limited to comparing cool‐season (CS) and warm‐season (WS) turfgrass varieties grown in arid regions under varying deficit irrigation replacement levels. This study investigated the effects of five levels of reference evapotranspiration for short grass (ET OS ) replacement (55%, 70%, 85%, 100%, and 115%) on the performance and fall recovery of several turfgrasses in the southwestern United States. Three years of field research evaluated green cover and visual quality of three CS Kentucky bluegrass ( Poa pratensis L.) (four cultivars), tall fescue [ Schedonorus arundinaceus (Schreb.)] (three cultivars), and perennial ryegrass ( Lolium perenne L.) (three cultivars), and two WS turfgrasses bermudagrass ( Cynodon dactylon L.) (three cultivars) and buffalograss Buchloe dactyloides (two cultivars). CS grasses required higher ET OS replacement than WS grasses to maintain acceptable quality (1–9, ≥6 = minimum acceptable) and coverage. Among CS grasses, Barserati Kentucky bluegrass maintained the best quality and green cover under deficit irrigation and demonstrated the most consistent ability to recover. Notably, bermudagrass performed well under deficit irrigation, maintaining acceptable visual quality and better green cover than CS species like Kentucky bluegrass and tall fescue at lower irrigation levels. Overall, there were significant differences among cultivars, demonstrating the importance of the selection process in drought tolerance. These findings support the promotion of drought‐resistant WS grasses to conserve water in arid regions without compromising turfgrass functionality. Future research should focus on variable and seasonal ET OS for irrigation of turfgrasses and estimating irrigation requirements.