Modeling root water uptake of cotton (Gossypium hirsutum l.) under deficit subsurface drip irrigation in West Texas

Date

2019-08

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Water availability is one of the major constraints in most of the cotton (Gossypium hirsutum L.) producing areas of the Southern High Plains (SHP) of Texas, where erratic rainfall, high rate of evaporation, and droughts are increasing pressure on the depleting Ogallala Aquifer to support the intensive groundwater-dependent agriculture in this SHP region. The growing emphasis on agricultural water management has increasingly stressed to develop sustainable irrigation strategies, such as deficit irrigation, for supplemental irrigations. Actual evapotranspiration cotton, i.e., actual evaporation and transpiration fluxes in cotton production systems, is affected by the root zone soil water dynamics. The quantitative evaluation of root zone soil water dynamics requires simultaneous knowledge of water flow in and through the root zone, root growth, and root water uptake (RWU). Despite its importance for managing efficient use of irrigation water, there remains a paucity of quantitative information on spatial and temporal RWU rate distributions in semiarid cotton fields especially under deficit subsurface drip irrigation in semiarid West Texas. Because of the complexity of water flow, root growth and RWU processes as well as the difficulties associated with field measurements of RWU rate distributions, numerical simulations of coupled water flow and heat transport, while accounting for root growth and RWU, using multidimensional vadose zone flow and transport models such as the HYDRUS (2D/3D) could provide an effective tool for managing efficient water use in cotton while enhancing cotton yield. To our knowledge, there have been very few studies that have evaluated the applicability of multidimensional vadose models such as HYDRUS (2D/3D) in predicting compensated and uncompensated RWU patterns of cotton under a deficit subsurface drip irrigation. Therefore, the overall objective of this study was to evaluate spatial and temporal RWU rate distributions in cotton grown under a deficit subsurface irrigation system using the HYDRUS (2D/3D) model. Specific objectives were to (i) calibrate and validate HYDRUS (2D/3D) model using field experimental data to predict RWU rate distributions under selected levels of deficit subsurface drip irrigation, (ii) quantify spatial and temporal RWU distributions in the cotton root zone under selected levels of deficit subsurface drip irrigation, (iii) evaluate actual transpiration, evapotranspiration and drainage fluxes under selected levels of deficit subsurface drip irrigation, and (iv) evaluate spatial and temporal compensatory RWU rate distributions under selected levels of deficit subsurface drip irrigation.
A field experiment was carried out at Texas Tech University’s New Deal Research Farm in a subsurface drip irrigated cotton field during the 2017 and 2018 growing seasons to observe soil physical, hydrological and thermal properties within the 0 to 100 cm soil profiles; soil water and soil temperature dynamics in the root zone down to 100 cm depth; meteorological variables; cotton physiological and root growth parameters; and cotton lint yield. The experiment was conducted using a randomized complete block design with four deficit irrigation treatments and four blocks. The four levels of deficit subsurface drip irrigation were applied: a seasonal requirement of (i) 50 mm (I1 treatment), (ii) 130 mm (I2 treatment), (iii) 200 mm (I3 treatment), and (iv) 280 mm (I4 treatment, which was used as the control). The HYDRUS (2D/3D) was calibrated at the control treatment (i.e., I4 deficit irrigation treatment) for a 157-day period from DOY (day of the year) 146 to DOY 303 (May 26 to October 30, 2017) using volumetric water contents and soil temperatures temperature measured at soil depths of 10, 20, 30, 50 and 80 cm as well as soil water potentials measured at 10, 20 and 40 and 60 cm depths in the two-layered 100-cm soil profile [i.e., sandy clay loam layer (0-20 cm) and clay loam layer (20-100 cm)]. Measured soil water retention, hydraulic, and heat transport parameters for the two-layered 100-cm soil profile domain were optimized using the HYDRUS inverse optimization algorithm, i.e., by minimizing the residuals between measured and simulated volumetric water content, soil temperature and soil water potential data. With the optimized parameters, the HYDRUS (2D/3D) was validated using experimental data from all treatments (I1, I2, I3, and I4 treatments) for a 157-day period from DOY 146 to DOY 303 (May 26 to October 30, 2017) and for a 154-day period from DOY 165 to DOY 319 (June 14 to November 15, 2018) during the two consecutive growing seasons. During both growing seasons, the HYDRUS (2D/3D) model simulations were found to agree with measured volumetric water content, soil temperature, and soil water potential values and their temporal variations at different soil depths for all treatments to a reasonable accuracy, as suggested by the results of the statistical indices Root mean square error (RMSE), Mean Error (ME), index of agreement (d) and coefficient of determination (R2). Simulated actual transpiration (i.e., RWU) and evapotranspiration flux and their cumulative values were in the order: I1 < I2 <I3 <I4, which were attributed to the enhanced root growth in I4 treatment as indicated by the higher root length density in I4 treatment in comparison with other deficit irrigation treatments (I1, I2, and I3). During a growing season, the maximum RWU rate was observed for I4 treatment, i.e., 0.007 cm3 cm-3 d-1, while the treatment I1 provided the minimum RWU rate of 0.003 cm3 cm-3 d-1. The highest root length density was observed in all treatments within the 20-40 cm depths. Accordingly, the depth distribution of RWU rates was not only governed by the root zone soil water status but also by the most densely rooted soil zone, i.e., 20-50 cm, suggesting the primary uptake layer or soil depth where soil water becomes critical for the most efficient RWU by cotton. Using the RWU compensation resulted in the enhanced RWU rates for all treatments. During a growing season, as compared to uncompensated RWU the enhanced RWU flux resulted in a higher increase in actual transpiration rate by 6% for I4 treatment while the corresponding increase was relatively lower for I1 with an increase by 2 %. HYDRUS (2D/3D) simulations suggest that the compensatory RWU distribution rates, which were independent of the plant stress status as indicated by stem or leaf water potential values, should be interpreted as a response to non-uniform soil water and root distributions. Relative evapotranspiration (ratio of actual to potential evapotranspiration) values correlated well with the measured stem water potential and leaf water potentials, further validate the use of HYDRUS (2D/3D) as a tool for evaluating cotton water stress. Overall, results of the multidimensional RWU simulations using experimental data at a semiarid cotton field under deficit subsurface drip irrigation suggest that the HYDRUS (2D/3D) could be used as an effective tool for managing efficient water use in drip-irrigated cotton under water-limited conditions.

Description

Rights

Availability

Unrestricted.

Keywords

Root water uptake (RWU), Days after planting (DAP)

Citation