Abstract
Regulated deficit irrigation (RDI) and crop-load adjustment are regarded as important viticultural practices for premium-quality wine production, although little is known about their interactive effects. Crop loads were altered on field-grown, own-rooted Cabernet Sauvignon grapevines exposed to RDI varying in severity and timing in the arid Columbia Valley (Washington) from 1999 to 2003. Following a dry-down period through fruit set to stop shoot growth, vines were irrigated at 60 to 70% of full-vine evapotranspiration until harvest. Other vines either received the same amount of water up to veraison, after which the irrigation rate was cut in half, or had their irrigation halved before veraison but not thereafter. Clusters were thinned within irrigation treatments during the lag phase of berry growth to achieve a target yield of 6.7 t/ha, compared with an unthinned control. The severity and timing of RDI had only minor effects on vegetative growth, yield formation, fruit composition (soluble solids, titratable acidity, pH, K+, color), and cold hardiness. The more severe water-deficit treatments slowed berry growth while the treatments were being imposed, but final berry weights were similar in three of five years. Although cluster thinning reduced yields by 35% and crop loads by 32%, crop load had little or no influence on vegetative growth and cluster yield components and advanced fruit maturity at most by three to four days. Very few interactive effects of RDI and crop load were observed, indicating that the crop load did not influence the response of vines to RDI.
- regulated deficit irrigation
- growth
- fruit set
- yield components
- grape berry
- fruit composition
- cold hardiness
- Vitis vinifera
The Columbia Valley in Washington may be classified as arid temperate steppe (Fischer and Turner 1978) and is characterized by warm, very dry, and short summers and cold winters. Maximum summer temperatures may exceed 40°C, while winter temperatures occasionally drop below −20°C. The frost-free period lasts less than 160 days, and grape berries usually ripen in warm days and cool nights (daily range ~18°C). Extremely low annual precipitation (~200 mm) prohibits grapegrowing without irrigation but lends itself to deficit-irrigation techniques that can be finely tuned to suit particular cultivars and wine styles. Shoot growth is extremely responsive to water stress, so that canopy development is easily manipulated by deficit irrigation. Moreover, because the sensitivity of grape berries to water deficit progressively decreases during development, final berry size may be effectively controlled by preveraison water deficit (Matthews and Anderson 1988, McCarthy 1997). Regulated deficit irrigation (RDI) aims to exploit this principle by applying a short episode of water deficit as soon as possible after fruit set. Earlier deficit is generally avoided due to the risk of poor fruit set (Hardie and Considine 1976). During the RDI period, irrigation water is withheld and the soil allowed to dry down until control of shoot growth has been achieved (Kriedemann and Goodwin 2003). Once shoot growth stops and especially after veraison, water application is controlled to restrain new shoot growth, limit berry size, and favor fruit ripening. Finally, the root zone is rewatered to field capacity at the end of the growing season to prevent root damage brought on by insufficient soil moisture during the cold dormant season.
Some water deficit is generally regarded as beneficial for wine quality, in particular red wine (Matthews et al. 1990). Berry size has long been thought to be a key factor contributing to (red) wine quality, because smaller berries have a larger surface to volume ratio. As anthocyanin pigments, tannins, and other quality-relevant components are extracted from the skin during fermentation, small berries are generally seen as desirable, since they are perceived as “more concentrated.” However, the existence of a simple cause-effect relationship between berry size and fruit composition, and hence wine quality, has recently been questioned. Water deficit also appears to (beneficially) influence fruit composition in ways that are, at least in part, independent of berry size (Roby et al. 2004). Indeed, it was recently found that water deficit can enhance accumulation of anthocyanins by stimulating the expression of genes encoding their biosynthesis (Castellarin et al. 2007).
Winemakers producing premium and ultrapremium wines generally aim to maximize wine quality (however subjectively this may be defined), whereas growers tend to maximize the crop within the constraints of winery demands (especially in the New World) and legal regulations (especially in the Old World). Although these two goals are not necessarily mutually exclusive, they clash frequently enough to warrant scientific attention. In Germany, temperature and rainfall were found to be the main factors driving fluctuations in grape quality between seasons, but a clear yield-quality relationship could not be identified (Hofäcker et al. 1976). A similar conclusion was recently reached in an investigation of the response to cluster thinning of three deficit-irrigated winegrape cultivars in eastern Washington (Keller et al. 2005). No one disputes that excessive crop loads delay ripening and may reduce fruit and wine quality (see review, Jackson and Lombard 1993), but the boundary between adequate and excessive is not obvious. There is considerable evidence suggesting that fruit composition and wine quality of Cabernet Sauvignon may be relatively insensitive to variations in yield (Ough and Nagaoka 1984, Bravdo et al. 1985, Keller et al. 1998, 2005, Keller and Hrazdina 1998), and that the response may depend on how and when the yield variation is established (Chapman et al. 2004). Nevertheless, it is not clear if this (or any other) cultivar can support and ripen the same amount of fruit under varying degrees of water deficit. Because many studies investigating the effects of RDI have compared irrigated with nonirrigated vines, there is little quantitative information available to determine optimum levels of water deficit at various times during the growing season. Although fruit soluble solids and other quality-relevant attributes are often higher in grapes from somewhat water-stressed vines, excessive stress clearly has a negative influence on fruit composition (Hardie and Considine 1976, Esteban et al. 1999). In addition, growers in the inland Northwest are concerned that water stress might impact winter survival and long-term vine productivity. Therefore, quantitative information could be used to fine-tune RDI strategies in order to produce fruit to winery specifications (e.g., for blending options) while maintaining vine capacity and cold hardiness.
The present study was conducted to test the interaction of the extent and timing of relatively severe water deficit (thereby conserving water resources) and cluster thinning and their combined effects on canopy development, vine capacity, yield formation, fruit composition, and cold hardiness. This paper reports results from a field trial conducted with own-rooted Cabernet Sauvignon grapevines in an arid climate over a five-year period. Gas exchange and other physiological measurements conducted in a companion study in the same vineyard will be reported separately.
Materials and Methods
Vineyard site and experimental design.
The experiment was conducted from 1999 to 2003 in the Canoe Ridge vineyard of Ste. Michelle Wine Estates, west of Paterson, Columbia Valley, Washington (45.88°N; 119.75°W; 125 m asl). The vineyard lies in the Horse Heaven Hills American Viticultural Area and is a source of ultrapremium fruit. Climatic data for the area are shown in Table 1⇓ and Figure 1⇓. Own-rooted Vitis vinifera L. cv. Cabernet Sauvignon had been planted in 1992 with a vine by row spacing of 1.83 m by 2.74 m on a uniformly deep (>1 m) loamy fine sand (Burbank sandy-skeletal, mixed, mesic Xeric Torriorthents). The soil’s field capacity is 14.6% (v/v), the permanent wilting point is 7.1% (v/v), and plant-available water was estimated at 50% of the total soil water. The vineyard site has a 14% south-facing slope and rows oriented north-south. Vines were trained to two trunks and a bilateral cordon at 106 cm; shoots were not positioned, but a leeward wind wire at 126 cm prevented the canopy from rolling over. Vines were spur-pruned to 36 to 42 nodes (20 to 23 per m of cordon) and, in accordance with standard industry practice, were thinned to ~20 shoots/m at the beginning of bloom in 1999 and ~25 shoots/m in 2000 by removing noncount shoots. Because the winery target yield of 13.4 t/ha for this vineyard could not be achieved, no shoot thinning was conducted after 2000 in order to maximize potential yields and reduce labor costs. All fertilizer applications (via fertigation through the drip system in identical total seasonal amounts for all treatments) and pest and disease management practices were applied commercially and as uniformly as possible across the vineyard (also see Schreiner et al. 2007).
Meteorological data from the Washington State University Public Agricultural Weather System (WSU-PAWS) weather station south of Alderdale, WA (10 km west of vineyard site) and total amount of irrigation water applied by treatment.
Growing degree day accumulation (base 10°C) in Alderdale, WA, from 1 Apr to 31 Oct 1999 to 2003; long-term mean derived from the Paterson, WA, weather station and adjusted for Alderdale by polynomial regression.
The vineyard was drip-irrigated using three pressure-compensated emitters (flow rate 1.8 L/h) for every two vines (1.2 m between emitters). Precipitation during winter was usually insufficient to fill the soil profile; thus the root zone was irrigated to field capacity after bud-break. Irrigation was interrupted before bloom, and the soil was allowed to dry down to control shoot growth (Figure 2⇓). As soon as shoot growth had ceased (~15 days after fruit set or a few days after the pea-size stage), three irrigation treatments were imposed. The current industry standard for regulated deficit irrigation (RDIS) was used as a control to replenish 70% (1999 and 2000) or 60% (2001 to 2003) of full-vine evapotranspiration (ETFV) and maintain soil moisture in the top 1 m at 10% (v/v) through harvest. This standard was derived from reference crop (grass) evapotranspiration (ET0) and a variable (from ~0.3 at the start of treatments to ~0.8 in early August to ~0.4 by harvest) crop coefficient developed for fully irrigated Cabernet Sauvignon in eastern Washington (Evans et al. 1993), assuming that the smaller canopy of deficit-irrigated vines would transpire only about 60 to 70% of ETFV. Two more severe water-deficit treatments (RDIE and RDIL) were designed to replenish 50% (1999) or 30 to 35% (2000 to 2003) of ETFV and maintain soil moisture at 8.3% (v/v) while the treatments were in place. The early-deficit (or preveraison) treatment (RDIE) was imposed until veraison, whereas the late-deficit (or postveraison) treatment (RDIL) was imposed from veraison through harvest (Table 2⇓). RDIL was treated as RDIS before veraison, and RDIE was treated as RDIS after veraison. Irrigation scheduling and amount were based on neutron-probe measurements (see below) conducted every Monday. The required irrigation water was then applied to each treatment in sets of ≥16 hr duration over the following one to five days (depending on the total amount to be applied). Initially, a 90-cm-deep root zone over the entire vineyard area was used to calculate the required amount of irrigation water. That was changed in 2001 to a management area covering a band of 37% of the total soil surface area, because vine roots in this soil were heavily concentrated under the drip lines (Schreiner et al. 2007); this decreased fluctuations in soil moisture and enabled closer adherence to the moisture targets (Figure 2⇓). Soil moisture in the root zone was replenished after harvest to minimize freeze-induced root injury during winter.
Key phenological stages (50% level), initiation and switching of water-deficit treatments, and time of cluster thinning of own-rooted Cabernet Sauvignon grapevines in the Canoe Ridge vineyard, Columbia Valley, WA (DOY: day of year).
Influence of growing season and irrigation regime on the volumetric soil moisture content in the top 90 cm (means ± se) of a Cabernet Sauvignon vineyard in the Columbia Valley, WA, 1999–2003 (B: bloom, S: fruit set, V: veraison, H: harvest).
Two crop-load treatments were imposed within each irrigation treatment: the low crop load (CLL) attempted to achieve a target yield of 6.7 t/ha, and the high crop load (CLH) was an unthinned control. Clusters were thinned in the CLL treatment at the beginning of veraison. The amount of fruit to be removed was based on counting and weighing clusters during the lag phase of berry growth and comparing these numbers with the previous ≥3-yr means of lag-phase and harvest data for this site (whereby each season’s new figures were added to the database). Because the objective of thinning was a constant target yield in CLL, this approach did not necessarily result in the same number of clusters being removed in each of the RDI treatments (RDIE tended to have slightly less fruit removal to compensate for the smaller berry size). Thinning was conducted on the west side of the canopy by preferentially removing green clusters (that lagged behind in development) to emulate commercial practices. After 1999, it was decided that thinning would only be carried out if the lag-phase yield estimate predicted a crop >5 kg/vine (~10 t/ha). As a result, clusters were not thinned in 2001 and 2003.
The experiment was designed as a split-plot with four replicated blocks, each comprising 30 rows of 56 to 70 vines. The irrigation treatments were randomly applied as main plots (10 rows each) within each block, and the crop-load treatments were applied as subplots within irrigation treatments. The CLH treatment was applied to three rows and the CLL treatment to seven rows per replicate in order to provide similar amounts of fruit ( 1.0 t/replicate) for separate winemaking (data not presented here). Measurements were taken on four “data” vines in each of three adjacent rows per treatment replicate or 48 vines per treatment combination.
Data collection and analysis.
Meteorological conditions were monitored throughout the experimental period using raw data derived from the Washington State University Public Agricultural Weather System (WSU-PAWS) Alderdale weather station, which is located at an elevation of 224 m, 10 km west of the experimental vineyard. Soil moisture was monitored weekly using neutron probes (503 DR Hydroprobe; CPN International, Concord, CA). Two PVC probe access tubes per treatment replicate were installed to a maximum soil depth of 1.5 m, located within rows and equidistant between drip emitters. In addition, stem or xylem water potential (Ψx) was monitored using a pressure chamber (PMS Instrument, Albany, OR) on at least four fully expanded leaves per treatment replicate from fruit set through harvest on the day before irrigation. Sun-exposed leaves were collected after they had been enclosed in aluminum-coated plastic bags for 30 min between 11:00 and 15:00. Preliminary measurements showed that Ψx remained nearly constant throughout this 4-hr period. Vine nutrient status was assessed in 2001 by collecting petiole samples within each treatment replicate (100 combined petioles per replicate) six times between fruit set and harvest. Petioles were analyzed by Cascade Analytical (Wenatchee, WA) for N, NO3-, P, K, Ca, Mg, B, Zn, Fe, Cu, and Mn. In addition, nutrient elements were analyzed in leaf blades sampled at veraison and harvest of 2003 in a companion study (Schreiner et al. 2007).
Pruning weights were recorded during winter pruning, and the number of shoots per vine was counted in the week before bloom. Shoots were separated into count shoots, originating from nodes retained at pruning, and noncount shoots, originating from basal buds and latent buds. Beginning in 2000, shoot length, number of leaves per shoot, leaf size, and leaf area per shoot and per vine were determined at bloom, at 650 growing degree days (GDD > 10°C) and about one week before harvest. All measurements were done on two shoots per data vine, and leaf area was estimated as described elsewhere (Keller et al. 2005). Shoot maturation was estimated by counting the number of internodes that had formed brown periderm at veraison. Bud and cane (phloem and xylem) cold hardiness of cane pieces consisting of the five basal buds was determined biweekly during the 2000/01 and 2003/04 winters by differential thermal analysis, using a published protocol (Wample et al. 1990, Wample and Bary 1992). Yield components were determined as follows: yield per vine was recorded at harvest; number of clusters per vine was counted just before bloom and at harvest; number of berries per cluster and mean berry weight were recorded at harvest. In addition, potential carry-over effects from previous seasons on bud fruitfulness (clusters per shoot), cluster differentiation (flowers per inflorescence), and fruit set were estimated by collecting inflorescences before bloom from vines close to the data vines. From each treatment replicate we collected 12 clusters each from lower and upper positions on count shoots and 12 clusters from noncount shoots and froze them immediately at −20°C. The number of flowers per cluster was estimated by regression between flower weight and flower number on 25% of the samples (r2 = 0.96, p < 0.001). Clusters were dipped in liquid nitrogen and then shaken in a Petri dish to dislodge the flowers from the rachis. Flowers were then immediately weighed to avoid thawing and water loss before counting. To calculate percentage fruit set, clusters were collected in the same manner in late July. From each treatment replicate we collected four clusters each from lower and upper positions on count shoots and four clusters from noncount shoots. Berries were counted on all clusters.
The time of harvest was based on a 24 Brix threshold, which was determined by weekly measurements of fruit composition that began after all clusters had passed veraison (~16 to 18 Brix). Twenty clusters were collected per treatment replicate, and from each of these five berries were plucked alternately from the top, middle, and bottom portion at random. A 100-berry subsample was weighed, crushed, and processed the same day. At harvest, 100 berries were collected and frozen at −18°C. The frozen samples were heated to 60°C for 20 min in a circulating water bath, shaken to resuspend the solids, allowed to cool to room temperature, and mixed. Juice samples were analyzed for soluble solids, titratable acidity (TA), pH, and total color as described elsewhere (Spayd et al. 2002). Individual pH values were converted to [H+], and the reported means were recalculated from means of [H+]. Juice potassium concentration ([K+]) was measured using a Jenway (Essex, England) flame photometer following dilution (1:25) with distilled H2O. The vegetative and yield component data were collected on a per vine basis, whereas the fruit-set and fruit composition data were collected as composite samples on a per replicate basis.
Statistica software (version 7.1; StaSoft, Tulsa, OK) was used for data analysis. Results were subjected to three-way (irrigation x crop load x season) analysis of variance (ANOVA) and F-test. The effects of season and season x irrigation interactions were almost always highly significant (p < 0.001) and failed Levene’s test because of differences in variance among seasons. Therefore, data were also analyzed as two-way (irrigation x crop load) ANOVA for each season, using the general linear model procedure for split-plot design with irrigation as the main plot and crop load as the subplot. The consecutive berry weight and fruit composition data were analyzed with a repeated-measures design. Duncan’s new multiple range test was used for post-hoc comparison of significant irrigation treatment means. Irrigation and crop-load treatment means are presented separately for each season. Selected variables were subjected to correlation analysis following appropriate transformations where necessary. Curves were fitted using the negative exponential-weighted least-squares method.
Results
Weather, soil moisture, and vine water status.
Three of the five growing seasons experienced more or less average temperatures, but the trial period also covered an exceptionally warm season (2003) and an unusually cool season (1999) (Table 1⇑, Figure 1⇑). Mean maximum/ minimum temperatures (all standard errors <0.9°C) during the period from budbreak to the beginning of bloom ranged from 19.9/7.6°C (2003) to 23.1/8.6°C (2001). The range during bloom–fruit set was 23.2/11.0°C (2000) to 27.6/13.8°C (2003). The variation in temperature among seasons was comparatively low between fruit set and veraison: 29.3/14.6°C (1999) to 31.4/16.0°C (2003). Differences were again significant ( p < 0.001) from veraison to harvest: 28.0/13.4°C (2000) to 30.7/14.6°C (2001). Seasonal differences were even more pronounced between harvest and leaf fall (i.e., first frost), with mean temperatures varying from 16.9/5.8°C (2000) to 22.3/9.3°C (2003). Moreover, compared with the other seasons, the mean minimum temperatures in 2003 were consistently (p < 0.001) between 1.0 and 1.8°C higher in each period from bloom through harvest, and 2.5°C higher between harvest and leaf fall. Each summer had about 12 to 15 days with maximum temperatures >35°C (usually before veraison), but only three days over the five years reached temperatures slightly higher than 40°C. Winter temperatures only rarely declined below −5°C, but a series of unseasonably cold nights occurred from 31 Oct to 3 Nov 2002 (minimum −9.1°C).
The soil always dried down during spring and early summer (Figure 2⇑). By bloom, the soil moisture averaged over the top 90 cm generally reached ~9 to 11% (v/v), although there was some variation among seasons. Mean soil moisture during the bloom–fruit set period was significantly (p < 0.001) higher in 2000 (9.4%) and, especially, 2003 (10.3%) than in the other three years (8.4 to 8.9%). Indeed, because of the relatively abundant spring rainfall in 2003, no irrigation water was applied until after fruit set, which was reflected in the low amounts of total irrigation during that season (Table 1⇑). Because these seasonal values include the prebloom and postharvest water supply, treatment differences may not appear to be very high. However, during the actual treament period the RDIE vines on average received 60% of the water in the other treatments, while the RDIL vines received only 45% of the water in the other treatments (a breakdown of irrigation water by period for the last three years of this study is provided in Schreiner et al. 2007). Therefore, once the irrigation treatments were imposed, differences in soil moisture due to treatment were far greater than those due to season, indicating that the RDI strategy worked very well on this site (Figure 2⇑). Although soil moisture fluctuated more in the top 30 cm than farther down the soil profile, differences due to irrigation treatments were similar down to 90 cm which, no doubt, was a reflection of the sandy nature of the soil at this site. Consequently, RDI E resulted in the driest soil between fruit set and veraison, and RDIL maintained the driest soil from veraison through harvest. The more severe water-deficit treatments often maintained the soil moisture just slightly above the permanent wilting point (in the top 30 cm even below that), while abundant irrigation at the end of October refilled the soil profile to field capacity. The start time and, particularly, the duration of each deficit period varied considerably among seasons, depending on temperature effects on vine phenology (Table 2⇑).
Measurements of midday Ψx for the most part reflected the soil moisture data. In particular, RDIE consistently led to lower Ψx during the preveraison period than did the less severe deficit treatments, whereas the differences during the postveraison period were significant only in two out of the three years Ψx was monitored (Table 3⇓). It is possible that these findings reflected our sampling strategy: Ψx was measured on the day before irrigation water was applied. By that time the soil may often have dried down sufficiently to impose a similar plant water deficit in all treatments, even though treatments would have differed markedly earlier on in the dry-down cycle (Figure 2⇑). The crop-load treatment never influenced soil moisture or Ψx during any part of the season (data not shown).
Effect of deficit-irrigation timing and growing season on midday stem or xylem water potential (Ψx) the day before irrigation of field-grown, own-rooted Cabernet Sauvignon grapevines in the Columbia Valley, WA. No measurements taken in 1999 and 2002.
Growth and yield formation.
Although there was some seasonal variation, irrigation and crop load had only minor effects on vigor (shoot elongation per day), shoot length, canopy density (shoots per meter of canopy), leaf area, and vine size (pruning weight) of field-grown Cabernet Sauvignon (Table 4⇓). The increase in shoot number (mean ± standard error, which were unaffected by any treatment) from 30 ± 0.2 shoots/m in 2001 to 32 ± 0.3 in 2002 and 35 ± 0.3 in 2003 (data not shown) was associated with an overall decrease in shoot vigor ( p < 0.001). Therefore, both pre- and postbloom shoot growth decreased ( p < 0.001) over the last three years of this study (when no shoot thinning was carried out), but did not differ among treatments (data not shown). By bloom the shoots reached 76 ± 0.9 cm (2001), 67 ± 0.9 cm (2002), and 62 ± 0.9 cm (2003). Shoot length at harvest, which on average occurred 105 d after bloom (Table 2⇑), was 92 ± 1.8 cm (2001), 76 ± 1.4 cm (2002), and 64 ± 1.0 cm (2003). 2002 was the only season in which RDIE resulted in lower postbloom vigor (i.e., shoot elongation rate between bloom and harvest) than RDIL and RDIS (Table 4⇓). The postbloom vigor of unthinned vines and their crop load (i.e., amount of fruit per unit leaf area) were inversely correlated (r = −0.37, p < 0.001, n = 416); this relationship was not influenced by irrigation treatments (Figure 3⇓). Consequently, as the crop load decreased below 1.0 kg/m2, vigor tended to increase exponentially, even though the vast majority of the shoots grew little (<3 mm/d) after bloom as intended by the RDI strategy.
Effect of deficit-irrigation timing, crop-load adjustment, and growing season on vegetative growth of field-grown, own-rooted Cabernet Sauvignon grapevines in the Columbia Valley, WA.
Relationship between amount of fruit per unit leaf area and postbloom shoot growth rate of own-rooted Cabernet Sauvignon grapevines. Data were pooled across three seasons and are for vines without crop adjustment only (all p < 0.001, n ≥ 144). (Negative vigor values reflect that many shoot tips had died by harvest.)
The mean vine leaf area prior to harvest was highest (p < 0.001) in 2000 and lowest in 2003, but the trend toward smaller leaf area with RDIE was significant in only two of the four years it was estimated (Table 4⇑). In the last three years of the trial there also was a trend toward slightly lower pruning weights of RDIE vines, which was significant in 2002 and 2003 (Table 4⇑). Shoot maturation (periderm formation) varied somewhat more between seasons (from 10 brown internodes per shoot by veraison in 2000 to six in 2003), but none of the treatments ever influenced shoot maturation (data not shown). Nevertheless, the canes tended to be lighter in the RDIE treatment (Table 4⇑). The decline in cane weight over time was associated with the simultaneous increase in shoot number (r = −0.52, p < 0.001); cane weight and shoot number also were negatively correlated within seasons (−0.45 < r < −0.36, p < 0.001, n ≥ 278). Despite the high and variable shoot density (17 to 51 shoots/m), however, there was no correlation with shoot length, periderm formation, or pruning weight. Shoot density did not influence the number of clusters per shoot in the following season (before crop adjustment) or the number of flowers per inflorescence, which implies that canopy density did not limit bud fruitfulness and cluster differentiation. Although the fruit:pruning-weight ratio varied from 1.1 to 11.0, this indicator of crop load and vine balance did not correlate with any of the above variables except cane weight (−0.48 < r < −0.34, p < 0.001). The RDI treatments failed to alter vine nutrient status as indicated by repeated petiole analysis in 2001, with one minor exception: RDIE vines had slightly but significantly (p < 0.001) lower Mg concentrations (data not shown). Similarly, leaf blade nutrients were unaffected by treatments in 2003 (Schreiner et al. 2007).
The number of clusters per shoot prior to thinning showed only minor variation from year to year (from 1.3 ± 0.02 in 2001 to 1.5 ± 0.03 in 2002 and 2003) and was unaffected by the experimental treatments (data not shown). By implication, then, neither irrigation nor crop adjustment impacted cluster initiation. Each year the non-count shoots consistently had the smallest ( p < 0.001) flower clusters (180 ± 5 flowers per inflorescence, with 97% having <300 flowers), followed by inflorescences at the upper (306 ± 7 flowers) and finally lower positions (451 ± 7 flowers, with 90% having >300 flowers) on count shoots. The size of inflorescences decreased (p < 0.001) over the five years of this study. On average, the flower clusters were largest in 1999 (393 ± 16 flowers), followed by 2000 (319 ± 18 flowers), and then the remaining seasons (284 ± 13 flowers), among which there was no difference in mean inflorescence size. Furthermore, neither the RDI strategy nor the crop-load adjustment ever impacted flower numbers (data not shown), implying that neither treatment had a pronounced immediate (same season) or carry-over (following season) effect on cluster differentiation and, consequently, inflorescence size.
Fruit set decreased with increasing number of flowers per inflorescence on both count shoots (r = −0.64, p < 0.001, n = 80) and noncount shoots (r = −0.51, p < 0.001, n = 40). However, while the relationship between inflorescence size and fruit set was similar for the two clusters on count shoots (10% set at 700 flowers vs. 43% set at 150 flowers), clusters on noncount shoots generally set less fruit than those on count shoots (p < 0.001) when calculated over the same range of flower numbers (e.g., 14% vs. 22% set at 300 flowers and 27% vs. 43% set at 150 flowers). Concomitant with the decrease in inflorescence size, the proportion of flowers that set fruit increased over the course of the study. The relatively large inflorescences in combination with the cool prebloom period (Figure 1⇑) led to only 18 ± 0.6% fruit set in 1999. Average fruit set was 23 ± 1.0% in 2000, 24 ± 1.2% in 2001, 26 ± 1.4% in 2002, and 27 ± 1.6% in 2003. However, while these four seasons were all different from 1999 ( p < 0.001), among themselves, 2000 and 2003 was the only pair that differed significantly ( p < 0.05), despite considerable variation in temperature and soil moisture during the bloom–set period. Considering that treatments were not imposed until after fruit set, it is not surprising that they failed to affect the proportion of flowers that set fruit (data not shown). Similarly, the treatments generally did not influence the number of berries per cluster at harvest (Table 5⇓). The seasonal mean berry number varied from 70 ± 2 (2002) to 50 ± 1 (2003).
Effect of deficit-irrigation timing, crop-load adjustment, and growing season on berry growth and cluster yield components of field-grown, own-rooted Cabernet Sauvignon grapevines in the Columbia Valley, WA.
Berry growth rates before veraison were highest in 2000 and 2001 (Table 5⇑). These were also the two seasons with the lowest temperatures (mean <18°C, compared with >20°C for the other three seasons), and hence the least GDD accumulation, during the bloom–fruit-set period. This was coupled in 2000, but not 2001, with relatively high soil moisture during the same period (Figure 2⇑). The standard RDI treatment usually led to the fastest rates of berry growth both before and after veraison, whereas RDIE slowed berry growth mostly before veraison and RDIL after veraison (Table 5⇑). However, there was a tendency in the last three years of this study for RDIE berries to expand somewhat more rapidly during ripening. Therefore, the more severe water deficit decreased final berry weight only in the first two seasons, when the RDIS berries were relatively heavy (Table 5⇑). The berries generally ceased expanding about three to four weeks after veraison (about two weeks before harvest) and maintained constant weight thereafter. Except in 2000, we could not find any evidence for compensatory berry growth after crop adjustment. Nevertheless, berry weight was negatively correlated with the number of berries per vine, especially in the CLH treatment (Figure 4⇓). Therefore, 2002, the year with the highest number of berries per vine, was also the year with the smallest berries, indicating that berry growth compensated for low berry number only when that number was established at fruit set, while compensatory growth no longer occurred when the berry number was reduced at veraison. However, the RDIE treatment deviated somewhat from the overall trend of increasing berry weight with decreasing berry number: berries remained smaller than in the other irrigation treatments when there were few berries per vine (Figure 4⇓).
Relationship between berry number per vine and harvest berry weight of own-rooted Cabernet Sauvignon grapevines. Data were pooled across five seasons and are for vines without crop adjustment only (all p < 0.001, n = 20).
Under current management practices at this site, yields fluctuated very little from year to year, even though on a single-vine basis the crop levels usually varied from 2 to 9 kg/vine (equivalent to a yield of 4 to 18 t/ha) within the same season. The average yield of the CLH vines over the five seasons was 10.01 t/ha with a standard error of only 0.10 t/ha. The lightest crop in CLH was harvested in 1999 (9.42 t/ha) and the heaviest in 2002 (11.05 t/ha). Cluster thinning consistently decreased cluster numbers and crop levels (Table 6⇓); averaged over the three years it was carried out, the reduction in yield was 35% (crop load −32%). Therefore, the CLL vines on average yielded 6.67 ± 0.12 t/ha, which was very close to the targeted 6.7 t/ha. Conversely, the influence on yield of timing and extent of deficit irrigation was inconsistent. Although RDIL on average led to lower yields (p < 0.001) compared with RDIE and RDIS, the differences were usually small, and the latter two treatments did not differ (Table 6⇓). However, despite the, at best, minor effect of extra water deficit on shoot growth and yield formation, the crop load of the RDIE vines was higher than that of the other two irrigation treatments in three out of five seasons (Table 6⇓). The yield of unthinned vines correlated positively with their pruning weight both across (r = 0.41, p < 0.001, n = 720) and within (0.46 < r < 0.54, p < 0.001, n > 140) seasons, indicating not only that larger vines had the capacity to support a heavier crop but also that the crop did not decrease future vine capacity. However, the effect of vine size on cropping was not due to varying numbers of (count or total) shoots per vine (r < 0.16, ns) but instead was related to the number of clusters per shoot (0.30 < r < 0.52, p < 0.001) and cluster weight (0.47 < r < 0.76, p < 0.001) which, in turn, was mostly determined by the number of berries per cluster.
Effect of deficit-irrigation timing, crop-load adjustment, and growing season on yield and crop load of field-grown, own-rooted Cabernet Sauvignon grapevines in the Columbia Valley, WA. (Yield (t/ha) may be estimated by multiplying the crop level by two.)
Fruit ripening and composition.
Neither the RDI strategy nor cluster thinning altered the date of veraison (Table 2⇑), which consistently occurred around 1000 GDD. The rate of sugar accumulation during ripening was calculated as the mean daily increase in the concentration of soluble solids between the first postveraison sampling and harvest. Sugar accumulation varied considerably (p < 0.001) among seasons: 0.24 ± 0.003 Brix/d in 1999, 0.21 ± 0.006 Brix/d in 2000, 0.29 ± 0.005 Brix/d in 2001, 0.26 ± 0.006 Brix/d in 2002, and 0.34 ± 0.012 Brix/d in 2003 (data not shown). As the low standard errors suggest, the various treatment combinations did not influence sugar accumulation in any season. Therefore, because the grapes in the different treatments ripened almost synchronously, all treatments were always harvested on the same day within a given year (Table 2⇑). Only in the first (and coolest) year of the study did the grapes not quite reach the target soluble solids concentration of 24 Brix by the time they were harvested (Table 7⇓). In the other four seasons they easily and consistently exceeded that threshold by about 1 Brix. Nevertheless, TA was highest (and pH lowest) in 2000 ( p < 0.001), which had the coolest ripening period of all seasons, and TA was lowest (and pH highest) in 2001, the year with the warmest ripening period. Color density, on the other hand, was ~12% lower in 2000 (p < 0.001) than in the subsequent seasons (Table 7⇓). The juice [K+] was higher (p < 0.001) in 2003 (2.9 ± 0.03 g/L) than in the preceding seasons, when it was consistently around 2.4 ± 0.07 g/L (data not shown). Neither irrigation nor crop load ever affected [K+], which, in turn, did not correlate with any of the other measures of fruit quality or with any yield component. The more severe RDI treatments did not result in any gains in terms of fruit composition (at least not among the components measured in this study) over and above the standard irrigation regime (Table 7⇓). On the contrary, there was a trend for RDIE to slightly decrease juice color compared with RDIL and RDIS. By contrast, cluster thinning tended to lead to slightly higher soluble solids and, in one year, lower TA. Again, the ripening periods of the two seasons in which the crop-load effect was significant were considerably cooler than those of the remaining seasons. Nevertheless, in no case did we find a significant (negative) correlation between crop level or crop load and fruit soluble solids or color density. There also were no significant correlations between shoot density (shoots/m) or leaf area per vine and soluble solids or color, indicating that fruit ripening was not limited by canopy density or size (i.e., source area). In addition, although the amount of sugar per berry always increased linearly with berry weight (r > 0.93, p < 0.001, n =24), berry weight was never correlated with the concentration of soluble solids. In only two seasons (2001 and 2003) was there a negative relationship between berry weight (and by implication berry size; i.e., skin:pulp ratio) and color density (r = −0.43, p < 0.05). Color density did, however, increase with increasing TA in each season (0.43 < r < 0.74, p ≤ 0.05) and also increased with increasing [H+] (r = 0.55, p < 0.001, n = 95 across seasons).
Effect of deficit-irrigation timing, crop-load adjustment, and growing season on fruit composition at harvest of field-grown, own-rooted Cabernet Sauvignon grapevines in the Columbia Valley, WA.
Cold hardiness.
Differential thermal analysis of bud and cane samples collected throughout the 2000/01 and 2003/04 dormancy periods showed that the low-temperature exotherms (LTE) representing the lethal temperature for 10% (LTE10), 50% (LTE50), and 90% (LTE90) of the buds were well correlated. The LTE90 for buds was ~2°C lower (r = 0.87, p < 0.001, n = 236), and the LTE50 was ~1°C lower (r = 0.92, p < 0.001) than the corresponding LTE10. The LTEs for buds and those for cane phloem or xylem tissues (LTE10, indicating 10% injury) were less well correlated. As expected, the degree of cold hardiness for all tissues improved from leaf fall through midwinter, after which the vines began to deacclimate and gradually lost cold hardiness (Figure 5⇓). On average, buds and, especially, phloem tissues were considerably hardier during the 2000/01 winter than in 2003/04, and even the xylem was hardier during the deacclimation phase. However, neither severity and timing of deficit irrigation nor cluster thinning had any impact on cold hardiness of buds and cane tissues.
Effects of deficit irrigation on cold hardiness (means ± se) of Cabernet Sauvignon buds (A), cane phloem (B), and cane xylem (C) during the 2000/01 and 2003/04 dormant seasons in the Columbia Valley, WA.
Discussion
This study investigated the potential of relatively severe RDI to control canopy growth (i.e., to reduce vigor), improve fruit composition, and conserve water resources, without impacting vine capacity and balance, yield formation, and cold hardiness of field-grown Cabernet Sauvignon. In addition, crop load was adjusted by cluster thinning in a portion of the vines in an effort to produce high-quality fruit and to test interactive effects with irrigation management. Irrigation treatments that imposed moderate to severe soil and plant water deficit (RDI E and RDI L) decreased vegetative growth only marginally beyond the level achieved with the standard deficit treatment (RDIS). Moreover, there were very few interactions between the irrigation and crop-load treatments, indicating that the higher crop load did not compromise vine performance under the more severe water deficit. This finding seems at first surprising, since fruit growth often occurs at the expense of vegetative growth under dry conditions (Eibach and Alleweldt 1985). However, shoot vigor and final length in this vineyard, managed using RDIS, were already within the range (60 to 90 cm) considered optimal for high-quality winegrape production (Smart et al. 1990). This optimal shoot length was generally reached before the treatments were imposed (during the initial dry-down phase), and growth thereafter was minimal. Growth is extremely sensitive to water stress: for example, growth of Riesling shoots has been reported to cease at midday Ψleaf ≤ −1.2 MPa (Schultz and Matthews 1988). In the present study, average midday Ψx values on the day before the irrigation treatments were initiated were ~−1.4 MPa, which would have put Ψleaf well below the “stop-growth” threshold (Choné et al. 2001, Williams and Araujo 2002). Obviously, once growth ceases, more severe deficit cannot have an additional effect. This was reflected in the relatively stable pruning weights over the five years, which also indicated that, within the range of crop loads achieved at this site, the rather severe water deficit hardly compromised vine capacity. Further evidence for sustained vine capacity is seen in the failure of any of the treatment combinations applied to compromise cluster initiation (clusters per shoot), cluster differentiation (flowers per cluster), cane maturation (periderm formation), or cold hardiness. Cold hardiness of Cabernet Sauvignon has previously been reported to be insensitive to deficit irrigation (Hamman and Dami 2000). In agreement with earlier research (Eibach and Alleweldt 1985), however, the more severe RDI treatments did reduce root growth in this vineyard, although that may have been partly compensated for by increased “mycorrhization” of the roots (Schreiner et al. 2007). Such compensation could also explain the failure of our irrigation treatments to impact vine nutrient status.
The variation in growth and other aspects of vine performance (see below) due to seasonal effects (i.e., effects of variation in climate, especially temperature) was far greater than that due to treatment effects (i.e., effects of variation in cultural practices, such as irrigation and cluster thinning). For instance, while 2003 had the coolest prebloom period, it quickly became very warm during bloom and remained the warmest season through leaf fall. This temperature effect in addition to the increase in shoot number per vine, rather than differences in (already low) soil moisture, may have been the reason for the early cessation of shoot growth and very low shoot vigor in 2003. Similarly, compared with 2003, the running mean temperature during the hardening phase (October through December) in 2000 was 2.2°C lower, and the mean temperature during the deacclimation phase (January through March) was 1.1°C lower; this difference was reflected in more cold-hardy buds and canes in the winter of 2000/01 than in 2003/04.
Although vine capacity and productivity were maintained throughout the study period, the vines appeared to somewhat self-regulate their yield potential early in the season: the number of clusters per vine generally varied in the opposite direction from the number of flowers per inflorescence between years (r = −0.32, p < 0.05, n = 60). In other words, years with many clusters per vine tended to be associated with smaller inflorescences and vice versa. Excluding 2001 from the data set improved this correlation markedly (r = −0.50, p < 0.001, n = 48); that year had three nights with freezing temperatures immediately before budbreak, and the vines had fewer and smaller inflorescences. Regardless of whether this self-regulation (or compensation) occurred in the previous season (during bud development) or in the current season (during flower development), it was apparently not affected by our treatments. The largest “down-regulation” of the yield potential, however, occurred during the bloom–fruit-set period. The temperature during this period appeared to have little influence on fruit set; mean minimum temperatures (Tmin) varied from 11 to 14°C among years and mean maximum temperatures (Tmax) varied between 23 and 28°C. Even in the last year of the trial, fruit set was still at the low end of what is considered normal. Therefore, it seems likely that the relatively low yields on this site were partly due to poor fruit set, which may have been caused by water deficit during bloom (Hardie and Considine 1976). Indeed, while no Ψx measurements were conducted during this period in the first two seasons, in 2001 the mean midday Ψx (n = 48) declined from −0.5 MPa on 12 June to −1.3 MPa on 14 and 19 June. In 2002 Ψx declined from −1.3 MPa on 10 June to −1.5 on 24 June, and in 2003 Ψx decreased from −1.3 MPa on 9 June to −1.6 MPa on 17 June. Therefore, if higher yields are to be achieved, the current practice of withholding irrigation water prior to bloom to control shoot growth may have to be modified, especially on soils with low water-holding capacity, such as the one in this study.
The different RDI treatments were usually not imposed until about two weeks after fruit set, toward the end of the berry cell-division phase (Harris et al. 1968). By that time the soil had already dried down considerably (<9.5% v/v), midday Ψx typically was between −1.3 and −1.5 MPa, and shoot growth was minimal because of the withholding of water through bloom and fruit set. The lowest soil moisture (<7.5%) between fruit set and the onset of RDI treatments was reached in 2001, the year with the warmest temperatures during the same period, and the only year in which berry growth rates did not differ among treatments. In the other years, berry growth was quite sensitive to the RDI treatments both before and after veraison. The final berry size, notably in the RDIL and RDIS vines, was significantly larger in the first two seasons than in the last three seasons, and this difference was apparent shortly after fruit set. The berry weights in the last three seasons were at the lower end of the range for Cabernet Sauvignon grown in Bordeaux (Jones and Davis 2000), and the more severe water-deficit treatments did not decrease berry weights below those obtained with RDIS. These results suggest that, in the absence of significant seasonal rainfall (and combined with high evaporative demand: mean ET0 ≈ 7 to 9 mm/d from bloom to veraison and 5 to 7 mm/d from veraison to harvest), the RDIS treatment may have been sufficient to minimize berry size from 2001 onward. However, although berry number per vine was an important determinant of berry weight (especially on unthinned vines), RDIE usually resulted in small berries (<1.2 g) even when the berry number was low (<5,000/vine). In other words, RDIE prevented compensatory berry growth when the vines had few berries, whereas berry size was minimized (<1.1 g) irrespective of the irrigation regime when the vines entered the berry development phase with many berries (>5,000). Water deficit early during berry development has often (Matthews and Anderson 1989), although not always (McCarthy 1997), been most effective at minimizing berry size. Our results may provide an explanation for this discrepancy among irrigation studies.
Fruit ripening and composition at harvest were relatively consistent from year to year, although sugar accumulated at slightly slower rates and TA was higher in the first two experimental seasons, both of which experienced cooler ripening periods (Tmin ≈ 12 to 14°C; Tmax ≈ 27 to 28°C) with slightly less solar radiation compared with the last three years (Tmin ≈ 13 to 15°C; Tmax ≈ 29 to31°C). Despite the larger berry size in the first two seasons and its possible relationship with irrigation amount, berry size and berry growth rate were poor predictors of sugar accumulation (as were crop level and crop load, although there was indeed a weak but significant negative correlation between crop load and sugar accumulation rate in 2000 and 2002). This suggests that temperature differences were the major cause of the variation in sugar accumulation among seasons. The only significant correlation (r = 0.50, p < 0.05, n = 24) between soluble solids and color density was in 2000, when color was relatively low and berries relatively large. While color density was directly related to juice acidity, increasing linearly with increasing TA and [H+], there was no relationship between juice [K+] and [H+], which is consistent with earlier findings (Boulton 1980). In the last three years of this study there was a tendency for TA to be lower in the early postveraison samples of RDIE fruit than in RDIL and RDIS, but the subsequent decrease in acidity was similar in all treatments. This suggests that RDI E led to a reduction in (malic) acid accumulation before veraison (Matthews and Anderson 1988), whereas malic acid degradation was not affected by the relatively severe water deficit (RDIL) imposed during ripening.
It is generally agreed that some water deficit is beneficial for (red) grape and wine quality but, as the present results show, more stress is not necessarily better. On the contrary, RDIE tended to result in lower color compared with the other treatments. It appears that once control of shoot growth and canopy development has been achieved, additional (more severe) water deficit may not result in further improvements of fruit composition, regardless of whether that deficit is imposed before or after veraison. On the other hand, our data also suggest that considerable savings in water resources can be achieved without adverse impacts on grapevine (or at least Cabernet Sauvignon) performance in terms of growth, fruiting, and cold hardiness. The lowest annual water input in this vine-yard from both rainfall and irrigation was 308 mm (RDIE treatment in 2003). Nevertheless, we reemphasize that stressing vines too early (i.e., before fruit set) may curtail potential yields, even if more buds were retained at pruning to compensate for low fruit set. The poor correlation between bud number and yield of unthinned vines (r = 0.10, p < 0.01) corroborates this conclusion.
The present results confirm those from another recent study evaluating crop adjustment under similar, although slightly cooler, climatic conditions (Keller et al. 2005). Although cluster thinning reduced yields, there were, at best, marginal gains in terms of fruit composition (even though all treatments were harvested on the same day) and none in terms of periderm formation (i.e., shoot maturation) and cold hardiness. In four out of five years, soluble solids were still increasing at a rate of 0.1 to 0.2 Brix/d during the week leading up to harvest, and this increase was not influenced by the fruit:pruning-weight or leaf-area:fruit-weight ratios (i.e., independent of whether or not the crop had been reduced). In other words, while cluster thinning reduced yield and crop load by roughly one-third, it advanced fruit maturity at most by three to four days. The fact that fruit composition differed at all between treatments might be attributed to the selective removal at veraison of green clusters on the west side of the canopy, where excessive heat can interfere with ripening (Spayd et al. 2002). Moreover, there were almost no interactions between crop-load and irrigation treatments, suggesting that even the vines subject to relatively severe water deficit were able to support and ripen their crop. In fact, using figures for “optimum” crop loads (Bravdo et al. 1985, Smart et al. 1990: 5 to 10 kg fruit/kg pruning weight) (Kliewer and Dokoozlian 2005: 0.8 to 1.2 m2 leaf area/kg fruit), these vines were at the low end of optimal to slightly undercropped. This indicates that the vines were sink-limited, which may at least partly account for the poor response of crop level and fruit composition to more severe water deficit (Poni et al. 1993). The stimulation of shoot vigor on vines with very low crop loads confirms this conclusion. In addition, there was no relationship between shoot density or leaf area and berry color, suggesting that fruit exposure was not limiting fruit quality. Of course, it is possible that quality-relevant compounds other than the ones measured here might have been influenced by crop adjustment or irrigation practice (see Chapman et al. 2004, 2005). Nevertheless, earlier research concluded that excessive irrigation was more detrimental to Cabernet Sauvignon wine quality than was high crop load (Bravdo et al. 1985).
Conclusions
The present data suggest that considerable savings in water resources can be achieved without marked short- and longer-term effects (whether beneficial or detrimental) on deficit-irrigated, own-rooted Cabernet Sauvignon performance in terms of growth, nutrient status, yield formation, fruit composition, and cold hardiness. Irrigation replacing as little as 30% of ETFV, either before or after veraison, did not lead to a further decrease in shoot growth and berry size and had only minor effects on fruit composition compared with the “standard” 60% ETFV replacement. A slight increase in pH and decrease in acidity and color in some years may be the (small) price to pay for imposing relatively severe water deficit before veraison. Withholding water too early (before fruit set), however, may reduce the vines’ yield potential. Similarly, cluster thinning significantly reduced crop loads and yields but had little effect on growth, fruit ripening and composition, and cold hardiness. The minor differences in fruit maturity that were observed could be explained by the selective removal at veraison of green clusters on the west side of the canopy. Moreover, very few interactions were found between the irrigation and crop-load treatments, which implies that cluster thinning did not influence the vines’ response to deficit irrigation. In the sunny climate of southeastern Washington, temperature, rather than soil moisture or crop load, was the main factor accounting for seasonal fluctuations in fruit composition. Hence it is questionable whether the minor advancement of grape maturity justifies the loss of potential yield and the costs involved in crop adjustment of Cabernet Sauvignon, at least where the growing season is long enough to achieve adequate fruit maturity and climatic conditions and soil properties are conducive to using deficit irrigation.
Footnotes
Acknowledgments: This work was partially funded by WSU’s Agricultural Research Center, project WPN00428, and by the Washington Wine Advisory Committee program. Financial and in-kind support was also provided by Ste. Michelle Wine Estates.
The authors thank B. Zimmermann, M. Mireles, C. Longoria, and A. Kawakami for technical assistance.
- Received December 2007.
- Revision received March 2008.
- Copyright © 2008 by the American Society for Enology and Viticulture