Abstract
Cold injury is a key environmental challenge in many grape-producing regions, especially those at high latitudes. Although grapevines acclimate to cold temperatures in fall and deacclimate when warm temperatures return in spring, cold hardiness varies with species, cultivar, phenology, ambient weather, photoperiod, and plant organ, which hampers implementation of effective mitigation practices. Using long-term data sets of lethal temperatures and spring phenology for primary buds of Vitis vinifera and Vitis labruscana, we parameterized and evaluated a discrete-dynamic model that simulates cold hardiness from early fall through budbreak of 23 genotypes. The model uses mean daily temperature as the sole input variable to drive daily changes in hardiness. Genotype-specific parameters, such as initial and maximum hardiness, temperature thresholds, acclimation and deacclimation rates, and chilling and heating requirements, were optimized through an iterative process. The model predicted cold hardiness with 0.89 ≤ r2 ≤ 0.99, depending on genotype. Because it simulates hardiness at budbreak, the model can also be used to predict the time of budbreak. Optimized model parameters revealed a north/inland-south/coastal gradient for genotype origin in terms of initial and maximum cold hardiness, and time of budbreak. Budbreak occurred earlier in hardier genotypes, consistent with more rapid deacclimation of genotypes originating from colder climates, paradoxically making these genotypes more vulnerable to spring frost in warmer environments. The current model of grapevine bud cold hardiness has uses in both climate modeling and risk assessment.
Cold hardiness (Hc), or the ability to tolerate freezing temperatures, is a major concern in many grapegrowing regions of the world where the ambient temperature can drop below freezing. Cold hardiness is a dynamic trait acquired in response to shortening photoperiod and declining temperature in late fall or early winter and varies with species, cultivar, phenology, ambient weather, and the plant organ of interest (Xin and Browse 2000, Gusta and Wisniewski 2013, Pagter and Arora 2013). Green, growing organs and tissues generally lack hardiness and may sustain injury at temperatures only slightly <0°C. Damage can occur even when the ambient air temperature remains >0°C if radiative cooling under clear sky conditions leads to subzero tissue temperatures. Conversely, plant tissues are most cold hardy during midwinter (i.e., the middle of the dormant season) when ambient temperatures are lowest. Depending on the aforementioned variables, grapevines (Vitis spp.) may survive temperatures ranging from approximately −10°C to less than −30°C during the dormant period (Fennell 2004, Mills et al. 2006, Keller 2010, Ferguson et al. 2011).
The transition between the growing and dormant seasons is associated with acclimation and deacclimation processes that alter the level of Hc (Keller 2010, Gusta and Wisniewski 2013, Pagter and Arora 2013). The genetic programs for dormancy and Hc are superimposed, and the transition from paradormancy to endodormancy is a prerequisite for the subsequent acquisition of full Hc (van der Schoot and Rinne 2011). Even in midwinter these processes respond to fluctuations in temperature that lead to short-term changes in Hc. This is especially noticeable when plants begin to deacclimate in response to unseasonal warm spells (Ferguson et al. 2011, Pagter and Arora 2013). Temperature-driven acclimation/deacclimation cycles continue until the changes leading up to budbreak render the deacclimation process irreversible. Furthermore, the dynamic nature of Hc is partially genetically determined; different species and cultivars may acquire varying levels of Hc as well as vary in their responses to changes in temperature.
These aspects of Hc complicate production strategies and lead to spatial and temporal variation in crop production. A specific low temperature that is of no concern at one point during the dormant season could pose a threat at other times or could be problematic for some cultivars being grown at a specific site but not for other cultivars or other sites (Ferguson et al. 2011). This has important practical consequences. In regions where the risk of cold damage is high, good agricultural practices suggest developing vineyards in areas less prone to damaging cold temperature occurrence or cold-air pooling (Widrlechner et al. 2012). Cultivar selection can be used to match appropriate genotypes with varying degrees of Hc to different sites, but this requires detailed knowledge of the dynamic behavior of each cultivar in response to changing meteorological conditions. Understanding this dynamic behavior is also required to make informed decisions pertaining to the implementation of protective measures such as the use of wind machines or heaters. In addition, the choice of pruning practices may be influenced by the timing and severity of a damaging cold event (Keller and Mills 2007, Dami et al. 2012).
The dilemma faced in production viticulture is in knowing precisely at what time and location damaging temperatures may occur. To address this challenge, we developed a robust numerical Hc model for dormant primary buds of three diverse grape genotypes (Ferguson et al. 2011). This dynamic thermal-time model predicts bud hardiness using genotype-specific coefficients (e.g., minimum and maximum hardiness, acclimation and deacclimation rates, and ecodormancy boundary) with daily mean temperature as the single input variable. A key strength of such a model is that, given only temperature measurements, Hc prediction can be extended across entire regions, which is considerably more cost-effective than conducting frequent real-time Hc assessments at multiple locations.
Initial feedback from early adopters of this model showed several limitations, including the few genotypes that were parameterized and relatively poor predictive performance during late winter/early spring, when buds are deacclimating. The latter issue is critical given the extensive variation in local weather conditions, including unexpected frost events, during this period. In some instances, the model tended to predict unrealistic Hc values (−5 to −10°C) at a time when budbreak was observed in the field (Ferguson et al. 2011). The likely explanation for this anomaly is that the model was developed for dormant buds using data generated by differential thermal analysis (DTA). Such data are not readily obtained once buds approach budbreak and their water content rises. Moreover, bud sampling was terminated at varying times in some genotypes and some years to permit timely winter pruning, resulting in a relative paucity of available DTA data close to budbreak. Thus, the model can only extrapolate Hc to this time, a shortcoming that is addressed by using spring phenology data in the new model presented here.
The present study had two main objectives. The first was to develop model variants for a wide range of grape genotypes. These genotypes comprise cultivars of Vitis vinifera L., of diverse Eurasian origin and used mostly for wine production, and Vitis labruscana Bailey, of northeastern North American origin and mostly used for juice production. The second objective was to enhance the performance of the previous model during the period leading up to and during budbreak, a time for which limited DTA data are available. We limited our analysis to primary buds due to their role as the main source of yield potential for the subsequent growing season. All abbreviations and units of measurement used are defined in Supplemental Table 1.
Materials and Methods
Cold hardiness data.
The Hc of endo- and ecodormant primary buds of up to 23 Vitis spp. genotypes has been routinely measured in our laboratory since 1988, using cane samples collected in the vineyards of the Irrigated Agriculture Research and Extension Center (IAREC) in Prosser, WA (lat. 46.3°N; long. 119.7°W; 260–365 m asl), and in the cultivar collection of Ste. Michelle Wine Estates, Paterson, WA (lat. 45.9°N; long. 119.6°W; 195 m asl). Data were collected as described for cultivars of V. vinifera and V. labruscana (Mills et al. 2006) (Table 1), using DTA that measures low-temperature exotherms (LTE) and high-temperature exotherms (HTE). An LTE corresponds to the (lethal) temperature at which supercooled intracellular water freezes in an organ. The Hc is expressed as LT50, which is the lethal temperature for 50% of buds tested. Measurements of Hc typically started in the fall (near the time of harvest) and continued through the dormant period until either pruning limited plant material availability or rapid deacclimation prior to budbreak made the DTA method unreliable. The latter limitation arises from the fact that it is increasingly impossible to distinguish LTE from HTE in swelling buds, as their water content increases from <50% to >75% and they lose the ability to supercool (Lavee and May 1997, Fuller and Telli 1999, Fennell 2004). An HTE indicates the freezing of extracellular water, which occurs at higher temperatures and is usually not lethal, although it induces cellular dehydration (Xin and Browse 2000, Fennell 2004). Consequently, our Hc data sets start and end at differing times each year. Data for each genotype were collected at ~2-week intervals, but for some genotypes in some years DTA measurements were conducted daily or weekly. The genotypes assessed changed from year to year (Table 1), depending on availability or industry interest. The country of origin for each genotype was taken from the National Grape Registry (ngr.ucdavis.edu). In addition, the genotypes were assigned to five broad groups according to the environment of their original distribution (Table 1). Because the precise geographic origin of many V. vinifera cultivars is unknown, we applied a simple procedure that resulted in a gradient from mild (group 1) to cold (group 5) winters: south (groups 1 and 2) or north (groups 3 and 4) of the Alps, coastal (groups 1 and 3) or inland (groups 2 and 4). The V. labruscana cultivars were assigned to group 5.
Spring phenology data.
Owing to the above limitations, Hc was approximated during the rapid deacclimation period leading up to and during budbreak, using phenological data obtained since 1988 in the same IAREC vineyards from which DTA data were collected (Table 1). The day of year (DOY) was recorded for woolly bud, budbreak, first leaf, second leaf, and fourth leaf separated from the shoot tip. Published data (Table 2) were used to infer Hc at each DOY for our phenological data. Because no cultivar-specific data are currently available, we used the same Hc across all cultivars within a species for a specific phenological stage. The V. vinifera values were derived from cv. Pinot noir (Gardea 1987) and the V. labruscana values from cv. Concord (Proebsting et al. 1978); these authors used similar protocols of freezing bud samples to a range of predetermined temperatures, then thawing the samples for ≥24 hr and visually evaluating tissue browning as a measure of cold damage. The phenology-derived Hc data were added to the DTA-derived Hc data to extend the data set beyond budbreak.
Model parameterization.
The present Hc model was built on the discrete-dynamic model (Ferguson et al. 2011); that report describes and discusses the complete mathematical structure and underlying assumptions of the model presented herein. The model was coded and parameterized in SAS (ver. 9.2; SAS Institute, Cary, NC). The daily mean air temperature (Tmean) was used as the input variable that drives changes in acclimation and deacclimation to predict daily changes in Hc (ΔHc). The ΔHc was added to the previous day’s Hc (Hc,i-1) to give the current day’s Hc (Hc,i). The Tmean was estimated as the average of the minimum (Tmin) and maximum (Tmax) daily temperatures provided by the nearest (<10 km) Washington State University (WSU) AgWeatherNet (weather.wsu.edu) station for each vineyard site: the IAREC on-site station (lat. 46.3°N; long. 119.7°W; 265 m asl) and the Paterson station (lat. 45.9°N; long 119.5°W; 129 m asl). Although Tmean ignores possible diurnal acclimation/deacclimation cycles due to differences in Tmin and Tmax, it is the unit that corresponds to, and therefore is relevant for, the measured Hc data which were acquired at most daily. Moreover, photoperiod was not included as a driving variable, because the starting point (DOY 250) for the model was chosen for endodormant buds (see below). Though shorter photoperiods may induce bud dormancy in grapevines, low temperatures are required to acquire full Hc (Schnabel and Wample 1987, Fennell and Hoover 1991; M. Keller and L.J. Mills, authors’ unpublished data, 2013). In addition to the introduction of spring phenology, the differences in Hc among genotypes were captured by genotype-specific parameters as described previously (Ferguson et al. 2011) with the following modifications (abbreviations in Supplemental Table 1).
First, we adapted the estimation of the initial hardiness (Hc,initial): that is, the Hc of endodormant buds in late summer or early fall after the shoots have formed brown periderm (only brown buds were sampled from shoots that were changing from green to brown) but before the subsequent, temperature-driven cold acclimation process in late fall and early winter (Pouget 1963, Schnabel and Wample 1987, Fennell 2004, Keller 2010). This was necessary because early Hc measurements were not routinely collected for many of the genotypes, since the original intent of these measurements was not the development of a model. Therefore, the Hc,initial of V. vinifera cv. Cabernet Sauvignon, the genotype with the largest available data set (Table 1), was used to estimate the Hc,initial of all other genotypes. The Cabernet Sauvignon Hc,initial was computed as the mean Hc (n = 6) of the earliest (typically mid- to late September) yearly data available from the optimization data set (see below). Because Hc of different genotypes was often measured on different dates, Cabernet Sauvignon Hc was interpolated (SAS proc. Expand, step size = 1 day) between consecutive dates for which DTA measurements were available through the winter solstice (21 December, northern hemisphere). This termination date was chosen to include all measurements taken during the fall acclimation period; grapevines typically reach their most hardy condition or maximum Hc (Hc,max) by this date (Schnabel and Wample 1987, Ferguson et al. 2011). This procedure gave an estimated Cabernet Sauvignon Hc for each DOY for which a measured Hc of any other genotype was available. Using regression analysis (SAS proc. Reg), the linear relationship between Hc of Cabernet Sauvignon and that of each of the other genotypes was applied to extrapolate Hc,initial of each genotype relative to the Hc,initial of Cabernet Sauvignon (Table 3). This approach maintained the relative hardiness rankings among genotypes to start the model.
Second, we updated the least cold-hardy condition or minimum hardiness (Hc,min) allowable for the model. Previously we had used Hc,min = −3°C across genotypes (Ferguson et al. 2011). The new Hc,min was taken as the hardiness of green growing tissues (fourth leaf stage): −1.2°C for all V. vinifera cultivars and −2.5°C for V. labruscana cultivars (Table 2). The former value was derived from V. vinifera cv. Pinot noir (Gardea 1987) and the latter from V. labruscana cv. Concord (Proebsting et al. 1978); reliable data for other genotypes are currently unavailable.
Third, we altered the calculation of the asymptotic bounds applied during deacclimation (see eq. 6 in Ferguson et al. 2011) by adding an exponent theta (θ) to the logistic component (clog,d) in Equation 1:
Eq. 1(Hc,i-1, Hc for day i-1; Hc,max, maximum Hc; Hc,min, minimum Hc). The theta-logistic equation is frequently used in ecology and was originally introduced for models of population growth in systems with finite resources (Richards 1959, Nelder 1961, Gilpin and Ayala 1973). Our previous model had defaulted to θ = 1 (Ferguson et al. 2011). Our rationale for this change was that allowing θ to vary by genotype would permit the model to better capture the accelerated deacclimation observed just before budbreak. Supplemental Figure 1 demonstrates the effect of different values for θ on simulated Hc as budbreak is approached.
Fourth, the chilling degree days (DDc) required for dormancy release, captured in the ecodormancy boundary (EDB) that defines the transition of buds from endo- to ecodormancy, was calculated using a threshold temperature common to all genotypes (Tth,c = 10°C). This change from the genotype-specific Tth estimated in Ferguson et al. (2011) makes the fixed Tth,c used for chilling requirements independent of the estimated Tth used for acclimation and deacclimation (Arora et al. 2003). This approach enables our estimated chilling requirements to be compared directly with published reports that commonly use Tth,c = 10°C across grape genotypes (Pouget 1963, Dokoozlian 1999, García de Cortázar-Atauri et al. 2009).
Model optimization and evaluation.
Whenever possible, the complete data set was separated into two categories: one set for model development and parameter optimization and another independent set for model evaluation (Table 1). For model development, we only parameterized genotypes for which ≥3 years of measured Hc data were available. Because the model will be used to predict future Hc, the last three available years of data for each genotype were included in the evaluation set (usually 2009–2010–2010–2011, and 2011–2012). A damaging freeze occurred during the 2010–2011 dormant season, but since the model is to be used to predict such events, two additional seasons with freeze events (2002–2003–2003–2004) were also included in the evaluation set. For genotypes with a limited number of years of available data, these rules were modified to include a subset of the above (Table 1). Both DTA and phenology observations were included, when feasible, in both the optimization and the evaluation data sets. Some genotypes were parameterized and the model fit evaluated internally (i.e., using the optimization data set), but due to limited data availability could not be given the more rigorous external (i.e., using the evaluation data set) evaluation.
The model was optimized and evaluated in SAS. We used stepwise iterative methods as described previously (Ferguson et al. 2011) to select the combination of model parameters that gave the best fit to the measured values in the optimization data set. A total of 1,653,750 parameter combinations, taken from an eight-dimensional hyperspace, were tested by stepwise selection for each genotype (Supplemental Table 2). This method identified the set of parameters that minimized the root mean square error (RMSE) between predicted and observed Hc as suggested by Willmott (1982). The internal validity (Caffarra and Eccel 2010) was tested by Pearson correlation analysis (SAS, proc. Reg) of predicted versus observed Hc and by calculation of the RMSE, using the optimization data set (see Table 1). The optimized parameters were then used to evaluate the performance of the individual genotype model variants externally, using the evaluation data set. Tests for external model evaluation included correlation analysis as well as calculation of the RMSE. Model accuracy was tested by calculating the mean error, or bias (B), of predicted versus observed Hc. Associations between pairs of optimized model parameters were explored using correlation analysis. In addition, effects of genotype origin on model parameters were tested using one-way ANOVA and correlation analysis.
Model use to predict budbreak.
As described above, Hc was inferred from observed spring phenology to extend the temporal scope of our data set beyond budbreak. This novel approach allowed us to also explore the Hc model for its potential use as a budbreak model. For this purpose budbreak was defined as the stage at which 50% of the bud population in the observation vineyards showed green tips. When Hc approaches temperatures only slightly below 0°C, deacclimation becomes irreversible and budbreak is imminent (Fennell 2004, Kalberer et al. 2006). To test the present model’s ability to predict budbreak, we assumed that the predicted budbreak date corresponded to the DOY for which the model first predicted Hc ≥ −2.2°C for V. vinifera cultivars and Hc ≥ −6.4°C for V. labruscana cultivars (Table 2). The RMSE and B were calculated, and correlation analysis (SAS, proc. Reg) was conducted to compare predicted versus observed DOYs of budbreak, using the phenology data in Table 1.
Results
The optimized model parameters revealed considerable phenotypic variation among the 23 Vitis spp. genotypes evaluated in this study. The genotypes differed in their Hc,initial in early fall and Hc,max in midwinter, Tth during eco- and endodormancy, acclimation and deacclimation rates (ka and kd), and EDB (Table 4), as well as in spring phenology (Table 5). For example, while their Hc,initial, ka, and kd varied, the two V. labruscana cultivars had significantly lower θ and better Hc,max (p < 0.001) than the V. vinifera cultivars (Table 4). However, the greater midwinter hardiness of V. labruscana was coupled with an earlier budbreak than in most V. vinifera cultivars (Table 5). These data also confirmed the reputation of Riesling as one of the hardiest V. vinifera cultivars, whereas Mourvèdre was the least hardy genotype in our study. Across all genotypes, the estimated Hc,initial varied from −9.5°C in Mourvèdre to −13.0°C in Lemberger, whereas Hc,max varied from −21.9°C in Sangiovese to −29.5°C in Concord (Table 4). One-way ANOVA showed a significant effect of geographic origin on Hc,initial (p < 0.001), Hc,max (p < 0.001), θ (p = 0.038), and budbreak DOY (p = 0.005). Correlation analysis confirmed these results; origin group number was negatively correlated with Hc,initial, Hc,max, and budbreak DOY (Figure 1), indicating a north/inland-south/coastal gradient for genotype origin of decreasing hardiness and later budbreak.
Across the 23 genotypes Hc,initial was positively correlated with Hc,max (Figure 2A). The Tth,endo correlated positively with kd,endo (r = 0.56, p = 0.006) and negatively with ka,endo (r = −0.90, p < 0.001), while Tth,eco correlated positively with kd,eco (r = 0.81, p < 0.001). Budbreak DOY was positively correlated with Hc,initial (Figure 2B) and Hc,max (Figure 2C), which indicates that hardier genotypes tend to begin spring growth earlier than less hardy genotypes when grown in the same environment. Omitting the two V. labruscana cultivars from the correlation analysis did not change the nature or significance of these associations. Multiple regression analysis showed that the genotype-specific Hc model parameters (Table 4) accounted for 87% (p = 0.002) of the variation in budbreak DOY (Table 5) among genotypes; Hc,initial, θ, EDB, Tth,eco, and kd,eco together accounted for 81% (p < 0.001) of this variation.
The model reliably predicted the typical course of cold acclimation of primary buds of different grapevine cultivars in fall, their midwinter hardiness, and the overall deacclimation pattern in spring (Figure 3). Integrating spring phenology to extend the Hc data set markedly improved the model fit during the irreversible deacclimation phase leading up to budbreak (Figure 4) compared with the earlier model (Ferguson et al. 2011). Across the 23 genotypes, the optimized model parameters predicted the measured LT50 values within the optimization data set with an overall r2 = 0.97 and RMSE = 1.5°C. This internal validity test showed that r2 ≥ 0.91 for all genotypes, while RMSE varied from 0.8°C for Dolcetto to 1.9°C for Malbec (Table 4). The external model evaluation, using the evaluation data set, showed that the error was somewhat higher than that found for the optimization data set, both for the overall RMSE (2.0°C; Supplemental Table 3) and for the individual genotypes (Figure 5). In this analysis, the lowest RMSE was found for Mourvèdre (1.2°C) and the highest for Concord (2.6°C). Correlation analysis demonstrated that the variation in predicted Hc accounted for 89% (Syrah) to 99% (Cabernet franc) of the variation in observed Hc in the independent evaluation data set (Figure 5). Overall model accuracy was high (B = 0.2°C); among the 16 genotypes for which at least one year of data were available to conduct an external evaluation, B varied from −0.5°C (Syrah and Sunbelt) to 1.0°C (Cabernet Sauvignon) (Supplemental Table 3). Despite this bias, the model accurately predicted Cabernet Sauvignon buds to be sufficiently acclimated to withstand the unseasonable freeze event (−17.3°C) that occurred at this location in late November 2010 (Figure 6). Although sampling of Cabernet Sauvignon buds for DTA ceased in early February 2011 because these vines were pruned, Figure 6 also illustrates the model’s ability to simulate the differences in spring deacclimation between a genotype with early budbreak (Chardonnay) and one with late budbreak (Cabernet Sauvignon; Table 5).
The November 2010 cold event resulted in varying degrees of bud injury in vineyards across Washington’s south-central region. Variation in Hc among genotypes (Table 4) was clearly an influencing factor, as some V. vinifera cultivars sustained severe damage, while others escaped with minimal or no damage in the same vineyard location. The model effectively captured these differences, predicting damage for some but not other cultivars at many locations. Site location also had an influence on the extent of damage within a cultivar; differences in bud injury due to this cold event occurred between vineyards planted to the same cultivar. This disparity could have been caused either by variation in Hc or by variation in Tmin between these locations. Thus, we compared two contrasting V. vinifera cv. Merlot vineyards, one near Alderdale, WA, with almost 100% of the primary buds killed, and another near Paterson, WA, with no reported bud damage. Running the model using temperature data from AgWeatherNet stations (Alderdale: lat. 45.9°N; long. 119.9°W; 187 m asl; Paterson: lat. 45.9°N; long. 119.5°W; 129 m asl) located near each vineyard demonstrated that cold acclimation of Merlot buds was similar at the two locations, leading to a predicted Hc of −19.9°C in Alderdale and −19.8°C in Paterson (Figure 7). The main difference, however, was related to the Tmin experienced during the freeze event: Tmin = −21.1°C in Alderdale, whereas Tmin = −14.0°C in Paterson where free air drainage prevented cold air pooling. Absolute damage levels were not quantified in this study, but were estimated by observations reported by growers of the percentage budbreak in spring and the extent of retraining and replanting in the subsequent 2011 growing season.
The model not only identified a loss in Hc due to unseasonably warm temperatures, but also was sensitive enough to simulate slightly different Hc values for relatively minor differences in average temperatures over longer periods (Figure 7). For instance, the sudden increase of Tmean up to 11.9°C in Alderdale and 10.6°C in Paterson during four days in mid-January was associated with an almost immediate deacclimation response in both locations. Moreover, the average Tmean was 0.3°C higher in Alderdale than in Paterson between mid-January and early March, whereas Tmean was 0.3°C lower in Alderdale than in Paterson from early March through April. These small differences were sufficient to reverse the relative level of predicted Hc between the two locations over these two periods.
Because the model was designed to predict Hc at budbreak, we tested its ability to be used as a budbreak model. There was significant variation among genotypes in the time of observed and predicted budbreak (Table 5). The average observed difference between the earliest genotypes (Concord and Sunbelt) and the latest genotype (Mourvèdre) was 16 days, and the model predicted this difference at 19 days. Phenology was not assessed for all genotypes in all years; consequently the absolute ranking shown in Table 5 is only approximate. Within the optimization data set, the overall RMSE for the comparison between the observed budbreak DOYs and those predicted by the Hc model was 6.7 days. The RMSE for individual genotypes ranged from 2.0 days (Cabernet franc) to 12.0 days (Sangiovese). Using the values from the evaluation data set, the overall RMSE was 7.3 days, and the RMSE for different genotypes varied from 0 days for Syrah to 13.0 days for Sangiovese. The external test for model accuracy found B = −1.0 day across all genotypes, but B varied from −8.3 days (Concord) to 13.0 days (Sangiovese). Across all genotypes, the model explained 45% of the variation in budbreak DOY, and significant correlations between observed and predicted budbreak DOY were found for about half of all genotypes tested (Table 5).
Discussion
Our long-term (≤24 years) data sets of both seasonal Hc measurements and spring phenology observations on field-grown grapevines enabled us to develop and evaluate discrete-dynamic model variants for primary bud hardiness of 23 genotypes, derived from two Vitis species, for the entire dormant season and extending through to budbreak. The integration of spring phenology and addition of the theta-logistic markedly improved model performance during the increasingly irreversible deacclimation phase leading up to budbreak compared with the published Hc model (Ferguson et al. 2011). Testing the performance of this comprehensive model under the actual situation of an unseasonable freeze event that occurred in November 2010 clearly demonstrated its robustness and sensitivity. Not only was the model able to differentiate genotypes that sustained bud injury from genotypes that were less susceptible, but it also predicted differences in damage levels between vineyard sites that were confirmed by observed differences in subsequent budbreak.
The stepwise optimization procedure, using more than 1.6 million iterations, for the model parameters uncovered considerable differences among genotypes and thus provided unique insights into their phenotypic behavior with respect to Hc during fall acclimation and spring deacclimation, as well as in midwinter. In the context of global climate change, the variation in ka and kd among genotypes is of particular interest. Variation in the propensity to deacclimate under unseasonably warm temperatures and in the ability to reacclimate when temperatures decline again determines bud survival for a given genotype as much as does its Hc,max in midwinter. Differences in Hc dynamics among plant genotypes (both among species and among ecotypes or cultivars) may be explained by the evolution of acclimation and deacclimation responses that ensure survival of plants adapted to a particular geographic region (and hence climate) without limiting their competitiveness (Browse and Xin 2001). Thus, the better Hc,max of the two V. labruscana cultivars compared with V. vinifera could be expected given the northeastern United States origin of the V. labrusca L. ancestors of the former (Keller 2010). Indeed, our results demonstrated that these genotypes fell in line with but extended the Hc gradient found for V. vinifera cultivars, which correlated with their presumed geographic origin.
The optimized model parameters showed that the cultivars originating from north-central Europe tended to start with a lower (more negative) Hc,initial and to achieve better Hc,max, but they also had a tendency toward earlier budbreak than their coastal or southern European counterparts. Riesling, of continental German provenance, was the hardiest V. vinifera cultivar in our study, whereas Mourvèdre (syn. Monastrell), originating from Mediterranean Spain, was the least hardy genotype. This finding agrees well with the idea that many V. vinifera cultivars may have been selected in their local environment before they were vegetatively propagated and that many of these local cultivars are genetically related to one another (Levadoux 1956, Myles et al. 2011).
Although bud temperature is the main determinant of the time of budbreak for a particular genotype (Keller and Tarara 2010), differences in budbreak timing among Vitis genotypes are well known (Kovács et al. 2003, García de Cortázar-Atauri et al. 2009, Nendel 2010). However, it is not immediately intuitive that winter-hardy genotypes should begin to grow before the less hardy genotypes when grown in the same location. One possible explanation is that there may have been little selection pressure for plants adapted to cold winter temperatures to maintain hardiness once spring approaches (Pagter and Arora 2013). Because environments with cold winters also typically have short growing seasons, the ability to begin spring growth rapidly under favorable and low-risk conditions presumably enables such genotypes to maximize seasonal resource acquisition and hence seed maturation. Given that the chilling requirement is often measured as the period of low temperature that is necessary to permit 100% of the buds to break (Cooke et al. 2012), genotypes that require less chilling to release dormancy may deacclimate earlier than, and therefore break bud before, genotypes that require more chilling. This idea is supported by the results from multiple regression analysis in our study, which showed that Hc,initial, EDB, θ, Tth,eco, and kd,eco together explained >80% of the variation in date of budbreak among genotypes. Thus, genotypes originating from regions with cold winters will tend to begin spring growth earlier than genotypes from regions with mild winters when these genotypes are grown in the same environment, paradoxically making the former more vulnerable to spring frost in warmer environments (Kovács et al. 2003). These results suggest that the traits that determine Hc, dormancy, and budbreak timing may be at least partly linked, which has implications for breeding and for crop performance in a changing climate.
The ability of our model to predict budbreak seems relatively poor, partly because the model was primarily optimized to simulate Hc. Optimization aimed to minimize the error over the entire dormant season, not specifically the error at budbreak as would be typical for budbreak models. Nevertheless, the overall RMSE (7 days) for observed versus predicted budbreak DOY compares favorably with the range of RMSEs (8 to 21 days) found in a recent comparison of existing budbreak models for 10 V. vinifera cultivars (García de Cortázar-Atauri et al. 2009). Similarly, a budbreak model for two V. vinifera cultivars that used multiyear data from 13 sites across northern Europe found a standard error of 4.5 days (Nendel 2010). The time of budbreak can also be impacted by viticulture practices and site factors (Williams et al. 1985, Friend and Trought 2007), as well as soil water content (M. Keller, unpublished data, 2013). Cultural practices were consistent between genotypes and years in the present study, but soil water content may have varied owing to low winter precipitation in this region (Davenport et al. 2008).
Our previous model (Ferguson et al. 2011) used a common, genotype-specific Tth for calculating both changes in Hc and chilling requirements to release dormancy. Although this approach may seem reasonable from a biological perspective, it made comparisons with other published chilling requirements problematic, since no other studies have used genotype-specific Tth for chilling calculations. Research on chilling requirements for grapevines has focused on summation at set temperatures common to all genotypes investigated (García de Cortázar-Atauri et al. 2009, Nendel 2010) and has not been complex enough to infer genotype-specific chilling temperature thresholds. Therefore, although in reality Tth,c may be unique for each genotype, the current model uses a fixed Tth,c = 10°C to calculate chilling for all genotypes while retaining the genotype-specific Tth,endo and Tth,eco to calculate changes in Hc. In other words, the model forces all genotypes to accumulate chilling degree days below 10°C while permitting each genotype to have its own temperature threshold that divides acclimation from deacclimation temperatures. The latter, moreover, is different before (Tth,endo) and after (Tth,eco) the EDB, that is, before and after dormancy has been released.
The present model was developed with temperature and LT50 data from only two locations. One might argue that the long-term nature of the optimization data set for some genotypes (up to 19 years) and the inherent variation in temperature patterns among these years should result in optimized parameters that permit application of the Hc model, as well as of its implications with respect to climate variability, to other, disparate regions. However, because this model was developed for a climate in which chilling requirements for grapevines are generally met, the purely mathematical parameterization may inadvertently overestimate actual chilling requirements. Running the model in a region with considerably warmer winters might lead to the prediction of very slow deacclimation in spring, because the model would incorrectly estimate that the chilling requirement is not met and hence that buds remain endodormant. Such a scenario would overestimate Hc during the period leading up to budbreak in a warm climate or during an unusually warm winter. While this issue could be solved by reparameterizing the model using data from climates with mild winters, Hc data from such regions are not currently available because research in those regions has traditionally focused on chilling requirements (e.g., Dokoozlian 1999) rather than Hc. Nonetheless, our study demonstrates that the introduction of spring phenology data to infer Hc greatly alleviates this shortcoming.
Conclusion
A robust, quantitative model that simulates daily changes in primary bud Hc during endo- and ecodormancy and during budbreak for 23 diverse Vitis genotypes was developed. The model also predicts time of budbreak for these genotypes. A Microsoft Excel version of this model can be accessed through http://wine.wsu.edu/research-extension/weather/cold-hardiness. The only input data required to run the model is mean daily temperature, which is easily recorded by affordable weather stations and should make this model easy to use and widely accessible. The model should be useful in climate change modeling to predict cold acclimation and deacclimation responses of different genotypes under variable climate change scenarios. It may also be used as a risk-management tool for site selection in regions with unknown grapegrowing potential and for vineyard management in regions where cold damage is common. We are currently implementing the Hc model on AgWeatherNet (weather.wsu.edu). Using temperature data from each of the network’s over 140 automated weather stations, the model automatically provides local and daily simulated Hc values for the grape cultivars reported in this study. Coupled with a weather forecasting service, the model may be used as an early warning system for impending and potentially damaging cold-temperature events. Supplemented with additional, static information on how to respond to cold damage, this forms a decision support system for risk assessment and damage mitigation in grapes.
Acknowledgments
Acknowledgments: This work was supported by the Washington State University Agricultural Research Center, the Washington Wine Industry Foundation, the Chateau Ste. Michelle Distinguished Professorship in Viticulture, the AgWeatherNet Program, and the Washington State University Viticulture and Enology Program. The authors thank Celia Longoria and Alan Kawakami for help with cold hardiness and phenology data collection and Ste. Michelle Wine Estates for providing samples for cold hardiness assessment. A Microsoft Excel version of this model can be accessed through: http://wine.wsu.edu/research-extension/weather/cold-hardiness.
Footnotes
Supplemental data is freely available with the online version of this article.
- Received September 2013.
- Revision received November 2013.
- Accepted November 2013.
- Published online February 2014
This is an open access article distributed under the CC BY license https://creativecommons.org/licenses/by/4.0/.