With the impending Labor Day weekend nearly upon us, it’s a sure sign that alfalfa’s physiological game plan is about to change in central portions of the United States and points north.

As fall temperatures take hold, dormant alfalfa varieties begin to partition nutrient resources toward those things that will enhance winter survival. Rapid and abundant top growth becomes secondary, though still important to maintain a photosynthetic factory. Moving to the top of the priority list is root carbohydrate (starch) and protein storage.

Forage researchers and extension workers have addressed the issue of giving plants adequate time to prepare for winter by recommending an approximately six-week, no-cut period in the fall. This is sometimes referred to as the “critical fall period,” analogous to a no-fly zone for mowers and swathers.

For a good chunk of the dormant alfalfa-growing region, the recommended no-cut period begins in early to mid-September, depending on latitude (and perhaps elevation). It lasts until the point that alfalfa essentially quits growing because of cool temperatures, usually mid- to late October.

It was easy

Thirty years ago, convincing a producer not to cut during the critical fall period was an easy task. Some years, even if they didn’t cut, alfalfa was a no-show the next spring. If they did roll the dice and cut, it was almost a sure bet that stands would take a significant hit.

Over the years, the fall cutting waters have become murkier for several reasons. Although cutting during the critical fall period remains a risk, it’s less so than it used to be. There are probably a couple of reasons for this.

Variety improvements

There was a time when the fall dormancy rating of an alfalfa variety was highly correlated with its winterhardiness. Those varieties with less dormancy often performed at a higher yield level because their regrowth was both earlier and faster. They also grew later into the fall. It was often these varieties that if cut at the wrong time in the fall were subject to devastating winterkill.

Pre-2000 alfalfa growers often had to make the choice between higher yield potential or a lower risk for winterkill.

During the subsequent years, that situation has changed. The link between fall dormancy and winter survival has been broken. In other words, there are varieties with a fall dormancy rating of 4 or 5 that are just as winterhardy as those with a fall dormancy rating of 2.

Most plant breeders now make their dormant variety selections after purposely cutting possible selections during the traditional critical fall period. This helps to ensure plant survivability under a wide range of fall-cutting regimes. Winter survival ratings are reported in the National Alfalfa & Forage Alliance’s Alfalfa Variety Ratings booklet.

There are two important considerations when the variety is brought into the fall-cutting equation. First, you have to make sure that you are using an improved variety with a high level of winter survival. Not all varieties fall into this category.

The other consideration is simply that Mother Nature can still humble genetics when weather conditions are extreme. Your risk level is much better with an improved variety, but it’s not zero.

Climate change

Call it what you will, but there is a preponderance of evidence to support the notion that growing seasons are, on average, getting longer and average killing frost dates are being pushed back.

These weather-related mutations don’t change the underlying principles of root nutrient status going into winter, but they do effectively push back the critical fall-cutting period. This allows for alfalfa plants to prepare for winter with a later harvest date.

In reality, the critical fall harvest period isn’t about calendar dates; it’s about getting enough growing degree units following the last cutting.

What does this mean?

On average, cutting on September 15 in 2018, for example, will allow for more growing degree units before a killing frost than cutting on the same date 30 years ago. Even so, the reality that any year can offer the possibility of an early killing freeze must always be factored into the fall-cutting decision.

Other considerations

Also to be considered in making a fall-cut determination, as has always been the case, is the intensity of the previous cutting schedule, soil fertility status, previous pest stressors, and your need for additional forage.

Stands that have been subjected to intensive cutting regimes, that lack soil fertility, or have been stressed by diseases and/or insects will be less likely to overcome the rigors of a fall cut. Older stands also are more vulnerable to winter injury from fall cutting than younger stands.

Finally, if the forage isn’t needed, why take the risk?