The Midwest dairy farmer’s lament was to the point: “If I didn’t have to worry about winterkill, I’d definitely grow more alfalfa.”

This farmer, like many others sitting at the meeting table who nodded in agreement, was explaining why his alfalfa acres were on the decline in recent years. “I like to grow and feed alfalfa, I just can’t depend on it,” he added.

Challenge defined.

So, who’s to blame that alfalfa isn’t “dependable?” In some respects, it’s nobody’s fault; in others, it’s everybody’s fault.

Let’s begin with nobody’s fault.

Alfalfa, like most plants, is subject to the whims of Mother Nature. We can’t control long-term ice formation in winter and early spring other than to ensure adequate field drainage. When ice is prevalent, there’s a good chance that alfalfa will suffer in the same way corn will suffer from a long-term drought, especially during the pollination period. It’s nobody’s fault, and there’s little more to discuss along these lines other than stress tolerance will need to remain a mainstay of alfalfa variety development.

Although environmental conditions can sometimes challenge the survivability of alfalfa, our industry does not always put alfalfa in a position to weather the challenging storms — to be dependable. All of us are to blame, including farmers, agronomists, nutritionists, marketers, and plant breeders. Nobody, including myself, gets a pass on this one.

Striving for high quality

We have pushed hard for high-quality forage over the years. The bar is continually raised. I remember the days when it was a struggle to get farmers to target a 150 relative feed value (RFV). As the years passed and relative forage quality (RFQ) entered the picture, we pushed toward 180 RFQ and even higher.

The need for higher quality forage was understandable. Overall, more alfalfa was fed then compared to now. Fiber digestibility was strongly correlated with milk production, as were a nutritionist’s perceived worth and a farmer’s milk check.

To achieve the level of forage quality being desired, harvest schedules were often altered to 25- to 28-day cutting intervals, or even less. Sometimes, an extra cutting was realized. Short-term forage yields were not always impacted dramatically, but something else was . . . the ability of the plant to persist.

We have long known the relationship between cutting intensity and stand longevity. It exists even in the best of weather conditions, but it can be more pronounced when Mother Nature gets a little cranky.

It’s at this point I’m reminded of my visit to the high elevation, semi-arid region in southern Idaho. Here, nonirrigated alfalfa fields persist for 10 to 18 years. They also are only cut only once per year (for high quality) because environmental conditions don’t allow for anything more.

Our current, intensive cutting schedules that result in high-quality forage come with an agronomic cost. Plants are somewhat more prone to winter injury and/or a shortened stand life. Enter more corn silage.

There is some irony to all of this.

As dairy farmers feed more corn silage and less alfalfa, the importance of super high-quality alfalfa declines. The difference between 150 RFQ and 190 RFQ becomes less apparent in the bulk tank, if it’s apparent at all. In fact, the 150 RFQ alfalfa may be the better play both nutritionally and agronomically. So, the reason why many farmers are feeding less alfalfa — a lack of dependability — may actually allow for cutting schedules that should make it more dependable.

We must continue to invest research dollars into alfalfa varieties that truly have improved fiber digestibility. The importance may not necessarily be so that we can harvest even higher quality forage, but so that plant regrowth periods can be extended while still being able to harvest acceptable forage quality for high milk production. At the same time, alfalfa is made less vulnerable to adverse environmental conditions.

Many other factors

Let’s not blame alfalfa’s undependability all on the nutritionists and their zeal for rocket fuel alfalfa. There are a host of other factors that weaken alfalfa’s survivability immune system. These are the sins of omission and include the following:

• Omit the needed amount of lime and fertilizer to optimize alfalfa plant health, productivity, and persistence. Potassium (K2O) is especially important for winter survival. Many farmers have admitted to me, without prompting, that they don’t put enough K2O on their fields.

• Omit scouting fields for insect pests, especially potato leafhoppers. Left untreated or treated too late, potato leafhoppers will significantly weaken alfalfa plants. Research has taught us that this condition extends beyond the year of infestation.

• Omit seeding improved alfalfa varieties that are selected and tested under adverse cutting schedules and environments. Further, there are uniform tests and a rating system for variety winterhardiness and dormancy. Don’t omit glancing at these before ordering your seed.

Finally, and let’s be honest, we batter our alfalfa fields with heavy equipment. I’m not sure anything can be done to eliminate this sorry fact, but there are ways to minimize it with traffic patterns, tire selection, and timing. The crown damage, and sometimes compaction, caused by today’s equipment is visible in the short-term and damaging in the long-term. It simply can’t be beneficial for plant health and persistence.

And, now, it’s September

As the calendar is flipped to September today, we begin to enter into a period that historically has been considered treacherous for cutting alfalfa.

All alfalfa growers know this story: As temperatures begin to cool and the days shorten, dormant alfalfa varieties enter into a new physiological game plan, partitioning nutrient resources into those things that will enhance winter survival.

Rapid and abundant top growth becomes secondary, though still important to maintain a photosynthetic factory. Moving to the top of the priority list is root carbohydrate (starch) and protein storage.

Through the years, many alfalfa growers have largely disregarded what has always been known as the “critical fall period,” a time to stay out of alfalfa fields and allow plants to prepare for winter survival. Perhaps with better plant genetics and longer growing seasons, this “stay out” period is not as important as it once was. Nonetheless, plant physiology still prevails and especially if fields are on the brink of the survival curve because of all the factors previously discussed.

Maybe alfalfa’s “dependability” problem can’t be fixed. But maybe it can. We have had precious little research using modern alfalfa varieties to assess the importance of all the factors discussed here. Hopefully, that will change.

What we do know is that alfalfa offers a lot of benefits in a ruminant’s diet, many of which are not accounted for in traditional ration-building software programs. Alfalfa also has an equal number of agronomic benefits that can’t come close to being matched by growing continuous corn.

Alfalfa has been prominent on dairy farms and in commercial hay fields for well over 100 years. However, it has never been cut as intensively or abused more dramatically than it has in recent years. We can make alfalfa more dependable; it’s just going to take a unified effort on behalf of all parties involved. At least some of what is happening isn’t “winterkill” but rather “peoplekill.”