With alfalfa, we want it all

By Mike Rankin, Managing Editor

It’s true. When it comes to alfalfa, the holy grail is to achieve high yields, high quality, and a long stand life. The latter is often called persistence.

We’ve talked about these production components and the tradeoffs that exist between them for more years than I really care to remember and many that predated my earthly existence, even though Medicare eligibility is right around the corner.

The components of the three-legged stool we tout as yield, quality, and persistence are well defined. We know what we have to do to attain any one of the three, but to fully maximize one will come at the expense of at least one of the other two.

I’m reminded of a story I did a couple of years ago about two Idaho alfalfa producers who lacked both seasonal rainfall and growing season length. Their dryland alfalfa got cut once per year and their stands remained productive for well over 10 years (one had just passed its 18th birthday).

Persistence? Yes.

Quality? Usually.

Yield? No chance.

It was sometime in the 1980s that testing forages really came into its own with the introduction of near infrared spectroscopy (NIR). Although the results were crude by today’s standards, they were compelling enough to persuade dairy farmers and their nutritionists to realize that alfalfa needed to be cut earlier and more frequently than had previously been the case if milk production was to be maximized.

More intensive cutting schedules led to higher forage quality, but it also led to shortened stand life and significant winter injury. Disease resistance was good, but not enough.

The 1990s was a decade of discovering that we needed better alfalfa varieties if it was going to hold its stature on dairy farms, which were now demanding high quality with just reasonable yields and persistence. Some damage had already been done as this decade was the beginning of a huge shift of Midwest and Northeast alfalfa acres to corn silage.

What happened next?

By the mid-1990s, alfalfa breeders put a new emphasis on winter survival. If the crop couldn’t withstand a Northern winter, its fate would be short lived regardless of what other attributes the crop might have. Over time, improved winterhardiness became a reality with many new varieties . . . but there was also a problem.

If yield was to be improved, then breeding efforts would have to break the link between fall dormancy and winter survival. In other words, farmers wanted yield and persistence, and they would manage forage quality with time of cutting. To get yield, faster recovering cultivars were needed that would stand toe-to-toe for winterhardiness and persistence with those Fall Dormancy 2 (FD2) offerings.

Through the years, plant breeding companies have been successful in breaking the fall dormancy-winterhardiness relationship. These days, there are some FD4 and FD5 varieties that possess outstanding winter survival characteristics with the added advantage of greater yield potential compared to their FD2 and FD3 counterparts.

This latter point was recently reinforced by a University of Minnesota multi-year study comparing varieties with various fall dormancy ratings cut at three different cutting intervals over several years.

The researchers reported that, on average, the FD5 cultivar yielded 9 percent more total-season forage dry matter than the FD2 cultivar. What does this mean? Effectively, it shows that plant breeders have been successful in reducing (not eliminating) the trade-off between yield, quality, and persistence.

At the core of reduced-lignin alfalfa is also an attempt to further blur the lines between our alfalfa three-legged stool, especially between yield and quality. The potential to cut later and accumulate yield that has the same quality as earlier cut forage is one more step toward alfalfa’s holy grail.

Can’t have it all

Despite the many advancements that have been made since the 1990s, alfalfa producers still can’t manage alfalfa such that yield, forage quality, and persistence are all maximized. The strategies to fully maximize any one of three are still different and will likely remain that way.

That said, plant breeders have made good progress in narrowing some of the significant trade-off gaps. But let’s not put it all on the breeders. Alfalfa growers can further narrow these tradeoffs with their own harvest-timing decisions coupled with recommended practices associated with things such as establishment techniques and soil fertility practices.

If the alfalfa industry is to survive and thrive, continued pursuit of the holy grail will be a big part of the equation.