April 2013 Archives

Near the beginning of the question period for this recent lecture at the University of Minnesota, I suggested that:

1) nobody has done a good comparison of ideotype breeding with breeding for yield, and
2) many plant breeders who use the word "ideotype" ignore tradeoffs.

The main point of Donald's 1968 paper, which coined the term, "ideotype" was that there are often tradeoffs between individual-plant competitiveness and the collective performance of plant communities, so we can improve the latter by sacrificing the former. That's a major theme of my book, as well.

But both my numbered points above turn out to be wrong, at least partly.

Yuan et al. (2011) compared ideotype breeding with breeding for yield. I criticized some of their choices for "ideotype traits" in my third lecture at the International Rice Research Institute, but it's still an impressive study.

And, rereading Rasmusson's 1984 paper on ideotype breeding, I find extensive discussion of tradeoffs, though he doesn't explicitly mention the tradeoff between competitiveness and yield potential hypothesized by Donald (1968).

I am correcting these errors in an perspective I'm writing for the journal, Evolution.

...or whatever we call over 100 but fewer than 1000 views.

This page has links to an interview Michael Joyce did with me at the end of my week-long visit to the International Rice Research Institute, as well as the five lectures I gave there (plus audience questions and discussion).

Also still available are:
* a 60-second AAAS story on my most-cited paper.
* a video of my keynote talk at the Applied Evolution Summit
* a lower-quality video of a talk on Evolutionary Tradeoffs as Agricultural Opportunities
* an audio interview with science writer Carl Zimmer

Or, you can find an updated list of my publications, with links to many of them, here.

"Crops that take up nutrients faster may reduce some wasteful nutrient losses (for example, leaching [see glossary] of nutrients in water percolating down through the soil), but every atom of nitrogen or phosphorus that is sold off-farm in grain, milk, or other farm products still needs to be replaced, for long-term sustainability... " -- Darwinian Agriculture, p. 67

This conservation-of-matter argument was the basis of my argument, in my fifth lecture at the International Rice Research Institute (IRRI), that rice with improved phosphorus uptake might offer mostly short-term benefits. (A slide with the only field data I've seen on this rice mysteriously disappeared, about 14 minutes into the talk.)

But the abiotic-stress group at IRRI called my attention to another approach to "the phosphorus problem" that seems really promising. Actually, there are several phosphorus problems:
* Phosphorus fertilizer is expensive, and it will get more expensive as high-phosphorus ore reserves are depleted.
* Some soils bind phosphorus, limiting its availability to plants.
* Phosphorus, mostly from livestock manure, is a major contributor to water pollution.

In the book, I mention crops with "proteoid" roots or increased symbiosis with mycorrhizal fungi as possible ways to increase crop uptake of less-available forms of phosphorus. But that doesn't solve the conservation-of-matter problem, i.e., the need to replace phosphorus in grain sent to distant cities or feedlots. The eventual depletion of phosphorus reserves seems such a severe (though perhaps distant) problem, that I briefly mention the "back to the land" option, to facilitate recycling of phosphorus in our waste.

But what if 1000 kg of grain contained only 1 kg of phosphorus, instead of 4 kg? We could then reduce phosphorus fertilization by 75%, making phosphorus reserves last four times as long. (OK, this isn't a permanent solution, but it could give us many more decades to find a permanent solution.) I didn't consider this option in the book, because I assumed that low-phosphorus grain would be less nutritious.

The abiotic-stress group at IRRI corrected my misinformation. It turns out that much of the phosphorus in grain is in the form of phytate, which neither we nor our animals can digest. So it ends up in manure. I'd heard about attempts to reduce phytate levels in seeds as a partial solution to the phosphorus-pollution problem. But low-phytate seeds would also reduce amount of phosphorus exported from a farm in each ton of grain, which would reduce the need for phosphorus inputs to replace it.

This seems like a win-win solution. Why didn't natural selection think of this? The phytate isn't there for our benefit; it's there to supply the phosphorus needs of the germinating seed and seedling, until it can grow enough roots to get phosphorus from the soil. We might therefore expect low-phosphorus seeds to grow poorly, although this isn't necessarily true for high-phosphorus seeds that have less of their phosphorus as phytate.

Low-phytate seeds with high total phosphorus would be more digestible, increasing the fraction of their phosphorus that ends up in meat or milk rather than manure. So they could reduce pollution. They wouldn't reduce the need for phosphorus inputs, however.

But what if we could supply the phosphorus needs of growing seedlings externally? If 99% of seeds get eaten, and only 1% get planted, could we give the 1% some extra phosphorus, perhaps as a seed coating? I don't see any fundamental (e.g., conservation-of-matter) reason why this wouldn't work, though it would probably require a clever combination of plant breeding and agronomy.