"Crops that take up nutrients faster may reduce some wasteful nutrient losses (for example, leaching [see glossary] of nutrients in water percolating down through the soil), but every atom of nitrogen or phosphorus that is sold off-farm in grain, milk, or other farm products still needs to be replaced, for long-term sustainability... " -- Darwinian Agriculture, p. 67
This conservation-of-matter argument was the basis of my argument, in my fifth lecture at the International Rice Research Institute (IRRI), that rice with improved phosphorus uptake might offer mostly short-term benefits. (A slide with the only field data I've seen on this rice mysteriously disappeared, about 14 minutes into the talk.)
But the abiotic-stress group at IRRI called my attention to another approach to "the phosphorus problem" that seems really promising. Actually, there are several phosphorus problems:
* Phosphorus fertilizer is expensive, and it will get more expensive as high-phosphorus ore reserves are depleted.
* Some soils bind phosphorus, limiting its availability to plants.
* Phosphorus, mostly from livestock manure, is a major contributor to water pollution.
In the book, I mention crops with "proteoid" roots or increased symbiosis with mycorrhizal fungi as possible ways to increase crop uptake of less-available forms of phosphorus. But that doesn't solve the conservation-of-matter problem, i.e., the need to replace phosphorus in grain sent to distant cities or feedlots. The eventual depletion of phosphorus reserves seems such a severe (though perhaps distant) problem, that I briefly mention the "back to the land" option, to facilitate recycling of phosphorus in our waste.
But what if 1000 kg of grain contained only 1 kg of phosphorus, instead of 4 kg? We could then reduce phosphorus fertilization by 75%, making phosphorus reserves last four times as long. (OK, this isn't a permanent solution, but it could give us many more decades to find a permanent solution.) I didn't consider this option in the book, because I assumed that low-phosphorus grain would be less nutritious.
The abiotic-stress group at IRRI corrected my misinformation. It turns out that much of the phosphorus in grain is in the form of phytate, which neither we nor our animals can digest. So it ends up in manure. I'd heard about attempts to reduce phytate levels in seeds as a partial solution to the phosphorus-pollution problem. But low-phytate seeds would also reduce amount of phosphorus exported from a farm in each ton of grain, which would reduce the need for phosphorus inputs to replace it.
This seems like a win-win solution. Why didn't natural selection think of this? The phytate isn't there for our benefit; it's there to supply the phosphorus needs of the germinating seed and seedling, until it can grow enough roots to get phosphorus from the soil. We might therefore expect low-phosphorus seeds to grow poorly, although this isn't necessarily true for high-phosphorus seeds that have less of their phosphorus as phytate.
Low-phytate seeds with high total phosphorus would be more digestible, increasing the fraction of their phosphorus that ends up in meat or milk rather than manure. So they could reduce pollution. They wouldn't reduce the need for phosphorus inputs, however.
But what if we could supply the phosphorus needs of growing seedlings externally? If 99% of seeds get eaten, and only 1% get planted, could we give the 1% some extra phosphorus, perhaps as a seed coating? I don't see any fundamental (e.g., conservation-of-matter) reason why this wouldn't work, though it would probably require a clever combination of plant breeding and agronomy.