« March 2009 | Main | May 2009 »

April 27, 2009

Abolish tenure?

Mark Taylor, a professor of religion, has observed (in the New York Times) that

"graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist)…[with] sometimes well over $100,000 in student loans."
This is not really true of the sciences, where the main product is the research that is central to graduate education, research that often leads directly to improvements in healthcare, agriculture, engineering, or environmental quality. Nor do science PhD's usually take on much debt. (If you are applying to grad school in science, and they don't promise you fellowship support or paid teaching opportunities sufficient to meet minimal living expenses, it's either because you are poorly qualified or because the program is poorly funded. Either way, you should reconsider.)

But programs in the sciences do collectively graduate more PhD's than they hire, so a PhD is no guarantee of a faculty position. I have discussed this before.

Taylor's proposed solutions? Several ideas whose effects on the stated problem are hard to predict but probably small (restructuring curriculum, abolishing departments, accepting video games and such as substitutes for traditional written dissertations), one that would make the job shortage worse but might have other benefits (eliminating programs and substituting internet courses), and two that might help new PhD's find jobs (preparing students for nonacademic careers and abolishing tenure). Preparing students for nonacademic careers is something that has been discussed for years and Taylor doesn't offer any new ideas on how to do this. But what about abolishing tenure?

There are two reasons why universities offer tenure, of which Taylor mentions only academic freedom: ensuring that professors can explore important ideas that may be controversial, without fear of losing their jobs. The second reason is that, by offering lifetime employment security, universities can attract top scientists at lower salaries.

It is not generally true, as Taylor claims, that "once tenure has been granted, there is no leverage to encourage a professor to continue to develop professionally." Salary increases depend on success in research and teaching as well as service to the university and society. There may be a few older professors, however, who would rather goof off than work for a salary increase. I don't know anyone like that, but Taylor's experience in a religion department may be different.

How could we discourage people from "retiring in place", without undermining academic freedom? How about letting professors' salaries decrease gradually if their academic performance decreases? (This assumes performance can be evaluated fairly, but we already assume this in awarding pay increases.) Especially if pensions are tied to peak salary, this would discourage professors from hanging on long past their prime.

A potential salary decrease would be much less of a threat to academic freedom than the prospect of having to give up research altogether. ("If I explore the ecological risks of transgenic crops, even though the university gets millions from biotech companies, I could risk a pay cut, but this is more important than money. Besides, if I make some really important discoveries, they might feel compelled to give me a raise, especially if biotech money runs out. Or I could get a better job offer from another university, or a book contract.")

Remember, however, that society supports graduate programs in the sciences mainly because we need the results from the research done by graduate students (apprentices guided by professors), not to provide jobs for professors or PhD's to fill those jobs. Any proposed changes in tenure should consider the impact on research that benefits society as a whole, not just the impact on individuals. Do we want to force everyone to work on what state legislators and university administrators think is most important, or should we allow professors willing to risk pay cuts (or risk forgoing salary increases, under the current system) to explore high-risk ideas with potentially high returns?

By the way, as an adjunct professor, I do not have tenure.

April 24, 2009

Optimal bet-hedging?

Suppose, in an average year, that weather is best for survival of young seedlings in early June. If a plant could make only one seed, it should make one targeted to germinate at the beginning of June. (Plants have some control over when their seeds will germinate, based on plant-hormone concentrations, seed-coat thickness, etc.)

Given variability among years, however, a plant that produces many seeds may have more descendants if those seeds germinate at various times, rather than risking everything on one date. On the other hand, too much variation in germination time may be as risky as too little. For example, seeds germinating really late may be killed by frost. How does actual variation in germination timing among seeds of individual wild plants compare with the optimum amount of variation? This week's paper is apparently the first to answer this question.

"Fluctuating natural selection accounts for the evolution of diversification bet-hedging" was published in Proceedings of the Royal Society by Andrew Simons, of Carleton University in Canada.

To see how seedling survival varies as a function of germination date and year, he germinated Lobelia seeds in a greenhouse and transplanted them to the field on several dates per year, over five years. (Seedlings were transplanted in 8 cm fiber pots, so they were probably more protected from competition than if they had germinated in the field. Did competition change enough over the season to affect optimal germination times?) Survival data for these seedlings were used to calculate the optimum variation in germination timing, that is, the amount of variation in a plant's seeds that would maximize its total descendants, given the observed variation in field conditions. He then compared that number to previously measured within-maternal-plant variation in seed germination timing. Actual variation was just slightly less than the calculated optimum: 12 vs. 13 (days?). Simons concluded that the fitness benefits of bet-hedging were sufficient to explain the observed variation.

One would expect some variation in germination even in a uniform environment where the optimum variation was zero (i.e., all seeds should germinate on the one best day, if possible). Simons argued that "such an invariable environment neither exists nor can be experimentally created." It may be true that there is always some variation, but we could certainly minimize variation by growing plants indoors under artificial lights, with the same yearly pattern of light and temperature every year. I wonder how long it would take for plant populations to evolve significantly lower variation in seed germination, under those conditions.

Also this week:
Is there an adverse effect of sons on maternal longevity?

Snowdrift game dynamics and facultative cheating in yeast


April 17, 2009

Mindless manipulation

This week I'll discuss a recent paper from our lab. But first, here are links to three other papers that look interesting:

Evidence from the domestication of apple for the maintenance of autumn colours by coevolution Some insect pests avoid trees whose leaves turn red in autumn and do poorly on those trees, but can trees "lie" or is there an unbreakable link between red color and poor quality as a host, perhaps because "aphids grow better on trees that drop their leaves later [because they have enough nitrogen they can risk losing high-N leaves in frost?], which are known to have fewer autumn colours [because, by the time they lose chlorophyll, UV levels are too low to require the protection provided by red anthocyanins?]."?

Functional morphology of the ankle and the likelihood of climbing in early hominins Modern chimps use their ankles, when climbing trees, in ways some early hominins (1-4 million years ago) probably couldn't, based on fossils.

Cooperation and virulence of clinical Pseudomonas aeruginosa populations
Patients with pneumonia are sicker when bacterial cells cooperate by producing individually costly virulence factors, but bacterial populations evolved "cheaters" that don't make these factors within 9 days.

---------------------------------------------------------------
In our paper, "Rhizobitoxine producers gain more poly-3-hydroxybutyrate in symbiosis than do competing rhizobia, but reduce plant growth", published online in The ISME Journal, my PhD student Will Ratcliff describes experiments showing how symbiotic nitrogen-fixing bacteria can manipulate their plant hosts.

Legumes use the plant hormone, ethylene, to control the number of root nodules they make to house these bacteria, known as rhizobia. Some rhizobia make a chemical, rhizobitoxine, that blocks ethylene signaling, resulting in more nodules per plant.

We were not surprised to find that a rhizobitoxine-producing (Rtx+) strain resulted in worse plant growth than an otherwise-similar Rtx- strain. Past natural selection has presumably eliminated plant genes that fail to adjust nodule number appropriately to meet a plant's nitrogen needs without wasting photosynthate. Therefore, a microbe-induced change in the way a plant allocates its resources is unlikely to benefit the plant. (Toby Kiers and I discussed this point recently in Annual Review of Ecology, Evolution, and Systematics.) Similarly, we would expect natural selection to have eliminated the rhizobial Rtx+ gene unless it benefited the rhizobia. (It's risky to assume that a given complex trait will necessarily evolve, just because it would be beneficial, but eliminating a complex but harmful trait is a lot easier. There must be hundreds of simple mutations that would knock out rhizobitoxine production, for example.)

Benefits to Rtx+ rhizobia were harder to detect than we expected, however. Although the Rtx+ strain more than tripled the number of nodules per plant, it then occupied only 42% of those nodules, when competing with an Rtx- strain. So, if anything, rhizobitoxine helped the competing Rtx- strain. When the two strains were inoculated together, nodules containing the Rtx+ strain were similar to Rtx- nodules in rhizobia per nodule, so there was no benefit there, either.

So what was different? In a word, polyhydroxybutyrate (OK, it's a pretty long word, so let's just call it PHB). Rtx+ rhizobia accumulated 47% more of this high-energy lipid than Rtx- rhizobia in other nodules on the same plant. Previously, Will showed that rhizobia can use PHB to power reproduction or to survive starvation, so this is an important benefit, but not one we could have detected just by counting or weighing nodules.

There are at least two possible interpretations of these results:
1) Rhizobitoxine tricked the plant into giving Rtx+ nodules more photosynthate, which they used similarly to Rtx- strains. In other words, they hoarded more PHB, but also fixed more nitrogen. In that case, why did plants infected only with Rtx+ rhizobia grow less?
2) Maybe rhizobitoxine somehow protected Rtx+ from the "host sanctions" that can reduce the reproduction of rhizobia in nodules that fix too little nitrogen. Once protected against sanctions, mutants that divert resources from nitrogen fixation to PHB could spread within populations of Rtx+ rhizobia.

"What natural selection cannot do, is to modify the structure of one species, without giving it any advantage, for the good of another species; and though statements to this effect may be found in works of natural history, I cannot find one case which will bear investigation." -- Darwin


April 8, 2009

Evolution-Proof?

Which animals kill the most humans? Lions and tigers and bears? Oh no, malaria-transmitting mosquitoes! The risks of using insecticides to kill mosquitoes may be outweighed by the benefits, but those benefits only last until mosquito populations evolve resistance. Careful use (insecticide-treated bed-nets, for example, rather than spraying wetlands) can slow the evolution of resistance, but we haven't yet achieved a goal I recently saw on a bumper sticker, namely, to "Stop Evolution Now!"

Can we do better? A paper published today suggests a new approach. "How to make evolution-proof insecticides for malaria control" was written by Andrew Read and colleagues. It's in the open-access journal, PLoS Biology, so you can read the whole article for details, but here's my summary:

When insecticides kill mosquitoes quickly, the rare insecticide-resistant mutants make a disproportionate contribution to the next generation, so resistance evolves quickly. But reproduction of the malaria parasite inside mosquitoes is slow enough (and mosquito life-spans are short enough) that most mosquito eggs are laid by mosquitoes that have not yet become infectious. So the authors suggest developing methods that only kill older mosquitoes. This could be a slow-acting chemical insecticide, a slow-killing virus, or something that preferentially kills older mosquitoes. They wrote:

" in principle at least, public health advances can be achieved with minimal selection for resistance by an insecticide that kills after the majority of mosquito reproduction has occurred but before malaria parasites are infectious."

One problem they note is that slow-acting insecticides would provide longer-lasting protection from malaria, but they wouldn't keep people from getting bitten. People often have a short-term perspective, so might be reluctant to adopt this approach, unless it were combined with other methods. For example, window screens keep people from getting bitten at home, while also imposing selection on the malaria parasite for lower virulence (because people too sick to leave the house don't get bitten and transmit the more virulent variants).

Although most mosquitoes live only a short time, I wonder whether there are places or times where long-lived mosquitoes constitute most of the population. For example, most Monarch butterflies have short life-spans, but those that migrate back to Mexico from the US are much longer-lived. If long-lived mosquitoes lay most of the eggs after a drought, for example, then insecticides that kill (susceptible) older mosquitoes could lead to more rapid evolution than the models in this paper predict. But I don't know enough about mosquitoes to know if that's a real concern.

Why Evolution is True discusses human-biting mosquitoes in the London Underground that may have evolved since it was built. Their sister species (or subspecies) on the surface mostly bites birds.

April 3, 2009

How fast can sexual traits evolve?

Experimental populations of hermaphroditic plants evolved a significant increase in male function in only three generations.

Many plant species are hermaphrodites, with each individual producing both pollen and seeds. Others species have separate sexes, as mammals and birds do, while still others have mixtures of unisexuals and hermaphrodites. Based on the distribution of these traits in the family tree of life, evolutionary transitions among these "lifestyles" appear to have been fairly common. This week's paper shows how hermaphrodites can evolve to be more female or, in this case, more male. Hermaphroditic Sex Allocation Evolves When Mating Opportunities Change was just published in Current Biology by Marcel Dorken and John Pannell.

What does it mean to be “more male?” Dorken and Pannell previously showed that hermaphrodites of the plant they study, Mercurialis annua, make more pollen when they are spaced farther apart (i.e., when they need their own pollen because they are unlikely to get it from neighbors). This is an individual response, rather than an evolutionary change in the genetic composition of a population. But can greater maleness also evolve?

When there are lots of male-only plants around, there is plenty of pollen. (Males make at least 10 times as much pollen as hermaphrodites the same size, because pollen grains are smaller than seeds and therefore cheaper.) But when males are scarce, hermaphrodites that make more pollen should sire a larger fraction of the next generation (on themselves or neighboring hermaphrodites), so genes for this greater male function should increase in frequency.

And that is what the authors found. Experimental hermaphrodite-only populations increased the fraction of their resources spent on pollen, rather than seeds, by 10% in only three generations.

Meanwhile, hermaphrodites mixed with about 50% males showed no evolutionary change. I think they may have been expecting these populations to evolve in the opposite direction, with hermaphrodites becoming more female – perhaps eventually all-female, leading to a population with separate male and female plants. But, given the higher cost of seeds, relative to pollen, it may take stronger selection or a longer time to evolve greater seed production, relative to greater pollen production.

Other recent papers that looked interesting:

Accommodating natural and sexual selection in butterfly wing pattern evolution

Communal Nutrition in Ants

Flat lizard female mimics use sexual deception in visual but not chemical signals

Two fungal symbioses collide: endophytic fungi are not welcome in leaf-cutting ant gardens

Phylogenomics Revives Traditional Views on Deep Animal Relationships

Non-breeding season events influence sexual selection in a long-distance migratory bird

No actual conflict over colony inheritance despite high potential conflict in the social wasp Polistes dominulus

Sequencing and Analyses of All Known Human Rhinovirus Genomes Reveal Structure and Evolution


Cooperation and virulence of clinical Pseudomonas aeruginosa populations


New Guinea highland origin of a widespread arthropod supertramp

Flight speeds of swifts (Apus apus): seasonal differences smaller than expected


The cultural and chronological context of early Holocene maize and squash domestication in the Central Balsas River Valley,


Sexual selection drives weak positive selection in protamine genes and high promoter divergence, enhancing sperm competitiveness

Coevolution of diet and prey-specific venom activity supports the role of selection in snake venom evolution

The evolution of primate visual self-recognition: evidence of absence in lesser apes

Mimicry, colour forms and spectral sensitivity of the bluestriped fangblenny, Plagiotremus rhinorhynchos

Reliabilities of identifying positive selection by the branch-site and the site-prediction methods

Plants with double genomes might have had a better chance to survive the Cretaceous–Tertiary extinction event

Ecomorphological selectivity among marine teleost fishes during the end-Cretaceous extinction

Starch grain and phytolith evidence for early ninth millennium B.P. maize from the Central Balsas River Valley, Mexico

A microraptorine (Dinosauria–Dromaeosauridae) from the Late Cretaceous of North America

Natural variation in a neural globin tunes oxygen sensing in wild : Caenorhabditis elegans

It takes two to tango: reproductive skew and social correlates of male mating success in a lek-breeding bird

Trill consistency is an age-related assessment signal in banded wrens

Influence of major histocompatibility complex genotype on mating success in a free-ranging reptile population

Seasonal host dynamics drive the timing of recurrent epidemics in a wildlife population