May 18, 2009

An analogy between the law of chemistry and US highway system

From The Origins of Life: A Case is made for the descent electrons, appeared in American Scientist:

Consider the requirements of the U.S. Interstate highway system. The system includes an enormously complex network of roads; major infrastructure devoted to extracting oil from the Earth, refining oil into gasoline and distributing gasoline along the highways, a major industry devoted to producing automobiles; and so on. If we wanted to explain this system in all of its complexity, we would not ask whether cars led to roads or roads led to cars, nor would we suspect that the entire system had been created from scratch as a giant public works project. It would be more productive to consider the state of transport in preindustrial America and ask how the primitive foot trails that must certainly have existed had developed into wagon roads, then paved roads and so on. By following this evolutionary line of argument, we would eventually account for the present system in all its complexity without needing recourse to highly improbable chance events.

In the same way, we argue, the current complexity of life should be understood as the result of a multistep process, beginning with the catalytic chemistry of small molecules acting in simple networks—networks still preserved in the depths of metabolism—elaborating these reaction sequences through processes of simple chemical selection, and only later taking on the aspects of cellularization and organismal individuality that make possible the Darwinian selection that biologists see today. Our task as origin-of-life researchers is to look at the modern highways and see what they reveal about the original foot trails.

Looks like Chemists are trying to learn from civil engineers and town planners. Although I don't believe there is any universal law to explain the growth of networks of different categories (say, engineering networks or biological networks)---since the mechanisms and incentives that induce the network growth are quite different--the patterns of different networks, in the final outcome or in the process of incremental change, might exhibit similar features, which is interesting.

May 12, 2009

The next big thing: Wolfram | Alpha

When talking about Stephen Wolfram, people in the field of complex systems will naturally think of his well-known book A new kind of Science and the software Mathematica. Now he and his researchers will publicize a new kind of search engine: Wolfram|Alpha , the so-called computational knowledge engine. It will be open to the public pretty soon.

Some people asked Stephen if Wolfram|Alpha wanted to be the google-killer. He said no, " the goal of Alpha is to give everyone access to expert knowledge and the data that a specialist would be able to compute from this information."

So what's special about this search engine? Here is the quote from Read Write Web:

Alpha, which will go live within the next few weeks, is quite different from Google and really doesn't directly compete with it at all. Instead of searching the web for info, Alpha is built around a vast repository of curated data from public and licensed sources. Alpha then organizes and computes this knowledge with the help of sophisticated Natural Language Processing algorithms. Users can ask Alpha any kind of question, which can be constructed just like a Google search (think: "hurricane bob" or "carbon steel strength").

I haven't got the chance to try it yet. But Some snapshots can be found here. Also, here is a sneak review of WolframAlpha. Just looking at the results, I was very amazed. By inputting information like GDP in Europe, you could get many interesting graphs, demographic data, and GDP-related data, which used to be probably only available for experts in this field or took lots of effort for a non-expert to secure.

For researchers, I assume it could be of even greater value, since we all know the pain of researching for relevant and reliable research data.