Will These New Search Tools Make Us Worse Searchers?
Internet search providers are doing some innovative things these days. Recently, two examples surfaced: Wolfram/Alpha, which returns answers rather than sources; and Netbase, which looks at the language surrounding a search term to expand on a topic's context.
Many of these new search technologies promise to analyze the context of data and return specific answers to our questions, as opposed to current search engines that bring us to sources where we can find the answers. It's a fine point, but an important one. Current search technologies require us to know of or at least analyze the source for an answer. New tech does more of this legwork for us in terms of sifting through a source to find data. But what does this mean, besides allowing us to type in "population Rhode Island" and being shown a number, rather than the Census Bureau web page where that number comes from?
It means that the source is hidden or at least obfuscated, which begs a couple of questions: First, commercial vendors are presenting data rather than sources. Who says they have the principle of good information as a primary motive? Second, such services further remove searchers from the process of search and make us less responsible for checking the sources for found information. We'll potentially be more reliant on the search tool, and less reliant on our own critical thinking skills.
I'm not saying that the Internet should be a place that requires a higher education to use, but I do think a higher level of skepticism can't hurt. Some aspects of these technologies are easily lovable (quicker reference-type answers, making the Internet more practical for everyone, automatic relationship-building between topics) but some other aspects make me nervous. What do you think?