Dr. Dregs has been uncharacteristically insistent of late that we stop acquiring more junk. Normally, I'm in agreement with that plan, but clearly I must have one of these. Just in case anyone out there is suddenly seized by the urge to buy me a present, now you know what to get. Hey, it's a little pricey, maybe, but aren't I worth it?
Of course, the Sunnydale High Library could never be complete without a Rupert Giles action figure, so I'd need one of those, too. This is obviously a dangerous path to travel ... maybe Doc Dregs is right.
An interesting piece of research featured on the front U of M page today shows that regular viewers of the NBC sitcom Will and Grace tend to be less prejudiced against gay men. That alone wouldn't be such a big deal, if the audience comprised mostly people who were already not prejudiced, but the study also found that attitudes were influenced most among Will and Grace viewers with the least amount of prior direct personal contact with gay men.
Will and Grace is a show I've watched on and off over the years, sometimes finding it hysterically funny, other times finding it a bit too repetitive and reliant on broad stereotypes: fussy, shallow, pop-culture obsessed gay men and desperate 30-something single woman. I haven't watched regularly in a couple of years, lacking both the time and the interest.
But there is something that disturbs me in all of this. It isn't that people underwent what I would consider to be a positive attitude change by watching a show that I would rate as mediocre-plus. It's that the conventional wisdom that we're all too jaded and media-savvy to be influenced by what we see on TV is apparently wrong.
So I guess it's both heartening and slightly upsetting that a mere TV sitcom can make a measurable difference in the world.
In addition to ruffly pink things covered with hearts and flowers, this has got to be right up there at the top of the list. Um, ew. Icky on so many levels. Do these people know what M.I.L.F. stands for? Because if you read the shirt's message aloud and substitute the phrase for the acronym, it seems like any kid wearing it has been signed up for a whole world of Oedipal pain.
This shirt, on the other hand, is cute, funny, and clever enough to overcome its questionable level of taste.
(Oh, and sorry to disappoint some of you -- you know who you are! -- but the child really is purely hypothetical.)
I am as big a Lord of the Rings fan as anyone. Okay, actually, I'm a much bigger fan than most people, having read the thing upwards of 20 times and being extremely fond of the recent film version. But some ... interpretations ... of the story just don't seem like such a good idea. For example, the musical, which has just opened in Toronto. Critics are underwhelmed by the spectacle, calling it, among other complimentary things, "largely incomprehensible."
I'm not sure I ever want to hear the music, which, no matter how good it might actually be, is pretty much my reason for believing that a LOTR musical is just a bad, bad idea to begin with. Here we have a work that teeters on the edge of self-parody and camp under the best of circumstances -- and we're going to turn it into a musical, that most campy of artistic/entertainment genres? This really couldn't have gone well. The New York Times review confirms my worst suspicions: Galadriel "sings of Elvish good will in the style of Celine Dion," and many of the songs "suggest Enya at an ashram." Oy.
I'm all for reinterpreting classics in new ways, "putting old wine into new bottles," as the saying goes. But to stretch that metaphor further than it should be stretched, sometimes the bottle really isn't the right shape or size for the wine it's supposed to hold. Maybe this musical can be tweaked and altered enough to make it an artistic success (and, oh yeah, a runaway hit), but I'm doubtful. I'm reminded of The Simpsons and Oh! Streetcar: The Musical. But that was supposed to be a joke.
Is cursive handwriting going the way of the dinosaurs? According to some educators, teaching students to write in cursive is no longer worth the time it takes. As handwriting continues to lose ground as a method of written communication, the advantages of learning cursive are fewer and fewer.
I have fond memories of learning cursive. I vividly remember my third grade teacher, tiny, white-haired Mrs. Achterberg, teaching us cursive by the old-fashioned but pretty Palmer Method. I was always a good student of penmanship, and even went through a period of obsession with learning calligraphy in late elementary school and junior high. I took pride in my ability to write beautifully (probably because my talents in the visual arts are otherwise close to nonexistent). Although my "regular" handwriting has evolved into a fusion of cursive and printing, to this day, I can still write perfect Palmer Method if I try.
But it's easy for me to concede that cursive probably isn't a necessary skill for students to learn now, especially since it doesn't come easily to most kids -- which makes learning it both frustrating and time-consuming. Considering the ways in which the elementary curriculum has expanded, something has to go, and cursive seems like a reasonable thing to dump. If kids are going to spend a lot of time practicing something that requires fine motor skills, I'd prefer to see them take up a musical instrument, or learn proper techniques for drawing, painting, or sculpting. And for the purposes of communication, typing by touch is clearly a much more essential thing to learn.
The only problem is that most people who can't write cursive also can't read cursive. That means all sorts of original documents from the last couple of centuries will be illegible to a large number of people. Reading cursive will probably someday be a skill possessed only by historians of the 19th and 20th centuries. But as the example of North by Northwest in the article demonstrates, this is also another example of the slow erosion of cultural continuity, another lost bit of common understanding that helps us relate to the past. Just think of all the classic films in which a note or a letter written in cursive is displayed on the screen. These will become indecipherable, just like the countless idioms, expressions, and bits of slang that have fallen out of usage over the decades. The end of cursive as something universally known by those with a basic education means it will be just a little bit harder to understand the great cultural products of the past. It's a normal process, but it's a little bit sad to see it happening.
The enduring popularity of John Gray eludes me completely. Is this really a newsworthy item? I'm bowled over by the deep insight it must require to say in hindsight that you never thought the Aniston-Pitt merger would last, because, you know, those high-profile celebrity marriages are usually so stable. Another shocking prediction from the agile brain of "Dr." Gray: Brad Pitt and Angelina Jolie won't stay together, either. Now there's a controversial contention. And I love Gray's answer to the question of whether his "methods" for relationship success work for him: something along the lines of "I got mine, so good luck to you." Nice.
Sometimes our culture is just really discouraging.
There's already been loads of commentary on the article (note: link is to a copy of article that does not require registration) in last week's NYTimes about female students at Ivy League schools who expect never to draw on their fancy educations in the service of career, but who rather plan to stay home with their potential children. Apparently, there are serious questions about whether the author's sensationalistic statistics are methodologically valid (sounds to me like they pretty much aren't), but even aside from that, I think all of the feminist handwringing over this "trend" is pretty silly. After all, how many of us are doing now what we thought we'd be doing at 19 or 20?
When I was 19, I planned to be an orchestral clarinetist. I expected that I would probably pick up a quick master's degree after finishing college, then start auditioning, presumably living hand-to-mouth supporting myself through a variety of odd and temporary jobs while I waited to win an audition. I expected neither to marry nor have children, assuming that the clarinet would be an all-consuming endeavor. I did have a backup plan of sorts, knowing the long odds I'd be facing in auditions. That plan was vague, though -- some sort of graduate or professional school to prepare me for another career in academia or law. I thought I'd figure it out as I went along.
Obviously, that's not how things played out. My plan had changed considerably by the time I graduate from college at the age of 23, and my plan changed again a few years after that. Now, I have my fair share of ambition, but not so very much focus, which probably separates me from many if not most Ivy Leaguers. Still, the basic point remains: is it really a good use of precious time to fret over what a few 19-year-olds say they plan to do with their lives? What are the chances really that in ten or fifteen years any of them will be doing what they think they'll be doing?
So here's the question part of this post: what was your plan at 19? Did you stick to it? In what ways have things not gone as you expected back then? Maybe there is reason to wonder if the gains of feminists over the past few decades were all for naught, and to worrywart over the prospect of privileged Ivy Leaguers not living up to the potential bestowed upon them by their educations, but I suspect it's all just an overreaction. Not that there aren't some sticky issues here, especially for feminists (like myself) who believe that everyone, male and female, should be able to make the choices they want, rather than the choices society expects. But come on, these are freshpersons in college. How much do they really know about their futures?
In the midst of all the dire and horrifying news of the week, it's reassuring to know that the Concerned Women for America can still get their undies in a bunch over something trivial. The Seattle Times reports that the CWA's latest crusade is against Starbucks, which has the effrontery to feature a quotation from gay author Armistead Maupin on coffee cups. Maupin's quotation is one of many featured on the cups, which Starbucks intends to serve as conversation-starters among coffeeshop patrons (or some such hogwash).
The CWA is kindly asking Starbucks to stop being liberal, already, so as not to alienate conservative caffeine addicts everywhere. As for Starbucks, they claim not to be of any particular political persuasion, even though Buyblue.org gives them a 100% "dark blue" rating for their executives' political contributions.
Even if it weren't for that, Starbucks's spokesperson gives them away in the article, saying, "Embracing diversity and treating people with dignity is one of the guiding principles of our corporation." Aha! The D-word! A sure sign of liberal sympathies.
Well, if the CWA wants to encourage conservatives to avoid a tasty, incredibly healthy beverage like coffee, I guess that's their business.
Most of us are surely familiar with the experience of "mall glaze," the soporific feeling that comes over you after too much time spent in the climate-controlled, white-noise wash of a shopping mall. The Mall of America produces its own super-potent brand of mall glaze, which at least for me, seems to come on quicker and more intense than at other shopping centers. Well, help is on the way: an outfit called PowerNap Sleep Centers is about to open a store at MoA called MinneNAPolis, where, for a mere 70 cents a minute, exhausted shoppers can gets some rejuvenating rest.
Leaving aside for the moment the question of whether or not this is a worthwhile venture, I have to express both amusement and bemusement at the linked article's lengthy treatment of the supposed social prohibition against sleeping in public. Clearly the writer doesn't spend much time on college campuses, where people of all ages and descriptions frequently nap sprawled across benches, tables, floors, and (when the weather's halfway decent) lawns. Education seems to be a deeply tiring endeavor, whether you're student, faculty, or staff.
As to why people don't catch a little shuteye on mall benches, isn't the more obvious explanation that it's simply far too noisy to sleep, unless you're one of those lucky types who can sleep anywhere, under any circumstances? I can't even imagine trying to nap in midst of the chatter of shoppers, the roar of the air conditioning, the piped-in music, and the screams from Camp Snoopy. I have seen people napping in the Mall, though: just check out any of the lounge areas in the women's rooms of the department stores.
So will tired shoppers be willing to pony up $42 for an hour's rest? I'm guessing not, but I suppose it depends on how desperate you are for a little peace and quiet, something that's always in short supply at the Mall of America.
Got a couple of essays brewing, but my priorities have been with outdoor activities lately, so you'll all just have to wait a little while longer to receive my trenchant observations and pearls of wisdom.
In the meantime, consider this: a guy who's on a mission to visit every Starbucks in the world. Now, I like Starbucks probably a little more than the next person: I usually enjoy their coffee. I don't find it too strong, bitter, or overcaffeinated, as many do. And I don't think they're evil, as large corporations go. But this strikes me as a little bit crazy.
At least the guy seems to have some perspective. I couldn't put it better than he does: "Every time I reach a Starbucks I feel like I've accomplished something when actually I have accomplished nothing.''
Then there's the guy making a movie about Starbucks-visiting guy...
Yeah. Anyway, back to your regularly scheduled surfing.
Although it makes me want to chuck the TV out the bedroom window, I inevitably wind up watching a few minutes of one of the network morning shows each day. You know -- Good Morning America, or Today, or whatever the heck CBS calls their show. Usually, spending a little time with one of these shows makes me feel angry, depressed, and insulted, but I watch anyway. I attribute this to my weather obsession: I will sit through ten minutes of inane chitchat and hyped-up human interest garbage to hear the daily weather forecast.
Virtually everything about these shows is annoying (if not actually nauseating), from the lame semi-scripted banter between the hosts, to the syrupy, drippy piano music that underscores each day's tragic story of a courageous survivor with a surprise happy ending. But the thing about these shows that I really don't get at all is the omnipresent wall of screaming people waving handmade signs standing outside each show's studio. It's usually the weather guy who has to make conversation with these people, letting them yell into the microphone for a few seconds about their Aunt Beulah's 85th birthday, or their cousin Joe's college graduation.
Why do people do this? What's the appeal? Just imagine yourself on vacation in New York (unsurprisingly, these people seem to always be tourists, never locals). You have a week or two to see one of the greatest cities in the world, and you decide that what you really want to do is to get up before the crack of dawn one morning so you can stand outside a TV studio (possibly in inclement weather) waving a paper sign, hoping the weather guy will talk to you? I just don't get it. Is there something going on here that I don't know about? Are these people actually extras earning a paycheck? Your explanations are welcome.
Wow, a whole article about how annoying personalized cell phone ringtones can be in the office. I don't disagree that these can drive you up the wall, but is this seriously newsworthy? I love the whole "news you can use" spin on this, though -- as if this is advice that people need in order to realize that they're behaving like insensitive clods. Who are these people who think it's appropriate to have their cell phones ring during work meetings? And is this really the deciding factor in whether you make a good or a bad impression on your colleagues and managers? Seems to me that someone clueless enough to have their cellphone playing hit song du jour at full volume during work isn't too likely to be bowling co-workers over with their savvy and personal charisma, anyway.
I think inappropriate ringtones are among the least of our worries. How about the person in the next cubicle who leaves their cell phone at their desk while they go to lunch, and it rings repeatedly, even though they never pick up? Or people who for some reason find it necessary to talk louder on their cell phone than they do on their land line? Or the cluster of cell phone users near the window (where they can get reception), all chattering away with a finger jammed in their non-phone ear? Hmm...is this a bigger deal than I thought? Maybe this should be the first article in a series on cell phone abuse in the workplace, since this is clearly one of the burning issues of our time.
Apparently, e-mail makes us dumber. A study done in the UK shows that those who are constantly distracted by e-mail, phone calls, or text messaging lose more IQ points than regular pot smokers. The "constant shifting of concentration makes the brain more tired and less focused," and therefore slightly impaired. It seems the IQ loss is temporary, though the article provides no details on how long it may take to recover.
I like to think of myself as someone who deals effectively, more or less, with distractions at work, but these results don't really surprise me. The constant intrusion of e-mail reduces my efficiency, since I always have to retrace at least a few mental steps when I return to whatever I was working on before the e-mail arrived. Most of us probably think we are accomplished and efficient multitaskers, but maybe we would be better served by concentrating on one thing at a time. I could try it, but first I have to read my new messages...
The Library of Congress has announced the 2004 additions to the National Recording Registry. The Registry, created by the National Recording
Preservation Act of 2000, is tasked with preserving the recorded cultural heritage of the United States. It identifies the best existing versions of recordings on the registry and works to preserve and provide access to them. To qualify, recordings must be culturally, historically, or aesthetically significant, and at least 10 years old.
This year's additions to the Registry run the gamut from jazz and blues, to gospel, to pop, classical, and hip-hop. The Registry also includes historically significant recordings of speeches and broadcasts. Some of favorites from this year's group: Ma Rainey's "See See Rider" Blues, Fats Waller's "Ain't Misbehavin'" performed by the composer, and Tom Lehrer's "Songs by Tom Lehrer." The registry's chronological range increases by almost ten years due to the inclusion of Public Enemy's "Fear of a Black Planet" and Nirvana's "Nevermind." As usual, all of the choices are absolutely unimpeachable: the musical selections are generally regarded as both historical and artistic milestones, and the speech and broadcast selections document historically important events and/or personalities. Check out the full Registry and see for yourself.
Or maybe you'd rather nominate a recording you think has special artistic and historic value. Public nominations for 2005 additions are open until July 15, and there's even a convenient online form.
I eat more than my share of fast food, but I never eat Wendy's chili. Apparently, this has been a good choice on my part: a Wendy's customer in San Jose, California found a human finger in his cup of chili yesterday.
I tend not to be too grossed out at the occasional reports of insects and random animal parts turning up in food. Yes, it's disgusting when this happens, but I figure I consume various unsanitary and queasy-making things all the time without knowing it. And usually, when these things are publicized, the presence of an offensive bit of detritus can be explained by negligence in processing, or something. What gets me about this particular story is that I can't come up with a logical explanation for how a human finger could have ended up in that batch of chili -- and apparently, neither can Wendy's or the local authorities. Ewww.
Dr. Dregs finally got to enjoy one of his Christmas gifts last night -- concert tickets to a show featuring Guy Clark, Joe Ely, John Hiatt, and Lyle Lovett at the State Theater. We weren't sure what to expect, having bought the tickets because Dr. Dregs is a huge Lyle Lovett fan (both of us are only moderately interested in the other three). It turned out to be well worth the money: the show was mellow and funny -- just four guys with their acoustic guitars, playing together because they thought it would be fun.
The stage was set very simply: four chairs in a row, with small tables holding water bottles behind the chairs. All four musicians came onstage together and took their seats. They took turns performing (in "alphabetical order," they claimed, with Clark first and Lovett last), moving through six sets with a song by each musician, with one or two of the others occasionally dropping in with a vocal harmony or a guitar solo. At the end of the show, all four did a couple of Woody Guthrie tunes together: "Ain't Gonna Be Treated This Way," and "This Land is Your Land."
The simplicity of the presentation -- just the performers and their guitars -- put the focus strongly on the quality of the songs and performances. All four showed how accomplished they are as solo peformers, with Lovett and especially Hiatt turning in particularly fabulous interpretations of their songs. But the pure emphasis on the songs really turned the evening into a celebration of great (and often underappreciated) songwriting. Lovett mentioned early in the show that he considered Clark, Ely, and Hiatt to be his "songwriting heroes," and there was a special complementarity among all of the songs that enriched the whole performance even further.
In my experience, this kind of thing happens all too rarely in concerts by big name stars. While the performances may be dazzling in both production and execution, such concerts are often infused with a sense of fatigue for the material -- if not for performing itself. There was none of that last night; Clark, Ely, Hiatt, and Lovett all seemed genuinely glad to be performing, and the crowd knew it. The Strib reviewer called the show a love-in between performers and audience, which strikes me as just about right. It's proof that audiences can be trusted to appreciate substance over flash.
Dr. Dregs and I ask ourselves this all the time, usually when we're consumed with a sudden, inexplicable, overwhelming urge to buy, watch, or listen to something (which has often been heavily advertised).
Because as thirtysomething professionals with disposable income (not that we have too terribly much of that -- after all, we work for the University), we are apparently one of the prime demographic groups that retailers/advertisers/broadcasters, etc. want to attract. Sometimes what's going on is way too obviously a series of cynical marketing ploys to encourage us to spend more money on crap we don't need, to take part in some nebulous "lifestyle" that winds up having about as much actual relevance, resonance, and meaning as the dust bunnies under my desk.
So although it sounded like a promising concept, I was very suspicious of MPR's plans for 89.3, the former WCAL, which they purchased last November from St. Olaf College under some protest from WCAL's fans. MPR announced not long after the sale that their plans for 89.3 would turn it into an eclectic pop mix of independent music and classics. They were doing this, they said, in order to attract a younger group of listeners to public radio, throwing in the rather patronizing idea that 89.3 listeners would eventually "mature" and start listening to 91.1 (news) and 99.5 (classical).
That rubbed me the wrong way, in part because of the implication that 89.3's listeners needed to wait for their tastes to evolve, and in part because I already listen to a lot of MPR (both 91.1 and 99.5). But 89.3 sounded promising enough (and enough to my eclectic tastes) that I had to give it a chance. The new 89.3 ("The Current") began broadcasting at 9:00 this morning.
Well, it's ten hours into day one, and I have to admit that this is some fabulous radio. This is the radio station I've been waiting for my entire adult life. Listening to it actually makes me happy. Lots of indie rock, some local music, with the occasional genre-crossing classic thrown in (Ray Charles, Billie Holiday, Frank Sinatra, Louis Prima...you get the idea). Lots of stuff I've never heard by bands I know a little, and lots of stuff I don't know at all, but which excites and interests me more than anything I've heard on commercial radio in ages. All of this, and practically ad-free -- the typical MPR "sponsorship" messages are models of unobtrusiveness, compared to the usual run of ads on commercial radio. See for yourself: here someone has helpfully compiled 89.3's playlist from 9-5 today. Great stuff.
I used to dutifully pay up my annual membership in MPR. I let it lapse a few years ago, as the price crept up, and it seemed less and less like MPR really needed my feeble contributions. But if my money will help keep 89.3 on the air, I'll sign up again. If they keep up what they've done today, it will be worth every penny. Are we being targeted? Who cares, when it sounds this good?
Idly browsing the headlines at CNN.com, this got my attention:
New Findings Change Thinking on Human Sacrifices
Disappointingly, this isn't about cutting-edge research showing that human sacrifice is a wholesome pastime for the whole family. Apparently, archaeologists have gathered much evidence to support the claims of the Spanish conquerors regarding the extreme brutality of human sacrifice as practiced by the Aztecs and Mayans. Interesting stuff, on which I can't really comment intelligently. But I'm grateful to CNN for coming up with such an effective headline. Because of it, I've learned something.
For reasons I don't quite understand (and am a little frightened to probe), I have long been both fascinated and amused by etiquette, its practice -- and especially the lack thereof. I have passed many a happy hour browsing the stories of astonishingly bad behavior and taste over at Etiquette Hell. I also occasionally check out the etiquette column "Ask Elise" at Indiebride, a site I discovered shortly after my own wedding which offers a welcome departure from the elaborate sticky-sweet and slightly creepy advice on other wedding websites.
Dropped by "Ask Elise" today for the first time in a couple of months, and along with Indiebride's typical episodes of overeducated brides thinking much too hard about a minor point of etiquette was this letter, from a woman who obviously comes from a very different world than any I know well.
The woman's complaint (she signs herself "Money Trouble") begins with what appear to be standard-issue complaints about her future in-laws: they give chintzy gifts, they don't do things like her parents do, and so on. But the letter rapidly turns into something else, in which the writer castigates her in-laws-to-be for not giving presents in kind: a gift of tickets to Hawaii (from Money Trouble and her fiance) is met with a cheap and tacky mug. You get the idea. Things really get rolling in the third paragraph, where MT reveals that in return for paying for the wedding, her family expects to receive lavish gifts from the groom's parents -- as well as a significant chunk of change (MT estimates around $30K) so the newlyweds can buy a home. This despite the fact that MT's future in-laws are obviously unlikely to spend anywhere near that amount of money, even though MT believes they could afford it (MT's fiance, incidentally, thinks the most they could expect from his parents is about $5,000. Obviously, that will never do). MT doesn't quite ask the advice columnist outright how to convince her fiance's parents to cought up the big bucks, but she comes pretty close.
Elise's response is a model of thoughtfulness and tact. She charitably ascribes MT's (to me) extravagant expectations to cultural differences. Still, she can't help but lecture a little bit: "There is simply no way to politely tell your future in-laws that they must make a thirty thousand dollar investment in your home, or even a five thousand dollar one. At bottom, you would do well to relax your expectations."
I'll say. If anyone had given us a wedding gift worth more than a couple hundred bucks, we would have been flabbergasted by their generosity. Our families are not wealthy, but I just can't quite get my head around the expectation that you'll be receiving $30,000+ wedding gifts. I don't doubt that many wealthy people do give as much, particularly to their children. But to expect it! There's one headspace I've never visited.
This portrait didn't hold much interest when its subject was known only as a random dead white 18th-century guy. But things have changed. The Berlin art gallery that has owned the painting since 1934 has just announced that experts have identified the subject as Mozart.
It gets even better, though. Not only is this a portrait of Mozart, it's very likely the last portrait of Mozart to be painted before his death. The painting supposedly dates from Mozart's last visit to Munich in October 1790. And as anyone who recalls the English version of Falco's cheesy 1986 megahit should know, Mozart died in December 1791 at the tender age of 35.
Am I imagining things, or does he look slightly unbalanced in the portrait? He also has a haggard look around the eyes, which makes him apppear much older than 34. Mozart's financial situation ranged from unstable to dire during the last few years of his life. Perhaps that was taking a toll -- along with too many other indulgences?
View 2004 edition
March 19, 2006-October 4, 2006
August 22, 2005-March 18, 2006 (on hiatus)
May 7-August 21, 2005
March 29-May 6, 2005
March 3-28, 2005
February 4-March 2, 2005
January 4-February 3, 2005
I've been meaning to post something about Christmas music, and now that ASCAP has posted their 2004 list of the top 25 most-performed holiday songs, I have the perfect opportunity. Naturally, this list does not include traditional carols, or any songs that are in the public domain, so it's not exactly what it claims to be.
Still, it is a good rundown of songs written in the last 60 or so years that have become holiday classics (for better or for worse). One of my favorites, "Have Yourself a Merry Little Christmas," finds itself right near the top of the list, at number 2. The best thing about the song is the way the lyrics manage to be moving without saccharine sentiment but with a slight edge of sadness. Also a pleasant surprise are the song's aspirations, which are refreshingly small-scale for a Christmas song. These qualities are emphasized in James Taylor's lovely rendition, which is so sincere and tentatively hopeful, it'll break your heart. And of course, the melody is unimipeachably lovely. This is a song well deserving of its classic status -- though perhaps not so deserving of being "interpreted" by the likes of Barry Manilow, Christina Aguilera, and Kenny Loggins. Oh, well.
My least favorite songs on the list are "Jingle Bell Rock" (no. 10) and "Rockin' Around the Christmas Tree" (no. 16), both of which have always struck me as inane and completely devoid of either lyrical or musical merit. I can almost tolerate Brenda Lee's classic 1958 version of "Rockin' Around the Christmas Tree," but I despise "Jingle Bell Rock," and won't listen to any version of it if I have a choice. Almost as bad is Sir Paul's insipid "Wonderful Christmastime" (no. 22), the enduring popularity of which I will never understand.
The list is full of songs that either inspire violent or ill feelings in me, or that have been ruined for me by bad covers and/or unfortunate commercial uses. Topping this list is "Feliz Navidad," (no. 13) which has been wrecked for me by years of Taco John's ads (maddeningly, they still use it). Whenever I hear "Silver Bells" (no. 11), its plodding rhythms and moribund tempo make me think it's never going to end, and I've heard far, far too many disastrous covers of "Santa Baby" (no. 24) in which grown women feel the need to make themselves sound like petulant 10-year olds. Actually, the whole song has kind of a skeevy feel to it: a woman, pretending to be a little girl, implies that she'll trade sexual favors for expensive gifts with "Santa," who is some weird combination of father figure and lust object. Ick.
But there are a few other songs on the list I like a lot: "White Christmas," (no. 5) "Little Drummer Boy," (no. 9) "A Holly Jolly Christmas," (no. 18) and "Sleigh Ride." (no. 12) Dr. Dregs, who can't stand either of the latter two, claims that I only like "A Holly Jolly Christmas" because of childhood brainwashing resulting from repeated exposure to Burl Ives's snowman character in the classic stop-action animated Rudolph the Red-Nosed Reindeer (which is celebrating its 40th anniversary this year, by the way). As to "Sleigh Ride," I can't really defend it, except to say that Leroy Anderson had a mastery of over-the-top cheesiness that has rarely been matched by any composer.
A couple of my favorites are missing from the list: "Do You Hear What I Hear?" and "We Need a Little Christmas." "We Need a Little Christmas" has even more melancholy edge then "Have Yourself a Merry Little Christmas": For I've grown a little leaner, grown a little older / Grown a little sadder, grown a little colder / And I need a little angel sitting on my shoulder / I need a little Christmas now. Like "Have Yourself a Merry Little Christmas," this pretty perfectly captures the mixture of despair and hope that characterizes the modern holiday season. As for "Do You Hear What I Hear?" (which has also had more than its share of truly appalling covers), I only really like it in the classic Bing Crosby version, complete with bombastic orchestral and choral flourishes -- they just suit the song's beat-you-over-the-head-with-hope grandiosity. (However, I wholeheartedly recommend against the version featuring Rosie O'Donnell and Sesame Street's Elmo. Yikes.)
Doc Dregs can speak for himself, but I'm pretty sure that he just hates most of these songs, probably due to years of Christmas-season retail incarceration. But I like many of these songs despite the many ways in which they've been desecrated, recycled, and tiredly repeated over the years. That's one sign of a true classic: you can still stand it even after it's been abused and transformed nearly beyond recognition.
With me stressed out and working all the time, and Dr. Dregs coming up on the end of the semester, he and I have needed a little quiet relaxation. So we've spent the past two evenings relaxing at home watching movies. Two nights, two movies, and boy howdy, they couldn't be more different.
Last night we watched the eagerly anticipated The Librarian: Quest for the Spear. We were not disappointed. Or to put it more exactly, we were not disappointed since the movie was, in fact, just as laughably bad as it promised to be. A few highlights:
Anyway, some seriously bad stuff going on there. We were moderately amused (when we weren't trying to control our gag reflexes), but I can't help but see this as an opportunity lost. If this movie had half of the humor and wit of, say, a second-tier episode of Buffy, it might have approached campy greatness. Larry tells me there's talk of sequel, so maybe the writers and cast will have another chance to reach that exalted place.
For something completely different, tonight we watched Errol Morris's The Fog of War, his stunning film about Robert McNamara. Whatever McNamara's naivete and mistakes during Vietnam, it's hard not to sympathize with the man in the documentary, who comes across and principled, tortured by his conscience, and who clearly grasps the enormity of the tragedy caused by many of his decisions. Factual inaccuracies aside, McNamara's account is fascinating, moving, and really brings home how easily even those with both intelligence and good intentions can fail. And then I fired up my web browser to this piece of news about Iraq.
"Those who do not learn from history are doomed to repeat it..."
So, surely you've all heard about the 10-year-old grilled cheese sandwich supposedly bearing the image of the Virgin Mary that so recently sold on eBay (for 28 grand, no less). The question naturally arises, if the sandwich is really 10 years old, why isn't it moldy? Divine intervention? As it turns out, there is a scientific explanation: mold doesn't like the trans fats in margarine or the calcium in cheese, so a grilled cheese sandwich made according to traditional methods stands a better chance than most other bread-based food items of passing through the years mold-free.
But wait, there's more: apparently, Hello Kitty has also chosen to make a miraculous (if burnt) cheese sandwich appearance. Of course, the nonbeliever in me can't help but dismiss this sandwich as the product of a Hello Kitty toaster -- but I guess my lack of faith is just something I'll have to work on.
My real concern here is that I eat grilled cheese sandwiches often -- as frequently as once a week. Has a divine or supernatural being been trying to communicate with me via toasted bread, and I just haven't been paying attention? Clearly I need to start carefully examining my grilled cheese sandwiches before I eat them. Remember Richard Dreyfuss and the mashed potatoes in Close Encounters of the Third Kind? It's so obvious to me now: food can be a vehicle for all kinds of unexpected, life-changing messages. "Watch what you eat" is starting to take on a whole new meaning!
Nice story in today's NY Times about the rise, fall, and tentative resurrection of First Avenue. My favorite thing: the way the article depicts Mayor R.T. Rybak's primary claim to fame as having stage-dived at First Avenue.
Rybak also gets the best quote in the article. He compares First Ave. to the Guthrie and the Minneapolis Institute of Art, saying "Minneapolis is at its best when it doesn't try to imitate anybody else." This is true of most cities, I think, which are almost always better off when they simply embrace their character and native institutions, warts and all, and develop an identity based on that. Rybak is right: things like First Ave., the Guthrie, and MIA are why we're not just "a cold Omaha."
As I wrote last week, I began a news and media boycott after the election. It was necessary so I could maintain my sanity and not succumb to an overwhelming sense of fear and despair. Although the immediate pain and worry have mostly passed along with the violent spasms of postmortem commentary (self-flagellating or self-satisfied, depending on which side the pundit is on), I find that I'm still reluctant to re-engage with news, television, and the blogosphere.
The result is that I'm feeling calmer and more focused than I have in a while. This is not to say that I'm focused on the right things: I've been spending a lot of time reading (What I Loved, linked over there on the left, which is turning out to be a beautifully written and mind-expanding, though melancholy novel), and playing Hordes of the Underdark -- pure escapism, on both fronts.
But losing myself in fantasy and staying out of touch with the events of the moment is therapeutic, especially after the stress and excitement of election season. It's good to take a little break from the burden of keeping myself informed. Although it's constantly under siege by my sense of futility, the civic ideal of the well-informed citizen is too deeply ingrained for me to give up on it entirely. I'll come back to politics, following the news, and thinking about the issues eventually. But for now, I'm going to continue my little news blackout.
The great state of Texas has forced textbook publishers Holt and McGraw-Hill to change language in health textbooks for high school students regarding marriage. Prior to the changes, the textbooks used gender-neutral phrases like "individuals who marry" and "married partners." The Texas Board of Education is insisting that the textbooks strictly define marriage as occurring between a man and a woman, and the parties involved as husband and wife.
Although this story is interesting (if disheartening), it's merely a minor example of an ongoing outrage. Texas, as the second-largest purchaser of K-12 textbooks in the nation, has a huge influence on the content of those textbooks. A little over a year ago, I read the excellent The Language Police: How Pressure Groups Restrict What Students Learn" by Diane Ravitch, an education historian at NYU. In it, Ravitch exposes how interest groups from the left and the right and state boards of education have forced textbook and standardized test publishers either to adopt the blandest possible language in many cases, or specifically to include language that interest groups see as supporting their agendas.
I don't think enough people are angry about this, so I strongly recommend The Language Police to anyone interested in how children are educated. Not convinced? Visit the link above, and read the excerpt there from the book's first chapter, which gives some bizarre, almost unbelievable examples of reading passages that were removed from a standardized test because of bias or lack of sensitivity. You'll want to read the whole book to learn how pervasive these practices are.
Most of my pop culture indulgences are not on television, but the WB's Gilmore Girls is an exception. Just in case you've somehow managed not to see the show or hear about it over the last four years, it's about two ridiculously attractive and intelligent women -- a young single mother and her teenage daughter -- who live in a picture-perfect Disneyesque Connecticut town. There, they tangle wittily and sometimes melodramatically with each other, quirky townspeople, and wealthy but cranky family members (who nevertheless are kind enough to give them loads of money when circumstances warrant).
The show is not, to use a phrase of the moment, reality-based. It has been lauded by critics for its lightning-fast, erudite, and clever dialogue. The scripts are peppered with enough pop culture and middlebrow references to induce a seizure, but it works, because the lines trip beautifully off the tongues of the perfectly cast actors. The show is also populated by characters that even though they aren't always original, provide enough twists on their archetypes to keep the show consistently entertaining.
Those are all perfectly good reasons to like the show, but Dana Stevens, writing in Slate, picks up on something else that explains the show's appeal: it's outrageously bookish. The characters, especially Rory (the daughter), read voraciously -- and the classics, no less. But even more unusual is the way the characters really think about what they read, and weave it into their understanding of the world. Books have a centrality in Rory's life that reminds me of my own experience throughout high school and college (though I was certainly nowhere near as well-read as she is at 19). The characters filter their life experiences through the literature they have read in a way that only people who are really obsessed with books do -- and that speaks to me in a way that most other TV shows and movies never can.
Although I didn't love the novel enough to re-read it, there are a few passages from Jonathan Franzen's The Corrections (winner of the 2001 National Book Award) that have stuck with me. One of my favorites follows:
It had started as a family joke: Dad always orders the mixed grill in restaurants, Dad only wants to go to restaurants with mixed grill on the menu. To Gary there was indeed something endlessly delicious, something irresistibly luxurious, about a bit of lamb, a bit of pork, a bit of veal, and a lean and tender modern-style sausage or two -- a classic mixed grill, in short. It was such a treat that he began to do his own mixed grills at home. Along with pizza and Chinese takeout and one-pot pasta meals, mixed grill became a family staple. ... before long, Gary was doing mixed grill two or even three times a week, braving all but the foulest weather on the deck, and loving it. ... He loved it and loved it and loved it and then all at once he didn't. ...
On the deck, in the radiant heat, as he blackened the prawns and seared the swordfish, a weariness overtook him. The aspects of his life not related to grilling now seemed like mere blips of extraneity between the poundingly recurrent moments when he ignited the mesquite and paced the deck, avoiding smoke. Shutting his eyes, he saw twisted boogers of browning meats on a grilled of chrome and hellish coals. The eternal broiling, broiling of the damned. The parching torments of compulsive repetition. On the inner walls of the grill a deep-pile carpet of phenolic black greases had accumulated. The ground behind the garage where he dumped the ashes resembled a moonscape or the yard of a cement plant. He was very, very, very sick of mixed grill.
In the book, this passage is a not especially subtle symbol for Gary's marriage and family life, which is falling apart. But what I've always loved about the passage (aside from the hilarious phrase "broiling of the damned") is how effectively it captures the counterintutive truth that excessive indulgence in something almost always leads to loathing of that thing.
I've experienced this in my own life with food, music, books, movies, and games. It happens so mysteriously: one day, I love a certain song or piece of music more than anything else. I want to hear it constantly. I listen to it over and over again. Then, suddenly, without warning, its appeal is gone. I might listen to it again after some time has passed. I might even appreciate it. But I will never again experience that intense, almost obsessive, craving to hear it.
What causes us to reach that saturation point? I haven't noticed any pattern to how long it takes the object of my devotion -- whether it be edible, musical, or otherwise -- to lose its magic. But it always happens eventually, and without warning. Maybe it's some sort of instinctive regulation, an enforced moderation, for the purpose of self-preservation. I can't explain it. But I do know I have to mourn a little each time this happens to me.
Do watching the news and surfing the Internet sometimes make you feel like you're trapped in a planet-sized echo chamber? Well, it does me. Media trips, a site devoted to creative remixes of cultural and media "content," might make you feel even more that way, but at least you'll be entertained.
I've almost blogged about this survey a couple of times in the last few weeks, which stresses the importance of writing skills in the professional workplace, and the lack of those skills among many employees. I haven't written about this up to now because neither conclusion is particularly surprising.
But this silly little column, which offers some tips for improving job-related writing, compels me to say something. Recommendations include relying on lists, keeping paragraphs to a maximum of two lines, and leaving out background information. The columnist does say that her tips are most relevant for e-mail and presentation writing. Still, taken as a whole, her suggestions comprise a near-perfect method for reducing complex thinking about complex issues to easily digestible, Powerpoint-friendly soundbites. This isn't always a bad thing, but taken as a way to improve writing overall, it depresses me. Obviously, work-related writing is not the appropriate venue to express your own character and style. But eliminating the writer's voice completely ends up making this stuff even more unbearable to read or sit through.
Some of her advice is actually good, such as the recommendation that any "finished" piece of writing should have its length reduced by 10% before it's ready for prime time -- there's a pointer I should take to heart and exercise more often. But why reinvent the wheel? There's nothing worthwhile here that Strunk and White didn't already say more eloquently. I think I'll stick with them.
Slate today has a good analysis of Starbucks's decision to raise prices on its coffee drinks. The writer concludes that it's a savvy move: even though Starbucks coffee is already pricier than almost anyone else's, it's also higher in caffeine -- so you get more bang for your coffee buck.
Unlike many, I don't think Starbucks is completely evil. I freely admit that I often get coffee at Starbucks, for two reasons: the quality of the coffee is consistent, and Starbucks is supremely convenient in terms of both locations and hours.
Starbucks is never the best coffee in the world, but it's extremely predictable. Either they have some fabulously consistent method of training baristas, or else their system is such that it's impossible to completely screw it up. It doesn't matter whether you're in an airport, a grocery store, a mall, or on a city street corner: your Starbucks latte will always taste the same. The brewed coffee displays the same consistency: a cup of Starbucks almost never surprises me in terms of strength, freshness, bitterness, or acidity. The only other coffee chain that I have found to be as consistent is Dunn Bros., which I adore, and would frequent more if their locations were more convenient for me.
Convenience is a big coffee issue for me. In our neighborhood, we have several cute little coffeehouses that make decent coffee, and which have atmospheres far funkier, more welcoming, and friendlier than Starbucks (or any other chain). But they are never open in the evenings, which is usually when I'm looking for a place to relax with a book and an espresso (is "relax with an espresso" an oxymoron? Hmm). Starbucks outlets tend to be open late. It depends on the neighborhood, of course: Uptown doesn't lack for late-night coffee. But my sleepy South Minneapolis neighborhood, inhabited largely by retirees and couples with small children, turns in early for the most part. Coffee after dinnertime is not a popular option.
Whenever a better choice is available, I'll take it. But Starbucks is everywhere, always open, and can be trusted. Doesn't make me thrilled to pay more for their mediocre beverages, but when that's the only choice...wait, didn't I say I needed to stop drinking coffee?
This is the best idea I've seen in PC design since the original iMac. I'm especially fond of the panda. Leave it to the Japanese...
Well, Montreal was lovely: the weather was mostly gorgeous, the conference was worthwhile, and the city itself is a great place to spend a couple of days -- especially if you want to practice your French! Since I spent most of my time in meetings, I didn't have a chance to see many of the sights, but we did have enough free time to do a few things:
This was my first trip to Montreal, but I hope it won't be my last. I have to take Dr. Dregs there, at the very least, so he can experience the CineRobotheque for himself.
My department reorganized in July. Since then, most of us have been figuring out workflows, negotiating relationships with new supervisors and teammates, and learning how our jobs have and haven't changed. We've also been under a directive from our department head to make a concentrated effort to clean up and empty out our workspaces, since a plan is in the works to reconfigure our little warren of cubicles to reflect our new organizational structure. The idea is that everyone should expect to have to move to a new location within the next six months or so.
The master plan for reconfiguring our space was finally unveiled today. Some of the changes are fairly radical, so it's a bit of a challenge to envision what it will be like once the upheaval is finished. My co-workers are greeting the news with either cautious optimism or trepidation, depending on how much they like their current location and what they think of their proposed relocation. I am cautiously optimistic: if the plan is executed as advertised, I will finally have adequate space for the materials I work with. I am also one of those lucky enough to be moving to a space where I will have fewer distractions to cope with.
But others are not so happy, worried that they'll lose necessary storage space, a quiet enough location to work without constant distractions, or simple creature comforts they've had to fight for in their current locations. It seems like it shouldn't be a big deal, but it is: most of us are deskbound upwards of 80% of the time. For many of my co-workers, that proportion approaches 95%. So it's critical to our productivity and mental well-being that we're as comfortable as possible in our workspaces (which, of course, have always been far from ideal). It's always been an uphill battle: few of us feel like we have enough space, we constantly fight filth (our floors are cleaned on a roughly biennial schedule), and it has often been hard to come by small things like a filing cabinet or an extra bookshelf. In other words, resources are scarce, so people tend to guard what they have with some vehemence. We also tend to be suspicious of anything that might upset our hard-won control over our workspaces, so a big move like this is bound to be especially traumatic, even for those of us who are satisfied with what is proposed.
Of course, one of the overarching purposes of such a major move is to uproot people from their zones of comfort (both physically and mentally), thereby forcing them to rethink how the organization does things (and, with any luck, encourage innovation). Another huge benefit will be the cleaning and reorganization of all of the stuff that surrounds us in our current drab (I would even say depressing) environment. The "shiny and new" character of the reconfigured workspace is pretty exciting, and is hopefully something that everyone can be enthusiastic about. It all makes for a complicated dynamic, and points up how risky and difficult change can be.
Here's the first review I've run across for The Sims 2, which sounds like an improvement over the original game in just about every dimension. I more or less lost several months of my life a couple of years ago to the management of my engrossing and often bizarre virtual families. I knew I had to quit when I started dreaming about my Sims. So I'm thinking it's just as well that a Mac version of The Sims 2 is "planned" but no date has yet been set for its release.
For me, this amounts to a nice side benefit to sticking with Macs: greatly reduced gaming distractions. Most of the very best games make it to the Mac eventually, which works out just fine for me. I enjoy games, and have even been known to play the occasional game obsessively for a period of time. But that's because I'm weak. Owning a Mac means I don't have to exercise too much personal willpower to resist whatever the cool game of the moment is. My choice of platform makes the temptation moot, and instead I can do something constructive with the time I would have spent gaming. See, Macs really do make you more productive!
A new study presents the (not-so) shocking conclusion that depending on the context, using the f-word at work can be a good thing. Researchers found that use of the expletive built solidarity within a work team in a New Zealand factory.
Not that this can be taken as license to let your filthy mouth run wild at work. The abstract and the article itself argue (unsurprisingly) that how people interpret any given expletive depends greatly on norms in a workplace community. In other words, just because this works in a New Zealand soap factory doesn't mean you should try it at your next meeting. But the larger point is thought-provoking: behaviors that are widely regarded as impolite or inappropriate may be exactly the opposite in the right situation.
Oh, and if you do decide to swear a blue streak at your next meeting, let me know how that goes.
I wasn't going to say anything about 9/11 until I saw that Selling Sno Cones at the Beach felt the same way I did: what is there to add to what has already been said? But I did have kind of an odd experience yesterday, so I'll share it.
I was at the 9/11 tribute at Lake Harriet last night, not because of some overwhelming sense of patriotism, but because John was playing in the orchestra. I missed the beginning of the concert since I was out on my bike having a lovely ride around Lake Harriet, Lake Calhoun, and Lake of the Isles, so when I sat down to listen to the concert, I couldn't get anywhere near the bandshell. So I chose a spot at the edge of the lawn where I could still hear and settled in, alternately listening and reading a book until the light went completely.
About an hour into the concert, I was approached by a guy toting a camera and a bunch of other equipment. He said he was from Channel 5, and would I be willing to say a few words on camera about my thoughts and feelings on the day? I turned him away, since I didn't think I really had any thoughts -- profound or mundane -- on the day (besides, I looked terrible, having just been on the bike for over an hour). So TV guy moved on to the people sitting next to me, an attractive family with two young children who were happy to speak their minds on camera.
I thought about it, though, after the TV guy left. The weirdest thing about it was how normal everything was. It was concert in the park on a beautiful night, just like any other summer Saturday at Lake Harriet. As usual, passing bikers, skaters, and dog walkers would pause to listen for a little while before going on. Kids ran and played in whatever bits of space they could find around the outskirts of the crowd, occasionally shushed by their parents when the music got quiet. Strangest of all, jets passed overhead on their landing approaches every couple of minutes, and nobody took the slightest notice. Suddenly I became hyper-aware of those jets -- what if one suddenly crashed into the bandshell? I knew that one wouldn't, but I kept imagining it over and over, trying to force myself to comprehend in any real way what it must have been like three years ago for those people in New York and Washington.
I remember being terrified three years ago, even way out here in the middle of the country. I remember agreeing with the conventional wisdom that nothing could ever be the same, because it seemed so obvious at the time. But here we are, the horrors of 2001 having receded (at least for those of us who only saw them on television), and things are the same. Our daily lives haven't substantially changed because of what happened. Sure, it takes a little longer to get on an airplane, but that's minor. We have to go on as we always have, partly because we haven't been given a reasonable alternative, and partly because to do anything else is almost as unthinkable as another attack the magnitude of 2001's.
I had these thoughts, and felt guilty and unpatriotic for having them. But then after the concert, John and I grabbed some food at Famous Dave's in Linden Hills. The server noticed John's t-shirt, emblazoned with the 9/11 tribute logo. She asked him what that was all about, and she said, "You know, it's so weird how far away that feels. Everything seems so normal now, you can't even remember what it was like that day." And I felt better, because I realized that I wasn't the only one experiencing that sense of disconnection from how I felt that day and during the weeks afterward. Is it just the passing of time that causes these rifts, or is there something else going on here?
Exciting news for those of us who have fond memories of the old Infocom text-based games (not to mention anyone who thinks Douglas Adams died far too young): the original Hitchhiker's Guide to the Galaxy text-based game is being revived by BBC Radio 4 in time for their new series based on the book. The game will be available on Radio 4's website in late September. But it gets even better: the game will be enhanced with new illustrations by Rod Lord, who did the graphics for the original Hitchhiker's Guide TV series.
Hitchhiker's Guide to the Galaxy is one of only a few of my childhood favorites that still holds up for me. I only vaguely remember the text-based game (a friend of a friend had it), but I'm looking forward to it, especially since it was written by Adams himself. I also fondly recall Adams's late-90s game Starship Titanic, which was a lot of fun despite its bugginess on the Mac. Good stuff.
An intriguing urban living arrangement is described in this Strib article. With undeveloped urban space so scarce, many developers are building apartments and condos atop superstores like Best Buy or Pottery Barn. Even some supermarkets have condos rising several stories above them. Of course, this raises all sorts of red flags regarding the suburbanization of the city -- but aside from that, I simply wonder what it must be like to live in one of these places.
I confess, I've always been inexplicably attracted to those condo developments with a row of cute little shops and restaurants tucked in at street level. I imagine with pleasure the prospect of having a sandwich shop and a coffee bar (even if it's Starbucks) right there, mine to use basically as an extension of my home. But living above a mega-store doesn't have the same appeal. Although the basic concept is the same, the condo-supra-mega-store idea lacks the faux-charm and convenience that the small-store-plus-condo developments have in my mind. And especially with a Best Buy, a supermarket, or a nightclub below, it seems like noise would be a problem.
There is one big-box store that I have to admit I might enjoy living above: a bookstore such as Borders or Barnes and Noble. Media and coffee at my fingertips -- what more could I possibly want?
As long as we're on the topic of food, I have to say something about this Strib piece on why so many newly-introduced foods and beverages fail. Apparently, anywhere from 50 to 90 percent of new food products don't manage to attract enough market share, and wind up on the ash heap of supermarket history, as it were.
Two things made me want to comment on this. First, Dr. Dregs and I have a running joke about how any food product he likes ends up discontinued. I have always maintained that it's because he has an affinity for bizarre (if not downright disgusting) foods and beverages, but it seems that I should instead blame the great mass of consumers who are so habit-bound that they never try new products. Apparently, Dr. Dregs is to be commended for his culinary adventurousness. Or at least the processed-food manufacturers should hold him up as a model consumer!
Second, the Strib article specifically mentions that the new reduced-carb colas introduced by Coke and Pepsi (C2 and Pepsi Edge) are not selling as well as anticipated. This sets me off on something that has been a pet peeve of mine since these sodas first appeared in stores: they cost too much!
Allow me to explain (if you can stand it). I've tried C2, and like it. When I drink soda, it's almost always diet, to spare the calories and sugar. But on those rare occasions that I mix soda with alcohol, I splurge and drink the regular stuff. Captain Morgan and Diet Coke just doesn't work for me. So when C2 became available, it seemed like the perfect compromise as a mixer: tastes pretty much like regular Coke, but with fewer calories and much less sugar. Perfect! Or so I thought until I shopped for it: C2's price for an 8-pack of cans was the same as the 12-pack price for any other soda. Infuriating! And Pepsi, of course, prices Pepsi Edge at exactly the same premium over "regular" Pepsi and Diet Pepsi!
A small difference, you say? Nothing to get too excited about? Maybe you're right, but it's the principle of the thing. Coke launches this product, and wants us to buy it. They need to sell a lot of it right off the bat, and build a little brand loyalty. So what do they do? Price the soda attractively, or at least competitively with their other products? No, they figure they can gouge desparate low-carb dieters, who they must assume are so frantic for a taste of sugary cola that they'll pay extra for C2. Well, according to this article, the soda companies were wrong! Wrong! I feel so vindicated.
I don't have much to add to this excellent rumination on the versions of national anthems being used during the Olympics, all arrangements by Canadian composer Peter Breiner. For those of you who haven't been watching the games obsessively, Breiner's arrangement of "The Star-Spangled Banner" is a marked departure from traditional arrangements of the anthem: the tempo and the dotted rhythms are laid back, violins and other strings play a very large role (particularly in the second strain, "and the rockets' red glare..."), and there's quite a bit more chromatic interest in the bass line than is typical.
I have been complaining since the first medal ceremony I saw that included the U.S. anthem that it sounds lazy, worn out even -- definitely much too laid back, almost dirge-like. I do like the chromatic bass, but everything else about the arrangement drives me crazy. It seems to go on forever, lacking any sense of forward motion and climax. As I've written before, I don't think "The Star-Spangled Banner" is either a great piece of music or a great national anthem, but for an arrangement of it to completely avoid bombast seems intentionally perverse. "The Star-Spangled Banner" doesn't have much going for it to begin with; absent its usual brass and cymbal crashes, its weakness is even more obvious.
This is not to say that I think there's only one right "sound" for the anthem. I'm all for interesting new arrangements and interpretations of the anthem, if only because mediocre tunes can often transcend their pedestrian natures through innovative arranging. But Breiner's misses the mark for me. It relies too much on being the opposite of the conventional understanding of "The Star-Spangled Banner." A new interpretation of an overly familiar tune should cause little revelatory thrills. This just makes me want to take a nap.
I'm not sure whether this is a good thing or a bad thing. The University of Central Florida, in exchange for 10 grand, a few student internship opportunites, and a couple of lectures by production personnel, is letting the WB's new dating reality show, Big Man On Campus, film there. The show will star UCF students: one male (the "Big Man" of the show's title) and twelve female, who will compete for the heart of the Big Man. Standard-issue dating-show twaddle (the creator is also the creator of ABC's big hit "The Bachelor"), but there's something a little oogy about having it officially sanctioned by an institution of higher education.
UCF, chosen (well, duh) because of the attractiveness of its student body, is clearly receiving some benefit here, but the $10,000 to be applied to "academic programs" isn't even enough to pay an adjunct professor for a year, so I'm not sure how much those funds will raise the quality level of education at UCF. Personally, I think they should have held out for enough cash to hire an academic celebrity to serve as visiting professor of women's studies, who would spend the year teaching and lecturing on the ways that these crappy, sexist dating shows objectify, stereotype, and demean both men and women (not to mention the culture). But if I had to guess, UCF cynically accepted a token amount because of the publicity they'll get. I'm sure they think this will be an excellent recruiting tool, and the sad thing is, they're probably right.
But if I were a UCF administrator, I'd be made very uneasy (if not flat-out nauseated) by the WB's blurb describing the show:
On a beautiful college campus, in a beautiful setting, with lots of beautiful people, things will get very interesting. First, the school's most eligible women are going to pick the BMOC - the cutest, hottest, sweetest guy on campus. But then the tables will turn as he chooses his Campus Queen - from the very same co-eds who picked him. These girls will do anything to make the grade.
This AP article published in today's Star Tribune really leaves a nasty taste in my mouth. Unsurprisingly, many of the individuals sued (rightly or wrongly) by the recording companies for downloading music from the Internet are forced to settle in order to preserve their financial well-being, even when they might have a chance of winning in court. Settling usually means damages of only a few thousand dollars, rather than the astronomical expense likely to be incurred by anyone crazy enough to try taking on this particular Goliath at trial. Between court costs and attorney's fees, even a winning lawsuit would usually run the accused more money than simply settling.
I know there have been isolated cases in the past where a large corporation sought damages from an individual, but what's upsetting about this situation is the sheer number of people the recording companies are suing. A Boston district court judge is quoted in the article, saying, "I've never had a situation like this before, where there are powerful plaintiffs and powerful lawyers on one side and then a whole slew of ordinary folks on the other side.''
It all makes my paranoid tendencies run a little wild. Is this a brave new world in which corporations have more legal standing than individuals? Even if that's not the intent of our laws, the move in that direction indicated by these lawsuits seems clear. I have little sympathy for those who download, copy, and distribute copyrighted materials without remorse. But is it really necessary to wreak financial hardship (if not ruin) on every person of whom the record companies have even the tiniest suspicion that they illegally downloaded a song or two? It's a horrifying prospect.
Here's a little something for those of you who can't stand to read another word about the Olympics. The Modern Language Association has a nifty feature on their website: the MLA Language Map, which uses 2000 U.S. Census data to show the number and concentration of speakers of thirty (yes, really thirty) different languages across the U.S. You can see the whole country, or zoom in on a state, county, or zip code. The Language Map also lets you make comparisons between different regions. I spent way too much time checking out which languages are spoken in my zip code, and was amazed to learn that at least one person in my zip code reported speaking each language, with only three exceptions: Hungarian, Russian, and Navajo. Which doesn't seem all that weird (after all, this is Minnesot-ah), until you consider that that means that someone in my neighborhood speaks Armenian, someone speaks Thai, someone speaks Greek ... you get the idea. Pretty cool.
Apologies in advance for the Olympics obsession. For those of you who couldn't care less, I'm sorry to say that there will probably be a lot of that the next couple of weeks. To echo my brother-in-law Rob's take on the political conventions: after all, the Olympics only happen once every four years! Okay, so that's not strictly true, now that Winter Olympics alternate with the summer games every two years. Still, it's not a frequent event -- and the winter games (cool as they are) just don't offer the same kind of variety.
Anyway, the topic of today's obsession was raised by King Kaufman in Salon. Kaufman writes the by-now familiar diatribe against women's gymnastics and its dogged adherence to a "little girls in pretty boxes" standard for both aesthetics and athletics. Though the argument is not new, I strongly identified with Kaufman's take on it: the older you get, the less exciting and interesting women's gymnastics becomes. As I was watching the team finals last night, I wasn't really enjoying myself. I was anxious for the American team, but basically bored by the proceedings. Watching 85-pound girls hurl themselves through the air just wasn't as much fun as it used to be (even though I still marvel at how anyone can make their bodies do these things). Men's gymnastics, on the other hand, has been completely enthralling (and because of lower expectations for the U.S. men's team, much less nausea-inducing).
It isn't just the typical age and size of women gymnasts that makes their competition less involving than the men's -- it's also the unappetizingly leering vignettes and skeevy commentary that make the whole thing distasteful. For example, a slow-motion, soft-focus mini-portrait of Svetlana Khorkina (who, at least, is well over 18) that doesn't come right out and say anything useful about the key to her, um, appeal, but implies all sorts of yicky things. Or the mention of one member of Romania's 2000 gold-medal team who, after Olympic victory, chose to pose in less-than-completely respectable magazines. And how about the approving tone taken by commentators and reporters in discussing how the Romanian gymnasts (all 6 of them 18 or under) think of each other as family -- because they are forced to live together while training, not seeing their own families for up to a year at a time? Ick.
Contrast this with the male gymnasts, who are portrayed and discussed as athletes, even when a touching story is involved (like Blaine Wilson and his wife losing a child). No one plays up their childishness or vulnerability, and no one backhandedly sexualizes them. It isn't quite enough to make me abandon women's gymnastics, especially at the Olympics. But this year for the first time, I'm bothered enough to consider it. You know something is wrong when, as a gymnast begins her routine, commentators rush to point out that she's 17, even though she looks like a nine-year-old. It's time for some changes.
I'm not usually much of a sports fan, but I do love the Olympics. I love having the chance to watch so many sports that are rarely if ever broadcast, as well as seeing the sheer diversity of sporting disciplines to which humans dedicate themselves. Since coverage started in the U.S. last Friday, I've been spending far too much time with the television.
The most obvious choices for Olympics coverage are NBC's primetime broadcasts. They are a definite improvement over the 2002 Winter Olympics, with fewer sappy athlete profiles and a little more emphasis on the events themselves. Through the magic of TiVo, we can skip commercials, insipid interviews, and those remaining touching athlete portraits that NBC has decided it can't do without. I've also caught a bit of badminton, table tennis, field hockey, and volleyball here and there on CNBC and Bravo. The coverage of second-tier events NBC's doing on these channels has been refreshing, and mercifully free of the schmaltzy filler that I fear will always be a feature of the primetime broadcasts.
Of course, for better or for worse, the Olympics wouldn't be the Olympics without lots of inane (if not outright dopey) commentary, most notably provided so far by Al Trautwig, Tim Daggett, and Elfi Schlegel for gymnastics. The most pleasant suprise in this vein has been NBC's specially prepared high-definition Olympics broacast, which runs 24/7 on NBC-HD. It sounds like an Olympics fan's dream -- until you realize that the HD broadcast is delayed an additional 12-24 hours after NBC standard-def primetime coverage.
Still, the HD coverage of marquee events like gymnastics and swimming is much more extensive, looks astoundingly good, is blissfully free of both ads and annoying filler vignettes, and miraculously, so far features much better commentary than what's in the standard broadcasts. In gymnastics, the difference is particularly dramatic, with former U.S. gymnast Shannon Miller providing intelligent and helpful commentary on the HD broadcasts. Miller is so much better than her counterparts on the standard broadcasts that I can only hope NBC will promote her up to the first string for the 2008 Olympics -- which with any luck, will be broadcast entirely in glorious high definition.
Here's a nice summary article from the Palm Beach Post about the trend toward "poli-tainment," under which general heading the author places everything from the Vote for Change tour to The Daily Show to The West Wing to Rush Limbaugh. The article makes the argument that the blending of entertainment with politics -- sometimes a little light on facts -- was pioneered by conservatives, with liberals only now really beginning to produce a lot of their own poli-tainment.
There's a lot of commentary and opinion about this issue right now, stemming from events like the recent announcement of the Vote for Change tour and lineup of artists. People on both sides of the political fence have argued that poli-tainment is a bad trend, either because celebrities outside politics should not use their position to push their political views, or because audiences for poli-tainment don't get the whole story. As I've previously written, I think the first argument is silly: celebrities ought to be able to address whatever issues they choose, so long as they're prepared for the potential consequences. As for the second point contra poli-tainment, I'm pessimistic (or realistic) enough to believe that some of those becoming "informed" through shows like The Daily Show, The West Wing, or even Rush Limbaugh are highly unlikely to seek out news and information through other means -- and some information is better than no information.
Also -- and maybe this is just postmodern Gen-X cynicism talking -- I'm not totally clear on the line between some of the things the article describes as poli-tainment and other things that purport to be respectable sources of news and opinion. Much of the content on Fox News comes immediately to mind, of course, but to be fair I would also include shows like MSNBC's Countdown with Keith Olbermann. Depending on your views and the stories of the day, these may be fun to watch, but they're the junk food of news: you'll probably learn enough to get by, but you're not doing yourself any favors by restricting your news diet to such sources.
I will, however, add the forthcoming book by former CNN editor David Mindich to my reading list. Tuned Out: Why Americans Under 40 Don't Follow the News will potentially provide some insight into whether the trend toward poli-tainment is actually something to worry about.
Here's a way of life that never would have occurred to me: home managing. For a fraction of market rent, attractive people with trendy furniture can live in fabulously expensive homes while their owners are waiting for a buyer to come along. The downsides are that you have to be prepared to move frequently, and you have to keep the house in perfect condition at all times, since a realtor might be dropping by with a potential buyer with only 15 minutes' notice.
I can see the attraction of living this way. A lot of houses have intriguing exteriors, or are in the perfect neighborhood -- or both. Such houses frequently make me wonder what it would be like to live in them, even though I know that I won't ever be able to afford such a lavish dwelling. Home managers get to live in these houses without being independently wealthy.
But, oh, those downsides. No pets. A requirement to be obsessively clean. Having to move three or four times a year. And I assume, no ability to customize or personalize your living space. No way to really put down roots in a place. It might have worked for me ten years ago (if I had had the financial wherewithal to acquire some good furniture), but now I shudder at the thought.
There are those who prefer to live more or less as nomads (hello, Lee and Faith!) This kind of thing is probably especially appealing to them. The catch is that in this version of the nomadic lifestyle, you lose out on one of its primary benefits: simplification, living only with necessities and maybe a few luxuries -- in other words, owning less stuff. Presumably that's not really sustainable when you have to keep enough furniture to make a 5,000 square foot house look good.
Rolling Stone documents yet another injustice perpetrated on gays and lesbians by discriminatory marriage laws. Apparently, U.S. copyright law does not allow artists, writers, composers, etc., to leave the rights to their creations to the person of their choice. As the article puts it, "No matter what an artist's intention, spouses, children and grandchildren, in that order, are the first in line to recapture the copyrights, followed by next of kin, executors and administrators." There is essentially no way for copyright owners to reliably pass along those rights with the remainder of their estates. The situation is such that some couples are resorting to adopting one another to ensure that copyrights end up with the person the artist intends.
This angers me from the perspective of basic rights and freedoms, but also because it's yet another demonstration of how wretched, contorted, and generally screwed up U.S. copyright law is. I despair of seeing it fixed in my lifetime, for all of the usual reasons: powerful interests have a lot at stake in keeping things the way they are, and Congress can't be bothered to grapple with the complexities of copyright in the digital environment.
As if trying to change copyright law weren't a daunting enough task, a law professor quoted in the article believes that fixing this would also require amending the Defense of Marriage Act. I can just see Congresscritters lining up to avoid that one! What a mess--time for a few more "activist judges," I guess.
Sounds cool, no? Or at least worth checking into? Well, pPod is actually a guide to London's public toilets, using music, text, and spoken word to provide directions, reviews, and trivia--not to mention musical accompaniment (the example given is, natch, Handel's Water Music).
Consider the endless possibilities: Debussy's La Mer? Hammerstein and Kern's Ol' Man River? TLC's Waterfalls? How about Frankie Goes to Hollywood's Relax? Okay, maybe not.
As basically silly as pPod is, it is a pretty neat demonstration of the uses to which an iPod can be put. I'd love to have downloadable, interactive, iPod-based guides to any number of cities, museums, and attractions. Who knows? Maybe the sewers of New York will be next!
Why does any marriage make men healthier, but only happy marriages make women healthier? This article from WebMD examines the question (the study and the article were published nearly a year ago -- I'm a little bit behind).
It makes intuitive sense that a happy marriage (or close partnership of any kind) would benefit the health of both parties. But men's health improves even in unhappy marriages, while such marriages have a negative impact on women's health. A Boston University psychologist speculates that marriage's effect on men and women differs because a) men are less sensitive to trouble in a marriage and b) women are generally more supportive partners, at least in part because women learn from birth that one of their major roles is to be supportive of friends, children, and spouse.
Is it just me, or does that not paint a very attractive portrait of the average man? Anecdotally, I have no sense for how true those suppositions might be. I don't see either a lack of sensitivity or supportiveness in any of my male friends. On the other hand, I've certainly heard a lot of second- and third-hand accounts from friends that tell the same old story of male ill behavior. I wonder if anyone has studied perceived levels of sensitivity/supportivness relative to the health of married people.
I also wonder what the health impact is in marriages/partnerships between men. My sense (I'm totally making this up, so beware) is that one partner or the other is likely to assume what would normally be thought of as the "wife" role. Does that mean that the health of the "wife" partner in a gay marriage suffers if the marriage is unhappy? Or do the observed benefits to men happen regardless of how happy the relationship is, just as in heterosexual marriages?
Finally, I'm curious as to what exactly it takes for a single person, male or female, to receive the same benefits to health as the happily married get. The article suggests that single women with a strong support network and a lot of close friends might benefit in the same ways as happily married women do. Corollary to that is the explanation for why single men are more unhealthy than single women: because they are less likely to have formed extensive support and friendship networks. Lots of food for thought here.
This little tidbit from Reuters by way of Yahoo describes a new invention called "Flower Speaker Amplifiers." This little gadget is hidden in a potted plant and broadcasts sound at a frequency that causes the plant's stems and leaves to function as amplifiers. In other words, this thing can make your houseplants talk and sing to you.
I can't quite decide if this is cool or just too bizarre to contemplate. Just think of the possibilities. If you had a room full of plants, you could tune them all to different radio stations and drive unsuspecting guests batty. (Scott and Danielle, I'm envisioning this for your entryway!) You could line a hallway or outdoor walkway with talking plants that all calmly whisper threats of mayhem. You could torture your pets by making your plants speak with your voice. The possibilities are endless, if a little bit creepy.
Oh, and did I mention the invention is Japanese? I bet you're not surprised.
In just under the wire for Monday...ran across this item from Reuters via CNN.com. Apparently, Linda Ronstadt (singer who had a lot of big hits in the 70s: think "Blue Bayou" and "You're No Good") dedicated a performance of the song "Desperado" to Michael Moore during a performance at the Aladdin casino in Las Vegas. She was greeted with a near-riot by some of the audience members, a quarter of whom left the concert, apparently demanding their money back. Following the show, she was escorted from the premises with the message that she would "not be welcomed back."
This reminds me of last year's Dixie Chicks incident on a smaller scale. I can see why the Aladdin wouldn't want their performers alienating paying customers with political screeds, but a song dedication hardly qualifies as deeply offensive. Really, why shouldn't musicians (actors, writers, whatever) use their bully pulpits to slip in a political message now and then? If music is essentially a means of self-expression, why is anyone surprised and/or offended when musicians tell us what they really think? I say, get over it. I don't see how Linda Ronstadt's politics have anything to do with her singing. I mean, she isn't the Indigo Girls. It's not like she decided to spend the whole show giving political speeches instead of singing -- she just dedicated one song to a controversial figure. The extreme reaction of the audience says something pretty appalling about the level of maturity and sophistication in our political discourse.
What is a sport and what isn't? Jordan Ellenberg, a math professor at Princeton, tries to answer this question in Slate. Unsurprisingly, he spends a lot of time discussing competitive math, but he hits several other interesting points as well.
This resonates with me partly because of two things. First, we recently saw Dodgeball (which, by the way, if you haven't seen it, is much funnier that you'd expect), with its spot-on sendup of the coverage of niche sports ("ESPN8: The Ocho!"). Second, while flipping channels late at night sometime last week, John and I caught ESPN2's (the true "Ocho") rebroadcast of Nathan's Famous Hot Dog Eating Contest, in which Takeru Kobayashi, a tiny Japanese guy, somehow managed to eat 54 hot dogs in 12 minutes to set a world record and win the contest. The broadcast came complete with all of the lame commentary you'd expect from any sporting event, and the actual contest was followed by interviews with the "athletes" who participated. There was lots of talk about the trials and tribulations of the competitive eating circuit, the rigors of training for competitive eating, and which contestants were the holders of records for various foods. All in all, it was the kind of spectacle you can't believe you're watching, yet you can't tear yourself away.
So maybe that's part of how we define what is a sport and what isn't: whether or not there's a spectacle involved, and how much entertainment value there is in watching the activity. Watching people solve math problems is pretty boring, no matter how impressive the mental feat -- you can't see the thought processes, so there's no overt drama. Chess and bridge have a little more going for them in this regard, but still, a viewer has to have a pretty sophisticated understanding of the game in order to become really absorbed by the proceedings.
Getting back to the article that put me on this train of thought, I do have to take issue with one of Ellenberg's throwaway lines: "It is a fact that basketball is a sport, and it is a fact that sauteing zucchini isn't." I don't know about that. Hasn't he ever seen Iron Chef? Here are people with astonishing physical and mental skills engaged in a physically draining, timed competition that is numerically scored. Admittedly, the scoring is subjective, but how is that any different from figure skating or gymnastics? (On a side note, at least chess and bridge have objectively quantifiable results!)
I don't really have a larger point here. It's just that it's pretty fascinating what kinds of things humans will build elaborate competition rules around. And how a simple word like "sport" that everyone is assumed to understand is actually really, really hard to define with any precision.
Thanks to Ethan Bunke for pointing out today's Barbara Ehrenreich column in the New York Times: an examination of groupthink, its dangers, and its increasing prevalence in American society and politics, which also manages to throw in a few nice anti-war sentiments on the side.
On the one hand, I think Ehrenreich is right to be alarmed. On the other hand, my sketchy knowledge of history tells me that deviation and speaking out have always been unpopular choices (often resulting in punishment) when a nation perceives itself as under attack. I'm thinking especially of the McCarthy era, during which even suggesting that the USSR and/or communism might have a teeny tiny bit of merit could get a person blacklisted. And am I just too young to remember a time when politicians weren't regularly accused by their opponents of being out of step with the attitudes and opinions of their constituents? I'd like to think that no matter how splintered and niche-heavy the views of the electorate might be now, there never was a time when everyone really agreed on all of the issues of the day.
So maybe we just need to wait for the pendulum to swing back the other way. Still, though -- don't we like to think of ourselves as having "grown up" enough as a society not to punish dissenters purely because they dissent? Maybe the deeper point lurking in Ehrenreich's piece is that contemporary groupthink is even more insidious because the punishments meted out to its deviants are met with such weak protest. We say nothing not so much because we're afraid of retaliation, but because we're too apathetic, too anesthetized by the details of our daily lives, and too convinced of our inability to change things to object. It's a pretty chilling thought.
Big blogging day today, I guess. But I just have to comment on this, an article delineating Fox's plan for a 24/7 reality show cable channel. Not that this is surprising, coming from Fox -- what's more upsetting is the bit of information tucked in at the end of the article that there's another all-reality all-the-time cable channel in the works: Reality Central.
What bothers me is not so much the prospect of perpetual opportunities to watch and re-watch shows that make me want to remove my eyeballs with a dull butter knife(like Big Brother and The Mole), but the fact that a significant portion of these networks' programming will be devoted to -- get this -- analysis and commentary. Analysis and commentary?!? Do they mean serious analysis and commentary? Because I think my head will explode if I'm exposed to people who are actually paid good money to provide their deep insights into the latest American Idol rejection.
Not that there isn't a place for snarky, irreverent commentary on reality shows. And here it is: Television Without Pity. I say, give their writers a cable channel and let them go for it. Now that would be must-see-TV!
Here's an entertaining diversion from Slate, a quiz that purports to determine your personal level of "red" or "blue"-ness. We're not really talking politics here, but culture.
It's an amusing little quiz. I came out in the middle, which I attribute to an awareness of the geography of the Upper Midwest (questions on Door County, the UP, and the Quad Cities -- incidentally, all of these are "blue state" places, at least according to the results of the 2000 "election"), as well as just knowing stuff. I mean, are there really people out there who can't identify Lee Greenwood, Jed Bartlet, Jon Stewart, and Laura Schlesinger? People living in America who have been vaguely sentient during the last decade? I refuse to believe it.
Anyway, I suppose it is Midwestern-ness that puts me in the middle (literally as well as figuratively. Ha!) The quiz does have another regional bias, though: I can identify several New York/East Coast questions, but I don't see any California/West Coast questions. Since knowing the answers to the New York questions places one toward the blue side of the spectrum, are Californians and Pacific Northwesterners (definitely culturally "blue" places by the quiz's methodology, I'd guess) at a disadvantage? A burning question if ever there was one.
The Strib reports that Target's corporate headquarters are moving away from their business-casual dress code to one that will basically require suits for both men and women. Employees are predictably upset, foreseeing major wardrobe investments in their immediate futures. The timing couldn't be worse, since Target just sold Marshall Fields, and employees will soon lose their discount at the department store.
Is this a sign of a trend toward more formal workplaces? The Strib article thinks not, citing General Mills and American Express as two large local employers that are sticking with a business casual dress code. Best Buy, however, has always had a fairly formal and restrictive dress code, both for employees in the retail stores and at headquarters.
The article points out that very specific dress codes can help protect a company against lawsuits. The whole thing reminds me a little bit of the trend toward uniforms in schools. Both seem to say that people (no matter their ages) can't be trusted to dress themselves appropriately without strict guidance. I'd like to think better of rational adults, but casual observation forces me to conclude that corporations and schools are right about this. It's the classic scenario where given the freedom to choose, people insist on choosing the worst possible options, and so must have their choice taken away. Overall, it's a pretty sad commentary on society.
A couple of disclaimers/warnings: 1) I'm treading on Dr. Dregs's territory here, so I hope he can forgive me, and 2) I'm not really a Trekkie -- just a casual fan, and therefore not really qualified to comment on anything Trek-related. But I can't resist. Slashdot led me to this little nugget from TrekToday. Apparently, Enterprise won't be dealing with the Romulan Wars, since Rick Berman is plotting the next Star Trek movie around them -- and he says it won't intersect at all with Enterprise.
I have mixed feelings about this. The last couple of Trek movies have been pretty depressing experiences overall, despite the fact that I more or less enjoyed them and will certainly watch them again (and yes, I can already feel the many pathetically to-be-wasted hours of my life ticking away in that sentence). So I've been half hoping that Paramount et al. would be ready to throw in the towel on Trek films, since the last couple of installments have been critical and box office disappointments. On the other hand, hope springs eternal -- maybe, just maybe, freed from the constraints of pre-established characters, time periods, technology, etc., the Trek Powers That Be can come up with something really interesting and original -- even (dare I suggest it?) good.
I'm just trying not to remember that I hoped for the same thing before Enterprise began. What a disappointment. I didn't even make it all the way through the first season (Dr. Dregs, however, remains rather sweetly if hopelessly loyal to it, so I do catch a few minutes of it here and there). Maybe Berman and company can pull this off, but I'm not optimistic. We'll have to see.
(Self-reflection kicks in) But ... uh, wait, does it say something about me that I get burned over and over again but still go back to Star Trek? (End self-reflection) Don't say it -- I told you, I'm not a Trekkie!
As if we didn't already know this, a study by the National Endowment for the Arts shows that reading among Americans is in serious decline. But it's worse than you think. According to the study, 89.9 million adults did not read a book in 2002. 89.9 million people! did NOT read! even ONE book! Astounding.
Of course, both the NEA and the book industry think this is a national crisis, a tragedy. I tend to agree, but the AP story doesn't really say why the decline of reading is such a problem: because the amount of reading you do is directly proportional to how well you communicate, especially in writing. Because reading helps you learn critical thinking, which helps you function in the world. And those are just a couple of the reasons that instantly occur to me.
But ... here's my question. Are people reading good writing, but just not in books? Are people satisfying their hunger for the written word on the Internet? Sure, the Internet is full of really awful writing. But there's also plenty of mediocre-to-great stuff out there, especially with the explosion of blogging. I'm curious as to whether the study addresses this at all.
Guess I should track it down and read it.
Caught about half of the 10 am hour of MPR's Midmorning. Today's interviewees are a couple of rock critics flogging a book to which they contributed essays: Kill Your Idols: A New Generation of Rock Critics Reconsider the Classics. The "classics" of the title include the Beatles, the Beach Boys, Fleetwood Mac, Pink Floyd -- and even some more recent stuff like Radiohead and Wilco. I'm a sucker because, yes, I'm going to run right out and buy this book.
I'm skeptical, though, I have to say. Interested, but skeptical. Is this just more of Gen-X/Gen-Y taking aim at the cultural touchstones of the Baby Boomers? (Not that the idols of the Boomers don't really, really need to be taken down a notch or two!) Even so, I suspect it will be a good read. As John pointed out, Roger Ebert's I Hated, Hated, Hated This Movie is so amusing because bad reviews are a lot more fun to read than good reviews.
I'm doubtful, however, that this book (along with pretty much all criticism) has any real or lasting cultural value except as a document for social historians. Taste is subjective, and I'm convinced there's no meaningful way to answer the question of what's empirically good versus what's empirically bad when it comes to art (or anything purporting to be art). Still, although my pop music tastes are no doubt hopelessly pedestrian by the standards of rock critics (I don't have nearly enough of the obscure), I can't help but feel sometimes like Rob Gordon in High Fidelity: you are what you like.
This guy wants to change the key of The Star-Spangled Banner from Bb to G. The major advantage is that the highest phrases of the melody would fall into a reasonable, singable range for most people (personally, I like Bb -- but then, my singing voice is a high soprano, so I have no trouble reaching the highest notes).
It's not a terrible idea, but I wonder if Mr. Siegel (who says he is not a trained musician) has considered the impact this would have on high school and community bands across the country. These groups love Bb. They'd play everything in Bb if they could. Okay, maybe that's a slight exaggeration, but unless things have changed a lot since I was a beginning clarinetist, kids learning band instruments learn the Bb major scale before any others. For some, it's probably the only scale they ever learn. Many casual players of band instruments are puzzled if not stymied by other keys (particularly sharp keys). As things stand, almost any band in America can learn to play The Star-Spangled Banner passably well -- because it's in Bb.
Of course, the key change wouldn't cause any problems for pro or near-pro groups like military bands. And G is certainly an easier key for a full orchestra. But there's something quintessentially American about an amateur wind band's rendition of the national anthem. I don't think we should make it any harder for them.
A couple of other observations: perfect-pitch boy John no doubt has a more informed opinion about this, but I think that The Star-Spangled Banner melody in Bb offers a necessary brightness or brilliance that G lacks. And as long as we're talking about changing the national anthem, how about adopting America the Beautiful instead? A majestic, attractive, singable melody coupled with a lovely, optimistic, stirring poetic text -- this is our ideal national anthem.
John and I rode the Hiawatha Line again today, and we had a very agreeable journey down to Fort Snelling. We walked to the 38th street station this time, which as it turns out is a pleasant 15 minutes away on foot. Granted, the walk will not be so pleasant in January (except for those of you who may enjoy the loss of sensation in various extremities, possibly resulting in the loss of actual extremities). But for three out of four seasons, the trek will be tolerable to lovely.
As we waited on the platform, we noticed some of the art at the station. Most clever are the miniature houses suspended from the station roof. According to this pdf (which includes pictures of each station), "the roof reflects nearby neighborhoods where bungalow homes are predominant. Sears catalog bungalows are cast in bronze and suspended from the roof to create a sense of discovery." That whole "sense of discovery" thing is a turn of phrase I would most certainly have found mockworthy before today, but damned if we didn't experience a sense of discovery when we noticed the little bungalows over our heads. This is some neat stuff.
So it was serendipity that I happened upon an article in today's Strib outlining some of the issues there have been with many of the public art installations commissioned for the Hiawatha Line. I'm sure these issues arise with almost any public art installation, but the variety of circumstances causing delay provides a neat summary of both the complexity and the ambitions of the whole Hiawatha Line project. We're trying to do something really cool here, people!
Still not convinced? Just wait until Janet Zweig's "Small Kindnesses, Weather Permitting" is finally installed: at 11 of the 12 stations, passenger-activated LCD screens "will feature audio and video shorts by Minnesota filmmakers, singers and storytellers." Rock on! This kind of thing reminds me why I'm still proud to live in Minnesota.
Andrew Leonard writes in Salon.com exactly what I've argued since I first warily clicked through to the iTunes Music Store a year ago April: that reasonably priced, legal access to downloadable music makes me a better music consumer -- and that the megacorporations controlling the content should be happy about that. Instead, they keep looking for ways to screw up the best thing that's happened to them in a good long while.
As I posted to the MGROE list a few months back: "I don't know exactly how much money I've spent on iTunes over the past year. Maybe $50 or $60, if that much. But I can tell you this: that's $50 or $60 of my money that Big RecordCo would never have seen otherwise. And I can think of at least half a dozen albums that I've bought (or will probably buy) that I never would have considered without being able to "sample" by buying a 99 cent iTunes song. That's a few more bucks that won't go the recording industry's way. I'm sure I'm not alone..."