I think Rosalind Gill brings up a good point in her article that "feminism" has generally been focused on the sexualization and objectification of women's bodies. I can see and understand her reasons for this: such shows as What Not To Wear really do scrutinize and bully "ugly" girls or "distasteful" women and [fashion] magazines such as Cosmopolitan or In Shape do not help either. In general, Gill is right when she says that today's "empowered" woman is in fact an excuse for objectifying women.
At least my understanding is that Gill is saying media encourages women to be "beautiful" or "sexy" for themselves, not for men. Again, I completely understand and see where she is coming from: magazines and TV shows/news shows definitely do nothing to disprove this argument. Despite the fact that I do understand and see why she would think this "independence" or "being pretty for yourself" action is just an excuse for encouraging objectifying women, I do not think that this is necessarily a bad thing.
Encouraging women to do things for themselves instead of others is a good thing (and a step towards "equality"), it is just that currently the form it is being accomplished is not very good. I do not think that there is nothing wrong with a woman feeling good about herself or feeling pretty. If something makes her feel pretty or good, then by all means, I think it is a wonderful thing...however, the media has thwarted this into something I think is quite nasty: encouraging extreme makeovers or surgeries in order to feel good. This form or "feeling good about yourself" I think is horrible and is exactly what Gill explains is an excuse for objectifying women.
Perhaps the reason why this is so horrible is because, as Gill mentioned, only women are the ones being objectified or singled out. Men really are not encouraged to care about their appearances as much as women do. Also, men are generally not the center of attacks when it comes to beauty flaws or mishaps. However, this seems to be largely a "Western" (or even an American) issue...in many Asian countries such as South Korea and Thailand, men care just as much about their appearance as women do. There are makeup products for men and fashion critiques of men just as there are for/of women. So, do you think if Western cultures--specifically American--were to adopt such a "principle" or media portrayal, would this view of "objectifying women" be changed? Would people still think women are being objectified or will they simply find different evidences that women are still being objectified? (NOT that I disagree that women are being objectified.)