Within the past few years, there has been a steady increase in the push towards eating organic due to the supposed health benefits and environmental factors associated with organic food. Stores like Trader Joe's and Whole Foods have popped up all over the place (even all across Minnesota) and it really causes me to question this whole organic food thing, even as a feminist issue. First of all, because feminism is usually about taking measures to protect and champion for marginalized individuals, and I would argue that the earth has been quite victimized. As a result, are organic foods really helpful towards decreasing one's negative impacts on the environment? Does it really mean eating healthier? And is it available for everyone?
Well... statistics say no. Organic food has not been proven to provide drastic health benefits, eating food produced locally, as opposed to organic food that potentially had a several thousand mile journey to your grocery store, is what will most effectively reduce one's carbon footprint, and that eating organic is mainly a trend among middle/upper class white people.
So, I question whether or not eating organic really is helpful for the consumer, the producer, and to the environment itself. Also, through organic food consumption, we can see a distinct pattern of socioeconomic statuses and races that do the consuming. So! Is this a feminist issue or do I just think too hard...?