Is Hollywood Racist?
"In recent years, with A list actors such as Denzel Washington and Halle Berry achieving widespread acclaim and success at the Oscars, there has been a sense that Hollywood is embracing black artists more than ever before."
-Anna Park, a regional attorney with the Equal Employment Opportunities Commission
Really? She could only think of two? Out of hundreds of actors? She is saying this as a good thing, but i think otherwise.
I recently came across this article. An assistant was fired and believe the studio fired him because he was black. The assistant stated, "I have not done another movie and I was threatened that I was going to be blackballed by going forward and pressing these charges. Universal has sought out to selectively destroy my career," In responce Universal put out a statement saying, "There is absolutely no basis to these allegations, Universal is committed to equal employment opportunity in all aspects of its business, and we are confident many witnesses will testify that Mr Davis' firing had nothing to do with his race but was solely due to his poor performance as a first assistant director."
I think it is not only the film industry, but all American industries that discriminate. But it may be the film industry that has the largest impact on American Society. Not all of us are exposed to what goes other settings, but in film, we are able to look directly at it. We may not be seeing the directors or crew, but we are able to see the characters. Who plays the lead? Who is the bad character? Who is the single mom? Who just graduated from college? Hollywood sends us messeges, often untrue ones based on stereotypes. When looking at these films through a feminist perspective, one is easily able to see this, but I believe that this may be harmful to some. If a "type" of person is always the bad guy viewers may believe that this is "real" and develop a fear and this fear may transfer over into real life.