Can Hollywood and Society Get Over it?


Last Night (Sunday), I watched “12 Years a Slave” and I couldn’t stand another 10 more minutes. It’s based on a book by Solomon Northup recounting his experience as a free man in the North being kidnapped, tortured, abused and sold as a slave in the South. But this movie version disgust me, not only the hate towards Black people is apparent, but Black actors in this movie should say, “NO”, I can’t be doing this role! Instead, they just want to be noticed by acting they’re abused, naked, raped just for the sake of an Oscar. I couldn’t bear how long Black actors have become over the last decade to get still stereotyped roles. I don’t know, but as a woman of color I don’t think this is the only history Blacks have. Our ancestors came in this land against our will, while being “reconditioned” with whips and chains, eliminating their original names to common ones and under heartless types of abuse not even a white person can endure, lost their way, their purpose, their heritage and culture to become just a feeble minded servants. I mean, Black people gets it, and it’s quite repetitive at times, over and over with the same story line it’s tiring. Black people wants to move on, wants to prosper, wants to be left alone. This movie and perhaps other movies and miniseries based of what the Black ancestors went through is a punch in the gut, and a insult. I know Hollywood and society is racist against us, they want Blacks back as servants, why this fetish and fascination? Is it because some people think Black people are just feeble minded today than decades before? Don’t be such jackasses!

Why don’t Hollywood start doing positive movies about the successes of Black people what they have achieved over the years instead of depressive movies and TV shows glamorizing their pain, suffering and humiliation? At least Tyler Perry got it right making positive films but then he just back off… a lot.

Advertisements