PORTLAND, Ore. – Everyone loves a good bad guy. Heck, that’s one of the oldest tropes in Hollywood. The good guy usually wins in the end, but it’s the bad guy we most remember. Harvey Weinstein is, by all accounts, the best kind of movie villain. He’s the bad guy who seems for all the world to be a good guy, only to be revealed in the end for the evil, perverted, hideous creature that he is. But here’s the problem – he’s hardly alone.
For all the huffing and puffing coming out of Tinsletown right now about Harvey Weinstein and the use of sexuality as leverage in Hollywood, it’s been a wink-and-nod proposition in Hollywood for as long as it’s been around that the quickest way for a woman to succeed was to sleep with a producer or a director. Now, I have no idea how prevalent that behavior is in real life, but it would have to be at least somewhat common to become the stereotype that it has.
But here’s the thing: for all the terrible things women have apparently had to endure in order to get the roles they want, what happens to them on screen is arguably just as bad. Heck, women are naked on film three times as often as men (6X as often in Game of Thrones). A quick scan of DVD covers and movie posters will reveal a panoply of female flesh, often without the woman’s face even being shown. Directors and writers, and even many actresses, have defended these depictions of women as ‘art’ and ‘realism’, but from what I’ve seen those often feel like empty excuses.
As a father of two young girls I live in fear of what they’ll experience as they grow into young women. The conflicting messages that our young men receive on a daily basis seems to be increasingly leading us down a dark path. There can be arguments all day long about who’s responsible for that, but I hope one of the discussions results is that, while women fight for economic equality in Hollywood or equal screen time, that they also fight for a more realistic and healthy depiction of human sexuality. I realize there’s a certain amount of fantasy in TV and movies, and I’m not advocating that we completely give that up, but I’m hopeful that the powerful people in Hollywood start having a conversation about how women are depicted on screen, why that REALLY is, and what we can do about it.
What do you think? Does Hollywood need to change its depiction of women? Or does this just come down to parents teaching their children how to differentiate between fantasy and reality?