I have to admit, I was pretty excited when Sex and the City first premiered on TBS. It was smart, funny, engaging, honest, and finally voiced women's sexual and dating concerns in a realistic way. At the time I was also a budding feminist, and felt that somehow, there was something about the show that made life a little bit better for women.
After I grew up and got My Women's Studies Degree, I realized that Sex and the City isn't exactly a beacon of hope for the feminist movement. It plays upon viewers' lingering racism and consumerism, and definitely isn't very radical or anything. However, I'm not one to complain about something just because it hasn't yet achieved perfection, and in general, I think that Sex and the City is really great for women.
One thing that always got me about Sex and the City was the surprising vehemence with which my male friends always proclaimed its inviability. Not only do men not like the show, they "hate" it. They think it's "stupid, "meaningless," worthy of a condescending chuckle and an eye roll. This reaction became even more pronounced when the Sex and the City movie approached its May 2008 release date. I was surprised to find most of the reviews downplaying its significance, and casting it aside as a piece of theatrical filth.
After reading the early reviews of the movie, I entered the movie theater with trepidation on opening night, expecting to see the hollow shell of the show I once loved. However, I was pleasantly surprised to discover that the movie was actually pretty good. It was just as relevant, engaging, and hilarious as the television show - even more so, in my opinion.
Which brings me to my point. I usually hate most movies, especially ones targeted at women, because they are so infantile and insulting. The Sex and the City movie is no great work, but it isn't just a shallow, vapid concoction either. In my opinion, society's rejection of Sex and the City is a rejection of womanhood itself. When I'm arguing with someone about whether the show is good or not, I'm not really talking about the show, and neither are they. I'm talking about women's right to gather with their friends, respond emotionally to breakups, articulate their thoughts by writing, and have those things depicted on a television show without it being brushed aside as petty and meaningless.
Sure, Sex and the City can be annoying and counter-productive, but so can most male-gendered programs. No one complains about blatant consumerism when it's celebrated in the latest Bond film, but Heaven forbid that the Sex and the City feature one scene where Carrie discusses buying shoes. You don't have to like Sex and the City, but why would you hate it? Hating something insinuates that you not only dislike something, but that you also feel strongly enough about it to officially take a public stand against it. Why do people always seem to hate Sex and the City, but just dislike anything gendered male?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment