It kind of annoys me that most time that a movie is haled as being "feminist" it tends to have the female characters (or at least the protagonist) going around beating people and killing them and I really don't get how that's particularly pro-HUMAN, let alone pro feminist
I mean, you have things like Kill Bill, Mad Max:Fury Road, Resident Evil, Ultraviolet, I Spit on Your Grave (though those guys deserved for SOMETHING terrible to befall them!) and such where the protagonist is a pissed off woman who goes around and behaves like a male action star and then they get called "strong female characters" and such- how? there's rarely even any character development or much background to let us know if they really ARE a strong female character or if they just like killing people! You don't get people praising Rambo for teaching little boys how to be strong for goodness sake
I don't get it... there are plenty of good examples out there of women who use their brains and such to make things BETTER for people (like Clueless, for an example... yeah, it was a comedy, but it showed the evolution of the main character into someone who was less concerned about her image and who did have concern for the happiness of others... or Legally Blonde, where the main character overcomes stereotypes and succeeds thanks to being smart, being a decent person and working her ass off) but they aren't generally the ones who get praise
Basically this is a "what the hell people? is this what you want girls to grow up aiming for? that strength = asskicking instead of that we can overcome obstacles and ourselves to make something better?"
this is not a debate on feminism or men's rights or what have you... I really don't want to hear about divorces, custody or pay differences here because it's not what I'm asking about... I'm simply curious as to what the fuck is up with movie people