I'm all for gender equality, but I am damn sick and tired
of Hollywoods boring and condescending take on how we have to
model reality in movies to encourage our little girls.
It is like in the 80's how TV started moving from being adult centered
and following ( too strictly ) the rules or propriety, the TV censor code
and all that, to catering to children because that was the demographic
to sell most profitably to.
So now men have to take the dummy role and let women be right all
the time. Why does Hollywood or most people for that matter in a
social crisis always have try to swing the pendulum back the other
way farther to get people aroused and arguing? What Hollywood has
done has to create a bunch of people in the "flyover states" who feel
that that place is out of control and it is ruining their society's values.
But even that cannot be debated reasonably and objectively, it has to
be all out culture war. The fake news is on all sides, and it probably
does not all come from Russia or China, it is to manipulate Americans
and keep us from realizing the power we have to make this country
better.
It is not better to show women fighting huge men and knocking them
out or killing that. Though that might happen 0.0000000001% of the
time and I would not rule it out, women are different from men in a
way we have not really even begun to touch in any kind of cultural
debate or discussion.