Do women still feel they need special treatment?
I am not sure we can use Hollywood to actually see how successful women are in America.
Reese Witherspoon by all accounts is a very successful actor. Been in many hits and has made allot of money in her career.
But in the real world I don't see any kind of system holding women back? As a matter of fact in the corporate world I work in it seems to lean towards favoring women.
I look at my own world and my sister in law is a psychologist, my girlfriend is a therapist, I have cousins who have careers, from owning her own chiropractor practice to an attorney, a couple who work for the government.
They all went to college and applied for jobs and that is about it. Actually in my family the women have been more successful overall than the men.
My workplace, my manager is a woman, the assistant VP and VP are women. There is only one guy in the management chain up until the CEO in my department.
Is there really a system in place that is making women work harder to be successful just because they are women?