Americans really that horrified about sex?
I'm from the UK and this movie confused me - is it an American thing to be so uptight about sex or just the people in this movie?! I know that it is a very Christian country but still this doesn't seem to govern the morals of everyone in the continent? I always thought that the term went 'No sex please we're British' but I think Brits seem to be far more free thinking about sex than Americans. I have been to the US but only Vegas where sex sells so don't think this counts lol so I guess I am going off the perception portrayed in movies.
I just thought they made such a huge deal about Olive having sex and I don't understand why - even though she didn't. It's only sex - the most natural thing in the world, and how we all got to be here. Two of my favourites quotes are -
Bill Hicks - "When did sex become bad - did I miss a meeting?!" and
John Lennon - “We Live In A World Where We Have To Hide To Make Love, While Violence Is Practiced In Broad Daylight.”
So is it a general consensus in the US or just the movie heavily exaggerating?