full retard blah blah Oscar blah blah... blah?
To *slightly* go in a different direction, do we really care about the Oscars anymore? I imagine they were started to give honors to the people who made the film, and also to let the masses know which films were good or respectable. That was before the Internet. Who cares what a bunch of prissy judges decide?
I'm not trying to take away from the CEREMONY, but I wonder if the Oscars are all that INFLUENTIAL. A big event and fun, yes/maybe, but influential... it feels like nowadays, when a movie's "good" it has buzz about it. And it feels like the Oscars are more of an afterthought. But I am no Oscar historian, so if anyone is, I'd love to hear you talk about it.