What happened to the Oscars?
Like an Oscar nominated Film used to be a huge deal & the ceremony itself used to be a bigger deal. So what actually "killed" it? I mean in terms of people actually caring about the Oscars
I'm not saying "No one watches the Oscars anymore". Clearly millions still do. But NOWHERE NEAR the level it used to be viewed. And it's not as "highly regarded" as it used to be by the general public either
So what actually happened? Is it cause of Celebrities going political? The "go woke, go broke" stuff? Or do people just not care for celebrities awarding each other anymore?
Like has the glamour or appeal of it all just worn off? Genuinely asking...