Can You Talk About White Privilege Without Showing It?
This isn't another alt-right/racist rant about how 'white roles are being taken by Black people'.
On the contrary, this is a genuinely progressive OP, albeit one that *actually* has been made with some *thought* and *intelligence* (which, alas, is not a trait that applies to all 'woke' individuals).
But these days I'm seeing a lot of fiction featuring Black people in positions of power and privilege (i.e. Black Panther, Bridgerton, The Little Mermaid, the upcoming Great Expectations adaptation in which spoiled rich girl Estella, is being played by a WOC). Now, I 100% understand why many Black people want to see themselves in such roles, rather than as victims of social oppression, but isn't there a danger that any discussion of CRT and white privilege will get lost if popular culture is full of images of power, privileged, wealthy and successful Black people (without the subversive context of a novel/TV adaptation, like Noughts and Crosses, which imagines an alternative universe in which Black people oppressed whites, whilst still serving as a commentary on the actual racial dynamic that exists in the real world)? If we don't depict the dynamic as it is, or pretend that Black people should be ashamed of their circumstances, rather than acknowledge that it's the result of historical and systemic oppression, and that whilst it's important to end that oppression, Black people shouldn't beat themselves up for how things have been up until now, there's a danger that white conservatives will turn around and say "White privilege? What white privilege?"
Surely culture has a duty to depict the reality, if it wants us to effect change. Pretending Black people have the same systemic power as whites, strikes me as the type of strategy a white supremacist society would use to keep POC in their places (an opium for the masses, so to speak).