How about a show concerning who sold the slaves?
These films and shows always (conveniently) stop short of who put and sold these slaves into sale,
I know it's 'fashionable' to falsely assume that the 'Evil White Man' went to darkest Africa with butterfly nets kidnapping all and sundry....but it never happened this way.
Yet, most (if not all) of these 'stories' always conveniently start in the Deep-South' (but never elaborate about how they got there in the first place and who sold them into slavery)
Blacks enslaved, bought and sold their own long before the 'white man' came onto the scene. In fact, only around 6% of the 12.5m African slave trade were sent to (what would become) America.
America also had it's fair share of black slave owners (replete with their own plantations) but once again, Hollywood never wants to talk about these people or events?
I wonder why?