White woman with black children.
It is is VERY important for Hollywood to keep this theme in their movies apparently. And to take it a step further, all the white men are bad guys. The leftists in hollywood are using their movies as political propaganda.
share