Ironic how they ignored the most likely end of the world - robots
Super-ironic that in a movie where already all of the the robots are superior in nearly every way to humans, and all mostly evil, that there was no mention of Kurzweil's Singularity as being anything to worry about in the future??? It's coming guys, 2040...
https://en.wikipedia.org/wiki/Technological_singularity
And if you can't be bothered reading about it, the gist of the problem is that computing power doubles every 18 months so will equal that of a human around 2023 (so then machines will be able to design and build themselves and we enter a runaway phase of increasing intelligence and ability) and equal to ALL humans by 2040. Once it realizes the world would be generally better off without us (or at least better off if we were "managed"), it will do whatever necessary to eliminate us and we won't be able to out-think it to stop it. The reason this problem is impossible to "program" away from the beginning is that it is just too difficult to adequately assign boundaries to an intelligent robot that has been given even a seemingly benign task (see paperclip maximizer problem below), and there is no way intelligent robot development can be universally banned due to the irritatingly childish perceived need for country leaders to continually prepare for war (try telling North Korea to "not research and develop super robot soldiers because everyone else has agreed to stop doing it").
https://en.wikipedia.org/wiki/Instrumental_convergence#Paperclip_maximizer