Hollywoods view of corporations
Would kill to keep profits.
While I enjoyed this movie, it made me all to aware of Hollywood's bias against big business.
I can only think of one movie that puts business in a positive light, and that was made in the 1930's...Llyods of London.
Can you think of any others?