What does "American" mean?
Sorry for what might be seen as a click bait title but I am really curious about this and couldn't think of a better way to phrase it.
A lot of films and songs have "American" in the title, a lot of the time the film being set in America is all that ties it to the title. Also a lot of the time the film could be set anywhere for all it matters.
A film like "American Beauty" could have just as easily been set in France for example.
So my question is using the word "American" just a way to attract attention or a bigger market or does it actually mean something that myself, as a non American is missing?