Anyone else here fascinated by The Old American West?
There have been a zillion westerns, both film and TV. Not many western-themed movies these days, and I can't remember the last TV show (Westworld doesn't count). Maybe Deadwood?
Lately I've been on a kick of watching documentaries related to The Wild West. It holds some kind of fascination for me and I'm not exactly sure why. I suppose because it seems like it was such a unique time and place in history.
Considering what an impact it made, it really didn't last very long either.
Does it interest you? If so, why?