America is a country that generally feels completely engulfed by some sort of weird sickness to me.
But, fuck, what do you expect?
If white people were meant to be in that country, we would have been there NATURALLY, and we would have the skin pigmentation to enable us to deal with that environment, such as the Native American Indians have.
As is, the sun must be doing some really fucking strange things to America's collective head.
Or I could of course be totally wrong, but it makes enough sense to me.