I noticed most everyone here is not from the U.S and was wondering what ya'll thought about the place. Does the US government seem like a power hungry corperate predator trying to bring about the new world order or the great defenders of peace that our media and politicians proclaim us to be? Do we seem like warmongers to you? Do the American people seem incredibly stupid, as a whole? I'm just curious on how it seems from an outside prespective?