I've always hated the term, "black". I mean, I know that even 'black' people use that term to depict themselves, but I think it's just because it's become so exhausted over the years. I think it would make more sense if they referred to themselves as 'African Americans'. That way, they have some identity (through history) and it segregates themselves from other 'black' people who might have not been captured as slaves and brought to America. (E.g. Aborigines, South Indians, etc.). Now I know that most black people these days have no affiliation with the continent of Africa whatsoever, so why should they even claim to be African-American anymore?