I'm American from Texas. I have friends that are from the U.K., Germany, Ireland, Spain and Italy and they all seem to have a much broader knowledge of Africa than Americans. Is this because there isn't as much of a stigma about black/white relations over there and it's taught in elementary through high school there? Only 2 of my European friends have gone to college. And no, none of them were born on U.S. bases in Europe. And I would ask my friends but I want to know now and the question just popped in to my head and it's almost 3 am so I can't call them right now. Please don't make any racist remarks.