Africa is the least understood continent among the US people. This ignorance is largely due to racism and the legacy of slavery. Of course, it is also due to the racist inattention by the media and US foreign policy, which is in equal measures dismissive and scandalizing. But things are changing. There is a new interest in Africa among the US public today and a new focus in government policy.