As you all know, the continent of Africa came under complete control by Europeans at the end of the 19th century. Africa was the last major part of the world to be brought into the Western hegemony; and even though colonialism in our continent lasted for a relatively short while, it completely...