America the empire

While the British Empire is permanently in the background of Western imperialism, the United States has never admitted to such a legacy. This is not to say it has nothing…