The Rise of the American Empire

1633 Words4 Pages

WWII has a ripple effect across the globe causing changes both internationally and domestically. Internationally, The sun finally began to set over the British Empire with the majority of her majesties colonial possessions gaining independence in the years following the war. Britain’s stage left exit from its hegemonic role resulted in the start of a new “Great Game” between two burgeoning superpowers. A new world order began to take shape with the United States and USSR vying to establish their own hegemony.

Aside from causing a major shift in geopolitical power, WWII also solidified the integral role oil played politically in national security. However, following the war the United States was no longer the world’s largest oil producer and was unable to maintain self-sufficiency as it had in the past. As a national security imperative oil was more important at this point than ever before. America’s war machine needed to be well oiled in case the new Cold War suddenly turned hot.

Aside from national security interests domestic thirst for oil boomed. The war brought us out of the Great Depression. During the Depression a traditionally capitalist American society embraced a kind of socialism with the New Deal. WWII transformed the bear turned in a raging bull. Capitalism was back with a vengeance, charging forward stronger than it had ever been before. The heavy industry built up to sustain the war effort was retooled to meet the demands of the emerging consumerist culture of the 1950s. The new explosion of industrial output became so pervasive that the decade ended with President Eisenhower warning of the dangers of the growing “Military-Industrial Complex.”

With domestic production drying up and demand soaring, it became c...

... middle of paper ...

...illing to go so far as to use the word empire. I’m not suggesting that the “imperial” actions by the US are any less sinister than the policies of the British and should be just ignored because the word empire doesn’t apply. But in my opinion, the word empire and all it connotes is just not an apt word to describe the United States.

There is one key difference between America and other empires from the past that I think is the deal breaker. We don’t think of ourselves as an empire. American’s never refer to it themselves as imperialists. We don’t like the word. The British liked the fact that they where an empire. It was a source of national pride that they had colonized nearly a quarter of the globe. To many in Britain today it still is. For the British being an empire was not something you did secretly and then claimed not to be as many accuse America of doing.

Open Document