Effect of World War I on The United States

919 Words2 Pages

World War I began in 1914 as a result of several things including alliances and nationalism throughout the world, and imperialistic nations attempting to gain more global power. The United States tried to stay with the foreign policy known as isolationism and stay completely out of the war. However, due to Germany’s severe actions, America was forced to enter the war in 1917 on the side of the Allies. This greatly helped the Allies in accomplishing a victory. The war ended in 1919 although there were still bitter feelings between the different countries. Following World War I, United States’ government became more involved and there was rising conflict over political influence in other nations, the economy took a turn for the worst, and social areas grew worse for many citizens although American women began to gain more.

As a result of World War I, the government became more involved, while conflict arose about United States’ influence in foreign nations. While the United States was planning on joining the League of Nations, citizens were not happy about it. As displayed in a political cartoon of Uncle Sam with his hands tied by foreign nations, citizens thought that America had very little control (Doc. B). While President Wilson was fixed on his 14 Points, they were rejected and the only choice he had was the League of Nations. However, he had very little control in this group. Other nations were the ones taking charge and using all their own ideas. Also, the Espionage Act showed how the United States’ government became more involved. The Espionage Act prohibited any interference with the war by American citizens and resulted in several Americans being arrested (Doc. C & Doc. G). Anything that was determined a thr...

... middle of paper ...

...to African Americans, German Americans also went through harsh times during and after the war. They faced racial discrimination because of them being from Germany, a nation that America was fighting against. Some suspected them of being spies or still loyal to their own country. Also, their traditions were looked down upon. Many thought they were silly and ridiculous and discriminated them because of it. However, women began to become a larger part of society. They were finally allowed to enter the work force. Women became more valued as keeping the economy going while the men were off fighting the war. They were able to keep this position while the war was still going on, but not very many were able to stay in the work force when the war ended. Overall, most minorities either remained the way they were or had worsened conditions as a result of World War I.

Open Document