WWI Continued in America After the War Ended

920 Words4 Pages
After the United States involvement in WWI, Americans were strengthened economically and diplomatically, but their home was drastically unsettled. Disputes of race, origins, labor, and women gaining equal rights arose throughout America. The war may have ended around America, but at its heart there was war amongst itself socially, politically and economically for women, immigrants, African Americans, and American men. The WWI affected African Americans socially through the Great Migration of African Americans traveling North, politically for women who were fighting for rights and the ability to work, and immigrants were outcasted depending their background, and economically for men that were being drafted and their previous jobs became vacant. These issues were not something that could be overlooked no longer and stuck out like a sore thumb, hurting America’s reputation. While men were being drafted for war, women picked up the slack by working in factories to help production for the war effort. Journalist Jean Godden Seattle wrote on June 30, 1918, patriotically exclaimed that women gained the right to work in various shops at the navy yard and it depicts women wanting to participate in the production of goods for their troops (Doc. 5). Women jumped at the idea of working to gain money while their “breadwinner” was fighting in the war because it let them taste what it means to be considered a working citizen. Although, they would not let go of the big step towards gaining equal rights to men. Women pushed further and the Joint Resolution was passed on May 19th, 1919 by Congress allowing women the right to vote and extends the right to suffrage to women making their dreams to be an American citizen a reality (Doc. 6). They... ... middle of paper ... ...ued because of what they have done. American men were called to war after America broke its isolationism and they lost their previous jobs in order to exhibit their patriotism. In America even if WWI was fought overseas, and there were necessary battles to be fought at home for many Americans. African Americans fought to work in the North for improved lives, women fought for suffrage and to help by working for the war effort, German-Americans and other immigrants were suppressed so no uprisings occurred from radicals, and men fought for their country. Overall, America experienced changes politically, economically, and socially but it also showed their patriotism and how they were able to cope with bringing involved in foreign affair which later led them not to approve of the League of Nations because of the pain and change internationalism has caused them.

More about WWI Continued in America After the War Ended

Get Access