The Change in the Role of Women in America After World War One

527 Words2 Pages

The Change in the Role of Women in America After World War One Before World war 1, the women's place was in the home. Her job was to clean and look after the house, take care of the children and have a meal prepared for the Husband when he came home from work. They were not considered able to work outside the home. Women had a lower status than men in society. They were not even able to vote. During the first world war the women had to take over a lot of the men's jobs as all the able men had gone over to Europe to fight in the war. This was a chance for the women of America to prove that they could do the jobs normally associated only with men, and that they could do them just as well as their male counterparts. Women's stance towards commonly held values were questioned, mainly by the younger generation. This was around the 1920's. Although not thought of like this at the time, the first world war changed the face of society. Women had gained more freedom. Women were starting to question some of the laws that had been ...

More about The Change in the Role of Women in America After World War One

Open Document