The Role of Women in American Society

From barefoot and pregnant to CEO! Could this be true? What are the roles of women in America?

From the time she was a little girl, certain roles were instilled in her. She always wore pretty little colored bows in her hair; they matched the pretty little dress her mommy dressed her in each day. She was told to play with dolls, cross her legs, and brought her father water when he asked. She was not supposed to play in mud or play football or basketball with her brothers. This is because she was born a girl. She would soon grow into a teenager, develop breasts and scout for the ideal guy that she would someday call her husband and expect to take care of her.

Historically, women have been expected to become homemakers. Duties of the American woman are to cook, clean, and raise the children. Does this sound familiar? These roles are idealized by many of the women on TV Land like June Cleaver. Women by nature are nurturers, which may explained why they are expected to tend to the children. Women are supposed to wear pretty dresses and never leave the house without their make up on and their hair neatly done. A woman is supposed to make her husband happy and treat him like the king he is. These are similar to the ideas women are taught as little girls. However, while these ideas might have been the roles of women in the 1970s, they do not hold up today.

For centuries, women have been oppressed and discriminated against. Laws have been created to discontinue these practices. Now, little girls can put down the Barbie and play with Hot Wheels or make mud pies. They play basketball and dream of being in the WNBA. When they grow up and go to college, they go to school for dentistry, engineering, business, or whatever they want.   

Today, the role of the American woman varies by the woman. Some American women are single mothers and have to be both mommy and daddy; some women choose not to even have children. Some women work outside the home in fields like mechanics, construction, and other manual labor jobs, that typically are thought of as masculine jobs. While others work as doctors, lawyers, police officers, and even run for president. In the near future, society may get rid of the idea of male versus female roles.

So what exactly is the role of women in the American society? The short answer is: whatever she wants to be!