First of all, what are gender roles? This could mean a lot of different things depending on how you view gender. According to plannedparenthood.org, they say, “Gender roles in society means how we’re expected to act, speak, dress, groom, and conduct ourselves based upon our assigned sex. For example, girls and women are generally expected to dress in typically feminine ways and be polite, accommodating, and nurturing.” That being said, as we come upon a new era, society is taking a different step when it comes to how they view gender roles and here is why.
Hearing these things as a women in the work place are very common which leads to the devaluing of women’s ideas, usually when people ask to speak to a manager, they tend to look for a man and disregard what women have to say or when women try to speak about an idea in the work place, a man may seem to step in and take over, leaving women feeling less important.
Women in the workplace are also judged on their appearance, a lot of times if a woman does not dress for the occasion, she will not be taken seriously. Women deal with this on a day to day basis.