Xandercosm
Smash Lord
I've looked and can't find a thread about this specific topic. I find it very important, so I decided to make my own...
Now, I'm not a female (as you can probably tell), but women's rights and equality are extremely important to me. I find it disgusting that women are payed less than men for doing the same jobs just as proficiently, I believe that women should be as independent as they so choose, and I think that men are obviously not the only people who should make decisions and who should have a presence politically. I also, am really disgusted by how women are represented by the media. As sex objects, existing for the sole purpose of attracting male consumers.
I, honestly think it's immature and it isn't right. Do we really want little girls growing thinking that they need breast implants? Is that the type of environment that we want in our society? Also, men who fall into the trap of that type of appeal in media don't understand and/or care about the actual reason those things are attractive. We, like every other animal, evolved to want and need sex. Otherwise our species wouldn't be able to survive. It's one of the most basic biological needs. But, unfortunately, companies in the business of creating entertainment for men have realized how to take advantage of that need.
It's sad because I doubt that this phenomena will ever fade away. I think some of the ways to help mitigate it, though, are to educate people more on sex and its roll in evolution. As a male myself, I can definitely say that when you come to understand its actual use, you can learn how to stop yourself from constantly taking the bait in the media.
If we could all make an effort to work to this goal, I think the world would be a better place for everyone. It's just one of many goals we need to work toward, but its important, nonetheless.
I hope everyone will voice their opinion on this matter. But if you're just here to say "Who cares! All women are good for is their looks!" then I don't think this is the thread for you.
Now, I'm not a female (as you can probably tell), but women's rights and equality are extremely important to me. I find it disgusting that women are payed less than men for doing the same jobs just as proficiently, I believe that women should be as independent as they so choose, and I think that men are obviously not the only people who should make decisions and who should have a presence politically. I also, am really disgusted by how women are represented by the media. As sex objects, existing for the sole purpose of attracting male consumers.
I, honestly think it's immature and it isn't right. Do we really want little girls growing thinking that they need breast implants? Is that the type of environment that we want in our society? Also, men who fall into the trap of that type of appeal in media don't understand and/or care about the actual reason those things are attractive. We, like every other animal, evolved to want and need sex. Otherwise our species wouldn't be able to survive. It's one of the most basic biological needs. But, unfortunately, companies in the business of creating entertainment for men have realized how to take advantage of that need.
It's sad because I doubt that this phenomena will ever fade away. I think some of the ways to help mitigate it, though, are to educate people more on sex and its roll in evolution. As a male myself, I can definitely say that when you come to understand its actual use, you can learn how to stop yourself from constantly taking the bait in the media.
If we could all make an effort to work to this goal, I think the world would be a better place for everyone. It's just one of many goals we need to work toward, but its important, nonetheless.
I hope everyone will voice their opinion on this matter. But if you're just here to say "Who cares! All women are good for is their looks!" then I don't think this is the thread for you.