Feminism
Myth: Feminists hate men.
FALSE! The simple truth about feminism is that women--all women, including Black and Brown women, trans women, and gay women--want equal rights.
In the 1970's in the USA there were over 14,000 laws specifically regulating the rights of women.
There are more now, likely thousands more.
We would like to have our bodies studied the same way men's bodies have been so that when we have a heart attack or stroke it can be identified (because these things present differently in women's bodies).
We would like equal treatment in ERs. Women, and Black women in particular, are undertreated in medical emergencies.
We would like **zero** legislation regarding our bodies and what procedures and treatments we have access to.
We want equal rights in marriage.
We want equal pay for equal work. Again, this applies to our Black, Brown, Trans, and Lesbian sisters.
We want you, men, to understand that it is not a compliment when you grab our asses or breasts without permission.
We want you, men, to understand that when you see us and all you see is our outside? That you have missed the entire fucking point of admiring women.
We want equal: respect; dignity; care; and love.
And we want to be able to be firm in our boundaries, aggressive in business, and wildly erotic without hearing ourselves name called with words that mean 'mother dog' or 'vulva' or that insinuate that we had sex with our mothers.
None of this means we hate men. It means we are tired of being treated as if we, the birthers of every single human being you have ever known, are second class citizens.
We love men and it is time that men love us back appropriately.
Feminism is all of those things and more.