The word is history, not her-story. There are two genders, and women don't have it as bad as they think they do. I'm an 18 year old woman, and usually people automatically assume that I am a huge feminist and believe that women deserve more rights than men. I'm a traditional girl, I don't think men are being condescending when they want to hold doors open or cover the check. I think it's gentlemanly and a dating habit that needs to be more popular.
Modern feminists slam girls for having flat stomachs, saying that they aren't real women or that they photoshop their bodies, but claim to be body positive. They think that men should breastfeed (something that isn't really possible so give that one up, ladies). I saw a photo on Twitter that stated "Cats are independent and feminine, which is why you must never trust a man who doesn't like them," because allergies and personal preferences are obviously not a thing.
I'm usually not one to bash someone's beliefs, but modern feminazism isn't fair to anyone. They think that one gender is superior to the other, which is exactly what feminism isn't. Feminism is wanting equal rights; something that everyone wants. It's not calling men supremacists because they believe that men are naturally more muscular and strong (which, news flash, they are) or if you believe that fathers don't have the right to want something for their children.
I truly believe that women need to support women, we go through a lot more shit than men. But that doesn't mean that men are not essential to this society and the world in general. But when women are the ones slamming other women, they don't get to call themselves body positive or that they love all women.