It seems people have always been trying to make decisions for women. A woman can never just choose whether or not they want to do something for themselves. Things we call normal that come with just being a woman can drive us crazy when we are told "That's unhealthy" or, "Why would you do that?"
My question to everyone who has ever asked a woman one of those is, "Did you put yourself in their shoes?"
Being a woman in our society is far from easy, from the expectations around how you're supposed to look, how you should always present yourself, and how desirable you make yourself without making yourself seem too slutty.
Of course, there are some things that, as women, we will need help, education, and guidance when it comes to our bodies. A woman's body is forever changing no matter what age we may be. The older we get, our body is adjusting to our reproductive and metabolic systems, and therefore we are having to make long-term decisions with our bodies.
But deciding whether or not we want to choose to get on some sort of birth control is something each woman has to make on their own because, at the end of the day, it falls onto their own body.
So why exactly is it that a man or someone else thinks they have the right to tell a woman to keep a child they aren't ready for or to not get on birth control? My answer is that society in some way has accepted it. Somewhere down the line (actually not too long ago) a man has sat down and pondered their opinion on what a woman should do with their body.
A woman choosing what to do with her body is overall her choice, depending on if they allow other opinions.
Society has for too long made women feel like their bodies are in the hands of others when in reality they are not.