"Smile a bit." "Give me a little smile." "It doesn't hurt to smile ma." These are only a few phrases that I have heard so many times in my life. I have been told to smile so many times that I have lost count. Now I'm not talking about being told to smile for pictures or by friends, I'm talking about random strangers I run into in the middle of the street who tell me to smile. Why do people have the need to tell me to smile? I don't want to smile and I don't have to conform to societal ideals that say that I must always smile.
I like my bitch face. I like my serious face. I like not smiling. I like being myself. Being told to smile will not do anything else than angering me. I will never understand the need of people to see others smile, especially women. It seems as if others feel that they have the right to dictate how you should look, as if they are telling you what is best for you.
Stop telling me to smile. Stop telling me that I look "prettier" when I smile. Your input is simply not necessary. Let my bitch face be because I will do everything to enhance my bitch face just to bug those who ask me to smile.