As a disclaimer, I have no hard feelings towards teachers. Most of my family members are educators in one way or another and they’ve instilled in me a knowledge of the importance of education. Teachers are valuable members of our society and we should treat them accordingly.
However, that doesn’t mean I enjoy having the same conversation with every person I encounter who inquires as to what my major is. Because the answer is no. I’m not intending on using my (eventual) English degree to go into the teaching field.
I think part of the reason why “What are you going to do, teach?” bothers me so much is because A) it doesn’t line up with what I actually want to do and B) it can be frustrating when people make assumptions (especially when they aren’t true) over and over again.
And I mean c’mon, there are plenty of other careers that you can have as an English major and flourish in, and plenty of people who have been successful at doing so. Education and English degrees are not synonymous with one another!
With a degree in English, a person can go on to become a lawyer, editor, or even an actor. There are even plenty of English majors involved in marketing and public relations too. All of that considered, you don’t even need to go into a field related to English.
The healthcare field isn't closed off to English majors. You don’t need to major in Biology to become a doctor or a dentist. By either completing the prerequisites in college or applying to a post-baccalaureate program centered around health care, English majors can break into medical school and other graduate schools to prepare for a health-related career.
By being an English major, you’ve developed a great set of communication skills that will be invaluable to you in whichever career path you end up in. So, to the chatty woman I sat next to on the plane, to the well-meaning relatives curious about my collegiate career, to all my dumbfounded classmates in the Sciences: no, I do not intend on teaching.