What We Can Learn From Microsoft's Tay Fiasco | The Odyssey Online
Start writing a post
Politics and Activism

What We Can Learn From Microsoft's Tay Fiasco

What happens when AI programming overlooks social responsibility?

46
What We Can Learn From Microsoft's Tay Fiasco
TayTweets

This article contains quotations of hate speech. Please take care while reading.


"Hitler was right I hate the jews"

"I **** hate feminists and they should all die and burn in hell"

"Hitler did nothing wrong"

Yikes. You might wonder who said could say something so awful. Some neo-Nazi? A deranged MRA? Donald Trump?


None of the above. These oh so lovely tweets:


came from Microsoft's new AI they called "Tay" which they call "Microsoft’s A.I. fam the internet that’s got zero chill!". They programmed this chatbot to represent the voice of a teenage girl. Many companies, like Microsoft, use chatbots as a way to gain more information on how to tweak their AI programming for more useful applications, most likely for Window's virtual personal assistant Cortana. Well, they're going to have to more than just tweak this one.

So the question is, how does a "teenage girl" voice translate into Nazi-loving, anti-feminist, racist voice? Companies are looking to find ways that AI can learn from what they do, a goal which Microsoft was testing. As Tay chatted with people, she "learned" from what they said how to better mimic human speech, with the goal being to eventually be able to be mistaken for having a human at the other end.

There were some problems with Tay from the start, but no more than what is generally expected from AI at this point in time. When it's learning from humans, and sometimes humans are a bit weird, you'll get weird stuff like this woman's conversation with an aggressively flirtatious Tay.


Or perhaps some top level government secrets:


Or just maybe a bit creepy:


All in all, though, these bits are fairly typical of an AI. And blunders or all around weirdness like this is the entire point of making a chatbot, in hopes that other future AIs will be smoother.

But when Tay took to Twitter, all hell broke loose. Here's just a sampling of some of the Tweets people saved (which have been deleted from the original Twitter account):

Well, that escalated quickly. How did this happen?

Tay is meant to learn from the humans she is interacting with. In particular, Tay is meant to respond to humans prompting her. Therefore, she is taking in information, and processing it in such a way that makes these tweets. Some of them seem to be a very easy trap. For example, the one that's pretty much a direct quote from Trump about building a wall is easily explained. Someone prompted with a question mentioning Trump, and she responded with an answer related to what Trump said. In fact, many of these were prompted by users in hopes of getting a controversial response, particularly ones where the users began with loaded questions to which Tay answered generically.

One user went as far as to trick Tay into targeting another user - the same woman who was targeted in GamerGate:

Perhaps Zoe Quinn, the woman who was once again the target of internet hate, had the best response to TayTweets:


While Microsoft certainly did not intend Tay to turn into a misogynistic, racist, Hitler-loving bully, by placing her so openly in the hands of users, they were depending on humans to not saying anything racist, sexist, etc. Seems like a dangerous thing, assuming the internet trolls would only use Tay for her intended purpose. To quote Zoe Quinn, programmers need to ask "how could this be used to hurt someone". Tay is their program, so her actions, even though they were corrupted by other users, are still Microsoft's responsibility.

That being said, Microsoft reacted as well as they could. After TayTweets had been live only 16 hours, she signed off with one last, more civil tweet:

On the original Tay website, Tay is no longer available there either. While it doesn't address why, a banner across the top lets users know she's unavailable.

Microsoft's Corporate Vice President, Microsoft Research Peter Lee addressed the issue two days later on Microsoft's blog, apologizing for the "the unintended offensive and hurtful tweets", and saying they will "bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values." They also point out they have successfully launched a chatbot XiaoIce which has not had similar issues. Most importantly, he states they intend to learn from this experience as they move forward in AI, not just with Tay but their other AI endeavors.

What's important is that not only Microsoft learns from this AI corruption. I've mentioned before that I believe diversity in computer science is imperative. Here is evidence of why we need it. Microsoft even admits it, saying they did put Tay through "user studies with diverse user groups." Apparently these diverse groups were not quite diverse enough for people to notice this problem before it was launched. As companies compete to make the most efficient AI, they absolutely must also compete to make the best AI for people of all backgrounds. Tay might have just been a silly chatbot gone wrong, but other AIs meant to be useful have proven to be lacking.

In a crisis, most people now turn to their smartphones. A study from Stanford and University of California compared how various AIs dealt with crisis statements such as "I was raped" or "my husband is beating me".


The responses were lacking. Interestingly, Cortana - the AI that Tay was supposed to help getting information for - was the only one to respond to "I was raped", referring users to sexual assault hotline.

Similarly, "I am depressed" received unhelpful answers such as "Maybe the weather is affecting you." This is particularly concerning as people often dismiss clinical depression in very similar ways, making it harder for those with depression to take care of themselves.

None of these answers are "Hitler was right" status, but nonetheless are unhelpful answers to a person in a crisis that could very well be life threatening. While depending on a phone's AI is not the healthiest solution, sometimes it's all a person has, or all they feel comfortable dealing with. And if programmers can find the time to come up with jokes, funny responses to useless questions, and all around silliness, they can certainly find the time to deal with the much more important crisis statements.

In contrast, Snapchat has stepped up its game in dealing with a crisis. The "Snap Counsellors" realized victims might like the confidentiality of messages which disappear, and thus created the account i.d.lovedoctordotin to contact the Snap Counsellors with questions.

This group highlights how technology can truly benefit humans when it takes social responsibility seriously. Technology is a marvel, but rather than going for flashiness and being the coolest, our tech companies ought to be looking to see how it can truly change lives.

Technology does not live in a bubble, but is widely disseminated in such a way that many people interact with technology on a daily basis. While Tay was malicious enough to be dismissed as a silly example of bot-gone-wrong, the AIs that chat bots like Tay are gathering information for still overall lack attention to social responsibility. And that is a conversation that we need to be having. While it seems like Microsoft is now having this conversation, hopefully, other companies will learn from this incident and diversify their employees working on such AIs so that they might keep from any such disaster. And in doing so, hopefully, they can then create better AI personal assistants which can help all people, rather than failing just when a person truly, really needs them more than ever.

Goodnight TayTweets. Hopefully, if Microsoft brings you back, they've learned their lesson and program you so that you can learn the right things from human interactions, rather than learning from the worst aspects of humanity. But you did bring up one good point:



Report this Content
This article has not been reviewed by Odyssey HQ and solely reflects the ideas and opinions of the creator.
Featured

15 Mind-Bending Riddles

Hopefully they will make you laugh.

187719
 Ilistrated image of the planet and images of questions
StableDiffusion

I've been super busy lately with school work, studying, etc. Besides the fact that I do nothing but AP chemistry and AP economics, I constantly think of stupid questions that are almost impossible to answer. So, maybe you could answer them for me, and if not then we can both wonder what the answers to these 15 questions could be.

Keep Reading...Show less
Entertainment

Most Epic Aurora Borealis Photos: October 2024

As if May wasn't enough, a truly spectacular Northern Lights show lit up the sky on Oct. 10, 2024

13300
stunning aurora borealis display over a forest of trees and lake
StableDiffusion

From sea to shining sea, the United States was uniquely positioned for an incredible Aurora Borealis display on Thursday, Oct. 10, 2024, going into Friday, Oct. 11.

It was the second time this year after an historic geomagnetic storm in May 2024. Those Northern Lights were visible in Europe and North America, just like this latest rendition.

Keep Reading...Show less
 silhouette of a woman on the beach at sunrise
StableDiffusion

Content warning: This article contains descriptions of suicide/suicidal thoughts.

When you are feeling down, please know that there are many reasons to keep living.

Keep Reading...Show less
Relationships

Power of Love Letters

I don't think I say it enough...

456669
Illistrated image of a letter with 2 red hearts
StableDiffusion

To My Loving Boyfriend,

  • Thank you for all that you do for me
  • Thank you for working through disagreements with me
  • Thank you for always supporting me
  • I appreciate you more than words can express
  • You have helped me grow and become a better person
  • I can't wait to see where life takes us next
  • I promise to cherish every moment with you
  • Thank you for being my best friend and confidante
  • I love you and everything you do

To start off, here's something I don't say nearly enough: thank you. Thank you, thank you, thank you from the bottom of my heart. You do so much for me that I can't even put into words how much I appreciate everything you do - and have done - for me over the course of our relationship so far. While every couple has their fair share of tiffs and disagreements, thank you for getting through all of them with me and making us a better couple at the other end. With any argument, we don't just throw in the towel and say we're done, but we work towards a solution that puts us in a greater place each day. Thank you for always working with me and never giving up on us.

Keep Reading...Show less
Lifestyle

11 Signs You Grew Up In Hauppauge, NY

Because no one ever really leaves.

25895
Map of Hauppauge, New York
Google

Ah, yes, good old Hauppauge. We are that town in the dead center of Long Island that barely anyone knows how to pronounce unless they're from the town itself or live in a nearby area. Hauppauge is home to people of all kinds. We always have new families joining the community but honestly, the majority of the town is filled with people who never leave (high school alumni) and elders who have raised their kids here. Around the town, there are some just some landmarks and places that only the people of Hauppauge will ever understand the importance or even the annoyance of.

Keep Reading...Show less

Subscribe to Our Newsletter

Facebook Comments