This Google Translate Glitch is Predicting the End of the World and It’s Terrifying


Seriously, what did we do before Google? I mean, if it wasn’t for the search engine, how would we know who the biggest idiot in the world is? Or what people really think about most in every state?

Well, it turns out Google isn’t only good at finding out things humans have already discovered, it’s now predicting the future. And we’d be impressed, if its prophecies weren’t so damn spooky.

This Google Translate Glitch is Predicting the Future and It's Terrifying

Reddit users were the first to come across Google Translate’s strange behaviour, which they dubbed “Translate Gate.” The multi-lingual translation service is spitting out warped doomsday messages, which we seriously hope is the result of an algorithm glitch.

For example, if you type “dog” 16 times in Maori, translating it to English, you get this: “Doomsday Clock is three minutes at twelve We are experiencing characters and a dramatic developments in the world.”

This Google Translate Glitch is Predicting the Future and It's Terrifying

Add an extra dog to that list, and this is what pops up: “Doomsday Clock is three minutes at twelve We are experiencing characters and a dramatic developments in the world, which indicate that we are approaching the end times and Jesus’ return.”

This Google Translate Glitch is Predicting the Future and It's Terrifying

And it gets worse. Add an 18th dog, and you’ll get the complete version of the end-of-times message.

“Doomsday Clock is three minutes at twelve We are experiencing characters and a dramatic developments in the world, which indicate that we are increasingly approaching the end times and Jesus’ return.”

This Google Translate Glitch is Predicting the Future and It's Terrifying

And it’s not just Maori to English. This is what happens when you try Somali to Irish.

This Google Translate Glitch is Predicting the Future and It's Terrifying

Yikes!

Now, Google Translate conspiracy theories are circling the web. Because, ya know, it’s the internet.

Some are blaming ghosts and demons for the mysterious results, while others are suggesting that people are taking advantage of the “suggest an edit” button. However, Harvard Professor Alexander Rush says the most likely reason is “neural machine translation.”

Rush told Motherboard: ““The models are black-boxes, that are learned from as many training instances that you can find.

“The vast majority of these will look like human language, and when you give it a new one it is trained to produce something, at all costs, that also looks like human language.

“However if you give it something very different, the best translation will be something still fluent, but not at all connected to the input.”

Basically, if Google Translate’s system is trying to find order in the chaos that people are feeding into it, and coming up with bizarre answers as a result.

Either that or fed up Google employees are having a laugh at our expense. Anything is preferable to a haunted translator.

You Might Also Like:

Google Makes Embarrassing Translation Error

European Union Fines Google $5 Billion Over Android Antitrust Abuse


Sophie Lloyd

Sophie is a cute feminist butterfly navigating the world one kitty meme at a time, or at least that’s how her best friend described her when she asked for help writing this bio. She likes cheese and one day hopes to be the proud owner of a corgi. For more of her random ramblings, follow her on Twitter/Instagram @_sophofbread.

0 Comments

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.