Seriously, what did we do before Google? I mean, if it wasn’t for the search engine, how would we know who the biggest idiot in the world is? Or what people really think about most in every state?
Well, it turns out Google isn’t only good at finding out things humans have already discovered, it’s now predicting the future. And we’d be impressed, if its prophecies weren’t so damn spooky.
Reddit users were the first to come across professional multi-lingual translation services, which they dubbed “Translate Gate.” The multi-lingual translation service is spitting out warped doomsday messages, which we seriously hope is the result of an algorithm glitch.
For example, if you type “dog” 16 times in Maori, translating it to English, you get this: “Doomsday Clock is three minutes at twelve We are experiencing characters and a dramatic developments in the world.”
Add an extra dog to that list, and this is what pops up: “Doomsday Clock is three minutes at twelve We are experiencing characters and a dramatic developments in the world, which indicate that we are approaching the end times and Jesus’ return.”
And it gets worse. Add an 18th dog, and you’ll get the complete version of the end-of-times message.
“Doomsday Clock is three minutes at twelve We are experiencing characters and a dramatic developments in the world, which indicate that we are increasingly approaching the end times and Jesus’ return.”
And it’s not just Maori to English. This is what happens when you try Somali to Irish.
Yikes!
Now, Google Translate conspiracy theories are circling the web. Because, ya know, it’s the internet.
i just watched a conspiracy theory about the somali google translate thing and now im fucking terrified why am i doing this to myself
— cess ♡ (@bIoomyari) July 21, 2018
Google Translate conspiracy theory is real ?? pic.twitter.com/Lf6VgzyOF3
— Mariah (@mariahhh______) July 16, 2018
Some are blaming ghosts and demons for the mysterious results, while others are suggesting that people are taking advantage of the “suggest an edit” button. However, Harvard Professor Alexander Rush says the most likely reason is “neural machine translation.”
Rush told Motherboard: ““The models are black-boxes, that are learned from as many training instances that you can find.
“The vast majority of these will look like human language, and when you give it a new one it is trained to produce something, at all costs, that also looks like human language.
“However if you give it something very different, the best translation will be something still fluent, but not at all connected to the input.”
Basically, if Google Translate’s system is trying to find order in the chaos that people are feeding into it, and coming up with bizarre answers as a result.
Either that or fed up Google employees are having a laugh at our expense. Anything is preferable to a haunted translator.
0 Comments