There are many instances in which we might ask ourselves if relying too much on technology might just bite us back in the butt. On the forefront of this thought are the many incidents in which we are forced to look at the many discrepancies and spot the possible dangers of technology.
Indeed, these discrepancies can range from harmless to dangerously dubious and weird to creepy.
One of these weird and creepy instances was when Facebook’s AI chatbot experiment began communication as the team expected it to, but it did so in a botched and shorthanded version of English that just… didn’t make sense.
|Photo Credits via blog.athenagt.com|
Reported by Independent UK, a source for the latest breaking news, comments, and features, the story of Facebook's two chatbots began when Facebook tasked them to “work out how to negotiate between themselves, and improve their bartering as they went along.”
They improved it, alright, but it wasn’t long until Facebook staff found something odd that the bots--Bob and Alice--were doing. If you would look at the transcript of the conversation, it would look really odd.
This is exactly the reason why Athena, a website writing content for data science, AI, blockchain, IOT, and other digital projects, decided to do a little debunking.
According to their report, the unusual chatbot exchange was not the robots finding a way to take over as they had previously believed. Instead, it was an effort by the bots to shorthand the English language and communicate faster and more precisely (for them).
|Photo Credits via Shutterstock|
When news about this came out, the press made it out to be a big deal. Athena reports that one British tabloid even called the incident “lethal,” saying that, should the same tech be transplanted to military robots, the results could be catastrophic. This is still too much of a stretch, especially considering the fact that Facebook was quick to amend its mistake by eventually changing the system “to prevent the bots from speaking in an independent language other than English.”