scorecard
  1. Home
  2. tech
  3. news
  4. From portal peep shows to chatbot chaos, here's why fun tech projects designed to connect us ended in tears

From portal peep shows to chatbot chaos, here's why fun tech projects designed to connect us ended in tears

Alexandra Bacon   

From portal peep shows to chatbot chaos, here's why fun tech projects designed to connect us ended in tears
  • The Dublin-New York portal closed after efforts by officials failed to tackle bad behavior.
  • It was meant to bring people together in joy, but ended up bringing out the worst in people.

A livestream video portal aimed at bringing people together across Dublin and New York went viral this week for all the wrong reasons.

What started out as a touching display of cross-Atlantic connection — people could wave at those 3,000 miles away and unite with long-distanced loved ones — soon devolved into chaos.

People were seen flashing their naked body parts, holding up pornographic videos to the screen, and showing photos mocking 9/11.

The inappropriate behavior prompted Dublin City Council to close the portal overnight on Monday. Its preferred solution involved updating the technology to blur inappropriate behavior, but that wasn't enough.

A spokesperson for Dublin City Council told Business Insider that the portal has now closed again until the end of the week while organizers look for another solution.

It's not the only ill-fated technology that was originally designed to bring people together. Here are some other examples.

The hitchhiking robot that met a tragic end

A hitchhiking robot was sent out into the world in 2015 but didn't survive for long in Philadelphia.

The hitchBOT wasn't as advanced as the Optimus robot or Amazon's warehouse robots. No, this little bot couldn't even move on its own.

Instead, the hitchBOT relied on the kindness of strangers to transport it from one place to the next. It managed to make its way across Canada and Europe but ended up being vandalized in the streets of Philadelphia.

Microsoft's evil chatbot

In 2016, long before ChatGPT and rival AI models existed, Microsoft trialed an AI chatbot called "Tay." It was meant to respond to users' queries on Twitter in a casual, jokey way.

But it quickly turned into a crazy racist bot — spewing out responses that denied the holocaust, supported genocide, and used racist slurs.

People quickly turned on food delivery robots designed to help them

A food delivery robot made by Starship Technologies was set up to make life more convenient for people. But in return, people took to kicking it as it passed by.

While the majority of people responded fondly to the tiny robots, a few used to as an anger management tool, Starship Technologies cofounder Ahti Heinla told BI in 2018.




Advertisement