There has never been a better time to quit Facebook
- I quit
- Facebook refuses to police information on its own vast network, despite being the largest media company in the world.
- Recently, CEO
Mark Zuckerbergdefended Facebook's decision to allow a Tump post that Twitter hid with a disclaimer for "glorifying violence." Employees have protested in unprecedented numbers, and some have quit.
- If you care about quelling the spread of
misinformation, especially amid the protestsrocking America and the world right now, you'll quit Facebook and never look back.
- This is an opinion column. The thoughts expressed are those of the author.
You can do better than Facebook.
You can get your social media fix from other platforms. You can send texts or create group chats in a ridiculous number of services that let you connect with friends and family.
There's Twitter for keeping up with news and other people. There's TikTok for videos and entertainment. There's Reddit if you want to get into discussions. There are plenty of other websites where you can create user profiles and connect with other users.
Facebook is the world's largest social network. But the company's recent actions, or rather inactions, are the most compelling reason to leave. (And that's saying something, considering Facebook's sordid history — here's a refresher course if you need one.)
On Friday, Donald Trump admonished protesters and said "when the looting starts, the shooting starts" on Facebook and Twitter, Twitter took action to flag his message as incendiary and "glorifying violence," and hiding the post with a disclaimer. Facebook refused to do the same, defending its decision publicly and internally. As a result, several employees have walked out, or straight-up quit.
Zuckerberg refuses to rock the boat
Facebook's stock has been a Wall Street darling for years, as the company continues finding ways to make money off its billions of users.
And Facebook's unfettered growth may be one reason why CEO Mark Zuckerberg reportedly quashed any ideas to fix the divisive nature of its service: He doesn't want to alienate any potential users. It's like how Michael Jordan famously said, "Republicans buy sneakers, too," when refusing to endorse a candidate in a US Senate race between Harvey Gantt, the first Black mayor of Charlotte, and Jesse Helms, who was a notorious segregationist.
Facebook has also repeatedly refused to fact-check political ads, even though false political ads may leave his network vulnerable for another mass-manipulation effort like what happened in 2016.
Refusing to police the content on your platform is inexcusable, especially when certain figures on that site have thousands or even millions of followers. Not policing the truth is deadly: Just look at the anti-vaxxer movement, or any of the discourse around coronavirus and using certain drugs as treatment from earlier this year. Better yet, Google search "Facebook" and "Myanmar genocide."
Policymakers may be able to change Facebook, but after the Cambridge Analytica scandal, Congress proved ineffective at asking the right questions, ultimately not knowing how to protect the billions of Facebook's users.
Creating laws around social networks on the internet is going to take time. But I'd argue that policing information for accuracy and context is a moral issue, and Facebook is failing by not even trying.
Facebook is a media company, and it should start acting like one
Given the unique state of our planet right now, and the ways in which 2020 has forced humans to examine the way we treat one another, I cannot in good conscience recommend Facebook to anybody. On the contrary, I believe it is a toxic service that is brilliantly designed to be addictive; misinformation aside, plenty of studies have shown that using Facebook makes people unhappy.
Even if Facebook can't police all the content on its service, the company has an obligation to examine the most important messages: the content within advertisements, and the content being shared by its most prominent users.
It's the responsibility of media companies to ensure the information on their site is good information, particularly if it comes from an account that meets a threshold for follower numbers like Trump's does, or just about any celebrity or elected official. Facebook's inability or unwillingness to moderate those accounts is dangerous.
Facebook isn't the only social network at fault; many others could also do better. Twitter should fix how trending topics are surfaced, and clamp down on bots. Reddit could do way better with handling misinformation and moderating its own out-of-control subreddits. But Facebook is the biggest offender because it is the biggest company.
Since Facebook refuses to hold itself accountable, you can hold yourself accountable instead. The biggest way to send a message to Facebook's leadership about fixing their platform is to cut off their feedback loop of information and quit their platform. Until they hold themselves accountable as a media company, Facebook should not be trusted with your time or information since they can't be bothered to police any of it.
So, one small way to make a difference in 2020: Leave Facebook, and don't look back.
- Top 10 must-visit destinations in South Africa
- Indian markets continue record run, Sensex scales 69K peak as power, bank shares advance
- Stock markets continue record run, Sensex scales 69K peak as power, bank shares advance
- Icy pebbles may be carrying water to developing planets across the cosmos!
- With $500 billion in reserves, LIC is the world’s fourth largest insurer