TikTok removed 400,000 videos in the 2nd half of 2020 to combat election and COVID-19 misinformation

Advertisement
TikTok removed 400,000 videos in the 2nd half of 2020 to combat election and COVID-19 misinformation
TikTok has had to confront the spread of misinformation and disinformation, especially relating to the 2020 US election and the coronavirus pandemic.Photo Illustration by Andrea Ronchini/NurPhoto via Getty Images
  • TikTok announced Wednesday it removed nearly 90 million videos globally in the second half of 2020.
  • Of those videos, more than 11 million videos were removed in the United States.
  • Almost 350,000 videos were removed for misinformation about the US election, while more than 50,000 were removed for spreading COVID-19 misinformation.
Advertisement

TikTok removed approximately 400,000 videos for misinformation related to the US election and COVID-19 in the second half of 2020, the company announced Wednesday.

In total, from July 1 to December 31 last year, the company said it removed 89,132,938 videos globally, with 11,775,777 of those being removed in the United States. TikTok said these videos were removed for violations of its community guidelines and its terms of service.

The stats were published Wednesday in a press release authored by Michael Beckerman, TikTok's vice president and head of US public policy, and Eric Han, the company's head of safety in the US.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

About 92% of the removed videos were deleted prior to a user reporting them, the company said. Approximately 83% of the videos were removed before anyone had seen the videos, and about 93% of the videos were removed within 24 hours of their being posted.

Of the more than 12 million videos removed, the company said it removed 347,225 videos for misinformation, disinformation, or "manipulated media" related to the 2020 election. TikTok said it deleted 51,505 videos for misinformation about the COVID-19 pandemic.

Advertisement

Like all major social media platforms, TikTok has had to confront the spread of misinformation and disinformation on its app, especially last year relating to the US election and the coronavirus. As Insider previously reported, TikTok has become a key tool for discussing politics among politicians, candidates, and TikTok users.

Scientists and doctors have also used TikTok to debunk falsehoods and conspiracies and educate users about COVID-19 and vaccines amid misinformation on the platform. The company last year began labeling videos about COVID-19 with a link to a pandemic information hub. The press release on Wednesday said the label was applied to more than 3 million videos in the second half of 2020.

TikTok, which is owned by the Chinese company ByteDance, in August updated its rules on disinformation and misinformation, creating new guidelines to prohibit synthetic or manipulated content and defining existing policies on "coordinated inauthentic behavior" relating to the election. The company also said at the time it expanded its relationships with fact-checking partners and launched a partnership with the US Department of Homeland Security.

The company on Wednesday said it was working to further bolster its effort to combat misinformation and disinformation on the platform to "better identify altered versions of known disinformation." TikTok also said it was working to develop tools to prevent "repeat offenders" from evading or otherwise circumventing its moderation decisions.

{{}}