TikTok is preparing for the upcoming U.S. midterm elections in November. Part of that is going to be a new update to their policy on political advertising, which will prevent creators from posting paid political messages on the short-form video app.
Critics and lawmakers accuse TikTok and rival social media companies including Meta Platforms, which owns Facebook and Instagram, and Twitter of doing too little to stop political misinformation and divisive content from spreading on their apps.
Social media apps have been coming under fire for allowing users to push massive disinformation and conspiracy theories, as well as hate speech and violence.
Ahead of the midterm elections, researchers and experts warn that TikTok is shaping up to be a primary incubator of baseless and misleading information, in many ways as problematic as Facebook and Twitter. This week researchers who track online falsehoods spoke out about how TikTok could be a massive issue for the elections.
The platform’s enormous reach, combined with the short length of its videos, spreads its videos far across its massive user base. Those qualities, alongside a poorly understood recommendation algorithm, can also make inaccurate claims difficult to contain.
Baseless conspiracy theories about certain voter fraud in November are widely viewed on TikTok, which globally has more than a billion active users each month. Users cannot search the #StopTheSteal hashtag, but #StopTheSteallll had accumulated nearly a million views until TikTok disabled the hashtag after being contacted by The New York Times.
Some videos urged viewers to vote in November, pushing debunked and proven false rumors that were raised during the congressional hearings into the Jan. 6, 2021, attack on the Capitol. TikTok posts have garnered thousands of views by claiming, without evidence, that predictions of a surge in Covid infections this fall are an attempt to discourage in-person voting.
TikTok has banned paid political ads since 2019. But campaign strategists have skirted the ban by paying influencers to promote political issues.
The update to the policy is an effort by TikTok to close the loophole. The social media company will host briefings with creators and talent agencies to remind them that posting paid political content is against TikTok’s policies.
TikTok’s head of U.S. safety Eric Han said that internal teams, including those that work on trust and safety, will monitor for signs that creators are being paid to post political content, and the company will also rely on media reports and outside partners to find violating posts.
“We saw this as an issue in 2020,” Han said about the last elections. “Once we find out about it … we will remove it from our platform.”
TikTok broadcast its plan following similar updates from Meta and Twitter.
Meta also said this week that it too will restrict political advertisers from running new ads a week before the election, an action it also took in 2020.
Last week, Twitter also announced its plans for the upcoming election that it will revive previous strategies for the midterm election. This will include placing labels in front of some misleading tweets and inserting reliable information into timelines to debunk false claims before they spread further online. Civil and voting rights experts said the plan was not adequate to prepare for the election.
Be the first to comment