TikTok has pledged to clamp down on false or misleading content on Election Day. The operator of the wildly popular video-clip app announced in a company blog post this week that it is taking a series of measures to limit the distribution of such content, including outright blocking of offending material.

A large part of this effort is the handling of potentially premature declarations of victory in any of the numerous races for public office. "Out of an abundance of caution, if claims can't be verified or fact-checking is inconclusive, we'll limit distribution of the content," wrote Eric Han, author of the post and head of safety for TikTok U.S.

Earlier in October, TikTok launched its Elections Safety Center to explain its numerous approaches to transparency and monitoring vote-related content. It has also posted a voter guide to the elections within its app. A banner directing users to that guide will be posted on content that makes claims that are not able to be independently verified by trusted third parties such as the National Association of State Election Directors.

Torso of a man using a smartphone, with election-related stickers on his shirt and on a takeout coffee he's holding.

Image source: Getty Images.

Lastly, TikTok said, it is providing an in-app set of hotline phone numbers for assistance in several languages should users experience any difficulty with, or challenge to, their attempt to vote.

TikTok is currently owned by a company based in China, ByteDance. It has attracted considerable controversy due to concerns over user security and its relationship to the Chinese state.

Under U.S. government pressure to divest TikTok, ByteDance agreed to the formation of a new, U.S.-headquartered company, TikTok Global, with considerable investment from Oracle (ORCL 2.02%) and Walmart (WMT -0.08%). That deal is pending.

Neither Oracle nor Walmart has made any public comment about TikTok's efforts on election transparency and content monitoring.