U.S. Midterms Bring Few Changes From Social Media Companies

U.S. Midterms Bring Few Changes From Social Media Companies

Social media companies are sharing their plans for safeguarding the U.S. midterm elections, although they have offered scant details.

SAN FRANCISCO — Social media companies are offering few specifics as they share their plans for safeguarding the U.S. midterm elections.

Platforms like Facebook and Twitter are generally staying the course from the 2020 voting season, which was marred by conspiracies and culminated in the Jan. 6 insurrection at the U.S. Capitol.

Video app TikTok, which has soared in popularity since the last election cycle while also cementing its place as a problem spot for misinformation, announced Wednesday it is launching an election center that will help people find voting locations and candidate information.

The center will show up in the feeds of users who search election-related hashtags. TikTok is also partnering with voting advocacy groups to provide specialized voting information for college students, people who are deaf, military members living overseas and those with past criminal convictions.

TikTok, like other platforms, would not provide details on the number of full-time employees or how much money it is dedicating to U.S. midterm efforts, which aim to push accurate voting information and counter misinformation.

The company said it is working with over a dozen fact-checking organizations, including U.S.-based PolitiFact and Lead Stories, on debunking misinformation. TikTok declined to say how many videos have been fact-checked on its site. The company will use a combination of humans and artificial intelligence to detect and remove threats against election workers as well as voting misinformation.

TikTok said it’s also also watching for influencers who break its rules by accepting money off platform to promote political issues or candidates, a problem that came to light during the 2020 election, said TikTok’s head of safety Eric Han. The company is trying to educate creators and agencies about its rules, which include bans on political advertising.

“With the work we do, there is no finish line,” Han said.

Meta Platforms Inc., which owns Facebook, Instagram and WhatsApp, announced Tuesday that its approach to this election cycle is “largely consistent with the policies and safeguards” from 2020.

“As we did in 2020, we have a dedicated team in place to combat election and voter interference while also helping people get reliable information about when and how to vote,” Nick Clegg, Meta’s president of global affairs, wrote in a blog post Tuesday.

Meta declined to say how many people it has dedicated to its election team responsible for monitoring the midterms, only that it has “hundreds of people across more than 40 teams.”

As in 2020, Clegg wrote, the company will remove misinformation about election dates, voting locations, voter registration and election outcomes. For the first time, Meta said it will also show U.S. election-related notifications in languages other than English.

Meta also said it will reduce how often it uses labels on election-related posts directing people toward reliable information. The company said its users found the labels over-used. Some critics have also said the labels were often too generic and repetitive.

Compared with previous years, though, Meta’s public communication about its response to election misinformation has gone decidedly quiet, The Associated Press reported earlier this month.

Between 2018 and 2020, the company released more than 30 statements that laid out specifics about how it would stifle U.S. election misinformation, prevent foreign adversaries from running ads or posts around the vote and subdue divisive hate speech. Until Tuesday’s blog post, Meta had only released a one-page document outlining plans for the fall elections, even as potential threats to the vote persist.

Twitter, meanwhile, is sticking with its own misinformation labels, though it has redesigned them since 2020 based in part on user feedback. The company activated its “civic integrity policy” last week, which means tweets containing harmful misinformation about the election are labeled with links to credible information. The tweets themselves won’t be promoted or amplified by the platform.

The company, which like TikTok does not allow political advertisements, is focusing on putting verified, reliable information before its users. That can include links to state-specific hubs for local election information as well as nonpartisan public service announcements for voters.

Share:
error: Content is protected !!