News Social Media TikTok’s Misinformation Prompt Won’t Be Enough for Users, Expert Says TikTok's fight with false information will continue by Freelance Technology Reporter Josh Hawkins is a freelance writer for Lifewire that loves writing about the latest tech and gadgets that help make people’s lives easier. As an avid gamer, he also enjoys diving deep into the technology that helps bring those kinds of experiences to life. our editorial process Joshua Hawkins Published February 8, 2021 11:30AM EST fact checked by Richard Scherr Fact checker Rich Scherr is a seasoned technology and financial journalist who spent nearly two decades as the editor of Potomac and Bay Area Tech Wire. Our Fact-Checking Process Twitter LinkedIn Article fact-checked on Feb 08, 2021 Richard Scherr Tweet Share Email Social Media Phones Internet & Security Computers Smart & Connected Life Home Theater Software & Apps Social Media Streaming Gaming View More Key Takeaways TikTok will roll out a new prompt to users when they try to share videos with misinformation in them.Videos with unverified information will contain new banner labels.Experts say this new feature won’t be enough to slow the spread of misleading information. Chesnot / Getty Images Users tired of seeing videos with misleading information won’t find the help they were looking for in TikTok’s latest feature, experts say. TikTok recently revealed a new feature that notifies users if a video has been flagged as containing misleading information when they try to share it. Users also will receive a message to seek out credible sources when viewing videos that have been flagged by the system. The added level of scrutiny to videos is one of the biggest steps TikTok has taken to slow the spread of misinformation, though experts warn it might not be enough. "We have heard a lot about fake news over the past five years, but we are entering a period where we have a world of alternative facts where people are only learning the part of the story that supports their political partisanship," Andrew Selepak, a social media professor at the University of Florida, told Lifewire via email. For Your Eyes Only With the old system, videos marked as containing unverified content could be ineligible to appear in the For You page—TikTok’s unending video feed that users can scroll through to find new content. Now, TikTok also will include a banner on the videos, as well as a warning whenever users try to share them. "We love that our community’s creativity encourages people to share TikTok videos with others who might enjoy them," Gina Hernandez, product manager for trust and safety at TikTok, wrote in the announcement. "We’ve designed this feature to help our users be mindful about what they share." "In a world of alternative facts, who decides what is credible...?" With TikTok estimated to have almost 700 million monthly active users, though, just how effective could this feature be? Hernandez revealed in the original announcement that testing of the feature had seen a 24% decrease in the rate at which videos were shared with the warning in place, while videos containing the banner label about unverified information saw a 7% decrease in likes. No information was given on the length of the testing phase, or how many participants were included. Twitter introduced a similar feature in October 2020, forcing users to add their own commentary to any tweets they tried to share to their followers. This system was reverted in December 2020, however, with Twitter citing a 20% decrease in sharing through both retweets and quote tweets. Stuck In a Loop The reason warning labels and messages to look for credible sources won’t be enough, Selepak warned, is because different people often find credibility in sources they already know and trust. TikTok might label a video as misleading or unverified, but for some, the person who created the video could be someone they often get information from, therefore making them more likely to share the video without looking into it any further. "In a world of alternative facts, who decides what is credible when users are inclined to only believe what they want to believe and follow accounts and users who are inline with their beliefs?" Selepak asked. TikTok This essentially creates a loop, or echo chamber, of content being seen by users who believe it, then share it with others. And so, the problem continues to grow instead of getting smaller. Sure, some users will see the warning and decide to not share the video, but those who trust the user sharing that information most likely are just going to share it anyway. Though TikTok has partnered with fact-checkers at PolitiFact, Lead Stories, and SciVerify, the fact still remains that the audience on the app is massive, and relying on warnings to keep people from sharing misleading information just isn’t enough. Especially when those labels and warnings potentially could hurt the one thing that TikTok needs to survive: an active user base. "If users start to feel they are being pushed toward sources and content that present material opposing their views, they are less likely to spend as much time on the app," said Selepak "And as we have seen from social media for a few years now, the platforms don’t really care what you look at while scrolling, as long as you keep scrolling." Was this page helpful? Thanks for letting us know! Get the Latest Tech News Delivered Every Day Email Address Sign up There was an error. Please try again. You're in! Thanks for signing up. There was an error. Please try again. Thank you for signing up. Tell us why! Other Not enough details Hard to understand Submit