News Social Media Instagram’s New Safety Prompts Can’t Succeed on Their Own, Experts Say Necessary, but not an end-all solution by Joshua Hawkins Freelance Technology Reporter Josh Hawkins is a freelance writer for Lifewire that loves writing about the latest tech and gadgets that help make people’s lives easier. As an avid gamer, he also enjoys diving deep into the technology that helps bring those kinds of experiences to life. our editorial process Joshua Hawkins Published March 18, 2021 Updated March 18, 2021 11:53AM EDT fact checked by Rich Scherr Fact checker Rich Scherr is a seasoned technology and financial journalist who spent nearly two decades as the editor of Potomac and Bay Area Tech Wire. Our Fact-Checking Process Twitter LinkedIn Article fact-checked on Mar 18, 2021 Rich Scherr Tweet Share Email Social Media Phones Internet & Security Computers Smart & Connected Life Home Theater Software & Apps Social Media Streaming Gaming View More Key Takeaways Instagram’s new safety features will limit who adult users can message. Younger users will be able to see if an adult has been making multiple message requests with other teens. Ultimately, Instagram’s new features need additional help if parents want to truly protect their teens. Robert Alexander / Getty Images Instagram says its new safety prompts will make it harder for predators to contact younger users, but there are still too many loopholes for them to be effective on their own. Instagram recently introduced a suite of changes for direct messaging as a way to protect the app’s younger audience. One of the biggest additions is a restriction on direct messages (DMs). Adult users now will find themselves blocked from messaging users under the age of 18 if those users aren’t following them. While the feature seems like a good move on paper, experts say it doesn’t offer nearly enough protection to make a difference without outside help. "Disallowing unsolicited messages from adults to children could cut down on scams, phishing, and predatory behavior targeting minors," Paul Bischoff, a privacy advocate with Comparitech, told Lifewire in an email. "However, it’s easy for Instagram users to lie about their age, and difficult for Instagram to verify a user’s age." The Age Problem While experts like Bischoff are happy to see Instagram working toward new ways to protect users on the app, there are still too many ways for predatory users to get around those new features. One the defining aspects for these new safety prompts is the user’s age. However, age has long been a huge point of contention in the online world. After all, when one is placed behind a screen, the anonymity of the web becomes a playground for users to create a profile of who they want to be—with many often lying about their age to get access to apps and features they might not typically be allowed to use. Instagram "Instagram’s feature to not let adults message users under 18 only works if those users are being honest about their age," Annie Ray, a social media expert at Buildingstars Operations, told us via email. Ray says many younger people on the internet get used to lying about their age to get access to adult websites, and that Instagram is no exception to the rule. The age problem isn’t a new issue, though, and Instagram isn’t blind to it. "We want to do more to stop this from happening," Instagram writes on its website, "but verifying people's age online is complex and something many in our industry are grappling with. To address this challenge, we’re developing new artificial intelligence and machine learning technology to help us keep teens safer and apply new age-appropriate features…" Working in Tandem Machine learning, while effective, will still take time to perfect. Even when utilized correctly, users may still find ways around it if they really want to. Because of this, some experts say Instagram’s safety prompts need parents to help make them more effective. "There are no foolproof solutions that will guarantee a safe, online experience for your child," Monia Eaton-Cardone, co-founder and chief operating officer of Chargebacks911, said over email. "Is it a good thing for Instagram to try to restrict adults from pestering kids? Of course. Is it anywhere close to being sufficient to stop predators completely? Of course not." "Instagram’s feature to not let adults message users under 18 only works if those users are being honest about their age." Eaton-Cardone says parents shouldn’t rely on these new safety features to keep their children safe, stating there’s no substitute for an involved parent. Instead, she recommends parents use those features to compliment their own check-ins and inquiries. "Ask them if they've been getting any weird messages from strangers. Ask them if their friends are having negative experiences online," she said. "In previous generations, parents worried about predators targeting their kids when they left the house," Eaton-Cardone explained. "Children were taught not to talk to strangers and to be wary of suspicious-looking people on the streets—but the assumption was they were safe at home. Today, because of the Internet and flaws in cybersecurity, our homes can be even more dangerous than the outside world." Was this page helpful? Thanks for letting us know! Get the Latest Tech News Delivered Every Day Email Address Sign up There was an error. Please try again. You're in! Thanks for signing up. There was an error. Please try again. Thank you for signing up. Tell us why! Other Not enough details Hard to understand Submit