Why Social Media Can't Always Shelter Us From Offensive Content

Who decides what’s offensive?

  • Instagram just introduced a Sensitive Content Control feature to let users decide what they prefer to see on the platform. 
  • Social media platforms all have some content controls and policies to limit content they deem harmful and offensive. 
  • Experts say every user has a different tolerance level to what’s considered offensive and that controlling your content is as simple as utilizing the algorithm.
Person frowning at their smartphone

Getty Images / Westend61

Instagram introduced a Sensitive Content Control feature earlier this week, but content control policies like this one tend to fall short across social media networks. 

The platform’s new feature allows you to choose “allow,” “limit,” or tighten controls even further so you see even less “harmful or sensitive” content on your feed. All social media sites have some content policy in place, but experts say these policies ultimately won’t shelter everyone from everything, and shouldn’t. 

“As far as the social media sites themselves, how 'well'  they police that fringe content boils down to their own business goals and what metrics they see—in other words, who makes up the bulk of their community members,” Mary Brown, the marketing and social media director at Merchant Maverick, told Lifewire in an email. 

Defining Harmful Content 

Harmful content controls are nothing new to social media—almost every platform has a policy to limit certain types of sensitive or harmful content. Twitter's policy automatically removes tweets containing abusive content meant to harass or intimidate someone. The platform also updated its rules against hateful content in 2019 to include any tweets that dehumanized people based on religion

Facebook also has content moderation practices in place. For example, the social network does not allow self-harm images or content that glorifies eating disorders. The platform also has cracked down on allowing sensational health claims into people’s feeds, such as exaggerated or misleading health claims about vaccines. 

Illustration of thumbs up and thumbs down icons on two smartphone screens

Getty Images / Malte Mueller

But experts say these policies leave users with more questions than answers since “harmful content” can differ from what each platform defines it as.  

“Who determines what is offensive? Will users have to select from a list of topics they find offensive? Will Facebook and Instagram decide what is offensive? How will offensive even be defined?” Andrew Selepak, a social media professor at the University of Florida, said to Lifewire in an email. 

Instagram defines sensitive content as "posts that don’t necessarily break our rules, but could potentially be upsetting to some people—such as posts that may be sexually suggestive or violent.” 

Brown added that platforms couldn't possibly successfully shelter everyone from this type of content since everyone is different in their tolerance and preference for content. 

“Every single person has a different tolerance level, different attitudes, different tastes,” she said. “Every individual that downloads or uses a social media site has inherently accepted that he or she might stumble upon content that is on the fringes of that app's acceptable content guidelines or community standards.”

Many social media users have also criticized Instagram’s new feature, saying that it would limit content from activists and artists (about controversial topics or posting art that contains nudity) from reaching an audience.

Controlling Content

Brown notes that it’s a missed opportunity that Instagram’s new feature is hard to find within the app, making it that much more difficult for people to control the content they are comfortable with—whether they want to see less or more of “sensitive” content. 

“If it were a feature Instagram wanted to better highlight, the option could be built into the same interface on posts or reels where you can click ‘Report.’ That would be a more effective way of introducing this particular sensitivity control to people who are likely already using that function,” she said. 

"Extra features are great, but in the end, the algorithm is looking at what we're engaging with to determine what to recommend next."

Instagram's feature theoretically puts you in control of what you see rather than implementing a blanket policy on content like many other platforms. But ultimately, social media users can deem what they want to see on their feeds without these platform-made policies. 

"Extra features are great, but in the end, the algorithm is looking at what we're engaging with to determine what to recommend next," wrote Eric Chow, chief consultant at Mashman Ventures, to Lifewire in an email. 

Chow added that doing something as simple as letting the platform know that you don’t want to see a type of content (a feature many platforms have) is the most effective way to take control into our own hands.

“Users need to take responsibility and be aware of how they engage with their content—the more we like, comment, share, and save content on a particular subject matter, the more we'll be presented with it,” he said. 

 

Was this page helpful?