News

Instagram won’t recommend content that vaguely violates its Community Guidelines

Instagram says, “We have begun reducing the spread of posts that are inappropriate but do not go against Instagram’s Community Guidelines.” That means if a post is sexually suggestive, but doesn’t depict a sex act or nudity, it could still get demoted. Similarly, if a meme doesn’t constitute hate speech or harassment, but is considered in bad taste, lewd, violent or hurtful, it could get fewer views.

Specifically, Instagram says, “this type of content may not appear for the broader community in Explore or hashtag pages,” which could severely hurt the ability of creators to gain new followers. The news came amidst a flood of “Integrity” announcements from Facebook to safeguard its family of apps revealed today at a press event at the company’s Menlo Park headquarters.

“We’ve started to use machine learning to determine if the actual media posted is eligible to be recommended to our community,” Instagram’s product lead for Discovery, Will Ruben, said. Instagram is now training its content moderators to label borderline content when they’re hunting down policy violations, and Instagram then uses those labels to train an algorithm to identify.

These posts won’t be fully removed from the feed, and Instagram tells me for now the new policy won’t impact Instagram’s feed or Stories bar. But Facebook CEO Mark Zuckerberg’s November manifesto described the need to broadly reduce the reach of this “borderline content,” which on Facebook would mean being shown lower in News Feed. That policy could easily be expanded to Instagram in the future. That would likely reduce the ability of creators to reach their existing fans, which can impact their ability to monetize through sponsored posts or direct traffic to ways they make money like Patreon.

Instagram has updated its Community Guidelines to reflect the changes, saying it will limit the exposure of posts it considers inappropriate by not recommending them in the Explore or hashtag pages. 

Unfortunately, Instagram isn’t clearly defining what it deems ‘inappropriate’. According to TechCrunch, the definition includes anything that’s “violent, graphic/shocking, sexually suggestive, misinformation and spam content can be deemed ‘non-recommendable’”.

So, if a post is sexually suggestive, even if it doesn’t depict nudity or a sexual act, it could be demoted in the Explore page and from the hashtag search. Instagram does clarify that such posts will be visible to an account’s followers, just not to the general public.

The news has been met with mixed reactions from content creators, many of whom depend on the Explore page and hashtags both areas where platform recommendations are key to find new followers. Some creators are understandably concerned that the changes will diminish the reach of their posts in these areas, and will thus affect their ability to earn revenue from monetized posts.

(Visited 124 times, 1 visits today)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.