Instagram said Tuesday that it will automatically hide negative comments in posts, one of several new steps the Facebook-owned social media platform is taking to reduce bullying and harassment.
Instagram has been testing the feature in recent days and said it will target comments that users have reported as inappropriate in the past. Users must now click the ‘View Hidden Comments’ button to unveil a negative comment that has been covered.
Instagram has also tweaked its comment warning feature. After a user writes a potentially offensive comment, but before the comment is posted, a pop-up message will now appear that reads: “This may go against our guidelines.” The pop-up message notifies users that if they post a negative comment, it will likely be hidden and that Instagram may investigate whether to delete the user’s account.
“These new warnings let people take a moment to step back and reflect on their words and lay out the potential consequences should they proceed,” Instagram said in its announcement. “Since launching comment warning, we saw that reminding people of the consequences of bullying on Instagram and providing real-time feedback as they are writing the comment is the most effective way to shift behavior.”
Instagram didn’t offer specifics on what kind of language or comments would be hidden. The platform automatically deletes posts and comments that are pornographic, praise organized crime or threaten physical harm.
Twitter’s policy is to remove posts that “wish or hope for death, serious bodily harm or fatal disease against anyone.”
As Instagram hopes to minimize inappropriate comments, Facebook has