Instagram will hide comments that may be considered offensive


The company said the comments that will be hidden will be similar to what has been reported by users in the past. Instagram said it uses existing artificial intelligence systems to identify bullying or harassing language in comments.

Instagram announced on Tuesday that it would test the feature. This day also marks the tenth birthday of the app.

Users will still be able to tap “View Hidden Comments” to see those comments.

Adam Moseri, who took over the helm of Instagram two years ago, has vowed to fight bulline grandfathering. Last year, a company owned by Facebook rolled out a tool called “Restricted.” It allows you to “restrict” another user, meaning that person’s comments on your posts are visible only to them, and not to others. It has also previously added a feature that tells people it will be considered offensive before they post a comment. The idea is that it gives people the ability to pause and reflect.
Facebook takes a big step in connecting Instagram, Messenger and WhatsApp

Instagram said that after issuing comment warnings, it has seen a “significant improvement” in whether or not to edit the comment, although it does not elaborate further.

On Tuesday, Instagram also said it was adding an extra warning to people who posted too many comments that could be offensive. The notification will ask them to return to their comment, otherwise they risk consequences such as their comment being hidden or their account being deleted.

Twitter has Conducted similar tests. Earlier this year, it began considering rewriting its response to a tweet before publishing it to users, if it contained potentially harmful language.

.