Facebook-owned Instagram is adding tools and programs aimed at reducing abusive comments on the app.
One of its new programs is geared toward filtering out bullying comments.
“They’re using artificial intelligence,” said ABC News’ technology and consumer correspondent Becky Worley. “These are algorithms trying to guess if a comment is abusive.”
Instagram also has keyword filters that block out specific words associated with abuse and a new group-blocking feature that allows users to block entire groups of people, not just individuals.
“Kindness has been in the DNA of Instagram from our earliest days, and as our community grows — now to 800 million — we are working to make Instagram the kindest and most inclusive online community,” Instagram COO Marne Levine said in a statement.
“We want Instagram to be a welcoming platform for people of all backgrounds to express themselves. One way we’re doing that is by providing tools that enable people to control their own comment space.”
Worley said the social networking app, which is owned by Facebook, is clearly making the changes for more reasons than improving the safety of its users.
“They say this is about kindness,” Worley said. “If they want to differentiate themselves, they have got to keep the flame wars to a minimum.”
How to enable comment controls
To set up the new tools, Worley said, you must have an updated version of the Instagram app. Navigate to Settings and go to Comments. Turn on Enable Keyword Filters to add words that you’d like to restrict.
If your account is public, you can choose who can comment on your posts, from every user to certain groups of people, such as people you follow or your followers, Whether your account is public or private, you’ll be able to block other accounts from commenting on your posts altogether.