Instagram will ask users to think twice before publishing a content that can be considered offensive. The social network is implementing a new function that thanks to artificial intelligence allows to alert users about offensive comments.
So, for example, if someone writes "you are ugly and stupid", the system will be able to alert you with a warning message that says: "Are you sure you want to publish this?" And a link to a message in which Instagram explains "We are asking people to reconsider comments that resemble others that have been reported."
"This action gives people the opportunity to reflect and undo your comments and prevent the recipient from receiving notification of harmful comments ", explains Adam Mosseri, Instagram CEO, in statements collected in New Scientist.
Mosseri recognizes that bullying is "a complex problem" and that Instagram has been using artificial intelligence for years to detect this and other types of harmful content.
The network began tracking cases of bullying after cases such as the death of British teenager Molly Russell. After his suicide, in November of 2017, his father Ian said he believed that Instagram was partly responsible for his death.
In a conference, Ian Russell said that "it is important to recognize that technology companies do some things well, but unfortunately their platforms are being used by people to do harm and have not done enough to avoid it. Unless a change occurs, your platforms will become toxic, "he warned.
Another tool that Instagram could launch Restrict, is designed to help filter abusive comments without blocking other users. With this tool, restricted people will not be able to see when a user is active on Instagram or when they have read direct messages.
"We have heard from young people in our community who are reluctant to block, stop following or denounce their aggressor because it could aggravate the situation, especially if they interact with their aggressor in real life," says Mosseri, so Restrict could be of great help.