Instagram has decided to remove from the platform all photographs and videos that show self-harm explicitly. This decision comes after a man, father of a 14-year-old girl who committed suicide in 2017, has publicly denounced that “Instagram helped kill” his daughter.
“Instagram will remove all explicit images of self-harm but leave those that try to shed light on this taboo from a positive point of view.”
Apparently, the girl, named Molly Russel, had been seeing pretty graphic images of self-harm in the social network before she died. Adam Mosseri, the CEO of Instagram, has answered the accusations stating that the social network was “not where it was needed in relation to self-harm and suicide.”
The CEO said in an interview with the BBC that Instagram will remove all images related to harmful behavior as soon as possible. For now, Instagram has to rely on users to report this type of content, but they aspire to create an automatic filter that detects photos and videos of self-harm to remove them quickly.
Until now, this type of content was approved on Instagram under the pretext that many people can publish it to tell their story. From now on, explicit images will be prohibited. But not all, since those that approach the subject from a positive point of view, will be allowed.
In spite of this, images related to less explicit autolytic behavior (those used to tell a personal story, for example) will be more difficult to find through hashtags and traditional search, in order to limit the exposure of people more sensitive.