Facebook-owned Instagram pledges to ban images, posts related to self-harm and suicide

The Facebook-owned app announced that it will not allow users to post images such as memes, drawings and posters that depict the themes of self-harm and suicide

instagram logo
Instagram announced that it will not allow users to post images such as memes, drawings and posters that depict the themes of self-harm and suicide. Pixabay

Popular photo-sharing application Instagram said that it is prohibiting images and posts related to self-harm and suicide, on the lines of Facebook, its mainstream social media platform. The Facebook-owned app announced that it will not allow users to post images such as memes, drawings and posters that depict the themes of self-harm and suicide.

Popular photo-sharing application Instagram said that it is prohibiting images and posts related to self-harm and suicide. The Facebook-owned app announced that it will not allow users to post images such as memes, drawings and posters that depict the themes of self-harm and suicide.

In his blog post, Instagram head Adam Mosseri said on October 27 that accounts that share harmful content will also not be recommended it the photo-sharing app's discovery platforms, like Explore. Mosseri also encouraged Instagram users to share personal stories about suicide can raise awareness among people.

"Preventing people from sharing this type of content could not only stigmatize these types of mental health issues, but might hinder loved ones from identifying and responding to a cry for help," Mosseri wrote in his blog.

facebook
A picture shows the Facebook logo on a beach chair at the Facebook office in Berlin, Germany, 29 August 2016 (Stefanie Loos/File Photo/Reuters)

Instagram had banned all graphic images depicting self-harm in February. It had then announced that it would prevent non-graphic content such as images of scars, from being displayed in search and explore tabs and even hashtags. The app said that it will also soon remove images and posts that include 'methods and materials' related to suicide and self harm.

The development comes after the widespread public outcry over the death of British teenager Molly Russel. The 14-year-old committed suicide after viewing malicious content on Instagram. After Molly died, her father Ian Russel found large chunks of graphic material about self-harm and suicide on her Instagram and Pinterest accounts.

Mr. Russel publicly criticized Instagram for being partly responsible for his daughter's death, following which the UK government and various organizations called on the app to make necessary amendments. Recently, the app had removed its function allowing users to add cosmetic surgery filters to their photos, citing mental health concerns. These filters reportedly imitate surgeries like facelifts and Botox removals.

Instagram's announcement to remove the filters comes after several researchers pointed out the harmful effects caused by viewing these dangerous and unrealistic online content. Last month, just like its parent company Facebook, Instagram was also found allegedly involved the controversial Russian meddling in the 2016 US Presidential Elections.

In August, Instagram found itself embroiled in major controversy after reports emerged of the app allowing a San Francisco-based marketing company to collect information from millions of users. Even though, the tech giants boast of providing technologies without compensating one's privacy, they often find themselves mired in dubious and questionable situations. Be it Facebook, TikTok, Whatsapp and Instagram -- none can escape this harsh reality.

Related topics : Facebook
READ MORE