Teen kills herself and Instagram apologizes for banned suicide posts

Teenager Molly Russell, 14, committed suicide in November 2017. Analyzing her Instagram profile, it was discovered that of the 16,300 posts she had saved in the six months before her death, about 2,100 were linked to depression and suicide.

On Monday (27), Meta’s head of health and wellness (the company that owns Instagram), Elizabeth Lagone, apologized for the teen for seeing content that violated the company’s policies. “We’re sorry about that,” Lagone told the North London coroner’s courtroom, who is investigating the circumstances of Russell’s death.

Despite the apology, Lagone did not condemn all the depression and suicide content on Instagram. The executive said this type of publication is “generally acceptable,” as many people share their experiences with mental health problems and may seek help as a result.

According to Lagone, the company heard several experts who said it shouldn’t remove all content related to depression and self-harm “because of the stigma and shame it can cause to people who are struggling.”

“Why the hell are you doing this? . . . you have created a platform that allows people to post potentially harmful content on it [e] you are inviting the children to the platform. You don’t know where the risk balance is, “said family lawyer Russell Oliver Sanders.

Ian Russell, Molly’s father, reported that he believes Instagram’s algorithms pushed his daughter into explicit and disturbing posts, which ultimately contributed to her suicide.

Change of policy on harmful vessels

It is worth mentioning that in 2017, when the young woman committed suicide, Instagram allowed graphic posts to refer to suicide and self-harm, creating a space for users to seek support. However, in 2019 the social network updated its policy and banned any type of graphic image.

The person logs into Instagram

At the time, the platform stated that “collectively, it was recommended [por especialistas em saúde mental] that graphic images of self-harm, even when it comes to someone admitting their struggles, have the potential to unknowingly promote self-harm. ”

“We have never allowed content that promotes or glorifies suicide and self-harm, since 2019 we have updated our policies, implemented new technologies to remove more illegal content, show more specialized resources when someone searches for or publishes content related to suicide or suicide. harm self-harm and introduced controls designed to limit the types of content teens see, ”the Meta executive said.

Teenage mental health has become a major concern for Meta, the company that owns Facebook, Instagram and WhatsApp, following complaints from former employee Frances Haugen, who pointed out that the company was aware of how much its social networks could be toxic to young and young adolescents.

Additionally, Haugen’s allegations, filed in the Wall Street Journal in September last year, point out that young people may be directed towards content that encourages self-harm.

The post Teen Kills Himself and Instagram Apologizes for Banned Suicide Posts first appeared on Olhar Digital.

Source: Olhar Digital

Leave a Reply

Your email address will not be published. Required fields are marked *