Tutorial

Instagram Head Admits Platform Has a Problem Policing Self-Harm Posts

Instagram Head Admits Platform Has a Problem Policing Self-Harm Posts

Instagram head has admitted the company has much more to do to so self-harm and suicide. Adam Mosseri said Instagram is running a complete review into its policies about how it handles such content, which usually appears on Instagram feed.

The company will also add “sensitivity screens” that will warn users on what they are about to observe and confirm they want to look at such photos – at the moment, people can merely stumble across such photos in their Instagram feed. That’s part of a broader plan to make the posts harder to find. Writing in the Daily Telegraph, Mosseri said the current case of 14-year-old Molly Russell, whose father said she committed suicide after looking at self-harm posts, had left him “deeply moved.”

“We need to do everything we can to keep the most vulnerable people who use our platform safe. To be very clear, we do not allow posts that promote or encourage suicide or self-harm,” he said.

“We rely heavily on our community to report this content and remove it as soon as it’s found. The bottom line is we do not yet find enough of these Images before other people see them.”

his explanations come as social media and technology firms face rising investigation over their practices.

Health Secretary Matt Hancock said last week legislation might be needed to police harassing content on social media, and separate reports by the House of Commons Science and Technology Committee and the Children’s Commissioner for England called on social media to take more responsibility for the content on their platforms.

What are they going to do with this?

Adam Mosseri said Instagram was investing in technology to recognize sensitive images better and would also begin applying sensitivity screens which hide pictures from view until people actively choose to look at them.

“Starting this week we will be applying sensitivity screens to all content we review that contains cutting, as we still allow people to share that they are struggling even if that content no longer shows up in search, hashtags or account recommendations. These images will not be immediately visible, which will make it more difficult for people to see them,” he said.

“We want to better support people who post images indicating they might be struggling with self-harm or suicide. We already offer resources to people who search for hashtags, but we are working on more ways to help, such as connecting them with organizations we work with like Papyrus and Samaritans. We have worked with external experts for years to develop and refine our policies. One important piece of advice is that creating safe spaces for young people to talk about their mental health online is essential. Young people have also told us that this is important and that when the space is safe, the therapeutic benefits are positive.”

He said the site didn’t want to “stigmatize mental health” by deleting pictures which reflect the problems people were struggling with, but wouldn’t stop recommending them in searches, via hashtags or the Explore tab.

“Suicide and self-harm are deeply complex and challenging issues that raise difficult questions for experts, governments, and platforms like ours,” Adam Mosseri wrote.

“How do we balance supporting people seeking help and protecting the wider community? Do we allow people to post this content they say helps them or remove it in case others find it? This week we are meeting experts and academics, including Samaritans, Papyrus, and Save.org, to talk through how we answer these questions. We are committed to publicly sharing what we learn. We deeply want to get this right, and we will do everything we can to make that happen.”

https://instazood.com

That’s all guys.

What do you think about this new policy? Tell us in the comments below.

What is your reaction?

Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0

You may also like

More in:Tutorial

Leave a reply