Search
Close this search box.
GBC
GHANA WEATHER

Facebook’s Instagram ‘failed self-harm responsibilities’

Facebook
Twitter
LinkedIn
WhatsApp
Pinterest

Children’s charity the NSPCC has said a drop in Facebook’s removal of harmful content was a “significant failure in corporate responsibility”.

Facebook’s own records show its Instagram app removed almost 80% less graphic content about suicide and self-harm between April and June this year than in the previous quarter.

Covid restrictions meant most of its content moderators were sent home.

Facebook said it prioritised the removal of the most harmful content.

Figures published on Thursday showed that as restrictions were lifted and moderators started to go back to work, the number of removals went back up to pre-Covid levels.

‘Not surprised’

After the death of the teenager Molly Russell, Facebook committed itself to taking down more graphic posts, pictures and even cartoons about self-harm and suicide.

But the NSPCC said the reduction in takedowns had “exposed young users to even greater risk of avoidable harm during the pandemic”.

The social network has responded by saying “despite this decrease we prioritised and took action on the most harmful content within this category”.

Chris Gray is an ex-Facebook moderator who is now involved in a legal dispute with the company.

“I’m not surprised at all,” he told the BBC.

“You take everybody out of the office and send them home, well who’s going to do the work?”

That leaves the automatic systems in charge.

But they still miss posts, in some cases even when the creators themselves have added trigger warnings flagging that the images featured contain blood, scars and other forms of self-harm.

Leave a Reply

Your email address will not be published. Required fields are marked *

ADVERTISEMENT