26 C
Accra
Monday, November 30, 2020
No menu items!

Facebook’s Instagram ‘failed self-harm responsibilities’

Must Read

EC outlines measures to ensure PWDs vote independently

The Electoral Commission (EC) says it has put in measures to ensure that People With Disabilities (PWDs) cast their...

“More progress, more prosperity and more development in my 2nd term – President Akufo-Addo

“The New Patriotic Party has the blueprint for the development of Ghana, the foundation for that development has been...

Kantamanto GCB Bank fire started from storeroom

Preliminary investigations have established that the wildfire that gutted the Liberty House branch of the GCB Bank at Kantamanto,...

Children’s charity the NSPCC has said a drop in Facebook’s removal of harmful content was a “significant failure in corporate responsibility”.

Facebook’s own records show its Instagram app removed almost 80% less graphic content about suicide and self-harm between April and June this year than in the previous quarter.

Covid restrictions meant most of its content moderators were sent home.

Facebook said it prioritised the removal of the most harmful content.

Figures published on Thursday showed that as restrictions were lifted and moderators started to go back to work, the number of removals went back up to pre-Covid levels.

‘Not surprised’

After the death of the teenager Molly Russell, Facebook committed itself to taking down more graphic posts, pictures and even cartoons about self-harm and suicide.

But the NSPCC said the reduction in takedowns had “exposed young users to even greater risk of avoidable harm during the pandemic”.

The social network has responded by saying “despite this decrease we prioritised and took action on the most harmful content within this category”.

Chris Gray is an ex-Facebook moderator who is now involved in a legal dispute with the company.

“I’m not surprised at all,” he told the BBC.

“You take everybody out of the office and send them home, well who’s going to do the work?”

That leaves the automatic systems in charge.

But they still miss posts, in some cases even when the creators themselves have added trigger warnings flagging that the images featured contain blood, scars and other forms of self-harm.

SourceBBC

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest News

EC outlines measures to ensure PWDs vote independently

The Electoral Commission (EC) says it has put in measures to ensure that People With Disabilities (PWDs) cast their...

“More progress, more prosperity and more development in my 2nd term – President Akufo-Addo

“The New Patriotic Party has the blueprint for the development of Ghana, the foundation for that development has been laid in Akufo-Addo’s first term...

Kantamanto GCB Bank fire started from storeroom

Preliminary investigations have established that the wildfire that gutted the Liberty House branch of the GCB Bank at Kantamanto, in Accra, on Saturday, started...

Methodist University awaits accreditation to introduce new programmes

The The Methodist University College Ghana (MUCG) is awaiting approval from the National Acceleration Authority to introduce a couple of new programmes. The programmes are...

2020 Elections: EC extends media accreditation to Monday November 30

The Electoral Commission (EC) has extended media accreditation for the 2020 Presidential and Parliamentary elections to tomorrow (November 30,2020) at 5pm.   A statement from the...

More Articles Like This