Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

PA

Facebook blames coronavirus for hampering efforts to remove suicide posts from its platforms

The social network also said it took action on fewer pieces of child nudity and sexual exploitation content on Instagram

FACEBOOK HAS BLAMED the coronavirus for hampering efforts to remove suicide and self-injury posts from its platforms.

The social network revealed it took action on significantly less material containing such content between April and June because fewer reviewers were in action as the pandemic struck.

Facebook sent its moderators home in March to prevent the spread of the virus, but boss Mark Zuckerberg warned enforcement requiring human intervention could be hit.

The firm says it has since brought “many reviewers back online from home” and, where it is safe, a “smaller number into the office”.

Facebook’s latest community standards report shows that 911,000 pieces of content related to suicide and self-injury underwent action within the three-month period, versus 1.7 million pieces looked at in the previous quarter.

Meanwhile on Instagram, steps were taken against 275,000 posts compared with 1.3 million before.

Action on media featuring child nudity and sexual exploitation also fell on Instagram, from one million posts to 479,400.

Facebook estimates that less than 0.05% of views were of content that violated its standards against suicide and self-injury.

‘The most harmful content’

“Today’s report shows the impact of Covid-19 on our content moderation and demonstrates that, while our technology for identifying and removing violating content is improving, there will continue to be areas where we rely on people to both review content and train our technology,” the company said.

With fewer content reviewers, we took action on fewer pieces of content on both Facebook and Instagram for suicide and self-injury, and child nudity and sexual exploitation on Instagram.

“Despite these decreases, we prioritised and took action on the most harmful content within these categories.

“Our focus remains on finding and removing this content while increasing reviewer capacity as quickly and as safely as possible.”

The tech giant’s sixth report does suggest the automated technology is working to remove other violating posts, such as hate speech, which went from 9.6 million on Facebook in the last quarter to 22.5 million now.

Much of that material, 94.5%, was detected by artificial intelligence before a user had a chance to report it.

Proactive detection for hate speech on Instagram increased from 45% to 84%.

The data also suggests improvements on terrorism content, with action against 8.7 million pieces on Facebook this time compared with 6.3 million before – only 0.4% of this was reported by a user, while the vast bulk was picked up and removed automatically by the firm’s detection systems.

Hate speech

“We’ve made progress in combating hate on our apps, but we know we have more to do to ensure everyone feels comfortable using our services,” Facebook said.

That’s why we’ve established new inclusive teams and task forces including – the Instagram Equity Team and the Facebook Inclusive Product Council – to help us build products that are deliberately fair and inclusive and why we are launching a Diversity Advisory Council that will provide input based on lived experience on a variety of topics and issues.

“We’re also updating our policies to more specifically account for certain kinds of implicit hate speech, such as content depicting blackface, or stereotypes about Jewish people controlling the world.”

Children’s charity the NSPCC said Facebook’s “inability to act against harmful content on their platforms is inexcusable”.

“The crisis has exposed how tech firms are unwilling to prioritise the safety of children and instead respond to harm after it’s happened rather than design basic safety features into their sites to prevent it in the first place,” Dr Martha Kirby, child safety online policy manager at the NSPCC said.

“This is exactly why Government needs to urgently publish an Online Harms Bill that holds Silicon Valley directors criminally and financially accountable to UK law if they continue to put children at risk.”

Facebook has also revealed it removed more than seven million pieces of harmful coronavirus misinformation from Facebook and Instagram over the same period.

Need help? Support is available:

  • Aware – 1800 80 48 48 (depression, anxiety)
  • Samaritans – 116 123 or email jo@samaritans.ie
  • Pieta House – 1800 247 247 or email mary@pieta.ie (suicide, self-harm)
  • Teen-Line Ireland – 1800 833 634 (for ages 13 to 18)
  • Childline – 1800 66 66 66 (for under 18s)
Close
12 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Install the app to use these features.
    Mute OConnelj
    Favourite OConnelj
    Report
    Aug 11th 2020, 6:43 PM

    Can their moderators not work from home?

    64
    Install the app to use these features.
    Mute PeterC
    Favourite PeterC
    Report
    Aug 11th 2020, 6:51 PM

    @OConnelj: I’d imagine a job like that wouldn’t be something you could do from home. Moderator’s most likely need to have ongoing professional support, the toll on a person’s mental health in a job like that must be significant.

    56
    Install the app to use these features.
    Mute Da_Dell
    Favourite Da_Dell
    Report
    Aug 11th 2020, 7:14 PM

    @PeterC: yes, but thats if you believe the moderation and/or moderators are all above board so to speak. Maybe lookup what they have done with the French Government or look up how Google algorithms are ‘dealing’ with the likes of fake news, some very interesting whistle-blowing going on out there.

    18
    Install the app to use these features.
    Mute Tony Garcia
    Favourite Tony Garcia
    Report
    Aug 11th 2020, 7:04 PM

    Does your hate speech code includes footage of BLM/Antifa “peaceful protests” which the media would rather not show?

    38
    Install the app to use these features.
    Mute Modern Irish Dad
    Favourite Modern Irish Dad
    Report
    Aug 11th 2020, 7:14 PM
    1
    Install the app to use these features.
    Mute Ann Neylan
    Favourite Ann Neylan
    Report
    Aug 11th 2020, 7:04 PM

    My beautiful niece volunteers for the crises line. Awfully disheartened due to covid 19.

    17
    Install the app to use these features.
    Mute Joe_X
    Favourite Joe_X
    Report
    Aug 11th 2020, 8:22 PM

    @Ann Neylan: tell her thanks and to keep up the good work.

    18
    Install the app to use these features.
    Mute Patricia O'Reilly
    Favourite Patricia O'Reilly
    Report
    Aug 11th 2020, 6:57 PM

    In their defence that must be like looking for needle in a haystack watching all that stuff.. millions of posts 24 hrs a day..

    11
    Install the app to use these features.
    Mute Gavin Lynam
    Favourite Gavin Lynam
    Report
    Aug 11th 2020, 7:16 PM

    @Patricia O’Reilly: there’s ways to filter out them posts out, Facebook have a history of not giving a toss, plenty of scams on Facebook too

    24
    Install the app to use these features.
    Mute Michael Hersman
    Favourite Michael Hersman
    Report
    Aug 11th 2020, 7:51 PM

    Strange that they can’t deal with content as Child abuse and suicide. But they found the time to block all pages bringing to Cheech and Chong.

    12
    Install the app to use these features.
    Mute Joan Featherstone
    Favourite Joan Featherstone
    Report
    Aug 11th 2020, 9:14 PM

    Dumped FB about two weeks ago, was sick of the utter rubbish on it, don’t miss it at all, screen time down 16%, obviously was spending far too much time looking at BS…Journal might be next, with their ‘this may be perceived as toxic’, I mean ????

    11
    Install the app to use these features.
    Mute Shawn O'Ceallaghan
    Favourite Shawn O'Ceallaghan
    Report
    Aug 11th 2020, 7:59 PM

    Why is it their mods job to do this.

    2
Submit a report
Please help us understand how this comment violates our community guidelines.
Thank you for the feedback
Your feedback has been sent to our team for review.

Leave a commentcancel

 
JournalTv
News in 60 seconds