Wednesday, 16 May 2018

Facebook’s first transparency report shows majority of offending content removed before being reported

Facebook has released its first-ever transparency report, outlining the amount of content it has identified as breaking Community Standard rules between October 2017 and March 2018. According to the data, Facebook took action against a majority of the offensive content before it was reported by users.

The report is part of its Community Standards initiative first announced in April. Violation criteria has been separated into six categories: graphic violence, nudity and sexual activity, terrorist propaganda from ISIS, al-Qaeda and affiliates, hate speech, spam and fake accounts.

Facebook says it uses a combination of machine learning automation and employees to identify content that violates its Community Standards guidelines. The company has said repeatedly that it plans to have at least 10,000 safety and security professionals on staff by the end of 2018 to work on this initiative.

Facebook’s transparency report breaks down the number of content violations it took action against per each category between Q4 2017 and Q1 2018 and the amount of content identified before users reported it. It also lists the frequency of content violations within the graphic violence and nudity and sexual activity categories, as well as the frequency of fake accounts. Here’s an overview of the data:

How many pieces of content or accounts did Facebook take action against?

  • Graphic violence — Q4 2017: 1.2 million | Q1 2018: 3.4 million
  • Nudity and sexual activity — Q4 2017: 21 million | Q1 2018: 21 million
  • Terrorist propaganda from ISIS, al-Qaeda and affiliates — Q4 2017: 1.1 million | Q1 2018: 1.9 million
  • Hate speech — Q4 2017: 1.6 million | Q1 2018: 2.5 million
  • Spam — Q4 2017: 727 million | Q1 2018: 837 million
  • Fake accounts — Q4 2017: 694 million | Q1 2018: 583 million

Amount identified before users reported content or accounts

  • Graphic violence — Q4 2017: 72% | Q1 2018: 86%
  • Nudity and sexual activity — Q4 2017: 94% | Q1 2018: 96%
  • Terrorist propaganda from ISIS, al-Qaeda and affiliates — Q4 2017: 97% | Q1 2018: 99.5%
  • Hate speech — Q4 2017: 24% | Q1 2018: 38%
  • Spam — Q4 2017: 100% | Q1 2018: 100%
  • Fake accounts — Q4 2017: 98.5% | Q1 2018: 99.1%

Prevalence of content in violation of Facebook’s Community Standards

  • Graphic violence — Q4 2017: 0.16% to 0.19% | Q1 2018: 0.22% to 0.27%
  • Nudity and sexual activity — Q4 2017: 0.06% to 0.08% | Q1 2018: 0.07% to 0.09%
  • Terrorist propaganda from ISIS, al-Qaeda and affiliates — Data unavailable
  • Hate speech — Data unavailable
  • Spam — Data unavailable
  • Fake accounts — Facebook estimates that fake accounts represented approximately 3 to 4 percent of monthly active users (MAU) on Facebook during Q1 2018 and Q4 2017.

In all categories except one — hate speech — Facebook took action against most of the offending content before it was reported by users. For hate speech, there was a noticeable difference in the percentage of content Facebook proactively removed. More than 90 percent of the content was removed without being reported in nearly all categories except hate speech. For both quarters, less than 40 percent (38 percent in Q1 2018 and 24 percent in Q4 2017) of content identified as hate speech had action taken against it before it was reported. This means well over half of the hate speech violations identified on the platform had to be reported by a user versus Facebook identifying them through their own systems.

Facebook does note early in the report that it is still refining its internal methodologies for measuring its efforts and expects the numbers to become more precise over time.


About The Author

Amy Gesenhues is Third Door Media’s General Assignment Reporter, covering the latest news and updates for Marketing Land and Search Engine Land. From 2009 to 2012, she was an award-winning syndicated columnist for a number of daily newspapers from New York to Texas. With more than ten years of marketing management experience, she has contributed to a variety of traditional and online publications, including MarketingProfs.com, SoftwareCEO.com, and Sales and Marketing Management Magazine. Read more of Amy’s articles.

This marketing news is not the copyright of Scott.Services – please click here to see the original source of this article. Author: Amy Gesenhues

For more SEO, PPC, internet marketing news please check out https://news.scott.services

Why not check out our SEO, PPC marketing services at https://www.scott.services

We’re also on:
https://www.facebook.com/scottdotservices/
https://twitter.com/scottdsmith
https://plus.google.com/112865305341039147737

The post Facebook’s first transparency report shows majority of offending content removed before being reported appeared first on Scott.Services Online Marketing News.



source https://news.scott.services/facebooks-first-transparency-report-shows-majority-of-offending-content-removed-before-being-reported/

No comments:

Post a Comment