All You Need
In One Single
Theme.
Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat
Search here:
 

Blog

Social media training meath

This week, Facebook has released the community standards report for Q1.  This report outlines how much inappropriate and offensive content they have removed from their site in the first quarter of the year.  The report is split into each type of content that is regularly removed:

  • Adult nudity & sexual activity
  • Bullying & harassment
  • Child nudity & sexual exploitation of children
  • Terrorism & organised hate
  • Fake accounts
  • Hate speech
  • Drugs & firearms
  • Spam
  • Suicide & self injury
  • Violent & graphic content

Adult Nudity & Sexual Activity

In the first quarter of 2021, there was 31.8 million pieces of inappropriate content reported and removed.  That’s a jump of 3.7 million from the previous quarter.  That big jump could be attributed to old content being found and removed.  Of that 31.8 million, Facebook claim to have found 98.6% of it themselves before it was flagged by other users.

Bullying & Harassment

8.8 million pieces of content that was deemed to be of a harassing or bullying nature were removed in Q1.  Up from 6.3 million last quarter.  This is a combination of this content being more prevalent and Facebook’s detection technology evolving.

Child Nudity & Sexual Exploitation of Children

5 Million pieces of this content was flagged and removed by Facebook in Q1.  Facebook admit they may have missed a small number of inappropriate posts due to technical issues but are working on resolving.

Terrorism & Organized Hate

9 million pieces of content that incited hate or proclaimed a violent mission were removed from Facebook in Q1.  A staggering 99.6% was found by Facebook themselves before other users could see it.

Fake Accounts

Facebook try to remove as many fake accounts as possible, Facebook deem a fake account as accounts created with malicious intent to violate our policies and personal profiles created to represent a business, organization or non-human entity, such as a pet.  In the first quarter alone, Facebook has removed 1.3 Billion fake accounts.

Hate Speech

Hate speech is a tricky one to judge.  Facebook rightly don’t allow hate speech, but will sometimes allow it to be shared to raise awareness of hate speech.  In the first quarter, hate speech posts made up around 0.05% – 0.06% of Facebook’s content. This means for every thousand posts, between 5 and 6 would be hate speech.  As Facebook’s stance on sharing hate speech is a little ambiguous they have restored 408,600 pieces of content that had been removed.

Drugs & Firearms

3.2 million pieces of content related to drugs and firearms were removed by Facebook.  A decrease from 4.3 million from last quarter.

Spam

Spam is a very broad term to describe deceptive and annoying content.  As the term is so broad, the number of spam content that was removed is 905 million.  A similar number to last quarter.

Suicide and Self Injury

5 Million pieces of content containing suicidal or self injury elements were removed in Q1.  Again, Facebook are slightly ambiguous with this one as they sometimes allow self harm messages to raise awareness of the issue.

Violent & Graphic Content

This is deemed as content that glorifies violence or celebrates the suffering or humiliation of others on Facebook.  34.3 million pieces of content were actioned upon in Q1.  This works out roughly as 4 in every 1000 posts.