×
  • Tech - News - Tech Companies
  • Updated: April 04, 2021

Pornhub Releases First Transparency Report Details On Illegal Content

Pornhub Releases First Transparency Report Details On Illega

Pornhub has published its first-ever transparency report that sheds light on its moderation practices and on the reports it received from January to December 2020.

Pornhub removed a ton of content and went through some very major changes last December after the New York Times reported that its lax policy enforcement allows it to monetize rape and child exploitation videos.

Following the report, Pornhub announced new details last month on its improved trust and safety policies by introducing biometric technology to verify users who upload videos, following bans from payment processors and mass deletion of most of its content.

Pornhub has removed 653,465 pieces of content that violated its guidelines. Those include videos depicting a minor and trying non-consensual, such as revenge pornography and doxing attempts. It also removed videos containing animal harm, violence, and prohibited bodily fluids.

READ ALSO: MasterCard, Visa Terminate Use Of Their Cards On PornHub

The website explained that it deals with Child Sexual Abuse Material (CSAM) on its website through moderation efforts and from reports submitted by the National Center for Missing and Exploited Children.

The center submitted over 13,000 potential CSAM last year, with 4,171 being unique reports and others being duplicates.

Pornhub said it uses several detection technologies to moderate contents before publishing. In 2020, it scanned all previously uploaded videos against YouTube’s CSAI Match, the video platform’s proprietary technology for identifying child abuse imagery.

It also scanned all previously submitted photos against Microsoft’s PhotoDNA, which was designed for the same purpose.

Pornhub assures that it will continue using both technologies to scan all videos submitted to its platform. In addition, the website uses Google’s Content Safety API, MediaWise cyber fingerprinting software and Safeguard its own image recognition technology meant to combat both CSAM and non-consensual videos.

 

Related Topics

Join our Telegram platform to get news update Join Now

0 Comment(s)

See this post in...

Notice

We have selected third parties to use cookies for technical purposes as specified in the Cookie Policy. Use the “Accept All” button to consent or “Customize” button to set your cookie tracking settings