TikTok deleted 49 million ‘rule-breaking’ videos

TikTok logoImage copyright
EPA

TikTok says it deleted bigger than 49 million videos which broke its rules, between July and December 2019.

About a quarter of those videos were deleted for holding adult nudity or sexual train, the industry acknowledged, in its most up-to-date transparency memoir.

The video-sharing app also revealed it had got about 500 requests for files from governments and police, and had complied with about 480 of them.

The US has suggested it’s miles “making an try at” whether to ban the Chinese language-owned app.

On Monday, US Secretary of Command Mike Pompeo suggested that downloading TikTok would save citizens’ “non-public files in the fingers of the Chinese language Communist Event”.

He added that the US authorities used to be fascinated by whether to ban Chinese language-owned apps: “We’re taking this very severely. We’re completely making an try at it,” he acknowledged, in a Fox Info interview.

The authorities in India has already banned the app, citing cyber-safety concerns.

TikTok is owned by Chinese language firm ByteDance. The app is rarely any longer obtainable in China, but ByteDance operates a identical app, called Douyin, which is obtainable.

TikTok acknowledged it had no longer got any authorities or police files requests from China, or any requests from the Chinese language authorities to delete dispute.

On Thursday, the Wall Road Journal published a memoir suggesting the industry used to be fascinated by developing a fresh headquarters, out of doors of China.

TikTok told the BBC in an announcement: “As we catch into consideration the acceptable route ahead, ByteDance is evaluating changes to the corporate constructing of its TikTok industry. We dwell completely dedicated to conserving our users’ privateness and safety as we have a platform that conjures up creativity and brings joy for so much of of hundreds of thousands of people across the arena.”

Privateness parts

US authorities are examining whether TikTok complied with a 2019 settlement geared toward conserving the privateness of under-13s.

The app says it offers a restricted app experience, with additional safety and privateness parts for under-13s.

According to TikTok’s transparency memoir:

  • 25.5% of the deleted videos contained adult nudity or sexual acts
  • 24.8% broke its baby-safety policies, equivalent to implicating a baby in a crime or containing substandard imitative behaviour
  • 21.5% showed unlawful actions or “regulated items”
  • 3% were removed for harassment or bullying
  • Lower than 1% were removed for detest speech or “inauthentic behaviour”

TikTok’s transparency memoir also revealed:

  • The 49 million deleted videos represented no longer up to 1% of videos uploaded between July and December 2019
  • 98.2% of the deleted videos were noticed by machine studying or moderators sooner than being reported by users

TikTok used to be only released in 2017 – and on story of it be so fresh we know noteworthy less about the platform than we quit about Fb, for instance.

This memoir offers at least a shrimp bit detail about the more or less dispute it takes down.

There has been hundreds focal level just as of late on detest and extremism on platforms such TikTok, but fewer column inches about sexual dispute or the protection of minors.

But round half of the videos taken down were in those two categories.

What we invent no longer know, undoubtedly, is how noteworthy substandard dispute has been neglected by its moderators and machines.

Leave a Comment