TikTok Algorithm Steers Violent Videos to More Minority Than White Users: Lawsuit



New You can now listen to Insurance Journal articles!

TikTok faces a claim that its algorithm steers more violent videos to minority subscribers than to White users in a lawsuit blaming the platform for the death of a 14-year-old African-American girl.

The complaint, which also names Meta Platforms Inc., Snap Inc., and TikTok parent company ByteDance Ltd. as defendants, is among a stream of lawsuits that attempt to hold social media companies accountable for teens getting addicted to their platforms.

Parents of Englyn Roberts, who died in September 2020 about two weeks after she tried to take her own life, allege that TikTok is aware of biases in its algorithm relating to race and socio-economic status. Roberts wouldn’t have seen and been addicted to the harmful content that contributed to her death if not for TikTok’s programming, according to the complaint filed Wednesday in San Francisco federal court.

TikTok Sued After 10-Year-Old Dies in ‘Blackout Challenge’

“TikTok’s social media product did direct and promote harmful and violent content in greater numbers to Englyn Roberts than what they promoted and amplified to other, Caucasian users of similar age, gender, and state of residence,” the parents alleged.

The complaint was filed by Social Media Victims Law Center, a Seattle-based advocacy group.

Representatives of TikTok, Meta and Snap didn’t immediately respond to requests for comment.

The case is Roberts v. Meta Platforms, Inc., 22-cv-04210, US District Court, Northern District of California.

Photograph: The logo for ByteDance Ltd.’s TikTok app is arranged for a photograph on a smartphone in Sydney, New South Wales, Australia, on Monday, Sept. 14, 2020. Photo credit: Brent Lewin/Bloomberg.

Related:

Copyright 2022 Bloomberg.

Topics
Lawsuits

Was this article valuable?


Here are more articles you may enjoy.

Interested in Lawsuits?

Get automatic alerts for this topic.



Source link