Lawsuit Claims that Instagram, Snapchat, TikTok Cause Mental Health Problems in Teens
Lawsuit Claims that Instagram, Snapchat, TikTok Cause Mental Health Problems in Teens
Share:

Three new cases were filed against Meta, TikTok, and Snap, alleging that they encouraged underage users to develop mental illnesses. The accusations are part of a growing number of parents and their kids who are suing social networking corporations, claiming that the businesses are not just hooking users, but also aware of the harm they are causing.

Section 230 of the Communications Decency Act, a federal law that shields technology corporations from liability originating from third-party information, is being circumvented by the lawsuits, the latest in a string of instances relating social media to mental health difficulties among kids. They advanced the hypothesis that social media sites like Facebook are essentially flawed products that cause harm, such as eating disorders, anxiety, and suicide. At least 20 of these cases have been filed around the nation using the Facebook Papers as evidence; the Facebook Papers are a large collection of confidential business records that were leaked by whistleblower Frances Haugen last year.

One of the complaints filed Thursday in the Los Angeles Superior Court says, “This is the business model used by all defendants — engagement and growth over user safety — as evidence by the inherently dangerous design and operation of their social media products,”. “At any point, any of these defendants could have come forward and shared this information with the public, but they knew it would have given their competitors an advantage and/or it would have meant massive changes to their products and trajectory. The defendants chose to continue inflicting harm and instead concealed the truth.”

The platforms' product features are the focus of the lawsuits. They contend that the algorithms used by the businesses promote risky information that places engagement over safety.

They avoid any potential exemption under Section 230 by avoiding allegations pertaining to the particular content the platform's host. Technology corporations have long had strong legal safeguards from liability as third-party publishers according to the legislation. When a federal appeals court ruled last year that Snap cannot rely on Section 230 to protect itself from a lawsuit alleging that the company's design of a speedometer feature contributed to a fatal crash by encouraging speeding, it made a significant decision regarding the law, which may be the subject of reform.

“Plaintiff’s allegations arise not from third party content, but rather from Defendants’ product features and designs, including but not limited to algorithms and other product features that addict underage users, enhance and promote harmful social comparison, [and] selectively select and promote harmful content to vulnerable users based on their individualized demographics and social media activity,” the complaint said.

According to the claims, the platforms' absence of parental controls is a feature rather than a defect. TikTok requires that users be 13 years old to sign up. However, the lawsuit claims that in 2020, the firm reported that almost a third of its 49 million daily users were 14 or younger. The claimants assert that the platforms purposefully disregard or fail to validate email account legitimacy.

Another claim is that the platforms allow minors to open multiple accounts in violation of the terms of service even if they are aware that doing so violates the terms of service. According to the lawsuit, Snapchat's refusal to uphold the single-account restriction resulted in bullying among users.

“Each of Defendant’s products are designed in a manner intended to prevent parents from exercising their right to protect and monitor the health and well-being of their child,” the indictment said. “Defendants’ products are intended to enable children to evade parental controls.”

The complaints concern strict liability, negligence, unjust enrichment and invasion of privacy. One of the plaintiffs, the mother of a child who committed suicide, is also suing for intentionally inflicting emotional distress on TikTok.

The lawsuits were filed by the Social Media Victims Law Center, which represents numerous other plaintiffs in identical lawsuits across the country.

Facebook, TikTok and Snap did not immediately respond to requests for comment.

US and Chinese scientists discover an easy method to eliminate harmful "forever chemicals"

Municipal Corporation bulldozed the stall of 'Graduate Chaiwali,' and then...

Switch Mobility's first 200 of its new electric double-decker buses will arrive in Mumbai in December.

Join NewsTrack Whatsapp group
Related News