Carolina mom suing social media giant Meta

Carolina mom suing social media giant Meta
Updated: May. 9, 2023 at 6:30 PM EDT
Email This Link
Share on Pinterest
Share on LinkedIn

MONROE, N.C. (FOX Carolina) - A mom in the Carolinas is taking on one of the most powerful social media companies on the planet.

The life Rosemarie Calvoni knew with her husband and three children was shattered – and she says it’s all because of social media. She’s now on a mission to make changes.

She had no clue what could happen to her daughter when she handed her her first phone five years ago. Her now 16-year-old daughter used to love spending time with her friends, twin sister, and her brother.

“She was happy,” Calvoni said. “She was social outside. She was always a good kid.”

They appear to be a picture-perfect family of five, but Calvoni says the last few years have been more like hell.

When she was just 13, Calvoni’s daughter was diagnosed with anorexia, depression, and anxiety. She has been in and out of the hospital with several stays in treatment facilities. Rosemary’s husband has taken early retirement to watch their daughter 24/7.

“I’ve heard people refer to an eating disorder as a demon. Because it really is it takes a different form and changes the whole child,” she said. “If you don’t get the treatment they need, they will die. Very quickly.”

Calvoni was unsure how her fun-loving daughter changed so quickly until she opened her daughter’s Instagram on the phone she gave her.

“It was really for safety reasons more than anything else,” she said. “Little did I know what an unsafe environment I put her in.”

She found pictures and videos on how to lose weight quickly, how to exercise secretly, and how to be skinny. Images and pressures Calvoni never imagined her young daughter would be exposed to on Instagram.

The mom of three felt lost and confused until October 2021 when she heard a whistleblower testify before Congress.

“I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy,” said Frances Haugen.

A former employee of Facebook, now Meta, Haugen testified before Congress, telling lawmakers Facebook and Instagram are designed to exploit negative emotions to keep people on the platforms.

“They take things that they’ve heard for example, like, can you be led by the algorithms to anorexia content and they have literally recreated that experiment themselves and confirm yes, this happens to people,” Haugen said.

Calvoni watched the entire testimony - and so did attorney Jessica Carroll, a mother of two, with the law firm Motley Rice based out of Charleston. The very law firm that took on Big Tobacco is now taking on Meta.

“It was a lightbulb moment for a lot of people,” Carroll said. “Just as Rosemarie said, we had no idea the effect of social media on our young people.”

Their lawsuit says Instagram utilizes complex algorithms and behavioral cues to addict preteen and teenage children to maximize their use of the product, and the goal is to promote further engagement, meaning more views of advertisements and more revenue.

“And the more likes they get, the more their stocks go up,” Calvoni said.

The lawsuit also says Meta designed Instagram to have product features that prey upon children’s desires for validation and need for social comparison. Court documents say Facebook’s first president Sean Parker was once quoted as saying, “God only knows what it’s doing to our children’s brains.”

“They’ve harmed a generation of children that are going to deal with the effects of this for the rest of their lives,” Carroll said.

Calvoni is now pleading with parents, saying that Meta is targeting their children - just like her daughter.

“She was a guinea pig in a new world,” Calvoni said. “I gave her the phone for one reason and had no idea that someone else was targeting her for another because that’s what she really was a target.”

She hopes this lawsuit forces Meta to change its algorithm and add more security features for users. She knows taking away the phone from kids is easier said than done.

“It is like a drug,” she said. “It truly is. It’s a true addiction.”

Carroll says aspects of the case have a lot of crossover with opioid litigation she has worked on previously. The law firm is not looking to take away social media, but to hold those behind the platform accountable.

“We didn’t have the awareness that we do today,” Carroll said. “These design features were concealed. We were told by these companies that they had safety measures in place and they lied. We didn’t know that it could be this harmful.”

“If it can help the next generation then I’ve been I’ve done my job as a mother,” Calvoni said.

In response to the lawsuit, Meta sent us this statement:

Meta also provided additional details on safety steps the company has taken since October 2020 including implementing age verification technology, sensitive content control, and banning content that promotes suicide, self-harm or eating disorders. Those are provided below:

  • We use age verification technology to help teens have experiences that are appropriate for their age, including limiting the types of content they see and who can see and interact with them.
  • We automatically set teens’ accounts (U16) to private when they join Instagram. We also don’t allow people who teens don’t follow to tag or mention them, or to include their content in Reels Remixes or Guides. These are some of the best ways to help keep young people from hearing from adults they don’t know, or that they don’t want to hear from.
  • We’ve developed technology to help prevent suspicious adults from engaging with teens. We work to avoid showing young people’s accounts in Explore, Reels or Accounts Suggested For You to these adults. If they find young people’s accounts by searching for their usernames, they won’t see an option to follow them. They also won’t be able to see comments from young people on other people’s posts, nor will they be able to leave comments on young people’s posts.
  • We limit the types of content teens can see in Explore, Search and Reels with our Sensitive Content Control. The control has only two options for teens: Standard and Less. New teens on Instagram who are under 16 years old are automatically placed into the Less state. For teens who are already on Instagram, we send prompts encouraging them to select the Less experience.
  • We don’t allow content that promotes suicide, self-harm or eating disorders. Of that content we take action on, we identify over 99% before it is reported to us.
  • We show expert-backed, in-app resources when someone searches for, or posts, content related to suicide, self-harm, eating disorders or body image issues. They see a pop-up with tips and an easy way to connect to organizations like NEDA in the US. We also have a dedicated reporting option for eating disorder content.

Meta said they have built tools to foster a positive experience for teens, like notifications about taking regular breaks from Instagram or prompts to turn on “Quiet Mode.” If a teen has been scrolling on the same topic for a while, Meta says they’ll be notified that it might be time to look at something different.

The company has also developed supervision tools and resources for parents. They said they consult regularly with youth and safety advisors on policies and app development. They work with the National Center for Missing and Exploited Children to address content that could exploit children.

Meta provided us with a diagram regarding safety steps taken for Instagram: