More of Meta’s missteps released
• The Meta lawsuit reveals internal communications that seem to prove Meta was lax on child users because it was good for business.
• Meta has begun its own campaign to start lawsuits against app store owners like Apple and Google.
• The Meta lawsuit is being brought by 33 attorneys general simultaneously.
READ NEXT
Meta in the hotseat… again
In the latest release from the Meta lawsuit, we learned that the company purposefully engineered its platforms to addict children. It also knowingly allowed underage users to hold accounts, according to a newly unsealed legal complaint.
Read those sentences again and replace “Meta” with “Big Tobacco” or “Corporate Cocaine” and see how comfortable you feel.
If you’re of a certain generation, you’ll remember schoolmates proudly uploading party pics to a Facebook profile with an incorrect birthday – predated to meet the site’s age requirements. Because that’s how hard it was to get around due diligence on that platform. As Facebook lost its sheen and Instagram became the social media platform du jour, the same method enabled very young people to post pictures to the site.
Unsurprisingly, this get-around didn’t go unnoticed by Meta, the company behind the social media site. But why didn’t it close underage accounts?
The lawsuit was filed by the attorneys general of 33 states in late October. It alleges that the company knew – but never disclosed – that it had received millions of complaints about underage users on Instagram, but only disabled a fraction of them.
33 attorneys general – it’s possible they’re all wrong simultaneously, but the odds aren’t good if they’re lining up to sue you.
The volume of underage users was an “open secret” at the company, the suit alleges, citing internal company documents. One example cites an internal email thread in which employees discuss why a 12-year-old girl’s accounts weren’t deleted following complaints from the girl’s mother requesting the account be taken down due to her child’s age.
Employees concluded “the accounts were ignored” partly because representatives from Meta “couldn’t tell for sure the user was underage.”
In 2021, Meta received over 402,000 reports of under-13 users on Instagram but less than half of the reported accounts (164,000) were disabled that year. At times, the lawsuit’s complaint notes, Meta has a backlog of up to 2.5m accounts of younger children awaiting action.
The lawsuit alleges that this and other incidents violate the Children’s Online Privacy and Protection Act, which requires social media companies to provide notice and get parental consent before collecting children’s data.
It also focuses on longstanding assertions that Meta knowingly created products that were addictive and harmful to children. This was brought into the spotlight by Frances Haugen, who revealed internal studies showed platforms like Instagram led children to anorexia-related content.
According to company documents that are cited by the complaint, several Meta officials acknowledged that the company designed its products to exploit youthful psychology, including a May 2020 internal presentation called Teen Fundamentals, highlighting certain vulnerabilities of the young brain that could be exploited by product development.
It discussed teen brains’ relative immaturity and their tendency to be driven by “emotion, the intrigue of novelty and reward” and asked how these characteristics could manifest… in product usages.”
According to a statement from Meta, the complaint misrepresents its work over the past decade to make the online experience safe for teens. It tries to point out “over 30 tools to support them and their parents.”
That doesn’t really make up for ignoring complaints from parents that their younger children are using the app despite being underage. Meta argues age verification is a “complex industry challenge,” though it doesn’t seem – based on internal correspondences – to be one Meta is interested in tackling.
Instead, Meta says it favors shifting the burden of policing underage usage to app stores and parents like Google and Apple, by supporting federal legislation that would require app stores to obtain parental approval whenever under-16s download apps.
Meta lawsuit: company tries to shift the blame
On the same day that the Senate began investigation Meta’s failure to shield children using its platforms, the company began calling on US lawmakers to regulate Google and Apple’s app stores to better protect children.
A blogpost titled “Parenting in a Digital World is Hard. Congress Can Make It Easier,” written by Antigone Davis, Meta’s global head of safety, called for federal legislations mandating app stores to notify parents when a child between 13 and 16 downloads an app, soliciting the parents’ approval.
Although it doesn’t directly name Apple or Google, Meta’s blogpost is unlikely to be directed anywhere other than at the two biggest smartphone app stores in the world.
Davis’s call was published the same day that the Senate judiciary committee sent a letter to Mark Zuckerberg, Meta’s CEO, requesting that he “provide documents related to senior executives’ knowledge of the mental and physical health harms associated with its platforms, including Facebook and Instagram.” The letter asks for the documents by 30 November.
One Facebook safety executive alluded to the possibility that cracking down on younger users would hurt the company’s business in a 2019 email. A year later, the same executive expressed frustration that Facebook didn’t show the same enthusiasm for ways to identify younger users and remove them from the platforms as it did for studying the usage of underage users for business reasons.
The outcome of the Meta lawsuit remains to be seen, but the added scrutiny on social media companies is bound to have a knock-on effect – though what that effect is depends largely on Meta’s willingness to be goverened by data rules, and/or spend large sums of money on putting right any systems found to be overly addicting.