Meta Platforms’ immunity claim in social media addiction cases hits its first state supreme court this week, as Massachusetts justices consider whether the social media industry’s bedrock defense applies to Instagram’s design.
In oral argument Friday, the Massachusetts Supreme Judicial Court is set to sort out competing views on the reach of a litigation shield Congress created for internet companies in Section 230 of the Communications Decency Act, providing a crucial appellate review of the law’s application to social media addiction claims that threaten tens of billions of dollars in damages to Meta alone.
A ruling shrinking the reach of Section 230—often referred to as the 26 words that created the internet—could ripple through the social media industry, raising concerns about liability and fundamentally changing the internet as we know it, said Eric Goldman, who co-directs Santa Clara University’s High Tech Law Institute.
“If the plaintiffs make any progress, they’re going to structurally change the internet and almost certainly shrink it,” said Goldman, who backs a broad interpretation of Section 230 and calls the cottage industry of social media addiction litigation one of “several existential threats to the internet.”
The case, brought by Massachusetts Attorney General Andrea Campbell (D), says Meta purposefully designed Instagram to addict young users and deceived the public about the dangers posed to young people by overusing the product. Addictive features targeted in the complaint include incessant notifications and alerts, infinite scroll and autoplay elements, ephemeral posts, and intermittent variable rewards.
The oral argument comes weeks before the US Court of Appeals for the Ninth Circuit is scheduled to hear from Meta and others in their challenge to an Oakland federal judge’s ruling advancing portions of a massive multidistrict youth addiction case.
Judges haven’t interpreted Section 230 as the type of case-killer that Meta sees it as for the youth addiction cases, and many appeals courts have refused Meta’s requests to immediately appeal those decisions.
But while some courts have found the law to bar portions of cases, experts say the decisions are all over the map.
“Every case is a bit of a test case, and different courts and different circuits are deciding things differently,” said Elettra Bietti of Northeastern University School of Law, who signed onto a brief supporting a limited view of Section 230.
Representatives for Meta and Campbell didn’t respond to a request to comment.
Moderators’ Dilemma
Section 230 effectively immunizes website operators from liability for third-party content posted on their websites and has been understood to extend to claims related to “traditional editorial functions.”
Meta sees the definition as encompassing nearly all the platform’s activity, telling the Massachusetts justices in a brief that all Instagram content is from users and that without it, “no one would use it, nor could possibly claim that they were addicted to the service.” The Facebook parent also says the claims against its design of Instagram are barred by the First Amendment.
The trial court concluded the heart of the case had nothing to do with Meta having to edit, remove, or monitor content—typical actions protected under Section 230—and instead is being targeted for its own conduct. A Nevada Supreme Court ruling last month in a case involving ByteDance’s TikTok held the company’s platform safety statements and its feature design could be challenged.
A California federal court overseeing thousands of social media addictions cases in multi-district litigation adopted a “nuanced” approach that concluded certain allegations and features qualified for immunity, like publishing geolocation information for minors, while others, like offering appearance-altering filters, didn’t.
“It should be straightforward but it’s not,” said the Electronic Frontier Foundation’s Sophia Cope, who says Section 230 should reach all design features that are analogous to actions a newspaper publisher would take.
Bietti and other groups backing Campbell’s position urged the court in an amicus brief to find Section 230 applies when its actions would force a company into a “moderators dilemma,” where it would be forced to police all its content, police none of its content, or stop hosting third-party content.
Immunity Denied
Massachusetts’ high court considered a similar Section 230 case in 2021 as car-sharing company Turo invoked the provision to attempt to defeat an injunction blocking it from offering users the ability to pick up and drop off cars at Boston Logan International Airport, operated by the Massachusetts Port Authority.
The justices denied the availability of Section 230 to Turo because the injunction was based in part on the company’s own advertising of the airport as a desirable location. It also found Turo was a party to the rental transactions, not merely a disinterested host of its users’ content.
Anderson Kreiger partner David Mackey, who argued the Turo case for MassPort, said it’ll be interesting to see whether Meta’s role in its creation and management of Instagram gets a similar treatment.
Goldman said he hopes the justices are able to fix problems in the lower court’s ruling, like its brief rationale greenlighting the state’s public nuisance claim against Meta. The idea of an online public nuisance is a “super interesting theoretical question” that has been discussed for decades, he said, but the court approved the state’s theory without a thorough engagement with that deeper discussion.
“That one line changed how I teach internet law,” he said.
The case is Commonwealth v. Meta Platforms, Inc., Mass., SJC-13747.
To contact the reporter on this story:
To contact the editors responsible for this story:
