Senate makes last stand for child safety
US politicians hammered social media CEOs about supporting their bills. But what is actually in them?
Last week’s public flagellation in the US Senate of social media CEOs provided plenty of red-meat soundbites for the political factions present, though little in the way of useful debate on how to make social platforms actually safer for kids and teens. This is a shame, if not unexpected.
Hearing the senators bemoan the lack of progress in passing the 7(!) bills their committee has voted on in recent years certainly felt a bit rich, given the extent to which the members and their colleagues have been responsible for the obstructionist tactics that prevent laws from being passed.
The tone was set right from the start by Judiciary Committee chair Sen. Dick Durbin (D-IL) who introduced the panelists by pointing out that only Mark Zuckerberg of Meta and Shou Chew of TikTok appeared voluntarily, the others pursuant to subpoenas, and Discord CEO Jason Citron only “after US Marshals were sent to Discords headquarters at taxpayer's expense.” Ouch.
What followed was a relentless effort by each Senator to shame the CEOs into apologising to the victims of social media and to demand support for their favourite legislation without conditions. Predictable but depressing side tracks included the Republicans’ McCarthy-style attempts to get TikTok’s Singaporean CEO to admit he answers to the Chinese Communist Party. My favourite soundbite was Linda Yaccarino’s repeated reference to X as a “14 month old company”, which she says has “reprioritized child protection and safety measures.”
The most interesting, actually useful, suggestion that emerged among the shouting was Zuck’s idea (included in Meta’s legislative proposal) of implementing parental consent flows for under-16s at the device or app store level:
I don't think that parents should have to upload an ID or prove that they're the parent of a child in every single app that their children use.1 I think the right place to do this […] is within the app stores themselves. My understanding is Apple […] already requires parental consent when a child does a payment within an app. So it should be pretty trivial to pass a law that requires them to make it so that parents have control.
For a run-down of the (many) cringe-inducing moments of the hearing, I recommend Silicon Republic’s breakdown and TechPolicy’s analysis. A full transcript can be found here.
But given how repetitive the Senators were in promoting their own bills and insinuating that the only thing stopping them is the tech companies’ expensive lobbying efforts against, I was left with the question:
What’s in these bills anyway?
And could they come to a floor vote now that the Judiciary Committee has whipped up some public (supposedly bipartisan) emotion behind them?
The primary lever in these legislative efforts is to increase the platforms’ exposure to liability for harms kids and teens experience. It’s why the Senators filled the room with families of victims (suicides, drug overdoses) holding photos of loved ones whose loss they blame on social media. This approach represents, in many ways, a capitulation of the legislative branch: it says—we can’t come up with or pass regulation, so the next best thing is to allow families to sue the platforms and hope they will come up with solutions under pressure from the courts.
Let’s take a closer look at these bills, 5 of which have been passed out of the committee with a unanimous vote (which the Senators kept reminding us is astonishing in its own right2):
KOSA—the Kids Online Safety Act has been kicking around for 3 years. It would require operators to apply a ‘duty of care’ to prevent harms coming to kids and teens, both in terms of content and conduct. It requires the most privacy-protective settings by default, more parental oversight, research into new approaches to age verification3, and would allow young people to opt out of design features that are blamed for social media addiction, like algorithmic feeds, notifications, autoplay, etc. KOSA has faced vigorous opposition from civil rights and free speech groups4, who fear that the vague definition of ‘harms’ could allow state attorneys general (who would be charged with enforcing it) to cherry-pick content they deem harmful, leading to excessive and unevenly applied censorship5. That said,
Last week Snap broke rank with its own trade group NetChoice by backing the bill; CEO Evan Spiegel doubled down in the hearing by saying that Snap “have already implemented many of its provisions.” X’s Linda Yaccarino also declared support, and now Microsoft (who was not at the hearing) also decided to back it.6
The bill has 47 co-sponsors in the Senate from both parties, so it is thought to be the child online safety bill with the best chance of passing—expect another push in the coming weeks.
The Stop CSAM Act would make it easier for victims to request removal of child sexual exploitation imagery and allow them to sue platforms that knowingly distribute them. It effectively creates a civil remedy against any online platform that facilitates the exchange of CSAM.
It is backed by X, as of last week. TikTok’s CEO said he could back it if they can solve “some questions re implementation.”
The bill had been stalled for months because industry opposed provisions that would have weakened the legal shield platforms have when user messages are encrypted end-to-end7. Last minute changes now put this bill in a good position to get another hearing.
The SHIELD Act takes a privacy angle. It establishes federal criminal liability for people who distribute others’ private or explicit images online without consent.
The bill is supported by X. It has been sent to the Senate and is awaiting a hearing. Govtrack gives it a 34% chance of being enacted.
The EARN IT Act tries to tackle one of Republicans’ favourite bugbears—Section 230 of the Communications Decency Act—which some would like to abolish in its entirety8. This bill doesn’t go that far, but it would remove platforms’ immunity from civil and criminal liability in relation to CSAM, so they could be sued for hosting such content. It also establishes a National Commission on Online Child Sexual Exploitation Prevention.
This bill has been sent to the Senate and is awaiting a hearing. Govtrack gives it a 37% chance of being enacted.
The Project Safe Childhood Act is not actually new, but would provide continuation funding of $62m/year through 2028 for the Project Safe Childhood program administered by the DOJ. This coordinates child sexual exploitation investigations and prosecutions across federal, state, and local law enforcement; provides training to law enforcement on best practices; and supports public education programs.
This one passed in the Senate in October and is now awaiting a reading in the House. Govtrack gives it a 56% chance of being enacted.
The REPORT Act tightens the requirements for providers to submit reports to the National Center for Missing and Exploited Children (NCMEC) when they become aware of online sexual exploitation of children. It also makes it easier for victims to file reports directly with NCMEC, and imposes a 1-year data retention period on platforms to facilitate investigations.
The bill is supported by X. It passed in the Senate but is awaiting advancement in the House. Govtrack gives it a 46% chance of being enacted.
Finally, the Cooper Davis Act (passed the committee by a vote of 16-5) would require social media and communication service providers to report to the DEA when they know of the sale or distribution of illicit drugs, including fentanyl, methamphetamines.
Fully supported only by Snap so far. Govtrack gives it a 34% chance of being enacted.
The timing and tenor of this hearing felt a lot like a crashing together of two political objectives, which probably can’t be reconciled:
to create momentum for some flawed-but-better-than-nothing legislation that is urgently needed in the hope that it will get to the floor for a vote; and
for individual senators to telegraph their outrage-fuelled stance against Big Tech and with the victims in order to score maximum points with their political bases in an election year.
Time will tell which stream wins.
Luckily they already don’t need to to this more than once in experiences that are powered by Epic Games’ KWS, which is built on a shared, privacy-protective database of parent verifications that are reusable across experiences (aka the parent graph).
Lindsay Graham (R-SC): “we passed five bills unanimously […] and look at who did this? Graham, Blumenthal, Durbin, Hawley, Klobuchar Cornyn, Klobuchar, Blackburn, and Ossoff. I mean we've found common ground here that just is astonishing…”.
Dick Durbin (D-IL): “Unanimous. Take a look at the opposition and membership of the Senate Judiciary Committee and imagine if you will [if] there's anything we could agree on unanimously.”
The most recent draft of KOSA softened requirements that had led to concerns about user privacy. Specifically, age-based restrictions under the bill would now be based on actual knowledge of the platforms and do not require additional age verification methods on top of what COPPA already mandates. For more detailed analysis see TechPolicy’s Overcoming Fear and Frustration with the Kids Online Safety Act.
A helpful rebuttal of these arguments come from Fairplay: Our legal analysis of the Kids Online Safety Act.
Specifically, the concern among liberals is that Republicans are only supporting the bill because it would allow its AGs to use it to suppress LGBTQ content.
It’s easy to say why these 3 companies have found a cheap win in backing KOSA. Snap is mainly a messaging platform, so not very affected by KOSA’s potential impairment of content boosting algorithms; X doesn’t really have kids (and few teens) as an audience; and Microsoft is mostly concerned with business software (Minecraft, Activision notwithstanding).
The encryption debate came up several times during the hearing. Discord’s CEO said they made a choice not to encrypt user communications: “We don't believe we can fulfil our safety obligations if the text messages of teens are fully encrypted because encryption would block our ability to investigate a serious situation and when appropriate report to law enforcement.” Discord is the odd one out in this case, as most platforms are moving toward more, not less, encryption. And legislative efforts—such as via the Online Safety Act in the UK—to combine privacy with access in the case of crimes have mostly come to naught.
The debate over Section 230 has become so divisive and politicised—and is rife with misunderstanding and misinformation—that any sensible, nuanced reform proposal tends to get drowned out. Section 230 shields platforms from liability for most content posted by users. It was enacted at the dawn of the internet to support innovation and growth of fledgling digital services. More important, Section 230 is what allows those platforms to moderate content without fear of being sued for getting it wrong. Both political parties have a gripe: Republicans because they think platforms moderate too much content; Democrats because they feel it allows platforms to do too little to fight misinformation. If Section 230 were repealed, as Sen. Lindsay Graham would have it, the platforms would have to either drastically censor any and all controversial content that could lead to lawsuits, or stop hosting user-generated content altogether. These choices would disproportionately impact smaller services, like local news outlets, that host user comments, likely impoverishing the diversity of spaces on the internet. This doesn’t mean that Section 230 can’t be reformed—it can and should and the EARN IT Act is potentially a good example of how by focusing on narrow types of content that we can all agree are wholly unacceptable.
Thanks Max. Very useful summary. What will happen if multiple bills get passed / how do these cross align (presuming they don't). Would love your thoughts on the most likely contenders