There’s more to the FTC’s proposed COPPA changes than meets the eye
& how the amended privacy rules will impact kids' digital services
Just before the holidays the FTC dropped its long-awaited proposed revisions for the Children’s Online Privacy Protection Act, following a record-setting review period of over four years (!). This update to COPPA (the first since the FTC expanded the definition of personal information in 2013) feels a bit anticlimactic, given the length of the review and the extent of the debate in workshops and comment threads.1 As it is, the proposal is likely to please few – not the advocates of child protection, nor the kids’ content providers, social media or game platforms.
But look under the hood, and there are some rather useful changes and clarifications that – while not reducing the overall regulatory burden – will at least eliminate some grey areas and pave the way for more safer spaces for kids.
Let’s unpack what the FTC is recommending:
The Good -
Requiring a separate parental consent for sharing kids’ personal information (PI) with third parties (eg, advertisers) and not allowing access to be conditioned on such consent. This effectively unbundles the consents for collection and sharing, enabling parents to decline any PI sharing while still allowing their child to access the service.
There are fewer and fewer reasons to ever share kids’ PI with third parties so I don’t expect this to have a severe impact on operators. Advertising can be contextual, analytics can be effective without PI, and personalisation tools can be run in-house.
Require operators using the internal operations exemption2 to notify users what persistent identifiers (IP addresses, cookie IDs) they are collecting and why, and how they will ensure they can’t be used to target the user.
Sensible, increases transparency, but will challenge operators to explain technical uses of such identifiers in language parents can understand.
Add mobile number to the definition of Online Contact Information, so that parents can be asked for verifiable parental consent (VPC) via text message. The FTC also reiterates that consent can be obtained by other methods, such as a pop-up asking the child to hand the device to their parent.
This one has been long-awaited and overdue. Opens up innovations in VPC flows to streamline the experience for parents.
Add biometrics to the definition of Personal Information. Specifically: “biometric identifiers that can be used for the automated or semi-automated recognition of an individual.”
This was expected and also makes sense. What is truly encouraging is the FTC’s careful qualification to ensure only data that can be used for recognition of an individual is captured.3 This leaves the door open for biometric-based approaches that cannot be used for individual identification – such as age estimation – to improve child safety through privacy-protective age verification.4
Eliminate the monetary transaction requirement when using a payment card for VPC (so long as the payment service is still able to provide discrete notification of each transaction).
In simple terms, this means that payment card VPC no longer requires a small charge to be made, and brings the US in line with the EU and other jurisdictions. A very positive change, that will reduce friction in VPC flows. An open question is whether the FTC is OK with services defaulting to no notification unless the parent specifically configures it.
The Controversial -
Limit operators’ ability to nudge kids to stay online by prohibiting the use of persistent identifiers to optimise user attention or maximise engagement, including by sending push notifications.
This is part of the FTC’s continuing effort to stretch its mandate into legislating age-appropriate design via privacy regs. Unless the wording is significantly refined, this will give product, marketing and legal teams a lot to argue about. The line between personalisation, improving game play, maximising user engagement and ‘nudging’ is very jagged indeed.
Add new examples to the multi-factor test for determining whether a service is child-directed, including (a) marketing materials, (b) third-party reviews, and (c) the age of users on similar sites.
The FTC has used these factors effectively in recent cases anyway, so this is mostly clarifying. That said, the idea of comparing a site to similar services is likely to be contentious. Is YouTube like Kidoodle? Is TikTok like Zigazoo? Is Fortnite like Roblox?
Confirms that ad attribution falls within the ‘support for internal operations’ exemption (so long as no profiling takes place).
A useful clarification that will nonetheless upset many child protection advocates who don’t believe ad attribution is necessary in the kids’ market, or that any adtech platform can be trusted with persistent identifiers relating to kids. They have a point, which should remind operators to do more due diligence on their adtech suppliers.
The Missed Opportunities -
No expanded definition of website or service. The FTC considered other approaches to determining whether a site is child-directed, including setting minimum % thresholds of child users (via survey or other means), but ultimately decided against any change.
It seems the FTC did not consider the most impactful change it could have made: amending the definition of Site to include third-party served advertising. This would have made advertisers and adtech intermediaries equally responsible (with the publisher) for preventing PI collection (and mass-scale leakage of PI into the ad ecosystem) from kids if the ad is child-directed, especially on general audience website.5
No shift in the actual knowledge standard to constructive knowledge, ie sites that ‘should’ know they have young users would be required to apply COPPA’s protections.
The FTC basically says it was hamstrung by previous congressional rulings that rejected such an expansion of the knowledge standard. That said, that decision is 13 years old, and it would seem the world has changed since then (including how congressmen feel about kids on the internet), so it may have been worth a try? The status quo ensures that general audience services used by lots of kids will continue to wilfully avoid knowledge of their audience in order not to have to implement COPPA protections.
No attempt to truly reduce the incidence of leakage of kids’ PI via the internal operations exemption.
The FTC’s reasoning is brief and incomplete. Because of the internal operations exemption, operators broadcast kids’ IP addresses billions of times a day through ad requests to adtech intermediaries, most of whom do not end up bidding on the inventory.[5] By banning full IP addresses or persistent cookie IDs within the exemption, the FTC could have catalysed innovation in analytics, personalisation, contextual ad delivery, ad measurement – many of which can be provided using other methods.
No new tools to support platform-level age verification, consent management or federated parent verification.
Although the FTC helpfully reiterates its support for ‘common consent mechanisms’ and suggests that platforms (OS, devices, consoles) could play a centralising role, it does not (perhaps it cannot under its powers?) provide meaningful tools to help this along. For age verification, for example, platforms (e.g. consoles or mobiles) could really benefit from legal clarity that they can provide a user’s age to an app or a service without liability, and the app or service in turn would benefit from knowing it can rely on such a provided age to meet it COPPA obligations without having to re-age verify the user.
Undoubtedly the 60-day comment period now underway will be another heated one, and many commenters will decry the fact the FTC didn’t propose to ban targeted advertising to kids altogether (which would have brought it in line with the EU), nor raise the age for digital consent beyond 13, nor include ‘inferred data’ (ie, data about a child) in the definition of PI. The fact is that the FTC’s rule-making authority only extends to improving the implementation of the existing law. Anything beyond that – including many of the changes commenters want – remain the purview of legislators. In that light, the FTC’s proposal represents the art of the possible, and reminds Congress of its own duty to act if it wants more substantive changes.
The FTC received more than 175,000 comments when it first announced its intention to modify the rule.
Under the exemption, parental consent is not required if persistent identifiers are solely used to support internal operations, including site maintenance and analytics, authenticating users, personalisation, contextual ads, frequency capping.
Presumably the FTC has been drawing the lessons from state-level biometrics laws like BIPA, where arguments about “biometric identifiers” vs “biometric information” have tied legal teams everywhere into knots and led to countless vexatious lawsuits. The FTC understands the different uses of biometrics (ranging from very private to very invasive) and hopefully legislators will soon show a better grasp too.
Note that this is wholly unrelated to the use of privacy-preserving facial age estimation in adults, ie for providing verifiable parental consent, as proposed by my former team at SuperAwesome along with the ESRB and Yoti.
To better understand the problem of data leakage in the ad ecosystem, I can recommend Brian O’Kelley’s Data is fallout, not oil. To see why this is so pernicious for kids because of the internal operations exemption, take a look at the recommendation I submitted (on behalf of SuperAwesome) to the FTC in 2019, page 5.
Thanks for writing Max. Really useful summary. Looking forward to many more posts.