September 23: Student & Child Privacy News
Recapping last week's biggest student and child privacy updates.
Last week was a whirlwind in the world of student and child privacy! In case you missed it, we’ve rounded up the highlights for you. As always, please let us know if we missed anything, or you have additional news to share!
Best,
The PIPC Team
TLDR:
COPPA 2.0 and KOSA both passed out of the House Commerce committee.
New FTC report on social media data practices.
Instagram introduces teen accounts.
COPPA 2.0 and KOSA Pass Out of House Commerce Committee
Last Wednesday, the House Committee on Energy and Commerce passed revised versions of the Children and Teens' Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA). These new versions differ significantly from the Kids Online Safety and Privacy Act (KOSPA) that the Senate passed in July. (see PIPC’s full comparison of the Senate’s KOSPA with the new House COPPA 2.0 and KOSA).
With the House entering recess this week until after the general election, further action on COPPA 2.0 and KOSA will be postponed until the "lame duck" session.
Feedback from Commerce Committee members during last week's session suggests that more changes to COPPA 2.0 and KOSA are likely before the bills pass the House. Should these bills pass, Congress will need to form a conference committee to reconcile differences between the Senate's KOSPA and the House's versions of COPPA 2.0 and KOSA. Should they clear this hurdle, President Biden is expected to sign them into law, as he previously urged the House to expedite the process following the Senate’s approval of KOSPA.
Major differences in the House bills include:
Adding two exceptions that allow schools to continue to provide their students with privacy-protective, technology-enhanced educational experiences, such as through adaptive tests or personalized learning:
Covered platforms must provide the ability to “control personalized recommendation systems” to the educational agency or institution, rather than to the user, “when the online platform is acting on behalf of an educational agency or institution” and is subject to a written contract that complies with the requirements of the COPPA and FERPA.
Title II (Filter Bubble Transparency) added a similar exception to the notice and opt-out requirements. Online platforms “shall provide the notice and opt-out” to the educational agency or institution, rather than to the user, “when the online platform is acting on behalf of an educational agency or institution” and is subject to a written contract that complies with the requirements of the COPPA and FERPA.
Narrowing the duty of care provision by limiting the types of harms that platforms must prevent and mitigate.
Notably, the House removed mentions of mental health harms such as anxiety and depression, substituting harms stemming from the “promotion of inherently dangerous acts” that are likely to cause serious bodily harm, death, or “serious emotional disturbance.”
Creating a tiered knowledge standard for whether a covered platform knows an individual is a child or minor:.
For high-impact online companies (those primarily used to access or share user-generated content and either (1) generates $1,000,000,000+ in annual revenue, including revenue generated by any affiliate; or (2) has 100,000,000+ global monthly active users for at least 3 of the past 12 months), the standard is whether the platform “knew or should have known” that an individual was a child or minor.
For mid-sized platforms (those with an annual gross revenue of $200,000,000+, collecting personal information of 200,000+ individuals, and not a high impact online company), the standard is whether the platform “knew or acted in willful disregard” that a user was a child or minor.
For all other platforms, the standard is “actual knowledge.”
Eliminating the requirement for the FTC to conduct research on the risk of harms to minors from using social media and other online platforms.
Creating the same tiered knowledge standard as in KOSA for when high impact social media companies, mid-sized operators, and all other operators know an individual was a child or teen.
When a shared device has both child and adult profiles (such as in a Hulu account), clarifying that targeted advertisements can be shown in the adult’s profile but not in the minor’s profile.
Allowing parents and teens to access, correct, and delete teens’ information (previously just teens had these rights). However, the House bill still allows teens to consent to the collection, use, or disclosure of their personal information from their use of online services (instead of having to have parents consent).
Defining the term “educational agency and institution” to clarify that public, charter, and private schools can consent to technology uses in school with a sufficient written agreement in place.
What’s next?
Since the House is on recess starting this week through the general election, COPPA 2.0 and KOSA will not move again until after the election (which will be a “lame duck” session). Based on the comments by Commerce Committee members during last week’s markup, there will likely be many additional changes to the House versions of COPPA 2.0 and KOSA prior to passage.
Should the bills pass the House, Congress would need to form a conference committee to resolve or reconcile the differences between the Senate’s KOSPA and the House’s COPPA 2.0/KOSA. If the bills make it out of conference committee, we anticipate President Biden would sign them into law due to his previous statement following the Senate’s passage of the KOSPA (which incorporated both COPPA 2.0 and KOSA): “I encourage the House to send this bill to my desk for signature without delay.”
FTC Releases Report on the Data Practices of Social Media Companies
Last Thursday, the Federal Trade Commission (FTC) released a new staff report, A Look Behind
the Screens: Examining the Data Practices of Social Media and Video Streaming Services. The report shows how social media platforms “engaged in vast surveillance of consumers in order to monetize their personal information while failing to adequately protect users online, especially children and teens.”
The report includes in-depth analysis of how social media companies collect and analyze vast amounts of personal data about users–both on and off their platforms–to facilitate targeted advertising. The FTC details how consumers typically do not understand these data practices, posing significant risks to user privacy:
“Many users may not know that the information they input directly (e.g., birthdate, home address) may be used for advertising, and the extent to which their every click, search, and action is being collected, logged, saved, and, in some cases, shared, including their activity off of the [social media and video streaming services]. The data collected using the tracking technologies described below can be invisible to users. Consumers are frequently unaware of the potential downstream uses—including the sale to third parties of location data that may be used to identify consumers and their visits to sensitive locations, such as houses of worship and doctors’ offices—of the immense amounts of data collected about them.” (page 40).
To protect consumers from such data practices, the FTC recommends that Congress enact “comprehensive federal privacy legislation that limits surveillance and grants consumers data rights” (page 80) and “federal privacy legislation that protects Teen users online.” (page 84). As highlighted in Amelia’s past testimony, this is the ideal solution because, “in addition to raising privacy protections for all consumers, a comprehensive consumer privacy bill can carve out additional protections for children that fill the gaps in existing legislation.” Unfortunately, the momentum for comprehensive federal consumer privacy legislation we witnessed earlier this year has since stalled, making it highly unlikely that Congress will pass a comprehensive bill this term.
However, there is still a potential that Congress may act on one of the report’s recommendations by the end of 2024: enacting child-specific privacy protections, such as those COPPA 2.0 and KOSA. In its report, the FTC made several recommendations for companies offering social media and video streaming services to better protect children and teens online:
View the Children’s Online Privacy Protection Act (COPPA) as the floor, not the ceiling for privacy protections;
Do not ignore the reality that there are children using their platforms; and
Do more to protect teens using their platforms (page 83-84).
We were particularly interested in the FTC’s focus on protecting teens online–furthering a trend we have seen in their recent enforcement actions. The report noted that teens were often permitted “to create accounts without any restrictions,” (see graphic) experiencing social media platforms the same way that adults do.
Source: FTC Report, Page 73
Instagram Announces New Safeguards for Teen Accounts
Last Tuesday—the day after the House Commerce Committee announced its markup of KOSA and COPPA 2.0—Instagram announced that it will be launching “teen accounts” to increase safeguards for teens on their platform. According to the press release, teen accounts will have the following safeguards enabled by default:
“Private accounts: With default private accounts, teens need to accept new followers and people who don’t follow them can’t see their content or interact with them. This applies to all teens under 16 (including those already on Instagram and those signing up) and teens under 18 when they sign up for the app.
Messaging restrictions: Teens will be placed in the strictest messaging settings, so they can only be messaged by people they follow or are already connected to.
Sensitive content restrictions: Teens will automatically be placed into the most restrictive setting of our sensitive content control, which limits the type of sensitive content (such as content that shows people fighting or promotes cosmetic procedures) teens see in places like Explore and Reels.
Limited interactions: Teens can only be tagged or mentioned by people they follow. We’ll also automatically turn on the most restrictive version of our anti-bullying feature, Hidden Words, so that offensive words and phrases will be filtered out of teens’ comments and DM requests.
Time limit reminders: Teens will get notifications telling them to leave the app after 60 minutes each day.
Sleep mode enabled: Sleep mode will be turned on between 10 PM and 7 AM, which will mute notifications overnight and send auto-replies to DMs.”
Parents will be able to change these default settings for teens under 16. Additionally, the press release says that parents will be able to:
“Get insights into who their teens are chatting with: While parents can’t read their teen’s messages, now they will be able to see who their teen has messaged in the past seven days.
Set total daily time limits for teens’ Instagram usage: Parents can decide how much time their teen can spend on Instagram each day. Once a teen hits that limit, they’ll no longer be able to access the app.
Block teens from using Instagram for specific time periods: Parents can choose to block their teens from using Instagram at night, or specific time periods, with one easy button.
See topics your teen is looking at: Parents can view the age-appropriate topics their teen has chosen to see content from, based on their interests.”
As a whole, these changes mirror many of the proposed requirements in child-privacy bills such as KOSA and COPPA and sound like they will be a great step towards better protecting teens on social media while ensuring their privacy and autonomy is protected. We anticipate that additional social media companies will soon follow suit, proactively working to increase protections for children and teens on their platforms ahead of potential legislation forcing them to adopt these changes later on.