At a Glance:

  • On January 31, the Senate Judiciary Committee hosted a bipartisan hearing featuring Big Tech CEOs from some of the world’s largest social media companies, where committee members accused the companies of facilitating online child sex abuse.
  • The hearing, which featured Meta CEO Mark Zuckerberg, X CEO Linda Yaccarino, TikTok CEO Shou Zi Chew, Snap CEO Evan Spiegel, and Discord CEO Jason Citron, is the latest in a series of hearings aimed at reigning in Big Tech companies after failed attempts at legislation.
  • With multiple bipartisan bills meant to protect minors online advancing from the Committee, it is the Kids Online Safety Act (KOSA) that will likely be the first pick for Senate Majority Leader Chuck Schumer to put up for a vote. 

Here are five facts to know about the US Senate Hearing on “Big Tech and the Online Child Sexual Exploitation Crisis":

 

1. A BIPARTISAN COMMITTEE WANTS SOCIAL MEDIA COMPANIES TO ADDRESS CHILD SAFETY 

In an era where bipartisanship is rare on Capitol Hill, this hearing was noteworthy. Congress is pushing for more aggressive content moderation and regulation on social media platforms in a response to constituents’ concerns around online child sexual exploitation, youth suicides, and research showing that social media can be harmful to youth mental health. The hearing, which at times was tense, featured the CEOs of each major social media company and the families of those impacted by the harmful effects of social media. A few notable issues Congress wants to address include:

  • What should be done to make the internet, more specifically social media, safer for kids and others who are most vulnerable?
  • Who holds responsibility for content that is shared on social media platforms that might be harmful for users?
  • What is the impact and relationship between mental health and social media? 

 

"You Have Blood on your hands"

— Senate Judiciary Committee Ranking Member Lindsey Graham (R-SC) 

 

2. SOCIAL MEDIA COMPANIES ARE STILL LOOKING TO CONGRESS FOR POLICY SOLUTIONS 

Despite asking some critical questions, the hearing failed to yield any direct solutions, functioning instead mostly as a forum to express anger and frustration, and to address public pressure for Congress to act on what consumers feel is a risky and spotted regulatory environment.

Social media companies have asked Congress for an opportunity to help shape regulatory guidance that would help move the needle on this issue. However, they’d like for Congress to provide a working framework. As of now, there is no consensus on a solution.

Congress is currently debating legislation on how to regulate AI, a tool that many companies are increasingly using to help moderate their platforms. New regulations on how Generative AI is used will impact content moderation, curation, and news feeds where users see and consume most of their content. This means the space where brands and corporations post their content will likely be evolving again over the next year. 

 

3. CONGRESS HAS FIVE BILLS UNDER SERIOUS CONSIDERATION TO ADDRESS THIS ISSUE 

The Senate Judiciary Committee advanced five bills to address the safety of minors on social media as of last year, including the STOP CSAM Act of 2023. CSAM would effectively amend Section 230 of the Communications Decency Act and allow victims of child sexual abuse material to sue tech companies for content posted by third parties.

Currently, Section 230 provides limited federal immunity to providers and users of interactive computer services and is the reason many social media companies cannot be held liable for materials on their platform.

There is a lack of clarity about how effective any of this legislation will be in practice. Platform representatives consistently remind lawmakers that social media platforms are large areas where sometimes harmful content slips through, even with the help of content moderators (both real and virtual). Currently, the Kids Online Safety Act (KOSA) has support from Microsoft and X (in its current form) and seems to be on track to appear on the Senate floor. 

Legislation likely to hit the floor for a vote includes:

  • The Kids Online Safety Act (KOSA) would create liability, or a “duty of care,” for social media platforms that recommend content to minors that can negatively affect their mental health.

Key legislation currently for consideration on the Hill around child safety: 

  • The STOP CSAM Act would allow victims to sue online platforms, strengthen CyberTipline reporting requirements, and make it easier to request takedowns of child sex abuse material (CSAM).
  • The EARN IT Act removes the tech industry’s blanket immunity from civil and criminal liability under child sexual abuse material laws and establishes a National Commission on Online Child Sexual Exploitation Prevention.
  • The SHIELD Act ensures that federal prosecutors have appropriate and effective tools to address the nonconsensual distribution of sexual imagery.
  • The Project Safe Childhood Act modernizes the investigation and prosecution of online child exploitation crimes.
  • The REPORT Act combats the rise in online child sexual exploitation by establishing new measures targeting electronic service providers to help strengthen reporting of those crimes to the CyberTipline. 

 

4. LAWMAKERS NEVER MISS A CHANCE TO BE TOUGH ON CHINA’S TIKTOK 

Despite the high-level rapprochement between the US and China following Biden and Xi’s meeting last November in San Francisco, 2024 is an election year and China will continue to be a key issue on the campaign trail for both parties. During the hearing, members of Congress took the opportunity to grill TikTok CEO Shou Zi Chew on issues beyond the stated scope of the hearing—including data and national security concerns around privately-owned TikTok’s alleged association with the Chinese Communist Party. Chew, in his second appearance before Congress, reiterated that the company has never shared data with the Chinese government, nor has it ever been requested.

  • There is no political downside to being tough on China—another area of rare bipartisan consensus—particularly in an election season. Increased scrutiny and tough rhetoric from Congress and the candidates running for president on China-related issues should be expected; Chew and TikTok are easy targets on this issue.
  • Congress is considering the RESTRICT Act, legislation that would give the Department of Commerce more power to effectively ban apps that would pose a risk to national security. RESTRICT, targeted specifically toward TikTok, has enjoyed bipartisan support in the Senate, but has not progressed in the Senate. A number of other proposals are also under consideration as lawmakers explore other tools to protect US data. 

 

5. META’S ZUCKERBERG MADE AN APOLOGY AND MORE COMMITMENTS 

 

“I’m sorry for everything you’ve all gone through, it’s terrible. No one should have to go through the things that your families have suffered, and this is why we invested so much and are going to continue doing industry-leading efforts to make sure that no one has to go through the types of things that your families had to suffer.” 

— Meta Founder and CEO, Mark Zuckerberg 

Sen. Josh Hawley’s (R-MO) questioning led Mark Zuckerburg to turn towards parent advocates and express his regret for the pain they had suffered though Meta’s platform. Zuckerberg’s apology expressed regret for everything the families have suffered and expressed the commitment social media companies are making to ensure that no one else would go through similar situations again. The apology’s reception was mixed. One parent said that the apology was “half-hearted” and that social media companies had a lot of work to do to regain trust from parents and consumers.