Kids aren’t safe on social media, lawmakers say. Tech CEOs are back in DC to pledge that they’ll handle it

01.02.2024

 

This week, Congress will once again question the CEOs of major tech companies, including Meta’s Mark Zuckerberg, regarding potential risks their products pose to teenagers. Despite previous assurances that social platforms would assist teens and families in making informed decisions, concerns about the negative impact of social media on young users, potentially leading to depression or even suicide, have prompted online safety advocates to criticize the industry’s response.

With a looming presidential election and state lawmakers taking center stage, Congress is expected to push tech companies to go beyond their previously introduced tools.

The Senate Judiciary Committee hearing on Wednesday will feature testimonies from the chief executives of TikTok, Snap, Discord, and X, alongside Mark Zuckerberg. For some, like X CEO Linda Yaccarino, Snap CEO Evan Spiegel, and Discord CEO Jason Citron, this hearing marks their inaugural appearance before Congress.

Many tech CEOs are likely to use the hearing to highlight tools and policies aimed at protecting children and granting parents greater control over their kids’ online experiences. Companies such as Snap and Discord have indicated their intention to distance themselves from Meta by emphasizing that they do not prioritize algorithmically recommended content that could be addictive or harmful.

Despite these efforts, parents and online safety advocates argue that the tools introduced by social media platforms fall short, placing the responsibility of teen protection largely on parents and, at times, young users themselves. There is a growing consensus that tech platforms should no longer be entrusted with self-regulation.

Jeff Chester, the executive director of the Center for Digital Democracy, a nonprofit focused on online consumer protection, urged the committee to push executives to commit to substantial changes. This includes disconnecting advertising and marketing systems from services known to attract and target youth.

The proliferation of generative artificial intelligence tools, capable of enabling malicious content creation and dissemination on social media, underscores the importance of ensuring that safety features are inherently built into tech platform.

en_USEnglish