© 2024 Connecticut Public

FCC Public Inspection Files:
WEDH · WEDN · WEDW · WEDY · WNPR
WPKT · WRLI-FM · WEDW-FM · Public Files Contact
ATSC 3.0 FAQ
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Blumenthal’s kids online safety bill gains more support in Congress

Parent and student advocates hold signs of loved ones and messages to pass The Kids Online Safety Act.
WEBEX
/
CT MIRROR
Parent and student advocates hold signs of loved ones and messages to pass The Kids Online Safety Act.

A tech bill proposed by Sen. Richard Blumenthal, D-Conn., and one of his Republican colleagues got a new boost of momentum this week as Congress looks more seriously at a number of measures to protect children online.

Blumenthal reintroduced The Kids Online Safety Act on a Tuesday call alongside Sen. Marsha Blackburn, R-Tenn., and advocates concerned about the harmful effects on minors and Big Tech’s prioritization of profits. College students recounted the addictive nature of these apps when they were younger, leading them to create their own time limits and distance from social media.

One parent, Tracy Kemp, spoke about how her 14-year-old son was bullied for being Black and was unaware of it because he was not on social media. She said her son has since recovered from the cyberbullying but noted how not all families are as lucky.

Another parent, Joann Bogard, said she checked her son Nathan’s phone regularly and enabled safety features. But he died in 2019 after trying the “choking challenge,” which had become viral again on social media platforms. Bogard said she has been searching and reporting the presence of these videos for years but that they remain able to view.

To address the issues of safety and accountability, the bill aims to put in place stricter settings by allowing children and parents to disable addictive features, enable privacy settings and opt out of algorithmic recommendations. It requires tech companies to conduct an annual independent audit to analyze the risks to minors and see if they are working to reduce it.

It also establishes a “duty of care” for sites that are likely used by young individuals “to act in the best interests of a minor” in matters related to certain mental health disorders, physical violence, online bullying and sexual exploitation.

Since its introduction in early 2022, Blumenthal and Blackburn have made several changes to try to address critics’ ongoing concerns about censorship and the authority of state attorneys general. The Kids Online Safety Act now specifies the issues surrounding mental health instead of a broad interpretation when it comes to duty of care: suicidal behaviors, anxiety, depression, eating disorders and substance use disorders.

It also identifies protections for services like the National Suicide Hotline, substance abuse organizations and LGBTQ+ youth centers. The clarification likely seeks to assuage concerns that the bill would prevent vulnerable children from reaching vital services. Blumenthal also noted that schools and educational software are excluded from the measure.

“Today really is a major milestone in the effort to protect kids online. It is a bill introduction but even more importantly a call to action, because we’re enlisting not just members of Congress but people who have firsthand, real-life experience with the harms done by social media,” Blumenthal said. “There is no denying that we are in the midst of a mental health crisis in America. It’s a teen mental health crisis.”

There has been limited action on the federal level over the past few decades dealing with young users’ experiences online. The Children’s Online Privacy Protection Act of 1998 was one of the last major pieces of legislation that protects children under age 13.

Young users like Emma Lembke and Zamaan Qureshi, who co-chair Design It For Us, argue they should also be in control of their online experience rather than giving the authority solely to parents or guardians. Qureshi said Generation Z should have a “seat at the table.”

Lembke, a college student who testified about this issue before Congress in February, said she has been using social media since age 12. She said hours of watching daily content on Instagram and seeing unrealistic body standards led to disordered eating. She also runs a group seeking to get minors to spend less time online, the LOG OFF Movement.

“As part of the generation who will benefit from the legislation like what we’re discussing today, it is so important to myself and my peers that our perspective is a part of the process every step of the way,” Lembke said on Tuesday’s call.

Social media “can enhance our collective good,” Lembke added. “But like I said, no other generation has grown up in the same environment that is also one of deep, extensive and unchecked harm.”

Online protections and privacy appear to be a rare issue of bipartisan agreement in Congress, especially in a divided government where there will likely be little compromise. But like many efforts on Capitol Hill, it can take a while to navigate the legislative process.

The legislation was left out of a year-end bill to fund the federal government, but Blumenthal believes there is enough support to vote on the bill during this session of Congress. He said Senate Majority Leader Chuck Schumer, D-N.Y., is “100% behind this bill,” but the timing is at the discretion of Democratic leadership.

The Kids Online Safety Act has gotten a groundswell of support since it was first introduced with 12 co-sponsors. That number more than doubled to 30 co-sponsors and has enough Republican support to have it pass the Senate if all or most Democrats back the bill. But it will still need backing in the GOP-led House in order to get to President Joe Biden, who has repeatedly indicated his support for online protections for children.

One of the more than two dozen Senate co-sponsors includes Sen. Chris Murphy, D-Conn., who separately introduced his own legislation last week also aiming to protect children and teenagers using social media.

Sen. Chris Murphy, D-Conn., backs legislation barring children under 13 from using social media.
Lisa Hagen
/
CT MIRROR
Sen. Chris Murphy, D-Conn., backs legislation barring children under 13 from using social media.

The Protecting Kids on Social Media Act would bar anyone under 13 from using social media and require those between 13 and 17 get consent from a parent or legal guardian to sign up. The bill would also block companies from recommending content through algorithms for minors under 18. Enforcement would similarly be left to the Federal Trade Commission and state attorneys general.

“This is one of the most apolitical conversations I have in Connecticut. The opportunity here is to really frame this inside this building the same way that it exists out there in the public,” Murphy said last week. “It actually unites people of differing political views around one very simple premise — that parents should have better tools to protect their kids.”

Blumenthal has raised concerns about age requirements, arguing that they are difficult to enforce and “puts the burden” on parents over major tech companies. His bill only requires parental consent for children under age 13.

But the senators involved in these efforts have largely found common ground on online protections. Three of the four main sponsors of the social media bill also signed onto Blumenthal’s measure: Murphy, Sen. Katie Britt, R-Ala., and Sen. Brian Schatz, D-Hawaii.

Prior to the reintroduction of The Kids Online Safety Act, civil liberties and LGBTQ+ rights groups have been raising consistent concerns about potential censorship of young users, debate over what content is deemed “appropriate” and the discretion that would be given to state attorneys general.

Groups like the American Civil Liberties Union and the Gay, Lesbian and Straight Education Network commended the overall goals but warned about “significant parental surveillance” of vulnerable kids and teens. They believe it could have “unintended consequences” when it comes to content filtering and limited access to information for vulnerable children in abusive situations and LGBTQ youth.

Jason Kelley, associate director of digital strategy at Electronic Frontier Foundation, took issue with the duty of care provision because it “ties platform liability to content recommendations,” and it is difficult to manage such content when there is little agreement on what is appropriate for minors. Plus, he argued that the bill provides too much authority to attorneys general who could pressure tech companies to “over-moderate.”

“It’s understandable that there is this push to limit extreme content that young people see,” Kelley said in an interview earlier this year. But “is there agreement on what type of content actually causes these harms? I don’t think that there is.”

“The bill would just create massive censorship across the board for everyone,” he added. “The biggest targets are by any attorney general.”

It is unclear how the groups in question feel about the latest version of the legislation, but groups like Electronic Frontier Foundation had previously said no amount of revisions would change their opinion of the bill.

Blumenthal said lawmakers have been in touch with concerned groups like the ACLU and remain open to additional suggestions as the legislative process begins.

“I don’t think there’s a First Amendment right to drive eating disorder content or bullying or suicide to kids, but we have an open door policy,” Blumenthal said Tuesday. “We’re going to try to meet the ongoing concerns of any group or individuals who have legitimate suggestions and criticisms.”

Lisa Hagen is CT Public and CT Mirror’s shared Federal Policy Reporter. Based in Washington, D.C., she focuses on the impact of federal policy in Connecticut and covers the state’s congressional delegation. Lisa previously covered national politics and campaigns for U.S. News & World Report, The Hill and National Journal’s Hotline.

Stand up for civility

This news story is funded in large part by Connecticut Public’s Members — listeners, viewers, and readers like you who value fact-based journalism and trustworthy information.

We hope their support inspires you to donate so that we can continue telling stories that inform, educate, and inspire you and your neighbors. As a community-supported public media service, Connecticut Public has relied on donor support for more than 50 years.

Your donation today will allow us to continue this work on your behalf. Give today at any amount and join the 50,000 members who are building a better—and more civil—Connecticut to live, work, and play.