Capitalizing on Momentum: What’s Next for Technology Reform?
“It takes two to tango.” That phrase rings truer than ever in a political environment in Washington that, though divided, still requires working across the aisle to get any meaningful legislation to better our democracy across the finish line.
This concept is one that’s at the core of Issue One’s mission. Whether it’s inside or outside of Congress, we work in a crosspartisan manner to fix our broken political system and build an inclusive democracy that works for everyone.
That’s why we recently teamed up with the Democracy Journal to write seven essays for their lead fall issue, “Bipartisanship Reinvigorated.” The essays highlighted various reforms that would strengthen our democracy, but that also have support across the political spectrum. One area that the essays highlighted as an opportunity for reform that has support from both Democrats and Republicans alike is technology reform. Whether that be ensuring privacy for Americans’ data, protecting kids’ safety online, or reforming Section 230 of the 1996 Communications Decency Act, there are opportunities and outlets to move the needle on these issues even in such a polarized political environment.
Where Can We Go Next?
So what comes next for tech reform? To dig into this question, Issue One hosted a virtual event with the Democracy Journal about building bridges across divides to safeguard and strengthen democracy, with a particular focus on the landscape for technology reform legislation. During the discussion, Issue One Vice President of Technology Reform Alix Fraser, former Congressman and Council for Responsible Social Media (CRSM) Co-chair Dick Gephardt, and Issue One Campaigns Manager for Technology Reform Liana Keesing spoke about realistic technology reform policies that could continue to help our democracy remain robust in the 21st century.
They highlighted how the online information environment has shaped the trajectory of our democracy in recent years. Rep. Gephardt explained that “Democracy or self-government is really a communication, it’s a discussion, it’s a conversation between all the people, and there has to be at the base, shared reality, shared facts.”
Over the last decade, addictive and toxic algorithms have been drawing people into information silos with different sets of facts. While the issues the country faces today are similar to the ones we have faced historically, Americans are less willing to come together to solve them because social media has trained us to see the people who disagree with us as enemies.
Yet, despite this challenge, the speakers agreed that it’s possible to make technology and democracy compatible with each other. They emphasized the need to promote kids’ online safety, protect users’ data privacy, and hold Big Tech companies to higher product liability standards through new legislation and Section 230 reform.
Right now, kids’ online safety is the issue in the tech reform space that has gained the most traction. As our colleague Alix Fraser highlighted during the event, “Kids’ safety online has 90% support among Americans today, and when you look at these issues, you see so many unique allies coming together.” Most Americans know a young person or are a young person who has personally experienced the harms of social media. When people can put faces and names to the issue, they can see what’s at stake in their lives and communities.
This emotional investment in protecting children online has led to a coalition of organizations – including Issue One – advocating for the passage of the Kids Online Safety Act (KOSA), a bill that passed the Senate 91-3 earlier this year and that tech lobbyists are fighting tooth and nail to keep from advancing further in Congress. This bill would establish responsible safeguards that hold Big Tech companies to similar product liability standards that other industries are held to, such as companies that make children’s toys. In essence, if KOSA was passed and safeguards put in place, if a platform’s algorithm seriously harmed or killed a child, the company would be held accountable for that and would have more of an incentive to make its algorithm safer and healthier.
The challenge going forward will be to fire up the public around other issues in the tech reform space in the same way as kids’ online safety. As Issue One’s Liana Keesing noted during the event, the reality is that several other tech issues are just as important as kids’ online safety: “With something like kids’ safety, the harms are so clear and so visible, there are thousands of stories. The trick now is carrying that same momentum to issues like data privacy, like transparency, like [platform] research.”
Reforming Section 230 – a piece of product liability code in the 1996 Communications Decency Act that governs internet service providers (ISPs) – would serve as a genuine attempt at fixing some of these other underlying issues. The current version of the law absolves social media companies of consequences for the harm they inflict. Rep. Gephardt, who voted to pass Section 230 in the ’90s, shared insight as to how the current-day interpretation of Section 230 is out of alignment with what it was intended to do. He explained how at the time, the internet and ISPs were far less advanced and their ability to monitor and manage content was limited, making broad immunity a practical and reasonable protection. Section 230 allowed ISPs to host user-provided content without being held liable, while also empowering them to make a good-faith effort to take down harmful content.
Social media companies are no longer passive hosts, with extremely powerful technology at their fingertips; they are fully aware of what’s on their platforms. They understand exactly how it harms users and amplifies the negative downstream impacts across the digital world. “What we need to do is come up with a sensible, bipartisan amendment to Section 230 that would say that, among other things, the platforms can be held liable for harm caused by their affirmative boosting, amplification actions, that is to them ‘speaking,’” said Rep. Gephardt. As long as they know they can hide behind Section 230, using it as both a sword and a shield, they will continue to act recklessly.
During the virtual event, Fraser provided an example of their recklessness by shining a light on the tragic loss of Nylah Anderson, demonstrating how content online can cause devastating real-world consequences. Anderson died by suicide in 2021 as a result of a choking challenge that was amplified to her on TikTok’s “For You” Page. Fraser asked, “Shouldn’t someone be held accountable civilly or criminally for these actions? TikTok killed her [Nylah Anderson].” Keesing noted it is a systemic imbalance that allows them to evade liability: “This really has to do with their money and power.”
While Section 230 has allowed these companies to evade responsibility for harmful actions — and harmful inaction — their era of unchecked influence must end. Reform would impose legal and financial stakes, leaving them with no choice but to comply with standards that foster a democracy-enhancing information environment.
This is the biggest challenge undergirding tech reform policy today. Until we can have true accountability from Big Tech companies, democracy advocates must continue to pursue policies that put the American people ahead of these companies’ profits.