Don’t leave developers behind in the Section 230 debate


Last week, the US Supreme Court He was reviewing Section 230 of the Communications Decency Act of 1996 for the first time. During the oral arguments in the Gonzalez v. Google case, important questions were raised about forum liability and the risk of viral content.

As the Court answers these questions, it is an opportunity to reflect on why 230 was created in the first place, how it fosters innovation, and what we all stand to lose if the protections contained in 230 are narrowed.

Nicknamed the “26 words that created the Internet” by Jeff Kosseff, Section 230 establishes a liability shield for platforms that host third-party content. In the early days of the Internet, 230 created favorable legal conditions for the growth of startups and entrepreneurs, allowing the United States to become the world leader in software.

While today’s technology landscape is vastly different from the nascent Internet of the 90s, the reasoning behind Section 230 still holds true today. Legal architecture can create and stifle conditions for innovation.

What seems to have been lost in the debate over its impact on large social media platforms is an appreciation of how Section 230 supports the wider online ecosystem, particularly software developers. Developers are at the heart of our online world and at the forefront of creating solutions to global challenges, working to make our digital infrastructure more secure, reliable and safe.

Policymakers should recognize the critical role of developers and work to support them, not stifle innovation.

Developers rely on 230 to collaborate on platforms like GitHub and build and operate new platforms that reimagine social media. 230 Narrowing protections could have far-reaching implications, introducing legal uncertainty into the valuable work of software developers, startups and platforms that provide tools to realize their visions. As policymakers consider how to address new boundaries of media liability, it’s important to center developers in the decisions that shape the future of the Internet.

Software developers contribute significantly to United States economic competitiveness and innovation and are important stakeholders in platform policy. GitHub counts 17 million American developers on our platform – more than any other country. Their open source activities alone contribute more than $100 billion to the US economy each year.

These developers maintain the invisible but essential software infrastructure that supports our daily lives. Almost all software – 97% – contains open source components, often developed and maintained on GitHub.

As the Legal Officer at GitHub, a global community of over 100 million software developers collaborating on code, I know firsthand the importance of keeping 230 from breaking. While GitHub is far from a general-purpose social media platform, GitHub relies on 230 safeguards to host third-party content and engage in good faith content.

That’s especially important when a platform has over 330 million software repositories. With moderate liability protections, GitHub has been able to grow while maintaining the health of the platform. GitHub tailors our robust, developer-first content moderation approach to making our platform safe, healthy, and inclusive to the unique code collaboration environment where downloading a single project can have significant downstream effects for thousands or more pieces of software. Projects.

Coming to the specific case of Gonzalez v. Google, where the court asked the Court to examine whether Section 230’s liability protections include algorithmically suggested third-party content, a ruling in favor of the petitioners could have unintended consequences for developers. Support algorithms are used in software development in a myriad of ways that differ from general-purpose social media platforms.

GitHub’s contribution to Microsoft’s amicus brief on the issue outlines our concerns: Algorithmic recommendations on GitHub are used to connect users with similar interests, find related software projects, and recommend ways to improve code and fix software vulnerabilities. One such example is GitHub’s CodeQL, a semantic code analysis engine that allows developers to find vulnerabilities and bugs in open source code.

Developers are using GitHub to maintain open source projects that employ algorithmic recommendations to block hate speech and remove malicious code. The court’s decision to narrow 230 to exclude protection for recommendation algorithms could quickly lead to the entrapment of a variety of socially important services, including tools that maintain the quality and safety of software supply chains.

A ruling in Gonzalez and Google that seeks to roll back protections that benefit social media platforms could have an impact on the wider community. As the court heard the case, several amicus briefs emphasized its broad implications: from nonprofits (Wikimedia Foundation) to community content moderators (Reddit and Reddit Moderators) and small businesses and startups (Motor).

While calls to narrow it to 230 have focused primarily on screening out Big Tech, doing so may inadvertently stifle competition and innovation, creating additional barriers for up-and-coming developers and emerging vendors.

These concerns are not rigid: in “How to Law Silicon Valley,” Anupam Chander examines how the US legal system, in contrast to Europe, has created favorable conditions for Internet entrepreneurship. Asia, “Asian web enterprises face not only copyright and privacy restrictions, but also strict intermediary liability laws.

Narrowing 230 will not only harm the United States’ global competitiveness. It will hinder the development of technology in While US GitHub has come a long way since our early days, we are committed to leveling the playing field so that anyone, anywhere can become a developer.

In Gonzalez v. While we wait for the court’s decision on Google, it should be noted that regardless of the outcome of the case, there will be further efforts to narrow the 230, whether they are taking on algorithmic recommendations, AI or other innovations. While these new technologies raise important questions about the future of middleman accountability, policymakers should strive to steer a future that creates a legal environment that supports developers, startups, small businesses, and the nonprofit Internet sector.

Policymakers concerned about reducing harmful content can look at how developers manage content moderation. Policymakers’ calls for developers to develop valuable software projects using GitHub include the Algorithmic Accountability Act of 2022 and the Algorithmic Justice and Online Platform Transparency Act, including open-source content moderation algorithms.

Platforms including Twitter, Bumble, and Wikimedia have used GitHub to develop algorithms that flag misinformation, filter pornography, and block spam. Open source is encouraging innovation through content moderation by providing new models for community participation, control and transparency.

As we face new frontiers in middle accountability, policymakers must recognize the critical role of developers and work to support—not stifle—innovation.



Source link

Related posts

Leave a Comment

eighteen + 9 =