Open source software has long been celebrated for its collaborative spirit, but the reality has always been more complicated. Most projects are maintained by a small handful of unpaid volunteers, often just one or two people, who keep the infrastructure running for thousands of companies. This fragile model worked when contributing required significant effort and investment. But artificial intelligence is now dismantling that friction, and the consequences for open source are profound.
The rise of slop pull requests
Large language models and coding agents have made it trivially easy to generate code and submit changes to public repositories. Mitchell Hashimoto, the founder of HashiCorp and a longtime pillar of the open source community, has publicly considered closing his projects to external pull requests entirely. He is not losing faith in open source, but rather drowning in what he calls "slop PRs" — low-effort, AI-generated submissions that lack the context and care of human contributions.
Flask creator Armin Ronacher has described this phenomenon as "agent psychosis," where developers become addicted to the dopamine rush of agentic coding and let their AI assistants run wild through repositories. The result is a massive degradation in quality. These pull requests may look plausible because they were produced by a statistical model, but they often miss the nuanced trade-offs and historical understanding that a human maintainer brings to the table.
The economics of review asymmetry
At the heart of the problem is a brutal asymmetry in effort and reward. It takes a developer roughly 60 seconds to prompt an AI agent to fix typos or optimize loops across multiple files. But for a maintainer, reviewing those same changes can take an hour of careful scrutiny — verifying edge cases, ensuring alignment with the project's long-term vision, and communicating with the contributor. When a hundred contributors each use their personal LLM assistants, the maintainer becomes overwhelmed and eventually burns out.
The OCaml community experienced this vividly when maintainers rejected an AI-generated pull request containing more than 13,000 lines of code. The reasons cited included copyright concerns, lack of review resources, and the long-term maintenance burden. One maintainer warned that such low-effort submissions risk bringing the entire pull request system to a halt.
Even GitHub, the host of the world's largest code forge, is feeling the pressure. As reported by industry observers, GitHub is exploring tighter pull request controls and even UI-level deletion options because maintainers are inundated with AI-generated submissions. When the platform itself considers implementing a kill switch for pull requests, it's clear that we are witnessing a structural shift in how open source software is built.
Small libraries face extinction
Small, utility-focused open source libraries are among the hardest hit. Nolan Lawson, author of blob-util — a JavaScript library downloaded millions of times — recently argued that the era of the small utility library is over. In the age of Claude and GPT-5, developers no longer need to install a dependency to get a simple function. They can simply ask an AI to generate the code on the fly, producing a perfectly serviceable snippet in milliseconds. The incentive to maintain a dedicated library vanishes when the same output can be obtained instantly from a model.
Lawson laments that something deeper is being lost: these libraries were educational tools. Developers learned how to solve problems by reading and understanding the work of others. By replacing those libraries with ephemeral AI-generated snippets, we trade understanding for instant answers. The teaching mentality that lies at the heart of open source begins to erode.
From bazaar to curated garden
Armin Ronacher has provocatively suggested that the logical response to this dynamic is retreat. He advocates for building more code yourself, reducing dependencies, and using AI as a personal assistant rather than a source of external contributions. The irony is stark: AI increases the volume of low-quality contributions while simultaneously reducing the demand for small, reusable libraries. Developers are encouraged to keep their code inside their own walls, relying less on the open source commons.
This bifurcation is reshaping the entire ecosystem. On one side, massive, enterprise-backed projects like Linux or Kubernetes have the resources to build their own AI-filtering tools and ignore the noise. They are the cathedrals, guarded by sophisticated gates. On the other side are the provincial projects run by individuals or small cores who simply stop accepting outside contributions altogether. The proletariat of open source may find themselves locked out of the collaborative process they helped build.
Artificial intelligence was supposed to democratize software development, making it accessible to more people than ever before. And in some ways it has. But by lowering the barrier to contribution, it has simultaneously lowered the value of each contribution. When everyone can generate code, no single contribution is special. When code becomes a commodity produced by machines, the only scarce resource left is the human judgment required to say no.
The future: radical curation
Open source is not dying, but the meaning of "open" is being redefined. The era of radical transparency and "anyone can contribute" is giving way to an era of radical curation. The future of open source may belong to the few, not the many. Yes, the romanticized community model was always somewhat idealized, but AI has now made that ideal unsustainable. We are returning to a world where the only people who truly matter are those who actually write the code — not those who prompt a machine to do it for them. The drive-by contributor is being replaced by the verified human.
In this new landscape, the most successful open source projects will be those that are hardest to contribute to. They will demand a high level of human effort, context, and relationship. They will reject the slop loops and agentic psychosis in favor of slow, deliberate, and deeply personal development. The bazaar was a fun idea while it lasted, but it could not survive the arrival of the robots. The future of open source is smaller, quieter, and much more exclusive. That might be the only way it survives.
We do not need more code. We need more care — care for the humans who shepherd communities and create code that will endure beyond a simple prompt. As the tools change, the essential value of open source remains: the human judgment, the trust, and the willingness to maintain something for the long haul. Without that, even the most advanced AI cannot save open source from itself.
Source: InfoWorld News