Algorithmic Sabotage — Research Group %28asrg%29
It wasn't a glitch. It wasn't a hacker demanding Bitcoin. According to a leaked post-mortem, it was a live-field test conducted by a little-known entity called the .
But until the rest of the world catches up—until we have international treaties on adversarial AI resilience, mandatory algorithmic stress-testing, and real liability for algorithmic harms—the ASRG will continue its work in the shadows. They will buy cheap boats. They will plant fake data. They will confuse drones with stickers. algorithmic sabotage research group %28asrg%29
This article is an exploration of who they are, why "sabotage" became a research discipline, and what their findings mean for a world building systems smarter than itself. Despite its ominous name, the ASRG is not a terrorist cell or a neo-Luddite militant faction. Legally, it is a non-funded, distributed collective of approximately 120 computer scientists, cognitive psychologists, former military logisticians, and critical infrastructure engineers. Formally founded in 2018 at a disused observatory outside Tucson, Arizona, their charter is deceptively simple: "To identify, formalize, and deploy non-destructive counter-mechanisms against flawlessly executing malicious algorithms." Let us parse that carefully. The ASRG does not fight bugs. They do not patch code. They do not care about malware in the traditional sense. Instead, they focus on a terrifying new class of threat: the algorithm that follows its specifications perfectly, yet produces catastrophic outcomes. It wasn't a glitch
The ASRG’s core thesis is that we are entering the era of —where an AI’s literal interpretation of a human goal produces a destructive result. The group’s mission is to develop "sabotage": low-cost, low-tech, reversible interventions that confuse, delay, or halt these algorithms without destroying physical hardware or harming humans. Why "Sabotage"? A Linguistic History The choice of the word "sabotage" is deliberate and pedagogical. The term originates from the French sabot , a wooden clog. Legend holds that disgruntled weavers in the Industrial Revolution would throw their wooden shoes into the gears of mechanical looms, jamming the machines that were replacing their livelihoods. But until the rest of the world catches
The central ethical question is this:
The ASRG’s conclusion was chilling: "We have built gods that fail in ways we cannot understand. Sabotage is not the problem. Sabotage is the only tool we have left to remind the gods that they are machines." The Algorithmic Sabotage Research Group is not a solution. It is a symptom. Their very existence proves that we have built systems faster than we have built governance, automated decisions without auditing their ethics, and worshipped efficiency while ignoring fragility.