Responsibility Fog
How accountability dissolves when decisions are distributed across algorithms, committees, and platforms.
When a hiring algorithm rejects a candidate, who is responsible? The developer? The deploying organisation? The vendor? The regulator who approved the framework? Responsibility Fog describes the systematic diffusion of accountability in AI-mediated decision-making — not as a bug, but as an emergent property of how institutions adopt algorithmic systems. The concept traces back to my 2015 BA thesis on telecommunications data retention, where I first identified unclear distribution of responsibility in surveillance systems.
First introduced: Beyond Fragmentation (MA thesis, 2026, grade: 9.5/10)
Forthcoming: The Irreducible Human (Brill Publishers, with Dr. Kristian Guttesen)