Steve Krug's 2000 mantra 'Don't Make Me Think' built a generation of designers who eliminated friction so effectively they engineered the conditions for mass behavioral manipulation. The usability movement succeeded. Interfaces became seamless. Users became products. Attention became a commodity traded against behavioral prediction models precise enough, as the author argues, to move elections and radicalize teenagers. This is not a metaphor. It is an architectural description of what human-centered design optimized for when it asked only one question: does this work for the user right now.
The EU has spent a decade building the regulatory answer to that failure. The Digital Services Act now classifies dark patterns as violations, not clever UX. The AI Act prohibits subliminal manipulation, deceptive systems, and exploitation of vulnerabilities tied to age, disability, or economic status. The European Commission has already moved against TikTok and Meta under DSA transparency obligations. The author maps five compounding regulatory dimensions that now govern every digital product in Europe: data, decisions, resilience, scale and power, and accountability. Obligations stack. They do not replace each other. Compliance cannot be retrofitted.
What makes this worth reading in full is not the regulatory inventory. It is the generational argument underneath it. The author draws a direct line from Don Norman's 1988 'Design of Everyday Things' through Krug through the attention economy to the current compliance architecture, and makes the case that the second generation of human-centered design must ask a structurally different question: does this work for the human being, the citizen, and the society they live in. That reframe has consequences for every product decision made before legal review ever sees it.
[READ ORIGINAL →]