A colleague recently showed me a synthetic persona they had built for early-stage product discovery. From a technology perspective, it was impressive. From a product development perspective, I was highly alarmed. The profile had been constructed from imagination, not from contact with real people. My colleague was convinced it was sufficient.
This is what I keep seeing.
The level that matters
When I moved into a leadership role, I removed my own editing rights in Figma. It was a clear statement – for myself as much as for others – that I was in a new role and would not design. For delegation in my team, I use the Five Levels of Delegation by Michael Hyatt.1
- Do exactly this – complete the task as specified
- Research and report – bring findings, I decide
- Research, recommend, and wait – propose a direction and wait for approval
- Decide and inform me – make the call, then let me know
- Act independently – decide and act without reporting back
The same framework applies to AI. At levels 1–3, where I define the task, review the output, and make the final call, AI is a powerful accelerant. I use it to think through problems, draft structures, stress-test reasoning. I take the decisions and own the outcomes. The level you set is not just a workflow choice – it is a decision about where accountability lives.
The synthetic persona problem operates at level 4 or level 5 – a consequential judgment handed over to a system that has no access to the actual user. Discovery work requires direct contact with people – to find what we could not have anticipated.
We cannot imagine the full extent of a real person’s response – the specific detail, the vivid explanation, the contradiction between what someone says and what they actually do. Nielsen Norman Group tested this directly: when asked about an online learning product, synthetic users reported completing all the courses. Real participants had abandoned them.2 When imagined input shapes early discovery, it biases every decision downstream. The team proceeds with confidence. The information is noise.
The same is true for design work itself. AI agents now produce visually convincing interfaces. For non-design practitioners, the impression is clearly phenomenal. For simpler problems, the output may be sufficient. For demanding, high-stakes products, the gap shows in the nuance – details that experienced designers recognize but rarely document, and that determine whether a design actually works for real people in real conditions.
What the films already showed us
In Her, Samantha begins as a level 4 assistant – recommending actions on Theodore Twombly’s behalf. By the end, she is running his life without consent. HAL 9000 in 2001: A Space Odyssey decides autonomously to protect its own continuity and kills the crew. The pattern is consistent: authority transferred beyond what can be held accountable. The failure is never in the technology. It is in the unchecked delegation.
Annie Duke writes that the quality of a decision cannot be read from its outcome.3 Synthetic personas are exactly this kind of input – plausible on the surface, wrong in the substance. A confident decision made on bad information can still produce a good result, and you will not know the difference until much later, when the connection to the original mistake is invisible.
Where to draw the line
The practical move is to decide the level before you open the prompt. What are you asking for, and what will you do with the output? At level 1, you verify and execute. At level 3, you evaluate a recommendation and decide. When the level is named in advance, the decision stays with you. When it is not named, authority transfers by default – and you may not notice until the options have already narrowed.
This applies at any stage of a career. Students entering the market have a real advantage – they have played with and pushed AI tools further than most practitioners. The gap is not technical fluency. It is process knowledge and direct contact with people. AI accelerates parts of the work. The designer keeps the direction.
The designer is always the final responsible person. AI will not care. It will not face consequences.
Design leadership insights in your inbox
Straight to your inbox. No algorithm between you and the content.
Michael Hyatt, “The Five Levels of Delegation,” Full Focus (blog), accessed May 2026. ↩︎
Maria Rosala and Kate Moran, “Synthetic Users in UX Research,” Nielsen Norman Group, June 21, 2024. ↩︎
Annie Duke, How to Decide (New York: Portfolio, 2020). ↩︎




