Desire Isn't a Survey: Building Genuine Buy-In for AI-Powered Workflows
ADKAR’s Desire phase is about wanting to support and participate in the change. Not begrudgingly. Not because the org said so. Genuinely wanting to be part of what’s coming.
For most change initiatives, we rely on a familiar toolkit: change champions, incentives, communication campaigns, surveys to “measure buy-in.” For AI adoption, that toolkit often fails. People check the box, use the tool when someone’s watching, and revert to old habits when no one is. Desire remains shallow.
Why the Usual Playbook Falls Short
Change champions work when the change is visible and social. “Sarah has been using the new system and loves it—ask her!” That breaks down when the change is individual and subtle. AI adoption happens at the keyboard. There’s no obvious “Sarah using agents” moment to point to. The work is distributed, often private.
Incentives (bonuses, recognition, gamification) can backfire. They signal that the org doesn’t trust people to want the change. They also incentivize the wrong thing: usage metrics instead of outcomes. People learn to game the system—run the agent, ignore the output, claim the badge.
Surveys measure stated preference, not behavior. “How likely are you to use AI tools?” often reflects social desirability. What we need is observable adoption: people reaching for the tool when it helps, not when they’re reminded.
What Actually Builds Desire
1. Early wins that matter to the individual.
Desire grows when people see themselves succeeding with the new way. A developer who ships a feature in a fraction of the usual time. A PM who generates and refines a spec in an afternoon instead of a week. The win has to be personal and tangible—not a team metric, but “I did this, and it felt better.”
2. Reduced friction, not added process.
If adopting AI means more meetings, more forms, more approvals, desire tanks. If it means less boilerplate, fewer repetitive tasks, more time for interesting work, desire grows. Design the adoption path to remove friction, not add it.
3. Psychological safety to experiment.
People need permission to fail. To try the agent, get a weird output, and say “that didn’t work” without being judged. Desire emerges from experimentation, and experimentation requires safety. Leaders who model “I tried it, it was messy, here’s what I learned” create that safety.
4. Narrative alignment.
Desire connects to identity. “I am someone who uses the best tools to do better work” is a identity people can lean into. “I am someone who complies with the AI mandate” is not. Tie adoption to the craft narrative—this is how we do better work, not how we satisfy a requirement.
Measuring Desire Right
Instead of surveys, watch behavior. Are people using agents when no one’s tracking? Are they sharing tips and discoveries? Are they advocating for more capability? That’s desire. It’s harder to measure, but it’s what matters.