UX Collective: Agentic AI, design systems & Figma: a practical guide
Christine Vallaure, UI designer and founder of the moonlearning.io learning platform, opens with a concrete scene: a live demo featuring Brad Frost and Dominic Nguyen discussing agentic design systems. An AI agent was given a single instruction — “Add a customer reviews component” — with no existing component in the file, no Figma frame, no design specification. The agent found Star, Typography, and Avatar components, understood their properties and states, assembled a new component, and wrote the code and tests. The whole thing moved faster than writing a Jira ticket.
Vallaure uses this moment not to argue that AI replaces designers, but to ask a sharper question: what changes in practice when machines can read and act on design systems directly?
Her answer is that the focus shifts from complete-page mockups to building blocks. In an agentic workflow, the artifacts that matter most are components, tokens, and states — not the polished mockup that gets reviewed once in a handoff meeting. Where designers previously communicated intent through presentation files, they now need to communicate through structure: properly named components, accurate tokens, documented states.
The article draws a distinction between human-readable and machine-readable design. A loosely organized Figma file might survive a human handoff because developers can infer intent from context. An AI agent reads exactly what is there — no inference, no charitable interpretation. Vallaure describes this as an accountability shift. Design system basics in Figma are no longer just documentation for developers; they are now instructions for machines, and their precision matters accordingly.
She also identifies a subtler risk. The pull toward assembly — generating interfaces quickly from existing components — could gradually displace the conditions that make assembly meaningful. What the Storybook demo showed was not AI doing design: it was AI using what designers had already built. The quality of what the agent produced depended entirely on the quality of the design system it drew from. This puts more responsibility, not less, on the foundational decisions designers make: which components to build, how to token them, which states to document.
The piece ends with a practical reorientation: less time on full-page mockups, more time on building blocks. Designers who treat their design systems as machine-readable instructions are better positioned for agentic workflows than those who treat them as reference material for developers.
The article is aimed at designers working with Figma who want to understand what changes — concretely and day-to-day — as AI agents become part of the design pipeline. Vallaure brings a practitioner’s eye to a topic that often stays abstract: she names specific Figma structures (components, tokens, states) and explains why each one matters when the reader is a machine rather than a colleague.