- Fast Principles
- Posts
- Your next app will build itself to collaborate with YOU
Your next app will build itself to collaborate with YOU
Looking around the corner at the future of UI design for AI agent powered collaborative productivity tools
In last week's newsletter, we explored the concept of the "magical deliverable" - those AI-generated artifacts that would typically take hours or days to create manually, but can be produced in minutes through effective AI collaboration. While the promise is compelling, the reality of endless prompt refinement cycles has revealed a crucial missing piece: intuitive, personalized interfaces for AI interaction.
The current design challenge
Most AI interactions today rely on chat interfaces and prompt engineering. While powerful, this approach often leads to lengthy refinement cycles or complete do-overs. The core issue isn't the AI's capabilities, but rather how we interact with these systems. Text-based interactions, while flexible, lack the immediate feedback and direct control we've come to expect in professional tools.
Design language systems the human foundation
The solution begins with human-designed guardrails. AI Design Language Systems (AI DLS) are emerging as the crucial bridge between AI capabilities and user needs. Unlike traditional design systems, AI-ready DLS serve dual purposes:
They are consistent components and patterns that human designers create and for AI agents to follow in composing custom interfaces for users
They serve as a library of elements for AI agents to draw from to aid in building productivity applications tuned to specific users
They go beyond today’s DLS work by adding details about personalization to a specific user
The key difference is in governance. Human design teams establish core UI components, interaction patterns, and usage guidelines. AI agents then draw from this foundation to create custom interfaces, ensuring consistency while enabling personalization. This human-led approach prevents the "hallucination" of impossible interfaces while maintaining brand and usability standards.
Personalized, contextual interfaces
Imagine requesting a data visualization. Instead of launching a standard charting tool, the AI reconfigures the interface based on your specific needs. Common tools are positioned where you expect them, while specialized controls appear based on your request. The interface adapts to your working style, remembering preferences without becoming cluttered.
This isn't just about convenience. These personalized interfaces serve as visual constraints, clearly communicating what the AI can and cannot do. When you can see the available tools, you're less likely to request impossible features, reducing friction in the AI collaboration process.
What is the role of designers in this world of dynamic, personalized software products? Setting the DLS governance, usage guidelines, and common patterns for the AI agents to follow will become a key responsibility. This will 100x the influence of human designers on a wide breadth of software solutions that are then refined as part of the relationship between the human user and their AI agent.
Real-world implementation
A few early examples of this approach are already emerging:
Galileo UI demonstrates how AI can generate complete interfaces and then give users the tools to change the style or open it directly in Figma for direct design control.

User’s can apply global style changes and then open in Figma for direct editing of the screens
Figma AI let’s users prompt an agent to start a task. When complete the task is natively built in figma allowing human users to pick-up the task to refine it to completion.

The AI agent draws an initial pass directly in Figma allowing direct user editing when complete
ChatGPT Canvas gives users a new tool to refine and directly edit LLM generated content. User's can refinement prompt off a selection to change a specific sentence or paragraph or adjust specific refinement parts like reading level of the piece.

Direct edit or prompt within the output of the LLM
While, these tools don’t create interfaces to guide a conversation or specific deliverable they highlight a future where interfaces will be capable of generating bespoke software on-the-fly for users.
The path forward as a 1,000x creative
In this brave new highly personalized digital realm AI agents will help us create, what is the role of designers and developers?
Today, we’ve all heard the hyperbole 100x developer, but are we societally ready to take it to the next level? These personalized UI systems use will amplify human creators' influence by 1,000x, transforming them from digital craftsmen into architects of the online world. Rather than coding individual features, they'll design entire systems and set the rules that shape how millions of personalized applications are built - dramatically expanding their impact on users' digital experiences.
Key principles to prepare custom apps with an AI DLS enabled future:
Start with a strong design system as a foundation with clear components and usage details through governance documentation.
Focus on building user feedback mechanisms into key feature interactions to continually improve the personalization to the user.
Make AI agent constraints visible and understandable. Don’t just bury them in instruction documentation, highlight these edges with the interface.
Allow direct content editing as a form of feedback. I know basic, but strangely missing in most modern LLM interfaces.
Have AI agents “remember” the personalized user applications to guide future collaborative work sessions.

Base of the pyramid constraints to technical AI agent elements
What's holding us back?
The gap between vision and reality in AI-powered interfaces comes down to three main challenges:
Technical Limitations Current LLMs don't truly understand UI components or interaction patterns. They can generate code but lack deep understanding of interface design principles.
Missing Standards We lack common protocols for AI interpretation of design systems, making each implementation a custom solution.
Performance Constraints Real-time interface generation and modification requires significant optimization to feel natural.
Compute usage will increase with this complex visual layer. A detailed strategic assessment should be done balancing resource usage compared to user benefit.
Looking ahead
The next frontier in AI isn't just about smarter algorithms—it's about crafting deeply personalized digital experiences through thoughtfully designed agent interfaces. Today's AI assistants often feel like talking to a new person each time, stuck in an eternal first meeting.
But imagine agents that adapt their UI to your preferences: adjusting their communication style, anticipating your needs, and presenting information in ways that resonate with your learning style. While future advances will bring richer personality development and persistent memory, our current focus is on the immediate design challenge: How do we create interfaces that make AI interactions feel personal and consistent, even within single sessions? From customizable chat layouts to an adaptive suite of applications, these UI innovations lay the groundwork for truly personalized AI companions.
The goal isn't to replace existing interfaces but to enhance them with intelligence and adaptability. As these systems evolve, the "magical deliverable" becomes less about the end product and more about the seamless process of creation.
Stay up-to-date with AI
The Rundown is the most trusted AI newsletter in the world, with 1,000,000+ readers and exclusive interviews with AI leaders like Mark Zuckerberg, Demis Hassibis, Mustafa Suleyman, and more.
Their expert research team spends all day learning what’s new in AI and talking with industry experts, then distills the most important developments into one free email every morning.
Plus, complete the quiz after signing up and they’ll recommend the best AI tools, guides, and courses – tailored to your needs.
What we’re reading this week
Suggestion box
Forward it to a friend and have them signup here.
How'd we do this week? |
Until next time, keep innovating and stay curious!