Eyebrow Background Glow

MCP Apps: Bring MCP Apps to your users!

Introducing Pydantic AI Integration with AG-UI
By Steven Hartland and Nathan Tarbert
July 23, 2025

Pydantic AI is now natively supported with AG-UI

We’re excited to share that AG‑UI, the open Agent–User Interaction Protocol from CopilotKit, is now natively supported in Pydantic AI, thanks to a new integration built by Steven Hartland at Rocket Science, working hand in hand with the Pydantic AI team.

Why it matters

Pydantic AI is quickly becoming the standard for building reliable, type-safe AI agents in Python. But while the backend agent stack was solid, this integrations now enables first-class UI that improves the users experience:

  • Structured, streaming agent output
  • Live user inputs and corrections
  • Human-in-the-loop checkpoints
  • Frontend tools

That’s exactly what AG‑UI is designed for. It’s a composable protocol that connects AI agents to rich, interactive frontends, with minimal glue and strong separation of concerns.

"We originally began the AG‑UI integration as a standalone library, but with support from the Pydantic AI team, it became clear that this functionality belonged in the framework itself.”
"We collaborated closely with the Pydantic AI team to build in native support for AG‑UI, and shipped it alongside their new toolset abstraction."
  • Steven Hartland - VP Engineering @ Rocket Science

The result is a seamless dev experience:

  • No glue code
  • Streaming agent updates in structured JSON
  • Automatic wiring of tool inputs/outputs to UI state

AG‑UI now works out-of-the-box with Pydantic AI - meaning developers can instantly add structured agent interaction to any AG‑UI-compatible frontend.

What’s next

Rocket Science is very close to using this integration in production, and they’re helping drive AG‑UI forward as an open protocol for building real-time agentic systems.

We’re collaborating closely through the AG‑UI working group, and more patterns, docs, and examples are coming soon, especially around structured editing, tool chaining, and UX for error recovery.

If you're building AI agents in Python and need a clean bridge between backend logic and interactive UI, this integration is built for you.

👉 Try it out: https://ai.pydantic.dev/examples/ag-ui/

👉 Check out the docs: https://ai.pydantic.dev/ag-ui/

Stay up to date and follow @CopilotKit, Pydantic AI, and @rocketsciencegg.

Want to learn more?

Top posts

See All
Reusable Agents Meet Generative UIs
Anmol Baranwal and Nathan TarbertMarch 12, 2026
Reusable Agents Meet Generative UIsOracle, Google, and CopilotKit have jointly released an integration that standardizes how AI agents are defined, how they communicate with frontends in real time, and how they describe the UI they require. The integration connects three distinct layers. Oracle's Open Agent Specification (Agent Spec) provides a framework-agnostic way to define agent logic, workflows, and tool usage once and run it across compatible runtimes. AG-UI handles the live interaction stream between the agent and the frontend, keeping tool progress, state updates, and user interactions synchronized while the agent is executing. A2UI, developed by Google, allows agents to describe the UI they need - forms, tables, multi-step flows - as structured JSONL, which CopilotKit then renders automatically inside the host application. Previously, each of these layers required custom implementation per project. This release establishes a shared contract across all three, meaning agent developers can define the agent once, expose a standardized interaction stream, and have the frontend render structured UI surfaces without writing custom wiring for each tool or workflow. The practical impact is reduced integration friction across the ecosystem - agent runtimes and frontend clients that implement these standards can interoperate without lock-in to a specific framework or vendor.
The Developer's Guide to Generative UI in 2026
Anmol Baranwal and Nathan TarbertJanuary 29, 2026
The Developer's Guide to Generative UI in 2026AI agents have become much better at reasoning and planning. The UI layer has mostly stayed the same, and it is holding back the experience. Most agent experiences still rely on chat, even when the task clearly needs forms, previews, controls, or step-by-step feedback. Generative UI is the idea that allows agents to influence the interface at runtime, so the UI can change as context changes. This is usually done through UI specs like A2UI, Open-JSON-UI, or MCP Apps. We'll break down Generative UI, the three practical patterns, and how CopilotKit supports them (using AG-UI protocol under the hood).
Bring MCP Apps into your OWN app with CopilotKit & AG-UI
Anmol Baranwal and Nathan TarbertJanuary 22, 2026
Bring MCP Apps into your OWN app with CopilotKit & AG-UIToday, we are excited to announce CopilotKit’s support for MCP Apps. Now, MCP servers can finally ship an interactive UI that works out of the box in real agent applications.
Are you ready?

Stay in the know

Subscribe to our blog and get updates on CopilotKit in your inbox.