Eyebrow Background Glow

MCP Apps: Bring MCP Apps to your users!

CopilotKit v1.0 Launch 🚀 GenUI, GraphQL protocol, React SDK hooks

By Atai Barkai
July 8, 2024
CopilotKit v1.0 Launch 🚀 GenUI, GraphQL protocol, React SDK hooks

CopilotKit v1.0 Launch 🚀 GenUI, GraphQL protocol, React SDK hooks

Today tech giants are investing big in embedded AI copilots - intelligent virtual assistants for their products. Developers looking to integrate similar assistants have to either use inflexible ‘low-code’ solutions or build everything from scratch.

That’s why we built CopilotKit: for any developer to quickly integrate custom AI copilot experiences into products, with the full power of the latest AI tooling behind it.

CopilotKit v1.0 brings significant improvements in user& developer performance and experience, plus the latest in the CopilotKit Cloud beta.

Let’s dive in to what’s included in the v1.0 release or watch the video walkthrough here

A platform rebuilt for top performance and dev experience with GraphQL

CopilotKit 1.0 delivers a major new experience built on a refined protocol and a clean abstraction layer between the app layer and the LLM engine: the Copilot Runtime. Utilizing GraphQL, this new structure is more robust, extensible, and ready to support innovation in copilot technologies.

In its first version, built for our own internal tools, the Copilot backend acted a simple proxy to an LLM REST API call. Today this runtime uses a dedicated GraphQL API to handle typed, dedicated input fields and returns copilot-specific typed dedicated, output fields - satisfying the diverse needs of a modern copilot system including the new features below.

__wf_reserved_inherit

Using GraphQL's @stream, each output field can stream data independently in parallel, crucial for real-time, user-facing LLM applications. This well-typed input and output structure supporting sophisticated streaming also simplifies contributions to our open-source framework.

Interested in contributing? Talk to us on the contribution channel in our Discord!

Copilot Cloud now plug-and-play hosted (beta)

Copilot Cloud now extends the open-source CopilotKit with features for scale and enterprise requirements. The new managed Copilot Runtime enables one-click deployment, even on private clouds, and offers enterprise-ready functionalities requiring a stateful cloud environment.

Beyond the deploy, you can now leverage Copilot Guardrails, a mission-critical tool to keep your copilot focused on the task at hand and maintain a controlled experience for your users. See how allowlists and denlists work here:

__wf_reserved_inherit

Copilot Cloud is in beta and accessible here. To get early access to upcoming beta features - like realtime RAG, chat histories and knowledge bases, and PII filtering - get in touch here.

{{CTA component}}

Generative UI for custom components

The AI copilot end user customer experience demands a level of visual engagement, making generative UI critical for developers. Now easily generate visual components that update in real time, from thumbnail sharing formats to dynamic messages triggered by user actions.

With [render], developers can return a React client-side component perfect for creating dynamic content that responds to user input and actions.

Take a look at the sample code to show “processing…” while the action is executing and “done” when complete:

__wf_reserved_inherit
__wf_reserved_inherit

Read the full docs here:  https://docs.copilotkit.ai/reference/hooks/useCopilotAction#generative-ui

Better developer experiences in the SDK with React Hooks

These latest CopilotKit hooks provide clean abstractions for customizing your Copilot’s behavior in powerful ways for realtime customer needs. Introducing Action, Readable, and Suggestions:

__wf_reserved_inherit
__wf_reserved_inherit

What’s next on the CopilotKit roadmap

We can’t wait to hear what you think about the latest releases! The upcoming CopilotKit roadmap is full of exciting new features that we can’t help but give you a sneak preview. We highly value your input and want to hear from you in our weekly office hours sessions on discord (Thursdays at 10am PST).

Up next for CopilotKit users:

  • Co-agent LangChain and LangGraph integration - allow end-users to easily steer agents, bringing the power of human-in-the-loop to the powerful world of agents.
  • blog post we wrote previously: LangChain Collaboration Blog Post
  • Hosted and language agnostic backend extensions, including a new Python SDK for better interfacing with your current environment

What’s in flight for your AI copilot? Get started today with v1.0 on our Github. Interested in being a part of the CopilotKit Cloud beta? Sign up here.

__wf_reserved_inherit

Top posts

See All
Reusable Agents Meet Generative UIs
Anmol Baranwal and Nathan TarbertMarch 12, 2026
Reusable Agents Meet Generative UIsOracle, Google, and CopilotKit have jointly released an integration that standardizes how AI agents are defined, how they communicate with frontends in real time, and how they describe the UI they require. The integration connects three distinct layers. Oracle's Open Agent Specification (Agent Spec) provides a framework-agnostic way to define agent logic, workflows, and tool usage once and run it across compatible runtimes. AG-UI handles the live interaction stream between the agent and the frontend, keeping tool progress, state updates, and user interactions synchronized while the agent is executing. A2UI, developed by Google, allows agents to describe the UI they need - forms, tables, multi-step flows - as structured JSONL, which CopilotKit then renders automatically inside the host application. Previously, each of these layers required custom implementation per project. This release establishes a shared contract across all three, meaning agent developers can define the agent once, expose a standardized interaction stream, and have the frontend render structured UI surfaces without writing custom wiring for each tool or workflow. The practical impact is reduced integration friction across the ecosystem - agent runtimes and frontend clients that implement these standards can interoperate without lock-in to a specific framework or vendor.
The Developer's Guide to Generative UI in 2026
Anmol Baranwal and Nathan TarbertJanuary 29, 2026
The Developer's Guide to Generative UI in 2026AI agents have become much better at reasoning and planning. The UI layer has mostly stayed the same, and it is holding back the experience. Most agent experiences still rely on chat, even when the task clearly needs forms, previews, controls, or step-by-step feedback. Generative UI is the idea that allows agents to influence the interface at runtime, so the UI can change as context changes. This is usually done through UI specs like A2UI, Open-JSON-UI, or MCP Apps. We'll break down Generative UI, the three practical patterns, and how CopilotKit supports them (using AG-UI protocol under the hood).
Bring MCP Apps into your OWN app with CopilotKit & AG-UI
Anmol Baranwal and Nathan TarbertJanuary 22, 2026
Bring MCP Apps into your OWN app with CopilotKit & AG-UIToday, we are excited to announce CopilotKit’s support for MCP Apps. Now, MCP servers can finally ship an interactive UI that works out of the box in real agent applications.
Are you ready?

Stay in the know

Subscribe to our blog and get updates on CopilotKit in your inbox.