CopilotKit v1.50 is coming soon!

Back
By Atai Barkai
September 10, 2024

Coagent: LangGraph and CopilotKit streaming an AI agent state

We are quickly learning (and re-learning) that agents perform best when they work alongside people. It's far easier to get an agent to perform 70%, and even 80% and 90% of a given task, than to perform the given task fully autonomously (see: cars).

However, to facilitate seamless human-AI collaboration, we must connect AI agents to the real-time application context: agents should be able to see what the end-user sees and to do what the end-user can do in the context of applications.

End-users should be able to monitor agents' executions - and bring them back on track when they go off the rails.

This is where CoAgents come into play: agents that expose their operations to end-users using custom UI that is understandable to end-users, and where end-users can steer back on track should they “go off the rails”.

Human-in-the-loop is essential for creating effective and valuable AI agents. By integrating CoAgents with LangGraph, we elevate in-app human-in-the-loop agents to a new level. This is the next focus for CopilotKit—simplifying the development process for anyone looking to harness this technology. We’re already building the infrastructure for Coagents today and want to share our process and learnings as we go. Want early access to this technology? Sign up here

A framework for building human-in-the-loop agents

CopilotKit is the simplest way to integrate production-ready copilots into any product with both open-source tooling and an enterprise-ready cloud platform. When we look at Coagents, there are a few critical use cases that we see crop up most frequently:

  • Streaming an intermediate agent state - provides insight into the agent's work in progress, rendering any component based on the LangGraph node that it’s processing.
  • Shared state between an agent, a user, and the app - allows an agent and an end-user to collaborate over the same data via bi-directional syncing between the app and agent state.
  • Agent-led Q&A - Allow agents to intentionally ask a user question in conversation and app context with support for response/follow-up via the UI. “Your planned trip seems particularly expensive this week, would you like to reschedule for next week?”

At CopilotKit, we are dedicated to creating a platform that simplifies the development of copilots by providing developers with the most powerful tools at their disposal.

Our next step in this journey is to enable this through Coagents, beginning with the introduction of the intermediate agent state.

Ready to walk through some of this in action?

Build an AI Coagent children's book with LangGraph and CopilotKit

Let's see what a streaming agent state looks like in our storybook app. We've built an AI agent that will help you write a children's storybook where the agent can chat with the user to develop a story outline, generate characters, outline chapters, and generate image descriptions that we'll use to create the images using Dall-E 3.

This first version of Coagents with support for LangGraph will be released fully open source - sign up for first access here.

Top posts

See All
Master the 17 AG-UI Event Types for Building Agents the Right Way
Anmol Baranwal and Nathan TarbertNovember 3, 2025
Master the 17 AG-UI Event Types for Building Agents the Right WayAgent-User Interaction Protocol (AG-UI) is quickly becoming the standard for connecting agents with user interfaces. It gives you a clean event stream to keep the UI in sync with what the agent is doing. All the communication is broken into typed events. I have been digging into the Protocol, especially around those core event types, to understand how everything fits together. Here’s what I picked up and why it matters.
AG-UI Is Redefining the Agent–User Interaction Layer
Nathan Tarbert October 30, 2025
AG-UI Is Redefining the Agent–User Interaction LayerThe AG-UI Protocol continues to surge — and with it, a new standard is forming across the agentic landscape.AG-UI isn’t just another SDK. It’s the open protocol for agent-user communication, enabling real-time collaboration between humans and agents across any app, platform, or framework. What started as a developer-driven initiative inside the CopilotKit ecosystem has grown into an open-source movement — now being adopted, extended, and accelerated by leaders across the industry.
How to Make Agents Talk to Each Other (and Your App) Using A2A + AG-UI
Bonnie and Nathan TarbertOctober 9, 2025
How to Make Agents Talk to Each Other (and Your App) Using A2A + AG-UIIn this guide, you will learn how to build full-stack Agent-to-Agent(A2A) communication between AI agents from different AI agent frameworks using A2A Protocol, AG-UI Protocol, and CopilotKit.Before we jump in, here is what we will cover: What is the A2A Protocol? Setting up A2A multi-agent communication using the CLI, Integrating AI agents from different agent frameworks with the A2A protocolBuilding a frontend for the AG-UI and A2A multi-agent communication using CopilotKit.
Are you ready?

Stay in the know

Subscribe to our blog and get updates on CopilotKit in your inbox.