This repo demonstrates Human-in-the-Loop (HITL) with LlamaIndex and CopilotKit. It includes a Next.js app connected to a LlamaIndex agent that drafts an essay via a tool (write_essay), renders the draft in the chat for review, and waits for you to either accept or ignore the draft. When accepted, the essay is saved to shared state and displayed in the center panel.
- Ask the assistant: “Write an essay about …”.
- The agent calls the
write_essaytool which usesrenderAndWaitForResponseto show the draft. - You choose: Accept Draft (sends
SEND) or Ignore Draft (sendsCANCEL). - On accept, the UI saves the draft to shared state and renders it on the page.
- The action uses
followUp: falseto prevent loops after approval.
- Node.js 18+
- Python 3.9+
- OpenAI API Key (for the LlamaIndex agent)
- uv
- Any of the following package managers:
- pnpm (recommended)
- npm
- yarn
- bun
Note: This repository ignores lock files (package-lock.json, yarn.lock, pnpm-lock.yaml, bun.lockb) to avoid conflicts between different package managers. Each developer should generate their own lock file using their preferred package manager. After that, make sure to delete it from the .gitignore.
- Install dependencies using your preferred package manager:
# Using pnpm (recommended)
pnpm install
# Using npm
npm install
# Using yarn
yarn install
# Using bun
bun install- Install Python dependencies for the LlamaIndex agent:
# Using pnpm
pnpm install:agent
# Using npm
npm run install:agent
# Using yarn
yarn install:agent
# Using bun
bun run install:agent- Set up your OpenAI API key:
export OPENAI_API_KEY="your-openai-api-key-here"- Start the development server:
# Using pnpm
pnpm dev
# Using npm
npm run dev
# Using yarn
yarn dev
# Using bun
bun run devThis will start both the UI and agent servers concurrently.
The following scripts can also be run using your preferred package manager:
dev- Starts both UI and agent servers in development modedev:debug- Starts development servers with debug logging enableddev:ui- Starts only the Next.js UI serverdev:agent- Starts only the LlamaIndex agent serverinstall:agent- Installs Python dependencies for the agentbuild- Builds the Next.js application for productionstart- Starts the production serverlint- Runs ESLint for code linting
The main UI component is in src/app/page.tsx. You can:
- Modify the theme colors and styling
- Customize the CopilotKit sidebar appearance
- Inspect the HITL flow: the
write_essayaction usesrenderAndWaitForResponseand saves the accepted draft to shared state (displayed on the page).
Agent pieces:
agent/agent/agent.pydefines the agent and exposes the frontend tool.agent/agent/server.pyruns the FastAPI server.src/app/api/copilotkit/route.tsconnects the UI to the agent via AG‑UI.
- LlamaIndex Documentation - Learn more about LlamaIndex and its features
- CopilotKit Documentation - Explore CopilotKit's capabilities
- Next.js Documentation - Learn about Next.js features and API
Feel free to submit issues and enhancement requests! This starter is designed to be easily extensible.
This project is licensed under the MIT License - see the LICENSE file for details.
If you see "I'm having trouble connecting to my tools", make sure:
- The LlamaIndex agent is running on port 9000 (default when using
pnpm dev/npm run dev). If you runserver.pydirectly, it uses 8000—updatesrc/app/api/copilotkit/route.tsaccordingly. - Your OpenAI API key is set correctly
- Both servers started successfully
If you encounter Python import errors:
cd agent
uv sync