Skip to content

Build streaming chat UI with Vercel AI SDK and MCP support

pattern

Building chat interfaces that connect to MCP servers requires significant boilerplate for streaming, tool calls, and UI

nextjsmcpstreamingvercel-ai-sdkchat
19 views

Problem

Building a chat interface that streams LLM responses, handles tool calls, and renders rich UI requires wiring together server-side streaming, client-side state management, message history, and MCP server integration. Without a framework, developers write hundreds of lines of boilerplate for WebSocket handling, partial response rendering, and tool call lifecycle management.

Solution

Use the Vercel AI SDK with Next.js for streaming chat with built-in MCP support.

Server-side route with MCP tool execution:

// app/api/chat/route.ts
import { streamText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
import { experimental_createMCPClient } from "ai";

export async function POST(req: Request) {
  const { messages } = await req.json();

  const mcpClient = await experimental_createMCPClient({
    transport: { type: "sse", url: "https://your-mcp-server.example.com/sse" },
  });

  const tools = await mcpClient.tools();

  const result = streamText({
    model: anthropic("claude-sonnet-4-5-20250929"),
    messages,
    tools,
    maxSteps: 5,
  });

  return result.toDataStreamResponse();
}

Client-side chat hook:

// components/chat.tsx
"use client";
import { useChat } from "@ai-sdk/react";

export function Chat() {
  const { messages, input, handleInputChange, handleSubmit, isLoading } =
    useChat({ api: "/api/chat" });

  return (
    <div>
      {messages.map((m) => (
        <div key={m.id}>
          <strong>{m.role}:</strong> {m.content}
          {m.toolInvocations?.map((tool) => (
            <pre key={tool.toolCallId}>
              {JSON.stringify(tool.result, null, 2)}
            </pre>
          ))}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
        <button type="submit" disabled={isLoading}>Send</button>
      </form>
    </div>
  );
}

Why It Works

The Vercel AI SDK abstracts the streaming protocol into a single useChat hook. On the server, streamText handles SSE streaming, tool call execution, and multi-step reasoning. The MCP client connects to any MCP-compliant server and converts its tools into the expected format. The maxSteps parameter allows the model to call tools and continue reasoning without additional client logic. Streaming, tool calls, and MCP integration come with minimal code.

Context

  • The Vercel AI SDK Chat template was announced in April 2025 with auth, persistence, and artifact rendering
  • MCP support is experimental but functional for SSE-based MCP servers
  • The SDK supports multiple providers (Anthropic, OpenAI, Google) with a unified interface
  • The useChat hook can be dropped into any React app without the full template
  • The same pattern works for building v0/Lovable-style code generation UIs with canvas views
About this share
Contributormblode
Repositorymblode/shares
CreatedFeb 10, 2026
View on GitHub