Transport
The useChat transport system provides fine-grained control over how messages are sent to your API endpoints and how responses are processed. This is particularly useful for alternative communication protocols like WebSockets, custom authentication patterns, or specialized backend integrations.
Default Transport
By default, useChat uses HTTP POST requests to send messages to /api/chat:
import { useChat } from '@ai-sdk/react';
// Uses default HTTP transport
const { messages, sendMessage } = useChat();
This is equivalent to:
import { useChat } from '@ai-sdk/react';
import { DefaultChatTransport } from 'ai';
const { messages, sendMessage } = useChat({
transport: new DefaultChatTransport({
api: '/api/chat',
}),
});
Custom Transport Configuration
Configure the default transport with custom options:
import { useChat } from '@ai-sdk/react';
import { DefaultChatTransport } from 'ai';
const { messages, sendMessage } = useChat({
transport: new DefaultChatTransport({
api: '/api/custom-chat',
headers: {
Authorization: 'Bearer your-token',
'X-API-Version': '2024-01',
},
credentials: 'include',
}),
});
Dynamic Configuration
You can also provide functions that return configuration values. This is useful for authentication tokens that need to be refreshed, or for configuration that depends on runtime conditions:
const { messages, sendMessage } = useChat({
transport: new DefaultChatTransport({
api: '/api/chat',
headers: () => ({
Authorization: `Bearer ${getAuthToken()}`,
'X-User-ID': getCurrentUserId(),
}),
body: () => ({
sessionId: getCurrentSessionId(),
preferences: getUserPreferences(),
}),
credentials: () => 'include',
}),
});
Transform requests before sending to your API:
const { messages, sendMessage } = useChat({
transport: new DefaultChatTransport({
api: '/api/chat',
prepareSendMessagesRequest: ({ id, messages, trigger, messageId }) => {
return {
headers: {
'X-Session-ID': id,
},
body: {
messages: messages.slice(-10), // Only send last 10 messages
trigger,
messageId,
},
};
},
}),
});
Direct Agent Transport
For scenarios where you want to communicate directly with an Agent without going through HTTP, you can use DirectChatTransport. This transport invokes the agent’s stream() method directly in-process.
This is useful for:
- Server-side rendering: Run the agent on the server without an API endpoint
- Testing: Test chat functionality without network requests
- Single-process applications: Desktop or CLI apps where client and agent run together
import { useChat } from '@ai-sdk/react';
import { DirectChatTransport, ToolLoopAgent } from 'ai';
import { openai } from '@ai-sdk/openai';
const agent = new ToolLoopAgent({
model: openai('gpt-4-turbo'),
instructions: 'You are a helpful assistant.',
tools: {
weather: weatherTool,
},
});
const { messages, sendMessage } = useChat({
transport: new DirectChatTransport({ agent }),
});
How It Works
Unlike DefaultChatTransport which sends HTTP requests:
DirectChatTransport validates incoming UI messages
- Converts them to model messages using
convertToModelMessages
- Calls the agent’s
stream() method directly
- Returns the result as a UI message stream via
toUIMessageStream()
Configuration Options
You can pass additional options to customize the stream output:
const transport = new DirectChatTransport({
agent,
// Pass options to the agent
options: { customOption: 'value' },
// Configure what's sent to the client
sendReasoning: true,
sendSources: true,
});
DirectChatTransport does not support stream reconnection since there is no
persistent server-side stream. The reconnectToStream() method always returns
null.
For complete API details, see the DirectChatTransport reference.
Building Custom Transports
To understand how to build your own transport, refer to the source code of the default implementation:
These implementations show you exactly how to:
- Handle the
sendMessages method
- Process UI message streams
- Transform requests and responses
- Handle errors and connection management
The transport system gives you complete control over how your chat application communicates, enabling integration with any backend protocol or service.