Agent Chat - Integration Guide
Embedding Chat in Applications, Services, and Workflows
Overview
Agent Chat provides multiple integration patterns for embedding conversational AI into your applications. This guide covers REST API integration, WebSocket streaming, GraphQL queries, MCP protocol usage, and LibreChat migration.
Integration Methods
1. REST API Integration
Best For: Simple request/response patterns, server-side integrations, batch operations
Basic Chat Integration
import axios from 'axios';
const AGENT_CHAT_URL = 'http://localhost:3080';
const API_KEY = process.env.AGENT_CHAT_API_KEY;
async function sendChatMessage(message: string, model: string = 'claude-3-5-sonnet-20241022') {
const response = await axios.post(
`${AGENT_CHAT_URL}/api/chat/completions`,
{
model,
messages: [
{ role: 'user', content: message }
],
temperature: 0.7,
max_tokens: 1000
},
{
headers: {
'Content-Type': 'application/json',
'X-API-Key': API_KEY
}
}
);
return response.data.choices[0].message.content;
}
// Usage
const answer = await sendChatMessage('What is the capital of France?');
console.log(answer); // "The capital of France is Paris."
Multi-Turn Conversation
interface Message {
role: 'system' | 'user' | 'assistant';
content: string;
}
class ConversationClient {
private messages: Message[] = [];
private conversationId?: string;
constructor(
private baseUrl: string,
private apiKey: string,
private model: string = 'gpt-4-turbo'
) {}
async sendMessage(content: string): Promise<string> {
this.messages.push({ role: 'user', content });
const response = await axios.post(
`${this.baseUrl}/api/chat/completions`,
{
model: this.model,
messages: this.messages,
conversation_id: this.conversationId
},
{
headers: { 'X-API-Key': this.apiKey }
}
);
const assistantMessage = response.data.choices[0].message.content;
this.messages.push({ role: 'assistant', content: assistantMessage });
this.conversationId = response.data.conversation_id;
return assistantMessage;
}
setSystemPrompt(prompt: string) {
this.messages.unshift({ role: 'system', content: prompt });
}
}
// Usage
const chat = new ConversationClient(AGENT_CHAT_URL, API_KEY);
chat.setSystemPrompt('You are a helpful Python programming assistant.');
const answer1 = await chat.sendMessage('How do I read a CSV file?');
const answer2 = await chat.sendMessage('Can you show me an example?');
2. WebSocket Streaming Integration
Best For: Real-time UIs, streaming responses, live collaboration
React Chat Component
import { useEffect, useState } from 'react';
import io, { Socket } from 'socket.io-client';
interface ChatMessage {
role: 'user' | 'assistant';
content: string;
timestamp: Date;
}
export function ChatWidget() {
const [socket, setSocket] = useState<Socket | null>(null);
const [messages, setMessages] = useState<ChatMessage[]>([]);
const [input, setInput] = useState('');
const [streaming, setStreaming] = useState(false);
const [currentResponse, setCurrentResponse] = useState('');
useEffect(() => {
const newSocket = io('http://localhost:3080', {
auth: { token: localStorage.getItem('jwt_token') }
});
newSocket.on('connect', () => {
console.log('Connected to Agent Chat');
});
newSocket.on('chat_response', (data) => {
if (data.finished) {
setMessages(prev => [...prev, {
role: 'assistant',
content: currentResponse,
timestamp: new Date()
}]);
setCurrentResponse('');
setStreaming(false);
} else {
setCurrentResponse(prev => prev + data.token);
}
});
setSocket(newSocket);
return () => {
newSocket.close();
};
}, []);
const sendMessage = () => {
if (!socket || !input.trim()) return;
const userMessage: ChatMessage = {
role: 'user',
content: input,
timestamp: new Date()
};
setMessages(prev => [...prev, userMessage]);
setStreaming(true);
socket.emit('chat_message', {
message: input,
model: 'claude-3-5-sonnet-20241022',
stream: true
});
setInput('');
};
return (
<div className="chat-widget">
<div className="messages">
{messages.map((msg, idx) => (
<div key={idx} className={`message ${msg.role}`}>
{msg.content}
</div>
))}
{streaming && (
<div className="message assistant streaming">
{currentResponse}
<span className="cursor">▊</span>
</div>
)}
</div>
<div className="input-area">
<input
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyPress={(e) => e.key === 'Enter' && sendMessage()}
placeholder="Type a message..."
disabled={streaming}
/>
<button onClick={sendMessage} disabled={streaming}>
Send
</button>
</div>
</div>
);
}
3. GraphQL Integration
Best For: Complex queries, real-time subscriptions, type-safe APIs
Apollo Client Setup
import { ApolloClient, InMemoryCache, HttpLink, split } from '@apollo/client';
import { GraphQLWsLink } from '@apollo/client/link/subscriptions';
import { getMainDefinition } from '@apollo/client/utilities';
import { createClient } from 'graphql-ws';
const httpLink = new HttpLink({
uri: 'http://localhost:3080/graphql',
headers: {
authorization: `Bearer ${localStorage.getItem('jwt_token')}`
}
});
const wsLink = new GraphQLWsLink(
createClient({
url: 'ws://localhost:3080/graphql',
connectionParams: {
authorization: `Bearer ${localStorage.getItem('jwt_token')}`
}
})
);
const splitLink = split(
({ query }) => {
const definition = getMainDefinition(query);
return (
definition.kind === 'OperationDefinition' &&
definition.operation === 'subscription'
);
},
wsLink,
httpLink
);
export const apolloClient = new ApolloClient({
link: splitLink,
cache: new InMemoryCache()
});
Query Conversations
import { gql, useQuery } from '@apollo/client';
const GET_CONVERSATIONS = gql`
query GetConversations($limit: Int, $offset: Int) {
conversations(limit: $limit, offset: $offset) {
id
title
createdAt
messageCount
}
}
`;
export function ConversationList() {
const { loading, error, data } = useQuery(GET_CONVERSATIONS, {
variables: { limit: 20, offset: 0 }
});
if (loading) return <p>Loading...</p>;
if (error) return <p>Error: {error.message}</p>;
return (
<ul>
{data.conversations.map((conv: any) => (
<li key={conv.id}>
<h3>{conv.title}</h3>
<span>{conv.messageCount} messages</span>
</li>
))}
</ul>
);
}
Subscription to Message Stream
import { gql, useSubscription } from '@apollo/client';
const MESSAGE_STREAM = gql`
subscription MessageStream($conversationId: ID!) {
messageStream(conversationId: $conversationId) {
id
token
finished
model
}
}
`;
export function StreamingMessage({ conversationId }: { conversationId: string }) {
const [content, setContent] = useState('');
const { data } = useSubscription(MESSAGE_STREAM, {
variables: { conversationId },
onData: ({ data }) => {
if (data.data.messageStream.finished) {
// Message complete
setContent('');
} else {
setContent(prev => prev + data.data.messageStream.token);
}
}
});
return <div className="streaming-message">{content}</div>;
}
4. MCP Integration (Claude Desktop)
Best For: Claude Desktop tool integration, AI-native workflows
MCP Configuration
Add to Claude Desktop's ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"agent-chat": {
"command": "node",
"args": [
"/path/to/agent-chat/dist/mcp-server.js"
],
"env": {
"AGENT_CHAT_URL": "http://localhost:3080",
"AGENT_CHAT_API_KEY": "your-api-key"
}
}
}
}
Available MCP Tools
create_chat_session
{
"name": "create_chat_session",
"arguments": {
"title": "Python Help",
"model": "gpt-4-turbo",
"system_message": "You are a Python expert"
}
}
send_message
{
"name": "send_message",
"arguments": {
"session_id": "conv_123",
"message": "How do I parse JSON in Python?",
"stream": true
}
}
get_chat_history
{
"name": "get_chat_history",
"arguments": {
"session_id": "conv_123",
"limit": 50
}
}
search_conversations
{
"name": "search_conversations",
"arguments": {
"query": "python json parsing",
"limit": 5
}
}
5. LibreChat Migration
Best For: Existing LibreChat deployments
API Compatibility
Agent Chat is 100% compatible with LibreChat's API. No code changes required:
// Existing LibreChat code works without modification
const response = await fetch('http://localhost:3080/api/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${token}`
},
body: JSON.stringify({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Hello' }]
})
});
Environment Variables
Replace LibreChat env vars with Agent Chat equivalents:
# LibreChat
LIBRECHAT_URL=http://localhost:3080
LIBRECHAT_API_KEY=your-key
# Agent Chat (same)
AGENT_CHAT_PORT=3080
AGENT_CHAT_API_KEY=your-key
# Additional features
LLM_GATEWAY_URL=http://localhost:4000
VECTOR_HUB_URL=http://localhost:6333
AGENT_MESH_URL=http://localhost:3005
Enhanced Features
Enable additional capabilities beyond LibreChat:
# config/agent-chat.yaml
librechat:
enhanced: true
kagent: true # Knowledge graph agents
ossa: true # OSSA 1.0 compliance
agentStudio: true # Multi-platform agents
vectorSearch:
enabled: true
topK: 5
minScore: 0.7
agentOS:
memory:
layers: 5 # Enterprise memory
vortexEnabled: true # Token optimization
learningEnabled: true # Continuous learning
Integration Patterns
Pattern 1: Chatbot Widget
Embed chat in any web application:
<!-- index.html -->
<div id="agent-chat-widget"></div>
<script src="https://cdn.example.com/agent-chat-widget.js"></script>
<script>
AgentChat.init({
container: '#agent-chat-widget',
apiUrl: 'http://localhost:3080',
apiKey: 'your-api-key',
model: 'claude-3-5-sonnet-20241022',
theme: 'dark',
position: 'bottom-right'
});
</script>
Pattern 2: Slack Integration
Route Slack messages to Agent Chat:
import { App } from '@slack/bolt';
import axios from 'axios';
const app = new App({
token: process.env.SLACK_BOT_TOKEN,
signingSecret: process.env.SLACK_SIGNING_SECRET
});
app.message(async ({ message, say }) => {
const response = await axios.post(
'http://localhost:3080/api/chat/completions',
{
model: 'gpt-4-turbo',
messages: [{ role: 'user', content: message.text }]
},
{
headers: { 'X-API-Key': process.env.AGENT_CHAT_API_KEY }
}
);
await say(response.data.choices[0].message.content);
});
app.start(3000);
Pattern 3: Drupal Integration
SSO and content-aware chat:
<?php
// Drupal module: agent_chat
use Drupal\Core\Routing\RouteMatchInterface;
/**
* Implements hook_page_attachments().
*/
function agent_chat_page_attachments(array &$attachments) {
$user = \Drupal::currentUser();
$attachments['#attached']['library'][] = 'agent_chat/widget';
$attachments['#attached']['drupalSettings']['agentChat'] = [
'apiUrl' => \Drupal::config('agent_chat.settings')->get('api_url'),
'apiKey' => \Drupal::config('agent_chat.settings')->get('api_key'),
'userId' => $user->id(),
'userName' => $user->getDisplayName(),
'ssoToken' => _agent_chat_generate_sso_token($user),
];
}
function _agent_chat_generate_sso_token($user) {
$jwt = \Firebase\JWT\JWT::encode(
[
'sub' => $user->id(),
'email' => $user->getEmail(),
'exp' => time() + 3600
],
\Drupal::config('agent_chat.settings')->get('jwt_secret'),
'HS256'
);
return $jwt;
}
Pattern 4: CI/CD Integration
Use Agent Chat in GitLab CI pipelines:
# .gitlab-ci.yml
code_review:
stage: review
script:
- |
RESPONSE=$(curl -X POST http://localhost:3080/api/chat/completions \
-H "X-API-Key: $AGENT_CHAT_API_KEY" \
-H "Content-Type: application/json" \
-d "{
\"model\": \"claude-3-5-sonnet-20241022\",
\"messages\": [{
\"role\": \"user\",
\"content\": \"Review this code: $(cat src/main.py)\"
}]
}")
echo "$RESPONSE" | jq -r '.choices[0].message.content'
Pattern 5: Agent Swarm Orchestration
Coordinate multiple agents:
import axios from 'axios';
async function orchestrateDataAnalysis(dataFile: string) {
const orchestration = await axios.post(
'http://localhost:3080/api/agents/orchestrate',
{
task: 'Analyze sales data and generate insights',
agents: [
'data-cleaner',
'statistical-analyzer',
'insight-generator',
'report-writer'
],
context: {
data_source: dataFile,
output_format: 'markdown'
},
options: {
parallel: false, // Sequential execution
timeout: 600000 // 10 minutes
}
},
{
headers: { 'X-API-Key': process.env.AGENT_CHAT_API_KEY }
}
);
// Monitor progress via WebSocket
const socket = io('http://localhost:3080');
socket.emit('subscribe', {
type: 'agent_progress',
orchestration_id: orchestration.data.orchestration_id
});
socket.on('agent_progress', (progress) => {
console.log(`${progress.agent_name}: ${progress.status} (${progress.progress * 100}%)`);
});
}
SDK Libraries
TypeScript/JavaScript SDK
import { AgentChatClient } from '@bluefly/agent-chat-sdk';
const client = new AgentChatClient({
baseUrl: 'http://localhost:3080',
apiKey: process.env.AGENT_CHAT_API_KEY
});
// Simple chat
const response = await client.chat.send('What is 2+2?');
// Streaming
await client.chat.stream('Explain quantum computing', {
onToken: (token) => process.stdout.write(token),
onComplete: (message) => console.log('\nDone!')
});
// Vector search
const results = await client.search('python csv parsing');
// Agent orchestration
const orchestration = await client.agents.orchestrate({
task: 'Generate sales report',
agents: ['analyst', 'writer']
});
Python SDK
from agent_chat import AgentChatClient
client = AgentChatClient(
base_url='http://localhost:3080',
api_key=os.getenv('AGENT_CHAT_API_KEY')
)
# Simple chat
response = client.chat.send('What is the capital of France?')
print(response.content)
# Streaming
for token in client.chat.stream('Explain machine learning'):
print(token, end='', flush=True)
# Vector search
results = client.search('python tutorials', limit=5)
for result in results:
print(f"{result.score:.2f}: {result.content}")
Authentication Strategies
JWT Token
const jwt = require('jsonwebtoken');
const token = jwt.sign(
{
sub: 'user_123',
email: 'user@example.com',
roles: ['user']
},
process.env.JWT_SECRET,
{ expiresIn: '1h' }
);
// Use in requests
axios.defaults.headers.common['Authorization'] = `Bearer ${token}`;
API Key
# Generate API key via CLI
agent-chat users create-api-key --user user@example.com
# Use in requests
curl -H "X-API-Key: ak_abc123..." http://localhost:3080/api/chat/completions
Drupal SSO
// Exchange Drupal session for Agent Chat token
const response = await axios.post(
'http://localhost:3080/api/auth/drupal-sso',
{
drupal_session: drupalSessionCookie
}
);
const agentChatToken = response.data.token;
Error Handling
Retry Logic
import axios, { AxiosError } from 'axios';
async function sendMessageWithRetry(
message: string,
maxRetries: number = 3
): Promise<string> {
for (let attempt = 1; attempt <= maxRetries; attempt++) {
try {
const response = await axios.post(
'http://localhost:3080/api/chat/completions',
{
model: 'gpt-4-turbo',
messages: [{ role: 'user', content: message }]
},
{
headers: { 'X-API-Key': process.env.AGENT_CHAT_API_KEY },
timeout: 30000
}
);
return response.data.choices[0].message.content;
} catch (error) {
const axiosError = error as AxiosError;
if (axiosError.response?.status === 429) {
// Rate limit - wait and retry
const retryAfter = parseInt(axiosError.response.headers['retry-after'] || '5');
await new Promise(resolve => setTimeout(resolve, retryAfter * 1000));
continue;
}
if (axiosError.response?.status === 503 && attempt < maxRetries) {
// Service unavailable - exponential backoff
await new Promise(resolve => setTimeout(resolve, Math.pow(2, attempt) * 1000));
continue;
}
throw error;
}
}
throw new Error('Max retries exceeded');
}
Performance Best Practices
- Use Streaming: For long responses, use WebSocket streaming instead of REST
- Cache Responses: Implement client-side caching for repeated queries
- Batch Requests: Use GraphQL batching for multiple operations
- Connection Pooling: Reuse HTTP connections and WebSocket instances
- Rate Limiting: Implement client-side rate limiting to avoid 429 errors
- Compression: Enable gzip compression for large payloads
- Pagination: Use pagination for conversation lists and search results
Related Pages: - API Reference - Complete endpoint documentation - Architecture - System design and components - Development - Local development setup Last Updated: 2025-11-02