Full-fledged Agent CSR (Customer Service Representative) example using @hazeljs packages: Agent, AI, RAG, Memory, Queue, WebSocket.
- AI Agent - Stateful CSR agent with tools (order lookup, inventory, refunds, tickets, knowledge search)
- RAG - Retrieval-augmented generation for FAQ and documentation
- Memory - Conversation memory with BufferMemory (dev) / HybridMemory (prod)
- Approval Workflow - Human-in-the-loop for refunds and address updates
- REST API - POST /api/csr/chat, /api/csr/chat/stream, /api/csr/ingest, /api/csr/approve
- WebSocket - Real-time chat at ws://localhost:3001/csr
- Queue - Optional async ticket creation (Redis/BullMQ)
- Production - Rate limiting, circuit breaker, retry, health checks
# Install dependencies
npm install
# Set OpenAI API key (required)
export OPENAI_API_KEY=your-key
# Run
npm run devThis example is designed to be read alongside the docs guide and the code in src/csr/.
Important: the runnable CSR server in src/csr/ now executes chat through an HCEL-first path (ai.hazel ... .agent('csr-agent')) and keeps AgentRuntime as a compatibility fallback. This keeps tool execution and approval workflows stable while simplifying orchestration.
At a high level, HCEL lets you compose “support pipelines” like:
const result = await ai.hazel
.context({ sessionId: 'customer-1', userId: 'u-1' })
.prompt('Customer request: My package arrived damaged, what do I do?')
.agent('csr-agent')
.observe((e) => console.log('[HCEL]', e.type))
.execute();Docs guide: see hazeljs-landing/src/content/docs/guides/support-agent.mdx.
| Method | Path | Description |
|---|---|---|
| POST | /api/csr/chat | Send message to agent (sync) |
| POST | /api/csr/chat/stream | SSE stream response |
| POST | /api/csr/ingest | Ingest document into knowledge base |
| POST | /api/csr/approve | Approve/reject tool execution |
| GET | /api/csr/health | Agent health check |
Connect to ws://localhost:3001/csr and send:
{ "event": "message", "data": { "text": "What is my order status for ORD-12345?", "sessionId": "user-123" } }See .env.example for full list. Key variables:
OPENAI_API_KEY- Required for AIREDIS_HOST,REDIS_PORT- Optional, for Queue (async tickets)PINECONE_API_KEY- Optional, for production RAG (uses Pinecone when set)QDRANT_URL- Optional, for production RAG (uses Qdrant when set, if no Pinecone)PORT- HTTP server (default 3000)WS_PORT- WebSocket server (default 3001)NODE_ENV=production- Enables production features (rate limit, circuit breaker)
For production, consider:
- Redis for agent state: Add
redispackage and useRedisStateManagerwith AgentRuntime - Vector DB for memory: Use
HybridMemorywith Pinecone/Qdrant instead of BufferMemory - Redis for Queue: Set REDIS_HOST for async ticket processing
curl -X POST http://localhost:3000/api/csr/ingest \
-H "Content-Type: application/json" \
-d '{"title": "Refund Policy", "content": "Full refunds within 30 days..."}'curl -X POST http://localhost:3000/api/csr/chat \
-H "Content-Type: application/json" \
-d '{"message": "What is the status of order ORD-12345?", "sessionId": "customer-1"}'Import the collection and environment for easy testing:
- Import → Upload
hazeljs-csr-agent.postman_collection.json - Import → Upload
hazeljs-csr-agent.postman_environment.json - Select the "HazelJS CSR - Local" environment
- Run requests (Health Check first, then Chat, Ingest, etc.)
Recommended flow: Health Check → Ingest documents → Chat (order/inventory) → Chat (RAG query)
