Chat Interface
Build streaming chat interfaces with useN4iChat hook. Includes message history, streaming support, and smart message rendering.
Live Demo
Try the chat interface below. Type a message to generate UI.
Try asking: "Create a pricing card with 3 tiers"
useN4iChat Hook
The recommended hook for building chat interfaces with streaming support.
import { useN4iChat, N4iMessageRenderer, N4iErrorBoundary } from "n4i-genui/react";
function ChatPage() {
const {
messages, // Array of chat messages
isLoading, // Request in progress
isStreaming, // Currently streaming
error, // Error message if any
sendMessage, // Send message (with optional PDF context)
clearMessages, // Clear chat history
resetChat, // Full reset (abort + clear)
setMessages, // Set messages programmatically
abort, // Abort current request
} = useN4iChat({
apiEndpoint: "/api/chat",
brandSlug: "my-brand", // For multi-tenant setups
n4iMode: true, // Enable rich UI generation
headers: { "X-Custom": "..." },
onMessageSent: (msg) => console.log("Sent:", msg),
onMessageReceived: (msg) => console.log("Received:", msg),
onError: (err) => console.error(err),
onStreamChunk: (chunk) => console.log("Chunk:", chunk),
});
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
const form = e.target as HTMLFormElement;
const input = form.elements.namedItem("message") as HTMLInputElement;
sendMessage(input.value);
input.value = "";
};
return (
<N4iErrorBoundary>
<div className="flex flex-col h-screen">
{/* Messages */}
<div className="flex-1 overflow-auto p-4 space-y-4">
{messages.map((msg, i) => (
<div
key={i}
className={`p-4 rounded-xl ${
msg.role === "user"
? "bg-blue-500/10 ml-12"
: "bg-slate-800 mr-12"
}`}
>
{msg.role === "user" ? (
<p>{msg.content}</p>
) : (
<N4iMessageRenderer
content={msg.content}
primaryColor="#6366f1"
loadingText="Generating UI"
/>
)}
</div>
))}
{isStreaming && (
<div className="p-4 bg-slate-800 mr-12 rounded-xl animate-pulse">
Generating...
</div>
)}
</div>
{/* Input */}
<form onSubmit={handleSubmit} className="p-4 border-t border-white/10">
<div className="flex gap-2">
<input
name="message"
placeholder="Ask something..."
disabled={isLoading}
className="flex-1 px-4 py-3 bg-white/5 border border-white/10 rounded-xl"
/>
<button
type="submit"
disabled={isLoading}
className="px-6 py-3 bg-violet-500 text-white rounded-xl font-medium"
>
{isStreaming ? "Generating..." : "Send"}
</button>
</div>
</form>
</div>
</N4iErrorBoundary>
);
}N4iMessageRenderer Component
Smart component that auto-detects and renders N4i JSON or plain text. Handles the complexity of parsing and displaying AI responses.
import { N4iMessageRenderer } from "n4i-genui/react";
function MessageDisplay({ content }: { content: string }) {
return (
<N4iMessageRenderer
content={content} // Can be JSON or plain text
primaryColor="#6366f1" // Loading indicator color
loadingText="Generating UI" // Loading text
debug={false} // Enable debug logging
onAction={(actionId, nodeId) => {
console.log("Action:", actionId, nodeId);
}}
fallbackRenderer={(text) => (
// Custom renderer for non-JSON content
<div className="prose prose-invert">
<ReactMarkdown>{text}</ReactMarkdown>
</div>
)}
/>
);
}
// The component automatically:
// 1. Detects if content is valid N4i JSON
// 2. Renders rich UI if JSON is valid
// 3. Falls back to text display otherwise
// 4. Shows loading state while streaming incomplete JSONSmart Detection
N4iMessageRenderer automatically detects JSON vs plain text and renders accordingly. No need to manually parse or check content format.
N4iErrorBoundary
Wrap your chat UI with N4iErrorBoundary to gracefully handle rendering errors.
import { N4iErrorBoundary } from "n4i-genui/react";
function ChatApp() {
return (
<N4iErrorBoundary
fallback={({ error, resetError }) => (
<div className="p-4 bg-red-500/10 rounded-xl text-center">
<p className="text-red-400 mb-2">Something went wrong</p>
<p className="text-sm text-white/60 mb-4">{error?.message}</p>
<button
onClick={resetError}
className="px-4 py-2 bg-red-500 text-white rounded-lg"
>
Try Again
</button>
</div>
)}
>
<ChatPage />
</N4iErrorBoundary>
);
}
// The error boundary also suppresses form action errors
// that can occur with generated form elementsuseN4iChat Options
| Option | Type | Default | Description |
|---|---|---|---|
| apiEndpoint | string | "/api/chat" | API endpoint for chat requests |
| initialMessages | ChatMessage[] | [] | Initial messages to populate |
| brandSlug | string | - | Brand identifier for multi-tenant |
| n4iMode | boolean | true | Enable rich UI generation |
| headers | Record<string, string> | {} | Custom request headers |
| onMessageSent | function | - | Called when message sent |
| onMessageReceived | function | - | Called when response received |
| onError | function | - | Error callback |
| onStreamChunk | function | - | Called for each stream chunk |
With PDF/Document Context
Send document context along with messages for RAG-powered responses.
import { useN4iChat, N4iMessageRenderer } from "n4i-genui/react";
function RAGChat() {
const { messages, sendMessage, isStreaming } = useN4iChat({
apiEndpoint: "/api/chat",
});
const [pdfContent, setPdfContent] = useState<string | null>(null);
const handleFileUpload = async (file: File) => {
const text = await extractTextFromPDF(file);
setPdfContent(text);
};
const handleSubmit = (message: string) => {
// Pass PDF context with the message
sendMessage(message, pdfContent ? {
title: "uploaded-document.pdf",
content: pdfContent,
} : null);
};
return (
<div>
{/* File Upload */}
<input
type="file"
accept=".pdf"
onChange={(e) => {
const file = e.target.files?.[0];
if (file) handleFileUpload(file);
}}
/>
{pdfContent && (
<div className="text-sm text-green-400 mb-4">
✓ Document loaded ({pdfContent.length} chars)
</div>
)}
{/* Messages */}
{messages.map((msg, i) => (
<div key={i}>
{msg.role === "user" ? (
<p>{msg.content}</p>
) : (
<N4iMessageRenderer content={msg.content} />
)}
</div>
))}
{/* Input */}
<form onSubmit={(e) => {
e.preventDefault();
const input = e.currentTarget.elements.namedItem("q") as HTMLInputElement;
handleSubmit(input.value);
input.value = "";
}}>
<input name="q" placeholder="Ask about the document..." />
<button type="submit" disabled={isStreaming}>
{isStreaming ? "..." : "Ask"}
</button>
</form>
</div>
);
}Complete Chat Example
A full-featured chat interface with all the bells and whistles.
"use client";
import { useRef, useEffect } from "react";
import { useN4iChat, N4iMessageRenderer, N4iErrorBoundary } from "n4i-genui/react";
import { Send, Loader2, Trash2, AlertCircle } from "lucide-react";
export default function FullChatExample() {
const messagesEndRef = useRef<HTMLDivElement>(null);
const {
messages,
isLoading,
isStreaming,
error,
sendMessage,
clearMessages,
abort,
} = useN4iChat({
apiEndpoint: "/api/chat",
n4iMode: true,
onError: (err) => console.error("Chat error:", err),
});
// Auto-scroll to bottom
useEffect(() => {
messagesEndRef.current?.scrollIntoView({ behavior: "smooth" });
}, [messages]);
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
const form = e.target as HTMLFormElement;
const input = form.elements.namedItem("message") as HTMLInputElement;
if (input.value.trim()) {
sendMessage(input.value);
input.value = "";
}
};
return (
<N4iErrorBoundary>
<div className="flex flex-col h-screen bg-slate-950">
{/* Header */}
<header className="flex items-center justify-between px-6 py-4 border-b border-white/10">
<h1 className="text-xl font-semibold">N4i Chat</h1>
<div className="flex gap-2">
{isStreaming && (
<button
onClick={abort}
className="px-3 py-1.5 text-sm bg-red-500/10 text-red-400 rounded-lg"
>
Stop
</button>
)}
<button
onClick={clearMessages}
className="p-2 hover:bg-white/10 rounded-lg"
title="Clear chat"
>
<Trash2 className="w-4 h-4" />
</button>
</div>
</header>
{/* Messages */}
<div className="flex-1 overflow-auto p-6 space-y-6">
{messages.length === 0 && (
<div className="text-center text-white/40 py-20">
<p className="text-lg mb-2">Start a conversation</p>
<p className="text-sm">Ask me to create any UI you can imagine</p>
</div>
)}
{messages.map((msg, i) => (
<div
key={i}
className={`flex ${msg.role === "user" ? "justify-end" : "justify-start"}`}
>
<div
className={`max-w-[80%] rounded-2xl ${
msg.role === "user"
? "bg-violet-500/20 px-4 py-3"
: "bg-white/5 p-1"
}`}
>
{msg.role === "user" ? (
<p className="text-white">{msg.content}</p>
) : (
<N4iMessageRenderer
content={msg.content}
primaryColor="#8b5cf6"
loadingText="Generating..."
/>
)}
</div>
</div>
))}
{/* Streaming indicator */}
{isStreaming && (
<div className="flex gap-2 items-center text-white/40">
<Loader2 className="w-4 h-4 animate-spin" />
<span>Thinking...</span>
</div>
)}
{/* Error */}
{error && (
<div className="flex items-center gap-2 p-4 bg-red-500/10 rounded-xl text-red-400">
<AlertCircle className="w-5 h-5" />
<span>{error}</span>
</div>
)}
<div ref={messagesEndRef} />
</div>
{/* Input */}
<form onSubmit={handleSubmit} className="p-4 border-t border-white/10">
<div className="flex gap-3">
<input
name="message"
placeholder="Describe what you want to create..."
disabled={isLoading}
autoComplete="off"
className="flex-1 px-4 py-3 bg-white/5 border border-white/10 rounded-xl
focus:border-violet-500/50 focus:outline-none transition-colors"
/>
<button
type="submit"
disabled={isLoading}
className="px-5 py-3 bg-violet-500 hover:bg-violet-400 text-white
rounded-xl font-medium transition-colors disabled:opacity-50"
>
{isStreaming ? (
<Loader2 className="w-5 h-5 animate-spin" />
) : (
<Send className="w-5 h-5" />
)}
</button>
</div>
</form>
</div>
</N4iErrorBoundary>
);
}Next Steps
- → Add RAG Integration for document-aware responses
- → Customize appearance with Theming
- → Set up your API endpoint
