AI Module
Claude and OpenAI integration with streaming chat, token tracking, and provider switching.
Pro tier only. This module is included in the Pro plan.
Overview
The AI module gives your SaaS an AI chat interface powered by Anthropic Claude or OpenAI. It includes:
- Streaming chat responses
- Provider switching (Claude or OpenAI) via environment variable
- Token usage tracking per user
- Rate limiting based on subscription plan
- React chat component ready to drop in
All AI logic lives in src/lib/ai/ and the chat component in src/components/dashboard/.
Configuration
Environment variables
AI_PROVIDER="anthropic"
AI_MODEL="claude-sonnet-4-20250514"
ANTHROPIC_API_KEY="sk-ant-your-api-key"
Or for OpenAI:
AI_PROVIDER="openai"
AI_MODEL="gpt-4o"
OPENAI_API_KEY="sk-your-api-key"
Provider setup
The AI client auto-selects the provider based on AI_PROVIDER:
import Anthropic from "@anthropic-ai/sdk";
import OpenAI from "openai";
export function getAIClient() {
if (process.env.AI_PROVIDER === "anthropic") {
return new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });
}
return new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
}
Usage
Streaming chat API route
import { NextRequest } from "next/server";
import { auth } from "@/lib/auth/server";
import { streamChat } from "@/lib/ai/chat";
import { trackTokenUsage } from "@/lib/ai/tokens";
export async function POST(request: NextRequest) {
const session = await auth.api.getSession({
headers: request.headers,
});
if (!session) {
return new Response("Unauthorized", { status: 401 });
}
const { messages } = await request.json();
const stream = await streamChat({
messages,
userId: session.user.id,
onComplete: (usage) => {
trackTokenUsage(session.user.id, usage);
},
});
return new Response(stream, {
headers: { "Content-Type": "text/event-stream" },
});
}
Chat component
"use client";
import { useState } from "react";
import { useChat } from "@/lib/ai/use-chat";
export function Chat() {
const { messages, input, setInput, sendMessage, isLoading } = useChat();
return (
<div className="flex flex-col h-full">
<div className="flex-1 overflow-y-auto space-y-4 p-4">
{messages.map((message) => (
<div key={message.id} className={message.role === "user" ? "text-right" : ""}>
<p>{message.content}</p>
</div>
))}
</div>
<form onSubmit={sendMessage} className="p-4 border-t">
<input
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Ask anything..."
disabled={isLoading}
/>
</form>
</div>
);
}
Token tracking
Usage is stored in the usage table and queried per user:
import { db } from "@/lib/db";
import { usage } from "@/db/schema/usage";
export async function trackTokenUsage(
userId: string,
tokenData: { inputTokens: number; outputTokens: number; model: string }
) {
await db.insert(usage).values({
userId,
tokensUsed: tokenData.inputTokens + tokenData.outputTokens,
model: tokenData.model,
createdAt: new Date(),
});
}
export async function getUserTokenUsage(userId: string, daysBack = 30) {
// Query total tokens used in the last N days
}
Rate limiting by plan
const TOKEN_LIMITS = {
starter: 50_000, // 50k tokens/month
pro: 500_000, // 500k tokens/month
} as const;
export async function checkTokenLimit(userId: string, plan: string) {
const used = await getUserTokenUsage(userId, 30);
const limit = TOKEN_LIMITS[plan as keyof typeof TOKEN_LIMITS] ?? 0;
return used < limit;
}
Customization
Change the model
Update AI_MODEL in .env.local. Any model supported by the provider works:
- Anthropic:
claude-sonnet-4-20250514,claude-haiku-4-20250414,claude-opus-4-20250514 - OpenAI:
gpt-4o,gpt-4o-mini,o1-preview
Add a system prompt
const SYSTEM_PROMPT = `You are a helpful assistant for ${APP_NAME}.
You help users with their questions about the product.
Be concise and friendly.`;
export async function streamChat({ messages, userId, onComplete }) {
// Prepend system prompt to messages
}
Implement RAG (Retrieval Augmented Generation)
- Store documents in a vector database (e.g., Supabase pgvector)
- Before sending messages to the AI, query relevant documents
- Include retrieved context in the system prompt
Add tool use / function calling
Both Claude and OpenAI support tool use. Define tools in your chat function:
export const tools = [
{
name: "get_user_subscription",
description: "Get the current user's subscription status",
inputSchema: { type: "object", properties: {} },
},
];
Removing this module
- Delete
src/lib/ai/directory - Delete
src/app/api/ai/directory - Remove the chat component from
src/components/dashboard/ - Drop the
usagetable from your schema - Remove AI dependencies:
pnpm remove @anthropic-ai/sdk openai
- Remove AI environment variables from
.env.local