@plantagoai/ai
Anthropic Claude and Google Gemini AI wrappers with prompt caching, conversation history, vision support, and per-tenant usage tracking.
Consumers: Foundation, MarketHub, Soho
Installation
"@plantagoai/ai": "file:../../shared/packages/ai"
Peer Dependencies
All peer dependencies are optional — install only what you need:
| Dependency | Required For |
|---|---|
@anthropic-ai/sdk |
Claude chat / prompt caching |
@google/generative-ai |
Gemini text / vision |
firebase-admin |
Usage tracking to Firestore |
Claude (Anthropic)
Simple Chat
import { claudeChat } from "@plantagoai/ai";
const response = await claudeChat("Explain zero-knowledge proofs in 2 sentences", {
model: "claude-sonnet-4-20250514", // default
maxTokens: 1024,
temperature: 0.7,
});
console.log(response.text);
console.log(`Tokens: ${response.inputTokens} in, ${response.outputTokens} out`);
Chat with Prompt Caching
Enables Anthropic's prompt caching — large system prompts are cached across calls, reducing cost and latency:
import { claudeWithCache } from "@plantagoai/ai";
const response = await claudeWithCache(
"You are a constitutional reviewer. Here is the full constitution: ...", // cached
"Review this proposal for constitutional alignment: ...",
{
maxTokens: 2048,
conversationHistory: [
{ role: "user", content: "Previous question..." },
{ role: "assistant", content: "Previous answer..." },
],
}
);
console.log(response.text);
console.log(`Cache read: ${response.cacheReadTokens} tokens`);
console.log(`Cache creation: ${response.cacheCreationTokens} tokens`);
Create Client Directly
import { createClaude } from "@plantagoai/ai";
const client = createClaude(process.env.ANTHROPIC_API_KEY);
// Singleton — subsequent calls return the same instance
ClaudeOptions
| Field | Type | Default | Description |
|---|---|---|---|
apiKey |
string |
ANTHROPIC_API_KEY env |
API key |
model |
string |
"claude-sonnet-4-20250514" |
Model identifier |
maxTokens |
number |
1024 |
Maximum output tokens |
temperature |
number |
1.0 |
Sampling temperature |
ClaudeResponse
interface ClaudeResponse {
text: string;
inputTokens: number;
outputTokens: number;
cacheReadTokens: number;
cacheCreationTokens: number;
model: string;
}
Gemini (Google)
Simple Chat
import { geminiChat } from "@plantagoai/ai";
const text = await geminiChat("Describe this herb's medicinal properties", {
model: "gemini-2.0-flash", // default
maxTokens: 1024,
});
Vision (Image Analysis)
import { geminiVision } from "@plantagoai/ai";
// From base64
const description = await geminiVision(
"Identify this plant and describe its properties",
{ base64: imageBase64, mimeType: "image/jpeg" }
);
// From URL
const description = await geminiVision(
"What product is shown in this image?",
{ url: "https://example.com/product.jpg" }
);
Create Model Instance
import { createGemini } from "@plantagoai/ai";
const model = createGemini({
apiKey: process.env.GEMINI_API_KEY,
model: "gemini-2.0-flash",
});
GeminiOptions
| Field | Type | Default | Description |
|---|---|---|---|
apiKey |
string |
GEMINI_API_KEY env |
API key |
model |
string |
"gemini-2.0-flash" |
Model identifier |
maxTokens |
number |
1024 |
Maximum output tokens |
temperature |
number |
1.0 |
Sampling temperature |
Usage Tracking
Log AI usage to Firestore for billing and audit:
import { trackUsage, getUsageStats } from "@plantagoai/ai";
// Log usage (called automatically by chat functions when firebase-admin is available)
await trackUsage({
provider: "anthropic",
model: "claude-sonnet-4-20250514",
inputTokens: 1500,
outputTokens: 300,
cacheReadTokens: 1200,
userId: "user-123",
tenantId: "org-1",
feature: "proposal-review",
});
// Query stats
const stats = await getUsageStats({
tenantId: "org-1",
startDate: new Date("2026-04-01"),
endDate: new Date("2026-04-30"),
});
// {
// totalInputTokens: 150000,
// totalOutputTokens: 30000,
// totalCacheReadTokens: 120000,
// requestCount: 85,
// byProvider: {
// anthropic: { input: 100000, output: 20000, requests: 50 },
// google: { input: 50000, output: 10000, requests: 35 }
// }
// }
UsageRecord
interface UsageRecord {
provider: "anthropic" | "google";
model: string;
inputTokens: number;
outputTokens: number;
cacheReadTokens?: number;
cacheCreationTokens?: number;
userId?: string;
tenantId?: string;
feature?: string;
timestamp: Date;
}
Integration Examples
Foundation: Proposal Constitutional Review
import { claudeWithCache } from "@plantagoai/ai";
const constitution = await getConstitution(); // fetch from Firestore
const review = await claudeWithCache(
`You are a constitutional reviewer. Constitution:\n${constitution}`,
`Review this proposal:\nTitle: ${proposal.title}\nDescription: ${proposal.description}`
);
MarketHub: Product Description Generation
import { claudeChat } from "@plantagoai/ai";
import { geminiVision } from "@plantagoai/ai";
// Generate description from image
const visual = await geminiVision("Describe this product", { url: imageUrl });
const description = await claudeChat(`Write a product listing for: ${visual}`);