Quick Start
Integrate Helicone with Vercel AI SDK by configuring the provider with Helicone’s base URL:Installation
Configuration
Environment Variables
Set up your environment:.env.local
Provider Configuration
Configure different providers:- OpenAI
- Anthropic
- Azure OpenAI
Streaming Support
Helicone fully supports streaming with Vercel AI SDK:Text Streaming
Object Streaming
Client-Side Integration
Use Helicone with client-side Vercel AI SDK hooks:app/page.tsx
Advanced Features
Session Tracking
Track multi-turn conversations:User Tracking
Track requests by user:Custom Properties
Add metadata to your requests:Tool Calls
Track tool usage:Edge Runtime Support
Helicone works seamlessly with Edge Runtime:app/api/chat/route.ts
Troubleshooting
Headers not being sent
Headers not being sent
Make sure to pass headers in the provider configuration:
Streaming not working
Streaming not working
Ensure you’re using
.toDataStreamResponse() or .toTextStreamResponse():Edge runtime errors
Edge runtime errors
Helicone is fully compatible with Edge runtime. If you encounter issues:
- Verify environment variables are set in your hosting platform
- Check that
runtime = 'edge'is exported - Ensure you’re using supported dependencies
Examples from Source
See real integration examples:Next Steps
Sessions
Track multi-turn conversations
Custom Properties
Add metadata to requests
User Analytics
Analyze user behavior
Dashboard
Monitor streaming performance