Community Providers
The AI SDK provides a Language Model Specification. You can write your own provider that adheres to the specification and it will be compatible with the AI SDK.What are Community Providers?
Community providers are third-party implementations of the AI SDK provider interface that enable you to use additional AI services and platforms beyond the official SDK providers. These providers are built and maintained by the community, offering access to a wide variety of models and services.Available Community Providers
The AI SDK community has created providers for many popular AI platforms and services. Here are some notable examples:AI Platforms
- Ollama - Run large language models locally
- OpenRouter - Access to multiple AI models through a single API
- Portkey - AI gateway for model routing and fallbacks
- Cloudflare Workers AI - Run AI models on Cloudflare’s edge network
Specialized Providers
- Voyage AI - High-quality embeddings
- Jina AI - Multimodal embeddings and reranking
- Mixedbread - Embeddings optimized for search
- LangDB - Vector database integration
Regional Providers
- Zhipu AI - Chinese language models
- Qwen - Alibaba’s multilingual models
- Minimax - Chinese AI platform
- Sarvam - Indian language models
Development Tools
- Browser AI - Run models in the browser
- Llama.cpp - High-performance local inference
- MCP Sampling - Model Context Protocol support
Using Community Providers
Most community providers follow a similar pattern to the official providers:Creating Your Own Provider
You can create your own custom provider by implementing the Language Model Specification. This allows you to:- Integrate proprietary AI services
- Add support for new AI platforms
- Create specialized wrappers for existing services
- Implement custom model routing logic
Provider Quality and Support
Community providers are maintained by third parties and may vary in:- Feature completeness - Some providers may not support all AI SDK features
- Update frequency - Maintenance schedules differ between providers
- Documentation quality - Documentation depth varies by provider
- API compatibility - Breaking changes may occur independently of the AI SDK
- Checking the provider’s documentation and examples
- Reviewing the provider’s GitHub repository for activity
- Testing with your specific use case
- Having fallback options for critical applications
Contributing
If you’ve built a provider that implements the Language Model Specification, you can:- Publish it to npm
- Add documentation and examples
- Submit a pull request to add it to the community providers list
Do you have a provider that supports the AI SDK and has integration documentation? Please open a pull request to add it to the list.