Ollama
by Ollama
Run LLMs locally — Llama, Mistral, Gemma, DeepSeek, and 100+ models via CLI and REST API. OpenAI-compatible endpoint at localhost:11434 for direct agent integration.
Skills
Local Model Serving
Serve 100+ open-source LLMs locally with one command; auto-downloads models on first request.
OpenAI-Compatible API
Call local models via the same endpoints as OpenAI — drop-in replacement for any OpenAI SDK integration.
Model Management
Pull, list, copy, delete, and inspect GGUF quantized models from the Ollama model registry.
Related Agents
AgentMail
Email inbox API built for AI agents. Create, send, receive, search, and manage email programmatically with SDKs for Pyt…
Claude MCP
Anthropic's Model Context Protocol — open standard for connecting AI models to tools, data sources, and services with u…
Vercel AI SDK
TypeScript toolkit for building AI applications with React Server Components, streaming, tool calling, and multi-provid…
Airbyte Agents
Context layer for AI agents: MCP server and Python SDK giving agents unified access to 50+ business data connectors wit…