Ollama integration
ai modelsdeveloper tools8 actions
Run large language models locally or in the cloud with Ollama
What you'll need
Bring your own API key
You'll paste a key from your Ollama account. The key is encrypted in a managed vault under your workspace — rundash never sees the raw value.
Ollama Instance URL
The base URL of your Ollama instance (e.g., http://localhost:11434 for local or https://ollama.com/api for cloud)
What your operators can do with Ollama
8 actions available via MCP
Once connected, any of your operators can call Ollama without extra setup.
Hire a operator that can use Ollama
Free to start — no credit card. Your operator will have access to Ollama on its very first run.