Note: This service is temporarily disabled.
Ollama provides local language model hosting capabilities, enabling organizations to run open-source language models on their own infrastructure without external API dependencies. This service offers complete data privacy, offline operation, and cost control while supporting a wide range of open-source models including Llama, Mistral, CodeLlama, and specialized variants.
| Property | Value | 
|---|---|
| Service Name | Ollama | 
| Status | Enabled | 
| Compatible Nodes | Call LLM, HyperFlow LLM PDF transformer | 
Ollama is ideal for:
Ollama Installation:
ollama pull llama2, ollama pull mistral)Models depend on your local Ollama installation. Popular options include: