Note: This service is temporarily disabled.

Overview

Ollama provides local language model hosting capabilities, enabling organizations to run open-source language models on their own infrastructure without external API dependencies. This service offers complete data privacy, offline operation, and cost control while supporting a wide range of open-source models including Llama, Mistral, CodeLlama, and specialized variants.

Service Information

Property Value
Service Name Ollama
Status Enabled
Compatible Nodes Call LLM, HyperFlow LLM PDF transformer

When to Use This Service

Ollama is ideal for:

Setup Requirements

Ollama Installation:

Available Models

Models depend on your local Ollama installation. Popular options include: