Skip to content

ollama

ollama

Ollama inference engine backend.

Classes

OllamaEngine

OllamaEngine(host: str | None = None, *, timeout: float = 1800.0)

Bases: InferenceEngine

Ollama backend via its native HTTP API.

Source code in src/openjarvis/engine/ollama.py
def __init__(
    self,
    host: str | None = None,
    *,
    timeout: float = 1800.0,
) -> None:
    # Priority: explicit host (from config.toml) > OLLAMA_HOST env var > default
    if host is None:
        env_host = os.environ.get("OLLAMA_HOST")
        host = env_host or self._DEFAULT_HOST
    self._host = host.rstrip("/")
    self._client = httpx.Client(base_url=self._host, timeout=timeout)

Functions