pyseekdb.utils.embedding_functions.OllamaEmbeddingFunction

class pyseekdb.utils.embedding_functions.OllamaEmbeddingFunction(model_name: str = 'nomic-embed-text', api_key_env: str | None = None, api_base: str | None = None, dimensions: int | None = None, **kwargs: Any)[source]

Bases: OpenAIBaseEmbeddingFunction

A convenient embedding function for Ollama embedding models.

This class provides a simplified interface to Ollama embedding models using the OpenAI-compatible API. Ollama provides OpenAI-compatible API endpoints for embedding generation.

For more information about Ollama, see https://docs.ollama.com/

Note: Before using a model, you need to pull it locally using ollama pull <model_name>.

Example

pip install pyseekdb openai

__init__(model_name: str = 'nomic-embed-text', api_key_env: str | None = None, api_base: str | None = None, dimensions: int | None = None, **kwargs: Any)[source]

Initialize OllamaEmbeddingFunction.

Parameters:
  • model_name (str, optional) – Name of the Ollama embedding model. Defaults to “nomic-embed-text”. See Ollama documentation for available models: https://docs.ollama.com/capabilities/embeddings Note: Models must be pulled locally first using ollama pull <model_name>

  • api_key_env (str, optional) – Name of the environment variable containing the Ollama API key. Defaults to “OLLAMA_API_KEY” if not provided. Note: The API key is required but ignored by Ollama. You can set it to “ollama” or any value. If the environment variable is not set, it will default to “ollama”.

  • api_base (str, optional) – Base URL for the Ollama API endpoint. Defaults to “http://localhost:11434/v1” if not provided. For remote Ollama servers, use the appropriate URL.

  • dimensions (int, optional) – The number of dimensions the resulting embeddings should have. Supported if the model supports it. Check model documentation for supported dimensions.

  • **kwargs – Additional arguments to pass to the OpenAI client. See https://github.com/openai/openai-python for more information.

Methods

__init__([model_name, api_key_env, ...])

Initialize OllamaEmbeddingFunction.

build_from_config(config)

get_config()

Get the configuration dictionary for the OpenAIBaseEmbeddingFunction.

name()

Get the unique name identifier for OllamaEmbeddingFunction.

support_persistence(embedding_function)

Check if the embedding function supports persistence.

Attributes

dimension

Get the dimension of embeddings produced by this function.

get_config() dict[str, Any][source]

Get the configuration dictionary for the OpenAIBaseEmbeddingFunction.

Subclasses should override the name() method to provide the correct name for routing.

Returns:

Dictionary containing configuration needed to restore this embedding function

static name() str[source]

Get the unique name identifier for OllamaEmbeddingFunction.

Returns:

The name identifier for this embedding function type