Using Ollama in LobeChat
Ollama is a powerful framework for running large language models (LLMs) locally, supporting various language models including Llama 2, Mistral, and more. Now, LobeChat supports integration with Ollama, meaning you can easily enhance your application by using the language models provided by Ollama in LobeChat.
This document will guide you on how to use Ollama in LobeChat:
Using Ollama on macOS
Local Installation of Ollama
Download Ollama for macOS and unzip/install it.
Configure Ollama for Cross-Origin Access
Due to Ollama's default configuration, which restricts access to local only, additional environment variable setting OLLAMA_ORIGINS
is required for cross-origin access and port listening. Use launchctl
to set the environment variable:
bash