OpenAD Tell Me§
The "Tell Me" function lets you inquire an LLM on how to do certain things with OpenAD.
Info
- Supported LLMs Ollama
- Ollama requires an 8GB GPU.
Ollama Setup§
-
Install Ollama onto your platform.
-
Download the appropriate models.
-
Start the server if not already started.
That's it for local usage. If you want to run Ollama remotely, continue below.
Ollama Remote Setup with SkyPilot§
-
Check out our configuration file to launch ollama on SkyPilot: ollama_setup.yaml
-
Set up local environment variables
- For windows
setx OLLAMA_HOST=<sky-server-ip>:11434
- For Linux and macOS
export OLLAMA_HOST=<sky-server-ip>:11434
- To reset to local use
OLLAMA_HOST=0.0.0.0:11434
- For windows
Run Ollama§
Note: If prompted for an API key and none was setup, just leave the input empty.