Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
habrok:examples:ollama [2025/09/04 09:03] – created camarocico | habrok:examples:ollama [2025/09/04 11:47] (current) – camarocico | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | {{indexmenu_n> | + | ====== Ollama on Habrok |
- | ===== Ollama on Habrok ===== | + | |
You can run an LLM on Habrok with Ollama in a Jupyter environment by using the [[https:// | You can run an LLM on Habrok with Ollama in a Jupyter environment by using the [[https:// | ||
+ | ===== Setting up the virtual environment ===== | ||
- | ==== Setting up the virtual environment ==== | ||
To be able to use the app, you first need to set up a Python virtual environment. The version of Ollama installed on Habrok is '' | To be able to use the app, you first need to set up a Python virtual environment. The version of Ollama installed on Habrok is '' | ||
- | < | + | < |
module load Python/ | module load Python/ | ||
python3 -m venv $HOME/ | python3 -m venv $HOME/ | ||
Line 14: | Line 13: | ||
Once the virtual environment has been built, you need to install '' | Once the virtual environment has been built, you need to install '' | ||
- | < | + | < |
source $HOME/ | source $HOME/ | ||
pip install --upgrade pip | pip install --upgrade pip | ||
Line 21: | Line 20: | ||
Finally, to make sure that the Jupyter Notebook is aware of your virtual environment, | Finally, to make sure that the Jupyter Notebook is aware of your virtual environment, | ||
- | < | + | < |
python3 -m ipykernel install --user --name=ollama --display-name=" | python3 -m ipykernel install --user --name=ollama --display-name=" | ||
</ | </ | ||
+ | ===== Choosing a folder for the models ===== | ||
- | ==== Choosing a folder for the models ==== | ||
Another important choice when running the app is where the Ollama models should be saved; there are two options, with advantages and drawbacks: | Another important choice when running the app is where the Ollama models should be saved; there are two options, with advantages and drawbacks: | ||
* **Custom directory**: | * **Custom directory**: | ||
* **Temporary directory**: | * **Temporary directory**: | ||
+ | ===== Simple usage example ===== | ||
- | ==== Simple usage example ==== | + | To use Ollama in the Jupyter app, you need first to open a new notebook in the Jupyter app, and choose the **Ollama** Jupyter kernel built when setting up the virtual environment. Here is a small example which first imports the necessary packages: |
- | + | < | |
- | To use Ollama in the Jupyter app, here' | + | |
- | + | ||
- | < | + | |
import os | import os | ||
import ollama | import ollama | ||
- | | + | |
from openai import OpenAI | from openai import OpenAI | ||
</ | </ | ||
then downloads a model from Ollama: | then downloads a model from Ollama: | ||
- | < | + | < |
ollama.pull(" | ollama.pull(" | ||
</ | </ | ||
and also lists all currently downloaded models: | and also lists all currently downloaded models: | ||
- | < | + | < |
for model in ollama.list().models: | for model in ollama.list().models: | ||
- | | + | |
</ | </ | ||
+ | |||
It then creates a OpenAI API client: | It then creates a OpenAI API client: | ||
- | < | + | < |
client = OpenAI( | client = OpenAI( | ||
base_url=f" | base_url=f" | ||
Line 57: | Line 55: | ||
</ | </ | ||
and interacts with the LLM: | and interacts with the LLM: | ||
- | < | + | < |
response = client.chat.completions.create( | response = client.chat.completions.create( | ||
model=" | model=" | ||
Line 73: | Line 71: | ||
print(response.choices[0].message.content) | print(response.choices[0].message.content) | ||
</ | </ | ||
+ | |||
The model can, if desired, be deleted: | The model can, if desired, be deleted: | ||
- | < | + | < |
ollama.delete(" | ollama.delete(" | ||
</ | </ | ||
+ | You can find more info on how to use the Ollama Python library on their [[https:// | ||