This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision | |
| habrok:examples:ollama [2025/09/04 11:47] – camarocico | habrok:examples:ollama [2025/10/15 11:42] (current) – pedro |
|---|
| ====== Ollama on Habrok ====== | ====== Ollama on Hábrók ====== |
| |
| You can run an LLM on Habrok with Ollama in a Jupyter environment by using the [[https://portal.hb.hpc.rug.nl/pun/sys/dashboard/batch_connect/sys/hb-ollama/session_contexts/new|Ollama (Jupyter)]] Interactive App on the [[https://portal.hb.hpc.rug.nl|Web Portal]]. | You can run an LLM on Hábrók with Ollama in a Jupyter environment by using the [[https://portal.hb.hpc.rug.nl/pun/sys/dashboard/batch_connect/sys/hb-ollama/session_contexts/new|Ollama (Jupyter)]] Interactive App on the [[https://portal.hb.hpc.rug.nl|Web Portal]]. |
| ===== Setting up the virtual environment ===== | ===== Setting up the virtual environment ===== |
| |
| To be able to use the app, you first need to set up a Python virtual environment. The version of Ollama installed on Habrok is ''ollama/0.6.0-GCCcore-12.3.0'', which means that the Python virtual environment needs to use a version of Python using the same ''GCCcore-12.3.0'' toolchain, which is ''Python/3.11.3-GCCcore-12.3.0''. Other versions of Python might work as well, but toolchain compatibilities can sometimes be an issue. | To be able to use the app, you first need to set up a Python virtual environment. The version of Ollama installed on Hábrók is ''ollama/0.6.0-GCCcore-12.3.0'', which means that the Python virtual environment needs to use a version of Python using the same ''GCCcore-12.3.0'' toolchain, which is ''Python/3.11.3-GCCcore-12.3.0''. Other versions of Python might work as well, but toolchain compatibilities can sometimes be an issue. |
| |
| <code shell> | <code shell> |