Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
habrok:examples:llms [2025/03/05 12:26] camarocicohabrok:examples:llms [2025/07/28 11:23] (current) bob
Line 75: Line 75:
   ]   ]
 }</code> }</code>
 +
 +==== Running Ollama in a jobscript ====
 +
 +The following code can be used in a jobscript to run an Ollama model:
 +
 +<code>
 +# Load the Ollama module
 +# GPU node
 +module load ollama/0.6.0-GCCcore-12.3.0-CUDA-12.1.1
 +# CPU node
 +# module load ollama/0.6.0-GCCcore-12.3.0
 +
 +# Use /scratch for storing models
 +export OLLAMA_MODELS=/scratch/$USER/ollama/models
 +
 +# Start the Ollama server in the background, log all its output to ollama-serve.log
 +ollama serve >& ollama-serve.log &
 +# Wait a few seconds to make sure that the server has started
 +sleep 5
 +
 +# Run the model
 +echo "Tell me something about Groningen" | ollama run deepseek-r1:14b
 +
 +# Kill the server process
 +pkill -u $USER ollama
 +</code>