Differences
This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision | |||
| habrok:examples:llms [2025/03/05 12:26] – camarocico | habrok:examples:llms [2025/07/28 11:23] (current) – bob | ||
|---|---|---|---|
| Line 75: | Line 75: | ||
| ] | ] | ||
| }</ | }</ | ||
| + | |||
| + | ==== Running Ollama in a jobscript ==== | ||
| + | |||
| + | The following code can be used in a jobscript to run an Ollama model: | ||
| + | |||
| + | < | ||
| + | # Load the Ollama module | ||
| + | # GPU node | ||
| + | module load ollama/ | ||
| + | # CPU node | ||
| + | # module load ollama/ | ||
| + | |||
| + | # Use /scratch for storing models | ||
| + | export OLLAMA_MODELS=/ | ||
| + | |||
| + | # Start the Ollama server in the background, log all its output to ollama-serve.log | ||
| + | ollama serve >& ollama-serve.log & | ||
| + | # Wait a few seconds to make sure that the server has started | ||
| + | sleep 5 | ||
| + | |||
| + | # Run the model | ||
| + | echo "Tell me something about Groningen" | ||
| + | |||
| + | # Kill the server process | ||
| + | pkill -u $USER ollama | ||
| + | </ | ||