/ Categories / ollama-models-lcc

Bundle

Name:aisoft
Maintainer:LCC Support
Contact:support@lcc.ncbr.muni.cz

Available versions

Default build

  • ollama-models-lcc:1.0:auto:auto

ollama-models-lcc

Ollama models managed by the LCC group.

Typical usage (CLI):

All commands must be executed in the same working directory as all three commands use information stored into the ollama-server.host, ollama-server.log, ollama-server.pid files.

1) Start the Ollama server

$ module add ollama-models-lcc
$ ollama-server-start

2) Communicate with the server

$ module add ollama-models-lcc
$ ollama-client --help
$ ollama-client list
$ ollama-client run llama3:8b

3) Terminate the Ollama server

$ module add ollama-models-lcc
$ ollama-server-stop

Typical usage (JOB):

Job script: run_ollama

#!/usr/bin/env infinity-env

# activate the module
module add ollama-models-lcc

# start the server
ollama-server-start
sleep 10  # wait until is ready

# run the prompt
ollama-client run llama3:8b "Summarize this file: $(cat file.txt)"

# terminate the server
ollama-server-stop

Submit the job:

$ psubmit default run_ollama ngpus=1 ncpus=24 mem=110gb