rkllm-server/README.md
2025-02-01 21:27:56 +01:00

478 B

rkllm server

Hosts a simple flask-based chat interface to a rkllm-model at localhost:8080.

Install

On a r3588 system:

git clone "<repo>/rkllm_server"
cd rkllm_server
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
deactivate

you can now start the server with:

bash ./start_server.sh "/path/to/model.rkllm"

The first time on each boot it will ask for a sudo password to fix the npu speed (see [fix_freq_rk3588.sh]).