a simple web interface to chat with a model in rkllm
| include | ||
| lib | ||
| static | ||
| .gitignore | ||
| fix_freq_rk3576.sh | ||
| fix_freq_rk3588.sh | ||
| flask_server.py | ||
| README.md | ||
| requirements.txt | ||
| rkllm-server.asd | ||
| rkllm.py | ||
| server.lisp | ||
| start_server.sh | ||
rkllm server
Hosts a simple flask-based chat interface to a rkllm-model at localhost:8080.
Install
On a r3588 system:
git clone "<repo>/rkllm_server"
cd rkllm_server
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
deactivate
you can now start the server with:
bash ./start_server.sh "/path/to/model.rkllm"
The first time on each boot it will ask for a sudo password to fix the npu speed (see [fix_freq_rk3588.sh]).