Releases: abetlen/llama-cpp-python
Releases Β· abetlen/llama-cpp-python
v0.2.77-cu122
feat: adding `rpc_servers` parameter to `Llama` class (#1477) * passthru rpc_servers params wip * enable llama rpc by default * convert string to byte * add rpc package * Revert "enable llama rpc by default" This reverts commit 832c6dd56c979514cec5df224bf2d2014dccd790. * update readme * Only set rpc_servers when provided * Add rpc servers to server options --------- Co-authored-by: Andrei Betlen <[email protected]>
v0.2.77-cu121
feat: adding `rpc_servers` parameter to `Llama` class (#1477) * passthru rpc_servers params wip * enable llama rpc by default * convert string to byte * add rpc package * Revert "enable llama rpc by default" This reverts commit 832c6dd56c979514cec5df224bf2d2014dccd790. * update readme * Only set rpc_servers when provided * Add rpc servers to server options --------- Co-authored-by: Andrei Betlen <[email protected]>
v0.2.77
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python inβ¦
v0.2.76-metal
chore: Bump version
v0.2.76-cu124
chore: Bump version
v0.2.76-cu123
chore: Bump version
v0.2.76-cu122
chore: Bump version
v0.2.76-cu121
chore: Bump version
v0.2.76
chore: Bump version
v0.2.75-metal
chore: Bump version