Skip to content

add FP8 support to gguf/llama: #17215

add FP8 support to gguf/llama:

add FP8 support to gguf/llama: #17215

windows-latest-cmake (avx-x64, -DGGML_NATIVE=OFF -DLLAMA_BUILD_SERVER=ON -DGGML_RPC=ON -DGGML_AVX...

succeeded Nov 30, 2024 in 5m 42s