Skip to content

Actions: Adriankhl/llama.cpp

Python check requirements.txt

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
7 workflow runs
7 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

server : fix JSON-Scheme typo (#7975)
Python check requirements.txt #7: Commit 6a2f298 pushed by Adriankhl
June 23, 2024 16:24 3m 26s master
June 23, 2024 16:24 3m 26s
un-ignore build-info.cmake and build-info.sh (#7996)
Python check requirements.txt #6: Commit a785474 pushed by Adriankhl
June 19, 2024 23:38 3m 29s master
June 19, 2024 23:38 3m 29s
vulkan: select only one device for single gpu with multiple drivers
Python check requirements.txt #5: Commit 4bc3a79 pushed by Adriankhl
June 8, 2024 13:10 3m 25s fix-multiple-vullkan-driver
June 8, 2024 13:10 3m 25s
vulkan: select only one device for single gpu with multiple drivers
Python check requirements.txt #4: Commit 243b5ef pushed by Adriankhl
May 29, 2024 02:09 3m 24s fix-multiple-vullkan-driver
May 29, 2024 02:09 3m 24s
ggml: run llama_get_device_count before llama_default_buffer_type_off…
Python check requirements.txt #3: Commit bd00902 pushed by Adriankhl
May 27, 2024 02:27 3m 26s fix_vulkan_device
May 27, 2024 02:27 3m 26s
vulkan: fix MSVC debug build by adding the _ITERATOR_DEBUG_LEVEL=0 de…
Python check requirements.txt #2: Commit 4ed6a96 pushed by Adriankhl
May 21, 2024 01:41 3m 32s fix-msvc-vulkan-debug
May 21, 2024 01:41 3m 32s
vulkan: fix ggml_soft_max_ext parameter
Python check requirements.txt #1: Commit 7742d72 pushed by Adriankhl
May 12, 2024 11:52 4m 30s khl
khl
May 12, 2024 11:52 4m 30s