You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I changed my directory to an external drive (/mnt...). It set it to /run/user/1000/doc/b55597ff/models, and since I copied over the old models directory that already had models in it, seemed to run fine. I then tried downloading another model, and it consistently gives me the following error:
open /run/user/1000/doc/b55597ff/models/blobs/sha256-6e9f90f02bb3b39b59e81916e8cfce9deb45aeaeb9a54a5be4414486b907dc1e-partial-0: permission denied
Expected behavior
I should be able to download models normally.
Screenshots
Debugging information
Couldn't find '/home/pop-user/.ollama/id_ed25519'. Generating new private key.
Your new public key is:
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGX2YeZDYJThAXyS+PGai/5hCjgVNynojBXa/EQQh1PH
2025/02/01 19:03:16 routes.go:1187: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11435 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/run/user/1000/doc/b55597ff/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-02-01T19:03:16.037+01:00 level=INFO source=images.go:432 msg="total blobs: 19"
time=2025-02-01T19:03:16.043+01:00 level=INFO source=images.go:439 msg="total unused blobs removed: 0"
[GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached.
[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
- using env: export GIN_MODE=release
- using code: gin.SetMode(gin.ReleaseMode)
[GIN-debug] POST /api/pull --> github.com/ollama/ollama/server.(*Server).PullHandler-fm (5 handlers)
[GIN-debug] POST /api/generate --> github.com/ollama/ollama/server.(*Server).GenerateHandler-fm (5 handlers)
[GIN-debug] POST /api/chat --> github.com/ollama/ollama/server.(*Server).ChatHandler-fm (5 handlers)
[GIN-debug] POST /api/embed --> github.com/ollama/ollama/server.(*Server).EmbedHandler-fm (5 handlers)
[GIN-debug] POST /api/embeddings --> github.com/ollama/ollama/server.(*Server).EmbeddingsHandler-fm (5 handlers)
[GIN-debug] POST /api/create --> github.com/ollama/ollama/server.(*Server).CreateHandler-fm (5 handlers)
[GIN-debug] POST /api/push --> github.com/ollama/ollama/server.(*Server).PushHandler-fm (5 handlers)
[GIN-debug] POST /api/copy --> github.com/ollama/ollama/server.(*Server).CopyHandler-fm (5 handlers)
[GIN-debug] DELETE /api/delete --> github.com/ollama/ollama/server.(*Server).DeleteHandler-fm (5 handlers)
[GIN-debug] POST /api/show --> github.com/ollama/ollama/server.(*Server).ShowHandler-fm (5 handlers)
[GIN-debug] POST /api/blobs/:digest --> github.com/ollama/ollama/server.(*Server).CreateBlobHandler-fm (5 handlers)
[GIN-debug] HEAD /api/blobs/:digest --> github.com/ollama/ollama/server.(*Server).HeadBlobHandler-fm (5 handlers)
[GIN-debug] GET /api/ps --> github.com/ollama/ollama/server.(*Server).PsHandler-fm (5 handlers)
[GIN-debug] POST /v1/chat/completions --> github.com/ollama/ollama/server.(*Server).ChatHandler-fm (6 handlers)
[GIN-debug] POST /v1/completions --> github.com/ollama/ollama/server.(*Server).GenerateHandler-fm (6 handlers)
[GIN-debug] POST /v1/embeddings --> github.com/ollama/ollama/server.(*Server).EmbedHandler-fm (6 handlers)
[GIN-debug] GET /v1/models --> github.com/ollama/ollama/server.(*Server).ListHandler-fm (6 handlers)
[GIN-debug] GET /v1/models/:model --> github.com/ollama/ollama/server.(*Server).ShowHandler-fm (6 handlers)
[GIN-debug] GET / --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func1 (5 handlers)
[GIN-debug] GET /api/tags --> github.com/ollama/ollama/server.(*Server).ListHandler-fm (5 handlers)
[GIN-debug] GET /api/version --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
[GIN-debug] HEAD / --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func1 (5 handlers)
[GIN-debug] HEAD /api/tags --> github.com/ollama/ollama/server.(*Server).ListHandler-fm (5 handlers)
[GIN-debug] HEAD /api/version --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
time=2025-02-01T19:03:16.047+01:00 level=INFO source=routes.go:1238 msg="Listening on 127.0.0.1:11435 (version 0.5.7)"
time=2025-02-01T19:03:16.047+01:00 level=INFO source=routes.go:1267 msg="Dynamic LLM libraries" runners="[cuda_v11_avx cuda_v12_avx rocm_avx cpu cpu_avx cpu_avx2]"
time=2025-02-01T19:03:16.047+01:00 level=INFO source=gpu.go:226 msg="looking for compatible GPUs"
time=2025-02-01T19:03:16.291+01:00 level=INFO source=types.go:131 msg="inference compute" id=GPU-5d96a2f4-611f-dece-fac6-1f0abfa14589 library=cuda variant=v11 compute=8.6 driver=0.0 name="" total="7.7 GiB" available="7.0 GiB"
[GIN] 2025/02/01 - 19:03:16 | 200 | 9.94787ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2025/02/01 - 19:03:16 | 200 | 29.68735ms | 127.0.0.1 | POST "/api/show"
[GIN] 2025/02/01 - 19:03:16 | 200 | 28.43901ms | 127.0.0.1 | POST "/api/show"
[GIN] 2025/02/01 - 19:03:16 | 200 | 38.886373ms | 127.0.0.1 | POST "/api/show"
[GIN] 2025/02/01 - 19:03:16 | 200 | 39.399519ms | 127.0.0.1 | POST "/api/show"
[GIN] 2025/02/01 - 19:03:47 | 200 | 1.631504234s | 127.0.0.1 | POST "/api/pull"
[GIN] 2025/02/01 - 19:09:56 | 200 | 1.153713002s | 127.0.0.1 | POST "/api/pull"
The text was updated successfully, but these errors were encountered:
Describe the bug
I changed my directory to an external drive (
/mnt...
). It set it to/run/user/1000/doc/b55597ff/models
, and since I copied over the old models directory that already had models in it, seemed to run fine. I then tried downloading another model, and it consistently gives me the following error:Expected behavior
I should be able to download models normally.
Screenshots
![Image](https://private-user-images.githubusercontent.com/68170410/408817586-cffb5130-4208-4c67-a4ad-56e73369106d.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkxNjkxMjIsIm5iZiI6MTczOTE2ODgyMiwicGF0aCI6Ii82ODE3MDQxMC80MDg4MTc1ODYtY2ZmYjUxMzAtNDIwOC00YzY3LWE0YWQtNTZlNzMzNjkxMDZkLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTAlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjEwVDA2MjcwMlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTEyMWUyY2M1Zjg5N2YxZjUxN2E1ZmJlMDk5YTBkYjE0Y2E3NmY4ODkyNzVlMjBkMzA0ZjczNWU0MzgzMDNiMmUmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.pmgKsRtTx22sEAwIThk7NfhUgE9web9vF-F9BKOZcyo)
![Image](https://private-user-images.githubusercontent.com/68170410/408817600-18d4c698-2430-4cb3-82c3-10d7a6fb5422.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkxNjkxMjIsIm5iZiI6MTczOTE2ODgyMiwicGF0aCI6Ii82ODE3MDQxMC80MDg4MTc2MDAtMThkNGM2OTgtMjQzMC00Y2IzLTgyYzMtMTBkN2E2ZmI1NDIyLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTAlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjEwVDA2MjcwMlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTQxYzIyNTQyZTcyMTgzODQxZjU4ZWFmMGYxZjFkYTBhMzY2MjVkMjAzMDVlNjU5ODUzNDU1YTczOWQ0MTA3NjYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.625AnU4WmaNIZ6P_F0Ib6IqNhPv5-FgaRWJdpL8n6nI)
Debugging information
The text was updated successfully, but these errors were encountered: