[BUG] stream:true Error : ValueError: Unsupported vector type <class 'NoneType'> #2149
Open
9 tasks done
Labels
bug
Something isn't working
Pre-check
Description
I'm trying to Stream the answer of the local module to a simple webpage.
When I use following, it works, but without
stream : true
async function sendMessage(){
const userText = chatInput.value.trim();
if(!userText) return;
chatInput.value = '';
chatLog.push({role:'user', content:userText});
renderChat();
}
I get the correct answers from my PDS-Files.
But when I use
stream : true
, I get the following error:03:57:59.483 [ERROR ] uvicorn.error - Exception in ASGI application
Traceback (most recent call last):
File "/home/x/venv/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in call
raise exc
File "/home/x/venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in call
await self.app(scope, receive, _send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in call
await self.simple_response(scope, receive, send, request_headers=headers)
File "/home/x/venv/lib/python3.11/site-packages/starlette/middleware/cors.py", line 144, in simple_response
await self.app(scope, receive, send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
raise exc
File "/home/x/venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "/home/x/venv/lib/python3.11/site-packages/starlette/routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/routing.py", line 735, in app
await route.handle(scope, receive, send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/routing.py", line 76, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/home/x/venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
raise exc
File "/home/x/venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "/home/x/venv/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
response = await f(request)
^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/fastapi/routing.py", line 214, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/starlette/concurrency.py", line 39, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2405, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 914, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/private-gpt/private_gpt/server/chat/chat_router.py", line 95, in chat_completion
completion_gen = service.stream_chat(
^^^^^^^^^^^^^^^^^^^^
File "/home/x/private-gpt/private_gpt/server/chat/chat_service.py", line 175, in stream_chat
streaming_response = chat_engine.stream_chat(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 265, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/callbacks/utils.py", line 41, in wrapper
return func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/chat_engine/context.py", line 237, in stream_chat
nodes = self._get_nodes(message)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/chat_engine/context.py", line 133, in _get_nodes
nodes = self._retriever.retrieve(message)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 265, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/base/base_retriever.py", line 245, in retrieve
nodes = self._retrieve(query_bundle)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 265, in wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 103, in _retrieve
return self._get_nodes_with_embeddings(query_bundle)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 180, in _get_nodes_with_embeddings
query_result = self._vector_store.query(query, **self._kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/llama_index/vector_stores/qdrant/base.py", line 836, in query
response = self._client.search(
^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/qdrant_client/qdrant_client.py", line 387, in search
return self._client.search(
^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/qdrant_client/local/qdrant_local.py", line 204, in search
return collection.search(
^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/qdrant_client/local/local_collection.py", line 519, in search
name, query_vector = self._resolve_query_vector_name(query_vector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/x/venv/lib/python3.11/site-packages/qdrant_client/local/local_collection.py", line 321, in _resolve_query_vector_name
raise ValueError(f"Unsupported vector type {type(query_vector)}")
ValueError: Unsupported vector type <class 'NoneType'>
I think I'm doing something wrong, but what?
Please notice: when I use use_context: false with stream:true it works and stream the output perfectly.
Steps to Reproduce
1- Call the function above with stream:true
Expected Behavior
Stream the output
Actual Behavior
Error.
Environment
Linux. Nvidia rtx 1660
Additional Information
No response
Version
latest as now
Setup Checklist
NVIDIA GPU Setup Checklist
nvidia-smi
to verify).sudo docker run --rm --gpus all nvidia/cuda:11.0.3-base-ubuntu20.04 nvidia-smi
)The text was updated successfully, but these errors were encountered: