You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I successfully install pipx, poetry, and install private-gpt. I also have postgresql 16 up and running with vector database support (using pgvector).
I already put Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf inside models folder.
Unfortunately there is too few documentation on how to config settings.yaml and settings-local.yaml in windows.
I've got this error when make run command.
Much appreciated for any help on propose diagnosing the error.
14:25:49.197 [INFO ] private_gpt.settings.settings_loader - Starting application with profiles=['default']
14:25:55.691 [INFO ] private_gpt.components.llm.llm_component - Initializing the LLM in mode=llamacpp
14:25:55.710 [WARNING ] py.warnings - E:\anaconda\envs\privategpt\Lib\site-packages\pydantic\_internal\_fields.py:132: UserWarning: Field "model_url" in LlamaCPP has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
14:25:55.710 [WARNING ] py.warnings - E:\anaconda\envs\privategpt\Lib\site-packages\pydantic\_internal\_fields.py:132: UserWarning: Field "model_path" in LlamaCPP has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
14:25:55.710 [WARNING ] py.warnings - E:\anaconda\envs\privategpt\Lib\site-packages\pydantic\_internal\_fields.py:132: UserWarning: Field "model_kwargs" in LlamaCPP has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
Traceback (most recent call last):
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 800, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.ui.ui.PrivateGptUi'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 800, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.server.ingest.ingest_service.IngestService'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 800, in get
return self._context[key]
~~~~~~~~~~~~~^^^^^
KeyError: <class 'private_gpt.components.llm.llm_component.LLMComponent'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "E:\gpt\privategpt\private_gpt\__main__.py", line 5, in <module>
from private_gpt.main import app
File "E:\gpt\privategpt\private_gpt\main.py", line 6, in <module>
app = create_app(global_injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\gpt\privategpt\private_gpt\launcher.py", line 66, in create_app
ui = root_injector.get(PrivateGptUi)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 976, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 802, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 813, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 1000, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 1041, in call_with_injection
dependencies = self.args_to_inject(
^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 1089, in args_to_inject
instance: Any = self.get(interface)
^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 976, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 802, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 813, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 1000, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 1041, in call_with_injection
dependencies = self.args_to_inject(
^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 1089, in args_to_inject
instance: Any = self.get(interface)
^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 976, in get
provider_instance = scope_instance.get(interface, binding.provider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 91, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 802, in get
instance = self._get_instance(key, provider, self.injector)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 813, in _get_instance
return provider.get(injector)
^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 264, in get
return injector.create_object(self._cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 1000, in create_object
self.call_with_injection(init, self_=instance, kwargs=additional_kwargs)
File "E:\anaconda\envs\privategpt\Lib\site-packages\injector\__init__.py", line 1050, in call_with_injection
return callable(*full_args, **dependencies)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\gpt\privategpt\private_gpt\components\llm\llm_component.py", line 63, in __init__
self.llm = LlamaCPP(
^^^^^^^^^
File "E:\anaconda\envs\privategpt\Lib\site-packages\llama_index\llms\llama_cpp\base.py", line 157, in __init__
raise ValueError(
ValueError: Provided model path does not exist. Please check the path or provide a model_url to download.
make: *** [Makefile:36: run] Error 1
The text was updated successfully, but these errors were encountered:
andreiramani
changed the title
[QUESTION] private-gpt in Windows 11 with Anaconda
[QUESTION] private-gpt in Windows 11 with Anaconda using local .gguf model
Feb 4, 2025
Question
I successfully install pipx, poetry, and install private-gpt. I also have postgresql 16 up and running with vector database support (using pgvector).
I already put
Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf
insidemodels
folder.Unfortunately there is too few documentation on how to config
settings.yaml
andsettings-local.yaml
in windows.I've got this error when
make run
command.Much appreciated for any help on propose diagnosing the error.
The text was updated successfully, but these errors were encountered: