Skip to content

Frequently Asked Questions

3Simplex edited this page Aug 1, 2024 · 22 revisions
Where are the default directories for models and settings?

Windows

  • Settings directory: C:\Users\%USERNAME%\AppData\Roaming\nomic.ai
  • Models directory: C:\Users\%USERNAME%\AppData\Local\nomic.ai\GPT4All

Mac

  • Settings directory: /Users/{username}/.config/gpt4all.io
  • Models directory: /Users/{username}/Library/Application Support/nomic.ai/GPT4All

Linux

  • Settings directory: /home/{username}/.config/nomic.ai
  • Models directory: /home/{username}/.local/share/nomic.ai/GPT4All
Explain these model settings, what are they for?
  • Temperature: This controls the randomness of predictions, lower values make the model more deterministic, while higher values increase randomness.

  • Top K: This limits the sampling pool to the most probable tokens. For example, if K=50, only the 50 most likely tokens are considered for the next word prediction.

  • Top P: The model looks at all possible next tokens and picks the smallest group of tokens that together have a total probability of at least this percentage. For instance, a setting of "1" will include 100% of all probable tokens. If P=0.9, it includes the fewest number of tokens with a combined probability of at least 90%. The lower this number is set towards 0 the less tokens will be included in the set the model will use next.

  • Min P: This sets a minimum probability threshold for individual tokens. The remaining selected tokens have a combined probability of 100%. A setting of "1" will include only 1 token with a probability if 100%. A much lower setting like P=0.05, includes the smallest number of tokens with a probability greater than 5%.

Experience how settings like Temperature, Top K, Top P, Min P change model behavior in this live example

It is running slow, what can I do about that?

Find the right number of GPU layers in the model settings. If you have a small amount of GPU memory you will want to start low and move up until the model wont load. Then use the last known good setting. Make sure the model has GPU support.

  • Vulkan supports f16, Q4_0, Q4_1 models with GPU. (some models won't have any GPU support)
  • Cuda supports all gguf formats. (some models won't have any GPU support)
    Ensure you are using the GPU if you have one. See "Settings > Application : Device" Make sure it is set to use either Vulkan or Cuda.
Clone this wiki locally