-
Notifications
You must be signed in to change notification settings - Fork 16
Attention Interpolation
ljleb edited this page Jan 28, 2023
·
1 revision
Attention interpolation extends the attention syntax (...)
in the prompt to shift the attention of enclosed words over time. The syntax is:
(prompt:begin,end)
Here, begin
is how much to weight the enclosed text at the beginning of the sampling process, and end
is how much to weight the enclosed text at the end of the sampling process.
Note that the interpolation always starts weighting with begin
and ends with end
, regardless of whether it is located inside prompt editing. I think this makes it easier to maintain the prompt when tweaking prompts.
Cheat sheet (assuming 20 sampler steps):
-
a (word:1.0,2.0)
: interpolate between attention of 1.0 and 2.0 for 20 steps. at step 0, attention is 1.0 and at step 19, it is 2.0 -
a [(word:1.0,2.0)::4]
: same as above but spans 5 steps instead (steps 0 to 4). attention is 1.0 at step 0 and it is 2.0 at step 4 -
a [[(word:1.0,2.0):4]::9]
: same as above, but start at step 5 and span 5 steps in total. attention is 1.0 at step 5 and it is 2.0 at step 9 -
a [[(word:1.0,2.0):4]::14]
: same as above, but span 10 steps instead. attention is 2.0 at step 14 instead of 9