You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/server/README.md
+1-5Lines changed: 1 addition & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -424,8 +424,6 @@ node index.js
424
424
425
425
`frequency_penalty`: Repeat alpha frequency penalty. Default: `0.0`, which is disabled.
426
426
427
-
`penalty_prompt`: This will replace the `prompt` for the purpose of the penalty evaluation. Can be either `null`, a string or an array of numbers representing tokens. Default: `null`, which is to use the original `prompt`.
428
-
429
427
`mirostat`: Enable Mirostat sampling, controlling perplexity during text generation. Default: `0`, where `0` is disabled, `1` is Mirostat, and `2` is Mirostat 2.0.
430
428
431
429
`mirostat_tau`: Set the Mirostat target entropy, parameter tau. Default: `5.0`
@@ -672,7 +670,6 @@ Given a ChatML-formatted json description in `messages`, it returns the predicte
672
670
"stopping_word": ""
673
671
},
674
672
"penalize_nl": true,
675
-
"penalty_prompt_tokens": [],
676
673
"presence_penalty": 0.0,
677
674
"prompt": "Say hello to llama.cpp",
678
675
"repeat_last_n": 64,
@@ -696,8 +693,7 @@ Given a ChatML-formatted json description in `messages`, it returns the predicte
0 commit comments