Token length exceed
-
Hi, for my models I keep getting this error. Please help in fixing this.
‘This model’s maximum context length is 2049 tokens, however you requested 4044 tokens (44 in your prompt; 4000 for the completion). Please reduce your prompt; or completion length.
The page I need help with: [log in to see the link]
Viewing 1 replies (of 1 total)
Viewing 1 replies (of 1 total)
- The topic ‘Token length exceed’ is closed to new replies.