Always get “max context length” error
-
Hi,
Now when I input anything(such as “test”)in the chatbot, it will always report the following error:
"This model's maximum context length is 4097 tokens, however you requested 4135 tokens (39 in your prompt; 4096 for the completion). Please reduce your prompt; or completion length"
The page I need help with: [log in to see the link]
Viewing 2 replies - 1 through 2 (of 2 total)
Viewing 2 replies - 1 through 2 (of 2 total)
- The topic ‘Always get “max context length” error’ is closed to new replies.