Bot GPT error MAX Tokens
-
Hi, in chatGPT section of plugin, after I did embbeding, the chat respond….
This model's maximum context length is 4097 tokens, however you requested 6047 tokens (4547 in your prompt; 1500 for the completion). Please reduce your prompt; or completion length.
My prompt is 3 o 4 words (10 tokens) what am I doing wrong?
if I delete a index POST of 3100 tokens, chat respond well.
Is the problem the length of the post, should i split it?
how many text or tokens does pinecone accept?
Viewing 1 replies (of 1 total)
Viewing 1 replies (of 1 total)
- The topic ‘Bot GPT error MAX Tokens’ is closed to new replies.