• Ely DeRoss

    (@quanticastudio)


    Hi, in chatGPT section of plugin, after I did embbeding, the chat respond….

    This model's maximum context length is 4097 tokens, however you requested 6047 tokens (4547 in your prompt; 1500 for the completion). Please reduce your prompt; or completion length.

    My prompt is 3 o 4 words (10 tokens) what am I doing wrong?

    if I delete a index POST of 3100 tokens, chat respond well.

    Is the problem the length of the post, should i split it?

    how many text or tokens does pinecone accept?

Viewing 1 replies (of 1 total)
  • Plugin Author senols

    (@senols)

    Hello @quanticastudio,

    You’ve correctly identified the issue; the length of the post is indeed the problem.

    You have a couple of options to address this:

    You could use gpt-3.5-turbo-16k, which has a larger context size of 16k. This should be sufficient for your needs.

    Alternatively, as you’ve suggested, you could divide your post into two pieces.

    I hope this helps to resolve the issue.

Viewing 1 replies (of 1 total)
  • The topic ‘Bot GPT error MAX Tokens’ is closed to new replies.