• Resolved freefincal

    (@freefincal)


    Hi, for my models I keep getting this error. Please help in fixing this.

    ‘This model’s maximum context length is 2049 tokens, however you requested 4044 tokens (44 in your prompt; 4000 for the completion). Please reduce your prompt; or completion length.

    The page I need help with: [log in to see the link]

Viewing 1 replies (of 1 total)
  • Plugin Author senols

    (@senols)

    Hi, pls update your plugin to version 1.5.73 and share with me your bot parameters and context tab screenshots.

Viewing 1 replies (of 1 total)
  • The topic ‘Token length exceed’ is closed to new replies.