Knowledge Builder Error > maximum context length is 8192 tokens
-
This model’s maximum context length is 8192 tokens, however you requested 10037 tokens (10037 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.
So i got this error by using instant indexing feature via pinecore and text embedding small model. Did i need to use another text Embedding Model like the large one? AI Power is set to GPT-4 Turbo Model.
Viewing 2 replies - 1 through 2 (of 2 total)
Viewing 2 replies - 1 through 2 (of 2 total)
- You must be logged in to reply to this topic.