Error generated by ”Embeddings”
-
Hi,
In the case of the ChatGpt bot, for certain text from the content of the questions, the error was generated: ”This model’s maximum context length is 4097 tokens. However, you requested 4893 tokens (3393 in the messages, 1500 in the completion). Please reduce the length of the messages or completion.” I assume the errors occurred for texts that were embedded in indexed pages.
This error occurred, after I had indexed all articles with the purpose of using “Embeddings”. I also wanted to index all pages but it didn’t work (probably because of the theme I use – Flatsome).
The error disappeared, after deleting all indexes from the Index Builder.
Viewing 4 replies - 1 through 4 (of 4 total)
Viewing 4 replies - 1 through 4 (of 4 total)
- The topic ‘Error generated by ”Embeddings”’ is closed to new replies.