• yurann2000

    (@yurann2000)


    First of all, I want to thank you for the great plugin! It works excellently, and I’m really satisfied with its functionality.

    However, I have a feature request that would be incredibly helpful for my use case. Currently, when sending long requests to ChatGPT, I run into limitations where the response exceeds the token limits. As a result, I only receive about 20-30% of the page’s content due to the response being cut off.

    Would it be possible to add a feature that allows splitting the response into multiple parts? This way, I could handle longer outputs more effectively without losing valuable content.

    Thank you so much for considering this! It would make a huge difference in ensuring that I can use the plugin to its fullest potential.

Viewing 1 replies (of 1 total)
  • Plugin Author jamesdlow

    (@jamesdlow)

    Thank you for the feedback. Will look and see if this is possible in a future version.

Viewing 1 replies (of 1 total)
  • You must be logged in to reply to this review.