"failed: TypeError: failed to fetch" with Open AI GPT Scripts #1499
Unanswered
njproductionsus
asked this question in
Support / Q&A
Replies: 1 comment 1 reply
-
Hi, @njproductionsus As upgrading to OpenAI to GPT-4, they have added token limits, by default, it is 200, you can check this for maximum token length in GPT-4 - https://community.openai.com/t/maximum-token-length-in-gpt-4/385914/3. I see you mentioned that it had worked fine with the previous GPT version, the issue does not seem to be from Rowy. You can process 10k rows on the free plan. Could you share logs too? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have been using the Parallel GPT script with great success (after much refinement).
Yesterday my Open AI upgraded to Tier 4 and so instead of only doing a batch of 240, I've doubled to my ideal batch of 480.
SINCE doing this, I have had some issues not getting it to work smoothly like 240 did and can't figure out why. I thought it was Open AI and limits but I have checked and that end is good with how I throttle the batches in groups of 80 per minute.
What I DID notice is that Rowy pops up red in the bottom left with the message of "failed: TypeError: failed to fetch".
Any idea what would cause this? Is 480 rows triggering too much for Rowy (versus 240 was fine)?
Beta Was this translation helpful? Give feedback.
All reactions