2:04 pm
September 15, 2022
Hi,
OPENAI raises this error: Error: This model's maximum context length is 4097 tokens. However, your messages resulted in 6946 tokens. Please reduce the length of the messages.
That's because there is a OPENAI rewrite active in "article assignment" with a full article extract from shortened rss feeds.
There is no option to limit the full arcticle extract.
Do you have an idea?
Thanks,
Ron
Every OpenAI GPT model has its limits. None of them are able to rewrite a whole book yet. In your case I would suggest to use %post_excerpt% instead. Course it's possible to shorten the input article. But how exactly? Do you want GPT to rewrite just a half or 1/5 of it? From the beginning? From the end? From the middle?
Please suggest your solution.
11:18 am
September 15, 2022
Most Users Ever Online: 541
Currently Online:
12 Guest(s)
Currently Browsing this Page:
1 Guest(s)
Top Posters:
ninja321: 84
s.baryshev.aoasp: 68
Freedom: 61
Pandermos: 54
MediFormatica: 49
B8europe: 48
Member Stats:
Guest Posters: 337
Members: 2852
Moderators: 0
Admins: 1
Forum Stats:
Groups: 1
Forums: 5
Topics: 1640
Posts: 8352
Newest Members:
torontomark48, info.ckmedianetwork, contact.mybeautystar, samuelbodde, john.prush, creightonnick0Administrators: CyberSEO: 3947