error: This model's maximum context length
(1 votes)
You are welcome. Also I'd like to add that many users don't quite understand how the OpenAI GPT models' restriction on the maximum number of tokens works. The number of tokens is calculated as the total sum of the length of the generated text you specify and the actual length of your assignment, including the original text.
Forum Timezone: Europe/Amsterdam
Most Users Ever Online: 541
Currently Online:
7 Guest(s)
Currently Browsing this Page:
1 Guest(s)
Top Posters:
ninja321: 84
s.baryshev.aoasp: 68
Freedom: 61
Pandermos: 54
MediFormatica: 49
B8europe: 48
Member Stats:
Guest Posters: 337
Members: 2816
Moderators: 0
Admins: 1
Forum Stats:
Groups: 1
Forums: 5
Topics: 1625
Posts: 8271
Newest Members:
jeremyboucher, lthompson2709, MAC, duty3canada, backlinkmorocco, dmitrypehovskiworkAdministrators: CyberSEO: 3909