i broke browsegpt?
-
halp... "ERROR: {'error': {'message': "This model's maximum context length is 4097 tokens. However, your messages resulted in 4101 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}"
-
dang thats crazy
-
idk WAT to do
it now like 4183 tokens from the letter l
-
try turn your device off and on again
-
4138 token!!?!?!??! wth?!?!?!?!!
-
ERROR: {'error': {'message': "This model's maximum context length is 4097 tokens. However, your messages resulted in 4267 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
-
wat tf do i do
-
@Almond
@vgmoose -
@Glacier-Ghoul i just suggested something to do
-
This post is deleted!
-
didnt work
-
ERROR: {'error': {'message': "This model's maximum context length is 4097 tokens. However, your messages resulted in 4349 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
-
this is absolutely crazy
-
@Glacier-Ghoul probably a server side issue then
-
caused by me
-
someone else try talk to @BrowseGPT
-
@Glacier-Ghoul probably not
-
please
-
@Glacier-Ghoul never mind ai is easy to mindfuck with, An example my friend once got the discord ai to say how to make [censored] napam
-
HOW the FREAK