i broke browsegpt?
-
This post is deleted!
-
didnt work
-
ERROR: {'error': {'message': "This model's maximum context length is 4097 tokens. However, your messages resulted in 4349 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
-
this is absolutely crazy
-
@Glacier-Ghoul probably a server side issue then
-
caused by me
-
someone else try talk to @BrowseGPT
-
@Glacier-Ghoul probably not
-
please
-
@Glacier-Ghoul never mind ai is easy to mindfuck with, An example my friend once got the discord ai to say how to make [censored] napam
-
HOW the FREAK
-
@Glacier-Ghoul no clue could be people from Vietnam are just bulit different
-
vietnam?
-
the friend is from Vietnam
-
ah
-
oh yea he also got ai to tell someone to end themselves ( wasn't to a real person but the ai Didn't know that)
-
sorry, i dont manage Browsegpt, try asking one of the @administrators
-
ERROR: {'error': {'message': "This model's maximum context length is 4097 tokens. However, your messages resulted in 4433 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
-
im scawed