i broke browsegpt?
-
@Glacier-Ghoul probably a server side issue then
-
caused by me
-
someone else try talk to @BrowseGPT
-
@Glacier-Ghoul probably not
-
please
-
@Glacier-Ghoul never mind ai is easy to mindfuck with, An example my friend once got the discord ai to say how to make [censored] napam
-
HOW the FREAK
-
@Glacier-Ghoul no clue could be people from Vietnam are just bulit different
-
vietnam?
-
the friend is from Vietnam
-
ah
-
oh yea he also got ai to tell someone to end themselves ( wasn't to a real person but the ai Didn't know that)
-
sorry, i dont manage Browsegpt, try asking one of the @administrators
-
ERROR: {'error': {'message': "This model's maximum context length is 4097 tokens. However, your messages resulted in 4433 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
-
im scawed
-
thank you sm
-
i was scared half to death id broke it
-
for more info, a "token" here is around one word, and this error is saying to "shorten the length" but you can't, because browsegpt will always use the current conversation for the context
this is how it's able to remember topics between messages, but only for 30 min from the last message