i broke browsegpt?
-
ERROR: {'error': {'message': "This model's maximum context length is 4097 tokens. However, your messages resulted in 4433 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
-
im scawed
-
thank you sm
-
i was scared half to death id broke it
-
for more info, a "token" here is around one word, and this error is saying to "shorten the length" but you can't, because browsegpt will always use the current conversation for the context
this is how it's able to remember topics between messages, but only for 30 min from the last message
-
i see
-
@Ava-Lee basically its a fire bomb and supriseingly it's not a war crime to use against enemy soldiers at times of war