http://blogoxeokthu4ntl7tptzpjfjmlfxdzsff2ueviwj4vjf7z3aadnz2id.onion/opsec/openwebuilocalllms/index.html
In total our conversation took 1242 tokens which is less than default window size of 2048. With longer prompt, the context window may be filled up which causes the model to forget what was at the beginning of conversation. If prompt_tokens is 2048, we need to increase window size.