I would also suggest confirming that the sender of the data is checking the number of bytes successfully written, and not assuming that all bytes were written successfully.
I assume your tokenizer expects text (Utf8?) but (socket-)streams work with byte data. It is possible that a multibyte character is split in transport. A small buffer increases the likelihood of that happening.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.