I am investigating a bit UDS for logging from an app and then using another process to send the logs to an external server. Overall it seems to work fine but when I was testing it I discovered that if I send some logs in a for loop, when stream is read from the socket it contains more "messages".
you can find the code for receving logs here https://github.com/MattBlack85/alf (after installing you can run it with alf /tmp/alf.sock http://127.0.0.1:8080
)
you can find a small example to send logs here https://gist.github.com/MattBlack85/86d620a306f16416a7f96a1a035984dc
you can find a small webserver to let alf send over the logs here https://gist.github.com/MattBlack85/0638ef87eb077eb46879d6c90a30cf7a
if the for loop has no sleep, the result is something like that
[2018-12-18 13:12:39,798] [DEBUG] alf.worker - MSG from queue: b'{"time":"2018-12-18 13:12:39,797","name":"test","levelname":"DEBUG","message":"test 0","pathname":"logalf.py"}{"time":"2018-12-18 13:12:39,798","name":"test","levelname":"DEBUG","message":"test 1","pathname":"logalf.py"}{"time":"2018-12-18 13:12:39,798","name":"test","levelname":"DEBUG","message":"test 2","pathname":"logalf.py"}{"time":"2018-12-18 13:12:39,798","name":"test","levelname":"DEBUG","message":"test 3","pathname":"logalf.py"}{"time":"2018-12-18 13:12:39,798","name":"test","levelname":"DEBUG","message":"test 4","pathname":"logalf.py"}{"time":"2018-12-18 13:12:39,798","name":"test","levelname":"DEBUG","message":"test 5","pathname":"logalf.py"}'
while if I put a small break of 1ms all the messages are received one by one. I tried to close all heavy process on my OS and leave the CPU free but it didn't work. This is not a big issue as I can add a terminator when formatting the JSON log and the split what is read from the socket and put every item of the resulting list into the queue, but why I am seeing this at all?