1

I'm wrote an MMO server where there is one main loop and all clients are processed (packets read/sent) (Non blocking) and then theres a small 500 ms delay/tick before the next loop.

I want to start working with IOCP, can this style server be done with IOCP?

bigdos
  • 11
  • 1
  • 1
    Why would you write a server to loop like that? Why not write the loop to get rid of that delay completely? What is the delay really doing, other than slowing your whole server down? In any case, IOCP notifies your code when I/O operations are complete. If you want delays in the processing, simply control how often you receive and process those notifications. Or have worker threads receiving the notifications in real time and putting them into a queue, and then process the queue in a loop like you want – Remy Lebeau Feb 14 '18 at 01:57
  • I am writing a server emulator for a game whose original server runs like that... because in the game, all actions that require server processing are governed by a game tick value (500ms). Each action you register within one tick, will start to take place by the beginning of the next tick. However I want to know if I can write an IOCP based server while still being able to keep this fundamental quantum of time... Would the only solution be to add 500ms timer between client actions? But I think you answered my question though – bigdos Feb 14 '18 at 02:12
  • use any hardcoded delay is wrong design. sense of use iocp - get notify just when I/O complete, without any delays. – RbMm Feb 14 '18 at 05:52

0 Answers0