0

As the answer in the question: how-do-i-handle-streaming-messages-with-python-grpc, @Nathaniel provides a solution to handle request and response.

But when I want to statistics the process time of every response, it just doesn't look right. For example, I sleep 200ms in my stream_iter, but the tr even less than 200. My code:

t0 = time.time()
for rsp in stub.Process(stream_iter()):
    tr = (time.time() - t0) * 1000
    print(tr)
    t0 = time.time()
...

So I want to know how to timing?

1 Answers1

0

It's hard to tell what is wrong with the given snippet. I would recommend to create a complete reproduction case as an issue to https://github.com/grpc/grpc/issues.

Here are some possibilities:

  1. Flow control, the client/server might buffer messages it received, hence you might see small burst in certain scenarios.
  2. Bug in iterator, e.g., we want to sleep in __next__, but might accidentally injected sleep in __iter__.

Generally, if you tune up the scale and throughput, the noise should go away. What you have for measuring the latency of each response is good.

Lidi Zheng
  • 1,801
  • 8
  • 13
  • Yes, I scaled up the throughput and I can see the average time is more credible, but I still don't known why there is the noise. As you mentioned, I guess there is buffer or other uncontrollable factor. – anonymous9527 Jun 30 '21 at 07:55