0

I am struggling with running pose models in NVIDIA Triton inference server. The model (open pose , alpha pose , HRNet ... etc ) load normally but the post processing is the problem

younes
  • 1
  • 1
  • Please provide more information about the model as well as the post processing algorithm that you are working with. Also provide some code and specify where exactly the problem is happening. – Keivan Dec 08 '21 at 10:02

1 Answers1

0

You can refer to the post-processing script in the docs. They have given an example for an image classifier: image_client.py

def postprocess(results, output_name, batch_size, batching):
    """
    Post-process results to show classifications.
    """

    output_array = results.as_numpy(output_name)
    if len(output_array) != batch_size:
        raise Exception("expected {} results, got {}".format(
            batch_size, len(output_array)))

    # Include special handling for non-batching models
    for results in output_array:
        if not batching:
            results = [results]
        for result in results:
            if output_array.dtype.type == np.object_:
                cls = "".join(chr(x) for x in result).split(':')
            else:
                cls = result.split(':')
            print("    {} ({}) = {}".format(cls[0], cls[1], cls[2]))
gab
  • 792
  • 1
  • 10
  • 36