I'm using a faster rcnn model to run some object detection. The wrapper I'm using is gluon and the code is below:
net = model_zoo.get_model('faster_rcnn_resnet50_v1b_coco', pretrained=True)
im_fname = utils.download('https://github.com/dmlc/web-data/blob/master/' +
'gluoncv/detection/biking.jpg?raw=true',
path='biking.jpg')
x, orig_img = data.transforms.presets.rcnn.load_test(im_fname)
box_ids, scores, bboxes = net(x)
My question is, is it possible to reduce the size of the arrays returned by net(x), effectively making computations faster?
The issue is that the model produces box_ids, scores and bboxes as arrays with 80000 elements - only the first ~10 are useful, the rest have a score of -1. I later try to convert these arrays to numpy arrays using asnumpy(), however, mxnet uses an asyncronous engine and this function has to wait for the computations to end before it can be executed. The computations take longer (5secs +) for 80000 elements and hence I am trying to reduce the array size (SSD model outputs approx 6000 elements and is much faster).
If you have other solutions on how to make .asnumpy() faster these are welcome too - basically, one pass of an image take 5 seconds and this seems unreasonable so I'm looking for it to be reduced to ~0.2s (which seems more appropriate right?)
Thanks!