0

I am trying get the evaluation loss metrics on detectron2 using COCOEvaluator. However, there is around 40k+ dataset in the evaluation folder which caused it to take around 45 minutes per evaluation. The dataset is downloaded from COCO website itself.

[09/07 23:58:44 d2.data.datasets.coco]: Loaded 40504 images in COCO format from annotations/instances_val2014.json
[09/07 23:58:51 d2.evaluation.evaluator]: Start inference on 40504 batches
[09/07 23:58:56 d2.evaluation.evaluator]: Inference done 11/40504. Dataloading: 0.0003 s/iter. Inference: 0.0667 s/iter. Eval: 0.0002 s/iter. Total: 0.0673 s/iter. ETA=0:45:24
...

I used register_coco_instance to register for my train and test dataset.

register_coco_instances(name=train_dataset_name, metadata={}, json_file=train_json_annotation_path, image_root=train_images_path)
register_coco_instances(name=test_dataset_name, metadata={}, json_file=test_json_annotation_path, image_root=test_images_path)

Is there anyway to evaluate a subset of the data (e.g. 5k) from the whole 40k+ datasets?

MT0
  • 143,790
  • 11
  • 59
  • 117
Barnacle Own
  • 85
  • 1
  • 9

1 Answers1

0

maybe this can help you See this: https://eidos-ai.medium.com/training-on-detectron2-with-a-validation-set-and-plot-loss-on-it-to-avoid-overfitting-6449418fbf4e

  • Please give links in comment when you did not know solution, not in answer. –  Sep 23 '22 at 02:40