3

I want to feed the HOGDescriptor (CPU interface) with a trained SVM. The HOG Descriptor offers a method setSVMDetector(const vector<float>& detector) and I'm asking what has to be in vector<float>& detector?

I have a trained SVM which can create a xml file. I want to use hog.setSVMdetector(const vector<float>& detector) for the custom dataset. How to use this function for our own data? Kindly anyone suggest the solution.

I am using MS VS to execute the code.

NorthCat
  • 9,643
  • 16
  • 47
  • 50
user3317933
  • 31
  • 1
  • 2

1 Answers1

0

This detector (or a set of coefficients) has to be computed from your trained model (XML file). This XML file contains all the information about your model/classifier (most importantly support vectors). These coefficients are computed from the support vectors. If you are using OpenCV SVM, then you can use this code (check the answer) for computing detector which you can directly use to customize your HOG detector.

Few things to note: in their answer, they are calling detector (or set of coefficients) as support_vector. But they are the same thing. Also, use your class labels as +1 (positive) and -1 (negative). Otherwise you might get incorrect detections.

Community
  • 1
  • 1
Sanchit
  • 3,180
  • 8
  • 37
  • 53