OpenCV 3.4.1 Get Primal Form of Custom Trained Linear SVM HoG detectMultiScale

Issue

I have trained a Linear SVM in OpenCV 3.4.1. Now I want to use my custom SVM with OpenCV 3’s HoG detectMultiScale function. The old method of setting the HoG detector with the custom SVM primal vector no longer works.

For OpenCV 2, one would get the Primal vector from the OpenCV 2 custom trained SVM like this:

#include "linearsvm.h"

LinearSVM::LinearSVM() {

    qDebug() << "Creating SVM and loading trained data...";

    load("/home/pi/trainedSVM.xml");

    qDebug() << "Done loading data...";

}

std::vector<float> LinearSVM::getPrimalForm() const
{
  std::vector<float> support_vector;

  int sv_count = get_support_vector_count();

  const CvSVMDecisionFunc* df = getDecisionFunction();

  if ( !df ) {
      return support_vector;
  }

  const double* alphas = df[0].alpha;
  double rho = df[0].rho;
  int var_count = get_var_count();

  support_vector.resize(var_count, 0);

  for (unsigned int r = 0; r < (unsigned)sv_count; r++)
  {
    float myalpha = alphas[r];
    const float* v = get_support_vector(r);
    for (int j = 0; j < var_count; j++,v++)
    {
      support_vector[j] += (-myalpha) * (*v);
    }
  }

  support_vector.push_back(rho);

  return support_vector;
}

Once the primal vector was created from the trained SVM data, one would set the HoG detector SVM like this:

    // Primal for of cvsvm descriptor
    vector<float> primalVector = m_CvSVM.getPrimalForm();

    qDebug() << "Got primal form of detection vector...";
    qDebug() << "Setting SVM detector...";

    // Set the SVM Detector - custom trained HoG Detector
    m_HoG.setSVMDetector(primalVector);

In OpenCV 3.4.1, this no longer works as CvSVM no longer exists and much of the SVM API has changed.

How do I get the primal vector of my custom SVM in OpenCV 3.4.1 that I trained like this:

// Set up SVM's parameters
    cv::Ptr<cv::ml::SVM> svm = cv::ml::SVM::create();
    svm->setType(cv::ml::SVM::C_SVC);
    svm->setKernel(cv::ml::SVM::LINEAR);
    svm->setTermCriteria(cv::TermCriteria(cv::TermCriteria::MAX_ITER, 10, 1e-6));

    // Train the SVM with given parameters
    cv::Ptr<cv::ml::TrainData> td = cv::ml::TrainData::create(trainingDataMat, cv::ml::ROW_SAMPLE, trainingLabelsMat);

    // Or auto train
    qDebug() << "Training dataset...";
    QElapsedTimer trainingTimer;
    trainingTimer.restart();
    svm->trainAuto(td);
    qDebug() << "Done training dataset in: " << (float)trainingTimer.elapsed() / 1000.0f;

Thanks.

Solution

It turns out the answer is in the OpenCV test / examples train_HOG.cpp on Github.

It looks like this:

/// Get the SVM Detector in HoG Format
vector<float> getSVMDetector(const Ptr<SVM>& svm)
{
    // get the support vectors
    Mat sv = svm->getSupportVectors();
    const int sv_total = sv.rows;
    // get the decision function
    Mat alpha, svidx;
    double rho = svm->getDecisionFunction( 0, alpha, svidx );

    CV_Assert( alpha.total() == 1 && svidx.total() == 1 && sv_total == 1 );
    CV_Assert( (alpha.type() == CV_64F && alpha.at<double>(0) == 1.) ||
               (alpha.type() == CV_32F && alpha.at<float>(0) == 1.f) );
    CV_Assert( sv.type() == CV_32F );

    vector< float > hog_detector( sv.cols + 1 );
    memcpy( &hog_detector[0], sv.ptr(), sv.cols*sizeof( hog_detector[0] ) );
    hog_detector[sv.cols] = (float)-rho;
    return hog_detector;
}

Answered By – PhilBot

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published