3

I found that the result of LDA in OpenCV is different from other libraries. For example, the input data was

DATA (13 data samples with 4 dimensions)
  7    26     6    60
 1    29    15    52
11    56     8    20
11    31     8    47
 7    52     6    33
11    55     9    22
 3    71    17     6
 1    31    22    44
 2    54    18    22
21    47     4    26
 1    40    23    34
11    66     9    12
10    68     8    12

LABEL
 0     1     2     0     1     2     0     1     2     0     1     2     0

The OpenCV code is

Mat data = (Mat_<float>(13, 4) <<\
        7, 26, 6, 60,\
        1, 29, 15, 52,\
        11, 56, 8, 20,\
        11, 31, 8, 47,\
        7, 52, 6, 33,\
        11, 55, 9, 22,\
        3, 71, 17, 6,\
        1, 31, 22, 44,\
        2, 54, 18, 22,\
        21, 47, 4, 26,\
        1, 40, 23, 34,\
        11, 66, 9, 12,\
        10, 68, 8, 12);

Mat mean;
reduce(data, mean, 0, CV_REDUCE_AVG);
mean.convertTo(mean, CV_64F);

Mat label(data.rows, 1, CV_32SC1);
for (int i=0; i<label.rows; i++)
    label.at<int>(i) = i%3;

LDA lda(data, label);
Mat projection = lda.subspaceProject(lda.eigenvectors(), mean, data);

The matlab code is (used Matlab Toolbox for Dimensionality Reduction)

cd drtoolbox\techniques\
load hald
label=[0, 1, 2, 0, 1, 2, 0, 1, 2, 0, 1, 2, 0]
[projection, trainedlda] = lda(ingredients, label)

The eigenvalues are

OpenCV (lda.eigenvectors())
0.4457    4.0132
0.4880    3.5703
0.5448    3.3466
0.5162    3.5794

Matlab Toolbox for Dimensionality Reduction (trainedlda.M)
0.5613    0.7159
0.6257    0.6203
0.6898    0.5884
0.6635    0.6262

Then the projections of data are

OpenCV
1.3261    7.1276
0.8892   -4.7569
-1.8092   -6.1947
-0.0720    1.1927
0.0768    3.3105
-0.7200    0.7405
-0.3788   -4.7388
1.5490   -2.8255
-0.3166   -8.8295
-0.8259    9.8953
1.3239   -3.1406
-0.5140    4.2194
-0.5285    4.0001

Matlab Toolbox for Dimensionality Reduction
1.8030    1.3171
1.2128   -0.8311
-2.3390   -1.0790
-0.0686    0.3192
0.1583    0.5392
-0.9479    0.1414
-0.5238   -0.9722
1.9852   -0.4809
-0.4173   -1.6266
-1.1358    1.9009
1.6719   -0.5711
-0.6996    0.7034
-0.6993    0.6397

The eigenvectors and projections are different even though these LDAs have the same data. I believe there are 2 possibilities.

  1. One of the libraries is wrong.
  2. I am doing it wrong.

Thank you!

Hyunjun Kim
  • 124
  • 2
  • 13
  • And how about the classes? How one could check the analyses you display? – ttnphns Aug 13 '15 at 15:28
  • What is the code you used? This isn't answerable in its present form. – gung - Reinstate Monica Aug 13 '15 at 15:52
  • How many classes are there? Can you give us the class labels? Are these matrices meant to be weight vectors for classification, or just vectors spanning the discriminative subspace? – A. Donda Aug 13 '15 at 18:28
  • Do I understand correctly that the original data are 4 class centroids in a 13-dimensional space? What about within-class covariances? – A. Donda Aug 13 '15 at 18:30
  • I made the question more clear thanks to your advices. Thank you ttnphns, gung, A. Donda! – Hyunjun Kim Aug 14 '15 at 01:16
  • Hyunjun, thanks, the question is much clearer now. Unfortunately, I don't know the answer. I would have thought that the two algorithms just produce two different eigenvector bases for the same 2-dimensional subspace, but that doesn't seem to be the case... – A. Donda Aug 14 '15 at 13:43

1 Answers1

1

The difference is because eigenvectors are not normalized. The normalized (L2 norm) eigenvectors are

OpenCV
0.44569   0.55196
0.48798   0.49105
0.54478   0.46028
0.51618   0.49230

Matlab Toolbox for Dimensionality Reduction
0.44064   0.55977
0.49120   0.48502
0.54152   0.46008
0.52087   0.48963

They look simliar now, although they have quite different eigenvalues.

Even though the PCA in OpenCV returns normalized eigenvectors, LDA does not. My next question is 'Is normalizing eigenvectors in LDA not necessary?'

Hyunjun Kim
  • 124
  • 2
  • 13