0

I am using NaiveBayes classifier of Weka. There's something that I have heard and I'm not sure if it's true. Somebody told me that when I have numeric values in weka, the higher value has a higher weight. Is that right?

I mean if the value of feature1 (which is numeric) of the first instance is 1 and the value of that feature for the second instance is 2, does it mean that the second instance must have a higher weight on that feature?

If that's right, what should I do if I want to change it?

For example, I have defined a distance feature. But, the closer is the better, which means the lower value for this feature must have a higher weight.

How should I implement it in Weka?

double-beep
  • 5,031
  • 17
  • 33
  • 41
user1419243
  • 1,655
  • 3
  • 19
  • 33
  • What do you mean by "weight"? It means something specific in terms of classifiers, but it can't be what you mean here. – Sean Owen Mar 07 '14 at 09:22
  • @SeanOwen , when I have categorical values, Weka shows me something like this: cat 1: frequency 20, weight :20 cat 2: frequency 30, weight : 30 Which I think means that weight of a category is the same as its frequency. Is my understanding right? Now, I am wondering how these weights go with numerical features. How does Weka weigh the numerical values of one feature. Is there something like what it does with the categorical features? Is value 2 of a feature weighted more than value 1? – user1419243 Mar 07 '14 at 09:58

1 Answers1

0

If I understand you correctly, you are asking whether you need to transform your feature value such that transformed value is proportional to prediction. It depends on learning algorithm you select and it has nothing to do with Weka. For most learning algorithms such as regression or decision trees, you don't need to apply such inverse transform. Normally I would just leave the value as-is unless algorithm requires other normalization such as rescaling or recentering to zero mean.

Shital Shah
  • 63,284
  • 17
  • 238
  • 185