-2

So, I have a thing to do, but i need an advice how to do that. My data points is: 1,2,9,6,4 and I need to compute distance between clusters. I need to you Euclidean distance.

My answer was: {1,1} = 0. {1,2}=1 , {1,9} = 8. Am i doing correct or not?

blockByblock
  • 366
  • 2
  • 4
  • 14

1 Answers1

1

So you have 5 data points, right?

the formulas should be this:

square root of ((1-1)²)  = 0
square root of ((1-2)²)  = 1
square root of ((1-9)²)  = 8

...so yeah, you're right.

euclidian distance formula

  • Thank you. After this point I need to find which clusters are merged into new cluster? How to do that ? do you have any idea ? – blockByblock Oct 14 '16 at 18:54
  • @ipo how would k-nearestuneighbor work for clustering? – Has QUIT--Anony-Mousse Oct 14 '16 at 19:01
  • Note that the square root and the square are redundant, it reduces to simply `abs(a-b)` in the one-dimensional case. That's why Euclidean doesn't make particular sense on 1 dim data. – Has QUIT--Anony-Mousse Oct 14 '16 at 19:03
  • There are different ways to do this. For example k-nearest-neighbor or the k-means algorithm. You calculate the euclidian distance between all you're data points for k-nearest-neighbor and look where the amount of k data points "accumulate" (you decide before how big k is). For the k-means algorithm you choose new data points called prototypes and calculate the distance between these prototypes (minimum 2, because 1 makes no sense) and your data points. Whatever algorithm you choose for, the result is nearly the same: the datapoints belongs to the cluster, where the distance is the lowest. – idontknowwhoiamgodhelpme Oct 14 '16 at 19:12
  • @Anony-Mousse Yeah, youre right. In the normal case you've got more then 1 dim for your data points - in this case you will need the square root. For 1 dim it' senseless. – idontknowwhoiamgodhelpme Oct 14 '16 at 19:13