-2

I am trying to run Fuzzy C Means algorithm in Spark using Scala

I'm following below link for this:

https://github.com/acflorea/fuzzyCMeans

Click below to see screenshot of the error which I am receiving

Error Screenshot:
img

I am have used sbt then spark-shell

Furthermore, to resolve this,

I have tried to import below

import scala.util.Random
import java.util.Random
import util.Random

But Still I am getting the same error as before(screenshot)

Any help in this regard would be appreciated. Thanks!

++++++++++++++++++++++++++++++++++++++++++++++++++++++++

I have imported below mentioned package,

org.apache.spark.mllib.clustering.KMeans

but I am still facing this issue. Kindly check the detailed screenshot:

Screenshot of my work

Kindly Assist! Thank you!

PS: I am using Spark 1.6.0 and Scala 2.10.5

2 Answers2

0

The two constants (RANDOM and K_MEANS_PARALLEL) are defined in org.apache.spark.mllib.clustering.KMeans

Be careful, I have not updated the package since Spark1.6 hence Scala 2.10 :)

Enjoy!

Adrian Florea
  • 125
  • 1
  • 3
  • 9
  • In order to import everything from an object in Scala you should use: `import org.apache.spark.mllib.clustering.KMeans._` Enjoy! – Adrian Florea Nov 18 '17 at 19:18
0

In Scala,

import org.apache.spark.sql.functions.rand

rand is a function available in spark sql. Importing this allows to access the rand function.

  • Welcome to SO! It would be great if you could support your answer with some explanation. In your case, the reason to use the `rand` from spark SQL. – samkart Mar 20 '20 at 07:04