I am running Spark 1.1.0 with PySpark.
When I run the example taken straight from the documentation:
from pyspark.mllib.regression import LabeledPoint
from pyspark.mllib.classification import SVMWithSGD
import array
data = [
LabeledPoint(0.0, [0.0]),
LabeledPoint(1.0, [1.0]),
LabeledPoint(1.0, [2.0]),
LabeledPoint(1.0, [3.0])
]
svm = SVMWithSGD.train(sc.parallelize(data))
svm.predict(array([1.0])) > 0
I get an error:
TypeError Traceback (most recent call last)
<ipython-input-5-68bcda022b28> in <module>()
12
13 svm = SVMWithSGD.train(sc.parallelize(data))
---> 14 svm.predict(array([1.0])) > 0
TypeError: 'module' object is not callable
What could be the issue?