I'm using CDH5.4 and I want to use spark, but I can't enable it. I got this error "The dependency is not satisfied for editing SPARK (0.9.0-1.cdh4.6.0.p0.98) : CDH (lower than 5.0).. However, Is it normal that the latest version of spark parcel is 0.9 (http://archive.cloudera.com/spark/parcels/latest/) Thanks in advance.
Asked
Active
Viewed 719 times
0
-
you are using cdh4.6 parcels with cdh5.4? how is that supposed to work??? – eliasah Jul 10 '15 at 15:00
-
I installed the latest version of Cloudera using cloudera manager, which was CDH5.4 – Cleo Jul 15 '15 at 08:24
-
because before 5.0 you need to install spark using the parcel! in 5.4 you can just install spark as a service!! – eliasah Jul 15 '15 at 08:40
-
Yes, it is already listed in my list of services, but how I can execute a spark jobs ? there is not a user web interface ( like Hue for example) ?? – Cleo Jul 15 '15 at 09:01
-
I'm not sure if Hue supports launching spark jobs. In case you need to know how to launch a spark job, I suggest that you read the official documentation! Basically, you need to run the spark-submit command for submitting a job on a cluster. – eliasah Jul 15 '15 at 09:09
-
When I click on Spark service , I m redirected to a url in port 18088 I have this message : No completed applications found! Did you specify the correct logging directory? Please verify your setting of spark.history.fs.logDirectory and whether you have the permissions to access it. It is also possible that your application did not run to completion or did not stop the SparkContext. – Cleo Jul 15 '15 at 12:31
-
I'm sorry with the few information you are giving us, your question isn't salvageable! You need to describe clearing what you are doing. We can't read minds. – eliasah Jul 15 '15 at 12:39
-
My question is quite simple; How I can launch a simple job example using Spark in Cloudera ( CDH 5.4) ? where I have to write my code ? – Cleo Jul 15 '15 at 12:48
-
and I answer you! If you don't know where you can write scala/java/python code. That's another problem! – eliasah Jul 15 '15 at 12:49
-
weird help !! thanks anyway ! – Cleo Jul 15 '15 at 12:55
-
go to chat http://chat.stackoverflow.com/rooms/info/83328/apache-spark?tab=general – eliasah Jul 15 '15 at 12:59
2 Answers
2
Spark used to be packaged separately, which is the spark parcel you are seeing, and also the contents of http://archive.cloudera.com/spark/parcels/latest/.
However, the CDH5.4 packages comes bundled with Spark, therefore you do not need to install that package! It is a bit confusing that it is still listed.

dpeacock
- 2,697
- 13
- 16
-
How I can run a scala programm in CDH5.4 ? I build a jar, I put my input file in the Hdfs, but I didn't find yet how to run the application. Is it in Hue like running a map reduce Job, or by a spark submit in a shell ? – Cleo Jul 21 '15 at 08:23
-
@Cleo Just click 'Add a Service', scroll down and you will see Spark service. Just follow the Add Service Wizard. – wangjunwww Nov 10 '16 at 22:51
0
I resolved the problem by editing the hue.ini, I added this two lines :
[desktop]
app_blacklist=
The second step is to run the livy_server ;)

Cleo
- 11
- 6