0

I'm trying to use the Spark Java API, but I'm getting a build path error in Eclipse due to Spark:

Archive for required library: 'C:/Users/IBM_ADMIN/Documents/spark-1.6.2-bin-hadoop2.6/lib/spark-assembly-1.6.2-hadoop2.6.0.jar' in project 'oplservice' cannot be read or is not a valid ZIP file

I'm using Eclipse SDK Version: 4.2.2 Build id: M20130204-1200.

I'm trying to use spark-1.6.2-bin-hadoop2.6 which I just downloaded (but I had the same problem with spark-v1.6.0-hadoop2.6.

Here is my build path set up in Eclipse:

I've checked a bunch of other posts about similar issues (e.g. close and reopen the project, run Eclipse -clean, etc.), but none of them has solved the problem. I haven't changed anything about the Spark jar or my Eclipse settings.

This shouldn't be that difficult, should it?

J. Bloom
  • 83
  • 2
  • 9
  • Perhaps check these: http://stackoverflow.com/questions/19526072/archive-for-required-library-could-not-be-read-or-is-not-a-valid-zip-file (See second answer there) and http://stackoverflow.com/questions/8857985/compiler-error-archive-for-required-library-could-not-be-read-spring-tool-su – Bajal Jul 02 '16 at 03:19
  • Use maven to manage spark project should be the easiest. [an exmaple](https://github.com/rockiey/explore-spark) – Rockie Yang Jul 02 '16 at 06:44
  • Thank you for you responses. I tried most of the suggestions without success. I have come to conclude that JAR in question, spark-assembly-1.6.2-hadoop2.6.0.jar, is in fact corrupted. Windows gives an "invalid or corrupt jarfile when it tries to open it with the Java Virtual Machine launcher. Now the question is how did it become corrupted, since I downloaded it directly from the Apache site and unpacked it with 7-Zip. @Bajal – J. Bloom Jul 03 '16 at 03:27

1 Answers1

0

Apparently this is a bug in Eclipse Juno. Since I switched to Eclipse Neon, I've had no problem with the Spark jars.

J. Bloom
  • 83
  • 2
  • 9