From the Spark downloads page, if I download the tar file for v2.0.1, I see that it contains some jars that I find useful to include in my app.
If I download the tar file for v1.6.2 instead, I don't find the jars folder in there. Is there an alternate package type I should use from that site? I am currently choosing the default (pre-built for Hadoop 2.6). Alternately, where I can find those Spark jars - should I get each of them individually from http://spark-packages.org?
Here is an indicative bunch of jars I want to use:
- hadoop-common
- spark-core
- spark-csv
- spark-sql
- univocity-parsers
- spark-catalyst
- json4s-core