2

In Java I am adding below maven dependency,

<dependency>
            <groupId>cloudant-labs</groupId>
            <artifactId>spark-cloudant</artifactId>
            <version>2.0.0-s_2.11</version>
        </dependency>

but it is not loading package even in pom.xml file showing below error,

Missing artifact cloudant-labs:spark-cloudant:jar:2.0.0-s_2.11

Can anyone help me please why it is causing issue?

I am able to add another maven dependencies but particularly this is not working..

Vimal Dhaduk
  • 994
  • 2
  • 18
  • 43

2 Answers2

3

It's not in the official maven repository. (http://search.maven.org/#search%7Cga%7C1%7Cspark-cloudant)

But when you check: https://mvnrepository.com/artifact/cloudant-labs/spark-cloudant/2.0.0-s_2.11 there is note:

Note: this artifact it located at Spark Packages repository (https://dl.bintray.com/spark-packages/maven/)

So you will need to add following to your pom.xml:

<repositories>
    <repository>
      <id>bintray</id>
      <name>bintray.com</name>
      <url>https://dl.bintray.com/spark-packages/maven/</url>
    </repository>
</repositories>

EDIT:

According to https://spark.apache.org/news/new-repository-service.html

Bintray, the original repository service used for https://spark-packages.org/, is in its sunset process, and will no longer be available from May 1st. To consume artifacts from the new repository service, please replace “dl.bintray.com/spark-packages/maven” with “repos.spark-packages.org” in the Maven pom files or sbt build files in your repositories.

So this should work:

<repositories>
    <repository>
      <id>bintray</id>
      <name>bintray.com</name>
      <url>https://repos.spark-packages.org</url>
    </repository>
</repositories>
VladoDemcak
  • 4,893
  • 4
  • 35
  • 42
  • How to add package while submit spark, – Vimal Dhaduk Dec 02 '16 at 07:00
  • ./spark-submit --packages cloudant-labs:spark-cloudant:2.0.0-s_2.11 --class spark.cloudant.connecter.cloudantconnecter --master local[*] /opt/demo/sparkScripts/ScoredJob/sparkcloudantconnecter.jar – Vimal Dhaduk Dec 02 '16 at 07:01
  • Why I got `Forbidden!` when trying to access `https://dl.bintray.com/spark-packages/maven/`? – wawawa May 10 '21 at 09:12
  • @Cecilia https://spark.apache.org/news/new-repository-service.html > Bintray, the original repository service used for https://spark-packages.org/, is in its sunset process, and will no longer be available from May 1st. To consume artifacts from the new repository service, please replace “dl.bintray.com/spark-packages/maven” with “repos.spark-packages.org” in the Maven pom files or sbt build files in your repositories. – VladoDemcak May 10 '21 at 10:23
  • Hi @VladoDemcak Thank you for this important info, it makes sense, I've been hitting `dependency not found issue`. The thing is that I'm using AWS Glue to run pyspark script and Glue downloads the dependency automatically, so does that mean it should be AWS side to change the location to the new repository service? Is there anything I can do on my side? Thanks. – wawawa May 10 '21 at 10:55
  • 1
    @Cecilia if you dont have access to change the url, then probably you will need to wait for the fix. – VladoDemcak May 10 '21 at 12:27
0

Check your maven repo to verify that the file name and version matches what you've specified. Most Maven repo's give you an example of what to use, copy/paste.

ex: Sonatype Nexus is a repo that I use and they let you search and get snippets so you never have to worry about typing things wrong.

RPD
  • 31
  • 7
  • See http://stackoverflow.com/questions/6111408/maven2-missing-artifact-but-jars-are-in-place – RPD Dec 01 '16 at 18:48