0

I am using case class in Scala (2.12.8) Apache Flink (1.9.1) application. I get the following exception when I run the code below Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V.

NOTE: I have used the default constructor as per the suggestion ( java.lang.NoSuchMethodException for init method in Scala case class) but that does not work in my case

Here is the complete code

package com.zignallabs
import org.apache.flink.api.scala._
/**
 // Implements the program that reads from a Element list, Transforms it into tuple and outputs to TaskManager
 */

case  class  AddCount ( firstP: String, count: Int) {
  def this () = this ("default", 1)  // No help when added default constructor as per https://stackoverflow.com/questions/51129809/java-lang-nosuchmethodexception-for-init-method-in-scala-case-class
}

object WordCount {
  def main(args: Array[String]): Unit = {
    // set up the execution environment
    val env = ExecutionEnvironment.getExecutionEnvironment
    // get input data
    val input  =env.fromElements(" one", "two", "three", "four", "five", "end of test")



    // *****   Line 31 throws the exception
    // Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
    //  at com.zignallabs.AddCount.<init>(WordCount.scala:7)
    //  at com.zignallabs.WordCount$.$anonfun$main$1(WordCount.scala:31)
    //  at org.apache.flink.api.scala.DataSet$$anon$1.map(DataSet.scala:490)
    //  at org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:79)
    //  at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
    //  at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:196)
    //  at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
    //  at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
    //  at java.lang.Thread.run(Thread.java:748)
    val transform = input.map{w => AddCount(w, 1)}  // <- Throwing exception

    // execute and print result
    println(transform)
    transform.print()
    transform.printOnTaskManager(" Word")

    env.execute()
  }
}

Run time exception is :

    at com.zignallabs.AddCount.<init>(WordCount.scala:7)
    at com.zignallabs.WordCount$.$anonfun$main$1(WordCount.scala:31)
    at org.apache.flink.api.scala.DataSet$$anon$1.map(DataSet.scala:490)
    at org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:79)
    at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
    at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:196)
    at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
    at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
    at java.lang.Thread.run(Thread.java:748)

I am building and running flink locally using local flink cluster with flink version 1.9.1.

Here is the build.sbt file:

name := "flink191KafkaScala"

version := "0.1-SNAPSHOT"

organization := "com.zignallabs"

scalaVersion := "2.12.8"

val flinkVersion = "1.9.1"

//javacOptions ++= Seq("-source", "1.7", "-target", "1.7")

val http4sVersion = "0.16.6"

resolvers ++= Seq(
  "Local Ivy" at "file:///"+Path.userHome+"/.ivy2/local",
  "Local Ivy Cache" at "file:///"+Path.userHome+"/.ivy2/cache",
  "Local Maven Repository" at "file:///"+Path.userHome+"/.m2/repository",
  "Artifactory Cache" at "https://zignal.artifactoryonline.com/zignal/zignal-repos"
)

val excludeCommonsLogging = ExclusionRule(organization = "commons-logging")

libraryDependencies ++= Seq(
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",

  "org.apache.flink" %% "flink-clients" % "1.9.1",
  // Upgrade to flink-connector-kafka_2.11
  "org.apache.flink" %% "flink-connector-kafka-0.11" % "1.9.1",
  //"org.scalaj" %% "scalaj-http" % "2.4.2",
  "com.squareup.okhttp3" % "okhttp" % "4.2.2"
)

publishTo := Some("Artifactory Realm" at "https://zignal.artifactoryonline.com/zignal/zignal")

credentials += Credentials("Artifactory Realm", "zignal.artifactoryonline.com", "buildserver", "buildserver")

//mainClass in Compile := Some("com.zignallabs.StoryCounterTopology")
mainClass in Compile := Some("com.zignallabs.WordCount")

scalacOptions ++= Seq(
  "-feature",
  "-unchecked",
  "-deprecation",
  "-language:implicitConversions",
  "-Yresolve-term-conflict:package",
  "-language:postfixOps",
  "-target:jvm-1.8")


lazy val root = project.in(file(".")).configs(IntegrationTest)
  • I copy/pasted your code into IntelliJ, and it runs just fine for me. This suggests that the problem is elsewhere: maybe the build environment or the project settings. – David Anderson Jan 23 '20 at 08:37
  • Could You please paste the dependencies here and verify the version on the cluster ?? This error is in 99% of cases caused by the mismatch of scala versions between the machine and flink or the developers machine and cluster. – Dominik Wosiński Jan 23 '20 at 09:42
  • David Anderson and Dominik Wosiński. Thanks for the reply. I have added the build.sbt file to the original post. What is the best way to clean the environment? – Mohammed-AR Jan 23 '20 at 22:08

2 Answers2

0

If you're using default args for the constructors of a case class, it's much more idiomatic Scala to define them like this:

case class AddCount ( firstP: String = "default", count: Int = 1)

This is syntactic sugar that basically gives you the following for free:

case  class  AddCount ( firstP: String, count: Int) {
  def this () = this ("default", 1)

  def this (firstP:String) = this (firstP, 1)

  def this (count:Int) = this ("default", count)

}
NateH06
  • 3,154
  • 7
  • 32
  • 56
0

I am able to now run this application using Scala 2.12. The issue was in the environment. I needed to ensure conflicts binaries are not there especially the ones for scala 2.11 and scala 2.12