0

I have a Spark streaming app and I have re-setted the build path, in order to make it cleaner. After I re-imported all the jar files, I get this error I've never had before: enter image description here How is it possible? How can I solve this?

static FlatMapFunction<Tuple2<String, String>, String> sentimentFunc = new FlatMapFunction<Tuple2<String, String>, String>(){
        private static final long serialVersionUID = 1L;


        @Override
        public Iterator<String> call(Tuple2<String, String> x) throws Exception {
            List<String> output = new ArrayList<String>();
            if(x._2==null){
                output.add("ERR");
                return output.iterator();
            }
            boolean like = false, sad = false, angry = false, hilarious = false, neutral = false;
            boolean [] sentiments = {like, angry, sad, hilarious, neutral};
            sentiments = checkEmojis(x, sentiments);
            if(checkSentiment(sentiments)){
                output.add(setSentiment(sentiments));
                return output.iterator();
            }

            sentiments = checkText(x, sentiments);
            output.add(setSentiment(sentiments));
            return output.iterator();
        }

  };
sirdan
  • 1,018
  • 2
  • 13
  • 34
  • There is an incompatibility between the method signature and the "return" statement in the method. But you haven't shown the latter so... Also please paste code as text, not as an image. – assylias Jun 14 '17 at 14:46
  • ok I have updated the question. The code was working fine just five mins ago, so I don't see how it's possible an incompatibility – sirdan Jun 14 '17 at 14:48
  • my mistake, now is the correct one – sirdan Jun 14 '17 at 14:52

1 Answers1

2

In Spark 1.x, the return type of call for FlatMapFunction is Iterable.

In Spark 2.x, the return type of call for FlatMapFunction has been changed to Iterator.

It seems that when you reset your build path, you changed it to point to spark 1.x instead of 2.x, thus invalidating all your flatmap functions.

RealSkeptic
  • 33,993
  • 7
  • 53
  • 79
  • I've got this identical issue too: https://stackoverflow.com/questions/44373228/error-with-spark-nosuchmethoderror-scala-predef-conformslscala-predefles Changing the pom isn't enough. How can I solve this? – sirdan Jun 14 '17 at 16:05