0

I am trying to execute sql query through spark sql. In oracle this query works without problems but in spark.sql I get error:

enter image description here

spark version - 0.2.1

SQL:

SELECT
a.*,
( a.clc_1 + a.clc_2 + a.clc_3 ) / 3 AS arpu,
CASE WHEN limit_max is null THEN CASE
                          WHEN break_$ < 1
                          OR break_$ IS NULL THEN 0.01
                          ELSE break_$ END
WHEN limit_max < 1
   OR limit_max IS NULL THEN 0.01
ELSE limit_max
END AS old_limit
FROM def.FEAUTRES_TEST_1  a

Could this be due to a syntax difference? And how can it be fixed?

Frank Schmitt
  • 30,195
  • 12
  • 73
  • 107
Sam324
  • 199
  • 4
  • 13
  • 1
    And why did you tag this with oracle? I mean if it works in oracle, what can oracle guys do for you here? – gsalem Jul 19 '22 at 11:46
  • The answer seems to depend on your Spark version - see https://stackoverflow.com/questions/25157451/spark-sql-case-when-then?rq=1 – Frank Schmitt Jul 19 '22 at 13:14
  • (sorry for the dupe hammer, I forgot that it was tagged as Oracle which resulted in an insta-close by me. I've removed the Oracle tag, since the question really is not related to Oracle) – Frank Schmitt Jul 19 '22 at 13:16

0 Answers0