I am trying to execute sql query through spark sql. In oracle this query works without problems but in spark.sql I get error:
spark version - 0.2.1
SQL:
SELECT
a.*,
( a.clc_1 + a.clc_2 + a.clc_3 ) / 3 AS arpu,
CASE WHEN limit_max is null THEN CASE
WHEN break_$ < 1
OR break_$ IS NULL THEN 0.01
ELSE break_$ END
WHEN limit_max < 1
OR limit_max IS NULL THEN 0.01
ELSE limit_max
END AS old_limit
FROM def.FEAUTRES_TEST_1 a
Could this be due to a syntax difference? And how can it be fixed?