1

I am working on Spark SQL using Scala. I have a requirement where i need to divide the count from two queries.

Query 1 - select count(1) from table_1 where flag = 'Y'

Query 2 - select count(1) from table_2 where flag = 'N'

Now, I need to divide count from query 1 and query 2.

val divideValue = sqlContext.sql("
SELECT count(*) FROM table_1 where y != 'yes'/SELECT count(*) FROM table_2 where y = 'yes'
")

The above doesn't work. Please suggest with the actual query

philantrovert
  • 9,904
  • 3
  • 37
  • 61
Rathish MK
  • 11
  • 1
  • 4

2 Answers2

1

check This.

Use Unique Column count query. like .ID and using self join we can get this

select count(distinct  t1.id) Y_count,  count(distinct  t2.id) N_Count, 
count(distinct  t1.id)/count(distinct  t2.id) divideCount
from #table t1, #table t2  
where t1.flag='Y' and t2.flag='N'
Mr. Bhosale
  • 3,018
  • 1
  • 17
  • 34
0

You can try this:

val count1 = sqlContext.sql("SELECT count(*) FROM table_1 where y != 'yes')
val count2 = sqlContext.sql("SELECT count(*) FROM table_2 where y = 'yes'")
val value1 = count1.head().getLong(0)
val value2 = count2.head().getLong(0)
val finalValue = value1/value2
Naman Agarwal
  • 614
  • 1
  • 8
  • 28