42

I am trying to take my input data:

A    B       C
--------------
4    blah    2
2            3
56   foo     3

And add a column to the end based on whether B is empty or not:

A    B       C     D
--------------------
4    blah    2     1
2            3     0
56   foo     3     1

I can do this easily by registering the input dataframe as a temp table, then typing up a SQL query.

But I'd really like to know how to do this with just Scala methods and not having to type out a SQL query within Scala.

I've tried .withColumn, but I can't get that to do what I want.

ZygD
  • 22,092
  • 39
  • 79
  • 102
mcmcmc
  • 631
  • 2
  • 7
  • 12

3 Answers3

100

Try withColumn with the function when as follows:

val sqlContext = new SQLContext(sc)
import sqlContext.implicits._ // for `toDF` and $""
import org.apache.spark.sql.functions._ // for `when`

val df = sc.parallelize(Seq((4, "blah", 2), (2, "", 3), (56, "foo", 3), (100, null, 5)))
    .toDF("A", "B", "C")

val newDf = df.withColumn("D", when($"B".isNull or $"B" === "", 0).otherwise(1))

newDf.show() shows

+---+----+---+---+
|  A|   B|  C|  D|
+---+----+---+---+
|  4|blah|  2|  1|
|  2|    |  3|  0|
| 56| foo|  3|  1|
|100|null|  5|  0|
+---+----+---+---+

I added the (100, null, 5) row for testing the isNull case.

I tried this code with Spark 1.6.0 but as commented in the code of when, it works on the versions after 1.4.0.

emesday
  • 6,078
  • 3
  • 29
  • 46
  • This is exactly what I was looking for. I tried a couple different things with `when` and `otherwise` but I guess I was getting the exact format wrong. Slightly off topic, but do you know how Spark handles withColumn? Like, if I'm adding ~20 columns, would it be faster to do 20 .withColumn and keep it a dataframe or to map it to an RDD and just add them all in the map then convert back to a dataframe to save to parquet? – mcmcmc Jan 21 '16 at 16:15
  • 1
    Just found [this](http://stackoverflow.com/questions/33826495/spark-scala-2-10-tuple-limit). I think UDFs are what I am looking for. – mcmcmc Jan 21 '16 at 16:26
  • 1
    why does this not work with if? `df.withColumn("D", if(df("B") == "") lit(0) else lit(1))` – user 923227 Apr 20 '17 at 23:19
  • 2
    @SumitKumarGhosh `df("B")` is a column. the condition `df("B") == ""` should never be true, because a column is not the same kind of object as a string. furthermore, the condition `df("B") == ""` is an all-or-nothing condition. it is not evaluated row-by-row, as i suspect you want. the `when`/`otherwise` syntax does the right thing, by contrast – Alex Grounds May 30 '17 at 16:22
  • Can we do 'in' query while applying when – Kalpesh Aug 27 '18 at 21:02
  • What if I am changing the values of two or more columns? Can we do that in single withColumn() method? – GadaaDhaariGeek Dec 27 '18 at 08:38
  • how to add the column simply if it does not exist? – Manoj Pandey Mar 04 '19 at 06:46
4

My bad, I had missed one part of the question.

Best, cleanest way is to use a UDF. Explanation within the code.

// create some example data...BY DataFrame
// note, third record has an empty string
case class Stuff(a:String,b:Int)
val d= sc.parallelize(Seq( ("a",1),("b",2),
     ("",3) ,("d",4)).map { x => Stuff(x._1,x._2)  }).toDF

// now the good stuff.
import org.apache.spark.sql.functions.udf
// function that returns 0 is string empty 
val func = udf( (s:String) => if(s.isEmpty) 0 else 1 )
// create new dataframe with added column named "notempty"
val r = d.select( $"a", $"b", func($"a").as("notempty") )

    scala> r.show
+---+---+--------+
|  a|  b|notempty|
+---+---+--------+
|  a|  1|    1111|
|  b|  2|    1111|
|   |  3|       0|
|  d|  4|    1111|
+---+---+--------+
Ram Ghadiyaram
  • 28,239
  • 13
  • 95
  • 121
Roberto Congiu
  • 5,123
  • 1
  • 27
  • 37
2

How about something like this?

val newDF = df.filter($"B" === "").take(1) match {
  case Array() => df
  case _ => df.withColumn("D", $"B" === "")
}

Using take(1) should have a minimal hit

Justin Pihony
  • 66,056
  • 18
  • 147
  • 180