2

I have a dataframe like this:

| id | prodId |    date    | value |
| 1  |   a    | 2015-01-01 | 100   |
| 2  |   a    | 2015-01-02 | 150   |
| 3  |   a    | 2015-01-03 | 120   |
| 4  |   b    | 2015-01-01 | 100   |

and I would love to do a groupBy prodId and aggregate 'value' summing it for ranges of dates. In other words, I need to build a table with the following columns:

  • prodId
  • val_1: sum value if date is between date1 and date2
  • val_2: sum value if date is between date2 and date3
  • val_3: same as before etc.

     | prodId |      val_1       |      val_2       |
     |        | (01-01 to 01-02) | (01-03 to 01-04) |
    
     |   a    |        250       |        120       |
     |   b    |        100       |        0         |
    

Is there any predefined aggregated function in spark that allows doing conditional sums? Do you recommend develop a aggr. UDF (if so, any suggestions)? Thanks a lot!

zero323
  • 322,348
  • 103
  • 959
  • 935
lilloraffa
  • 1,367
  • 3
  • 17
  • 22

1 Answers1

2

First lets recreate example dataset

import org.apache.spark.sql.functions.to_date

val df = sc.parallelize(Seq(
  (1, "a", "2015-01-01", 100), (2, "a", "2015-01-02", 150),
  (3, "a", "2015-01-03", 120), (4, "b", "2015-01-01", 100)
)).toDF("id", "prodId", "date", "value").withColumn("date", to_date($"date"))

val dates = List(("2015-01-01", "2015-01-02"), ("2015-01-03", "2015-01-04"))

All you have to do is something like this:

import org.apache.spark.sql.functions.{when, lit, sum}

val exprs = dates.map{
  case (x, y) => {
    // Create label for a column name
    val alias = s"${x}_${y}".replace("-", "_")
    // Convert strings to dates
    val xd = to_date(lit(x))
    val yd = to_date(lit(y))
    // Generate expression equivalent to
    // SUM(
    //   CASE
    //     WHEN date BETWEEN ... AND ... THEN value
    //     ELSE 0
    //   END
    // ) AS ...
    // for each pair of dates.
    sum(when($"date".between(xd, yd), $"value").otherwise(0)).alias(alias)
  }
}

df.groupBy($"prodId").agg(exprs.head, exprs.tail: _*).show

// +------+---------------------+---------------------+
// |prodId|2015_01_01_2015_01_02|2015_01_03_2015_01_04|
// +------+---------------------+---------------------+
// |     a|                  250|                  120|
// |     b|                  100|                    0|
// +------+---------------------+---------------------+
zero323
  • 322,348
  • 103
  • 959
  • 935