2

My initial data structure contains self-references which are not supported by spark:

initial.toDF
java.lang.UnsupportedOperationException: cannot have circular references in class, but got the circular reference

The initial data structure:

case class FooInitial(bar:String, otherSelf:Option[FooInitial])
val initial = Seq(FooInitial("first", Some(FooInitial("i1", Some(FooInitial("i2", Some(FooInitial("finish", None))))))))

To fix it a semantically similar and desired representation could be:

case class Inner(value:String)
case class Foo(bar:String, otherSelf:Option[Seq[Inner]])
val first = Foo("first", None)
val intermediate1 = Inner("i1")//Foo("i1", None)
val intermediate2 = Inner("i2")//Foo("i2", None)
val finish = Foo("finish", Some(Seq(intermediate1, intermediate2)))
val basic = Seq(first, finish)

basic.foreach(println)
val df = basic.toDF
df.printSchema
df.show
+------+------------+
|   bar|   otherSelf|
+------+------------+
| first|        null|
|finish|[[i1], [i2]]|
+------+------------+

What is a nice functional way to convert from the initial to the other non-self-referencing representation?

Georg Heiler
  • 16,916
  • 36
  • 162
  • 292

1 Answers1

0

This recursively dereferences the objects:

class MyCollector {
    val intermediateElems = new ListBuffer[Foo]

    def addElement(initialElement : FooInitial) : MyCollector = {

      intermediateElems += Foo(initialElement.bar, None)
      intermediateElems ++ addIntermediateElement(initialElement.otherSelf, ListBuffer.empty[Foo])
      this
    }

    @tailrec private def addIntermediateElement(intermediate:Option[FooInitial], l:ListBuffer[Foo]) : ListBuffer[Foo] = {

      intermediate match {
        case None => l
        case Some(s) => {
          intermediatePoints += Foo(s.bar + "_inner", None)
          addIntermediateElement(s.otherSelf,intermediatePoints)
        }
      }

    }
  }

  initial.foldLeft(new MyCollector)((myColl,stay)=>myColl.addElement(stay)).intermediatePoints.toArray.foreach(println)

The result is a List of:

Foo(first,None)
Foo(i1_inner,None)
Foo(i2_inner,None)
Foo(finish_inner,None)

which now nicely works for spark.

NOTE: this is not 1:1 for what I asked initially, but good enough for me for now.

Georg Heiler
  • 16,916
  • 36
  • 162
  • 292