4

Given the strong type system of Scala, I had an ambitious project which I'm about to abandon now because the effort to usefulness ratio seems to be too high.

Basically I have some graph elements (GE) and they correspond to sound processes which are carried out at a given calculation rate. Graph elements are composed from other graph elements forming their inputs. Now there are rather arbitrary constraints on the inputs' rates. In the source language (SuperCollider) the rates are checked at runtime, naturally because it's a dynamically typed language. I wanted to see if I can enforce the check at compile time.

Some constraints are fairly simply and can be expressed in the forms of "rate of arg1 must be at least as high as rate of arg2". But others get intricate, e.g.

"if arg0's rate is 'demand', args1's rate must be either 'demand' or 'scalar' or equal to the enclosing GE's rate".

The question is: Should I give up on this? Here is how it looks with runtime check:

sealed trait Rate
case object demand  extends Rate
case object audio   extends Rate
case object control extends Rate
case object scalar  extends Rate

trait GE { def rate: Rate }

// an example GE:
case class Duty(rate: Rate, in0: GE, in1: GE) extends GE {
  def checkRates(): Unit =
    require(in0.rate != demand || (in1.rate != demand &&
            in1.rate != scalar && in1.rate != rate))
}

And in constrast how it could look with type parameters for the rates:

sealed trait Rate
trait audio   extends Rate
trait demand  extends Rate
trait control extends Rate
trait scalar  extends Rate

trait GE[R <: Rate]

object Duty {
  trait LowPri {
    implicit def con1[R, T]: RateCons[R, audio  , T] = new ConImpl[R, audio  , T]
    implicit def con2[R, T]: RateCons[R, control, T] = new ConImpl[R, control, T]
    implicit def con3[R, T]: RateCons[R, scalar , T] = new ConImpl[R, scalar , T]

    implicit def con4[R, T]: RateCons[R, demand , demand] = 
      new ConImpl[R, demand, demand]

    implicit def con5[R, T]: RateCons[R, demand , scalar] = 
      new ConImpl[R, demand, scalar]
  }
  object RateCons extends LowPri {
    implicit def con6[R]: RateCons[R, demand, R] = new ConImpl[R, demand, R]
  }
  private class ConImpl[ R, S, T ] extends RateCons R, S, T ]
  sealed trait RateCons[ R, S, T ]

  def ar[S <: Rate, T <: Rate](in0: GE[S], in1: GE[T])(
    implicit cons: RateCons[audio, S, T]) = apply[audio, S, T](in0, in1)

  def kr[S <: Rate, T <: Rate](in0: GE[S], in1: GE[T])( 
    implicit cons: RateCons[control, S, T]) = apply[control, S, T](in0, in1)
}
case class Duty[R <: Rate, S <: Rate, T <: Rate](in0: GE[S], in1: GE[T])(
  implicit con: Duty.RateCons[R, S, T]) extends GE[R]

Tests:

def allowed(a: GE[demand], b: GE[audio], c: GE[control], d: GE[scalar]): Unit = {
  Duty.ar(b, c)
  Duty.kr(b, c)
  Duty.ar(b, a)
  Duty.ar(b, d)
  Duty.ar(a, b)
  Duty.kr(a, c)
}

def forbidden(a: GE[demand], b: GE[audio], c: GE[control], d: GE[scalar]): Unit = {
  Duty.kr(a, b)
  Duty.ar(a, c)
}

A path worth pursuing? Three more things that speak against it, apart from the code bloat:

  • There are probably a couple of dozen GEs which would need custom constraints
  • Composing GEs becomes increasingly difficult: code might need to pass around dozens of type parameters
  • Transformations might become difficult, e.g. imagine a List[GE[_<:Rate]].map( ??? ). I mean how would Duty.RateCons translate to TDuty.RateCons (where TDuty is a different GE)...

I had invested quite a bit of time in this project already, that's why I'm reluctant to give up so easily. So... convince me that I'm doing something useful here, or tell me that I should go back to the dynamically checked version.

0__
  • 66,707
  • 21
  • 171
  • 266
  • I'd also be interested in some resource that gives general advice on which kind of assertions can practically be statically checked. Something like design patterns. I'd prefer it to be for Scala, but not necessarily. – ziggystar Apr 06 '11 at 12:48
  • @ziggystar i found the following useful: http://michid.wordpress.com/2008/10/29/meta-programming-with-scala-conditional-compilation-and-loop-unrolling/ ; http://apocalisp.wordpress.com/2010/06/13/type-level-programming-in-scala-part-3-boolean/ : http://jnordenberg.blogspot.com/2009/09/type-lists-and-heterogeneously-typed.html ; Taking this statement from the first link: "This means that things like type sets (which would be a useful construct) are virtually impossible to create in Scala." -- makes me think the answer to my question is: keep with the dynamic approach... – 0__ Apr 06 '11 at 14:14
  • My statement was that sets of arbitrary types are (currently) impossible to create in Scala, but if you have control over the types in the set you can create a type level equality function which can be used to create a set type. – Jesper Nordenberg Apr 07 '11 at 14:11
  • @jesper Ok, I will probably need to learn more about type programming. For now, I have given up and removed all the rate types again, and suddenly everything went very painless and smooth again, so i kind of feel i did the right thing, although maybe in theory there could be more type safety. – 0__ Apr 07 '11 at 14:33

1 Answers1

0

As mentioned by Jesper Nordenberg, the thing to do is define a closed set of types and an equality operation over those types. If you do revisit this problem, an example of how you solved it would be desirable. Also, an example of type-level programming of the sort required by the questioner is desirable.

Read more here and here.

0__
  • 66,707
  • 21
  • 171
  • 266
troutwine
  • 3,721
  • 3
  • 28
  • 62