Pre-SIP: Custom dependent operation types

Introduction

Scala 3 supports various type-level operations via compiletime.ops.*. The great thing about them is that they compose well (e.g. val i : 1 + 2 * 3 = 7 ), as we expect from their equivalent term-level operations. But we have two problems:

  1. What do we do when an operation we want isn’t supported?
  2. What do we do when we have a new custom type that we want to have type-level operations on?

There is a limit how much we can hard-code into the compiler, so how do we achieve the required flexibility?

Using Implicits

Here is a simple example, just for understanding the concept.

trait MyType:
  type Out <: Int

trait One extends MyType:
  type Out = 1

trait Two extends MyType: 
  type Out = 2

trait Plus[A <: MyType, B <: MyType] extends MyType
  type Out <: Int

transparent inline given [A <: MyType, B <: MyType] : Plus[A, B] = ${someMacro}

If we don’t have internal support for the literal type addition, then we need to use a macro, but the macro is too complicated and requires to traverse all the compositions to evaluate the type expression and set the proper Out:
summon[One Plus Two Plus One]

All of this traversal and evaluation mechanism already exists within the compiler when it executes the compiletime.ops evaluation. So how can we utilize it to our benefit for our custom types and their operations?

Custom dependent operation types

To solve the problem, we introduce the following API:

package compiletime.ops

final class customOp extends StaticAnnotation
erased trait CustomOp[T <: AnyKind, Args <: Tuple]:
  type Out

The compiler’s job is fairly simple. Whenever it evaluates a type T[A1, A2, A3, ...] with a customOp annotation it searches for the implicit CustomOp[T, (A1, A2, A3, ...)], after it evaluates all type arguments of T is the same fashion. So the compiler evaluates the composition of all customOp types (and internally supported types) out-of-the-box, and we only need to define the implicits.

Thus our example will change to the following:

import compiletime.ops.{customOp, CustomOp}
@customOp type One <: Int
given CustomOp[One, ()] with 
  type Out = 1

@customOp type Two <: Int
given CustomOp[Two, ()] with 
  type Out = 2

@customOp type Plus[A <: Int, B <: Int] <: Int
transparent inline given [A <: Int, B <: Int]: CustomOp[Plus, (A, B)] = ${simplerMacro}

Another great advantage is that instead of dragging our dependent type with Out, we get the value directly:
val x : One Plus Two Plus One = 4

Would love to get your feedback

2 Likes

Good proposal. shapesafe committer here, would totally be a beneficiary if this is implemented.

IMHO the two packages follow different paradigms:

  1. your singleton-ops is a lazy prover: It is entirely possible to define, e.g. type A = 1 + 3 == 5 without proving it. The proving macro only got invoked on demand. This implies that type (1 + 3) =:= 5 is falsum (in the context of Curry-Howard isomorphism, in other words, upon summoning its instance, the compiler always report implicit not found), even (1 + 3) <:< 5 is falsum. They are equivalent but not identical. (BTW my shapesafe library follow the same paradigm)

  2. Scala 3 compiletime-ops is an eager evaluator: type (1 + 3) automatically becomes an alias of 4 REGARDLESS OF CONTEXT

What you are asking for (a contextual view of equivalence/univalence) probably won’t fit into the second paradigm. Unfortunately, Scala is not an HoTT implementation: it only admits Liskov rule of substitution, not a univalent one. So paradigm 2 really comes in handy if e.g. you want to use a variable of type 1+3 directly in place where type 4 is expected. This is a practical trade-off which many other languages are using (In this talk: Gradual tensor typing for practical machine learning in Haskell with Hasktorch - YouTube, paradigm 2 is briefly discussed for Haskell), but not the one we deserve :slight_smile:

I wonder if you would like to strike a deal with the dotty team? Ask them to propose an augmented implicit conversion feature that enables support for univalent substitution, then you can rewrite the compiletime-ops library to use the lazy proving paradigm, and the chasm will disappear.

BTW: I knew the author of that Haskell library very well, we were both accepted for oral presentation in a FP conference, and this discrepancy will inevitably causes conflict between us.

BTW2: This is also one of the 2 obstacles that prevents me from porting shapesafe to dotty, the other is the record type.

2 Likes

I don’t see where the problem exists in Scala 3.

For example,

import compiletime.ops.int.+
class Foo[T <: Int with Singleton](t : T){
  val x : T + T = (t + t).asInstanceOf[T + T]
}

What Scala 3 now knows about x is that it is <: T + T <: Int, which is indeed more that the singleton-ops library approach, but it does not prevent it from reevaluating the type expression later on when T is no longer abstract:

val f1 = new Foo(1)
val f2 = new Foo(2)
val x6 : 6 = f1.x + f2.x

With the singleton-ops library this is waaay uglier, and I don’t understand where its advantage is.

Hold on, don’t celebrate too early. Once I made a little change, the facade crumble :slight_smile:

  class Foo[T <: Int with Singleton](t: T) {
    val x: T + T = (t + t).asInstanceOf[T + T]

    val y: T / T = (t / t).asInstanceOf[T / T]
  }

  val f3 = new Foo(0)
  val xE = f3.y + f3.x

Error:
Division by 0
val xE = f3.y + f3.x

there is no way to delay it or delegate it to a user define axiomatic system. It is acceptable in your library tho

Thanks a lot, your example shows that my old impression on dotty compiler is wrong. I’ll need to figure out how it works internally (namely, to make summon[6 =:= 3+3] possible), also:

  • is it possible to skip that type cast?

  • is it smart enough to summon[(T1 + T2) * 2 =:= (T1 * 2 + T2 * 2)] by law of distribution? That will be some weird conspiracy

I see no reason why this should be acceptable anywhere. You explicitly have a 0/0 value. That’s an invalid algebraic expression.

Something that would be really useful in this case is to have type macros, like:

type MyType[A] = ${ myTypeImpl[A] }
def myTypeImpl[A : Type](using q: Quotes): Type = ???

This way you wouldn’t even need to have indirection through givens. And it would be more powerful than match types, since you could do more unrestrained computation in there.

I guess one challenging thing is when to expand MyType – you’d only want to do that when the type argument becomes concrete.

2 Likes

Yes, in a way. I have an Inlined mechanism in a library, that I think about separating into a tiny library by itself. It’s based on the singleton-ops TwoFace concept, but I changed the name and simplified it through opaque types. The example would look like:

class Foo[T <: Int](t : Inlined[T]){
  val x = t + t //automatically inferred as `: Inlined[T + T]`
}

The rest of the example should work as before since there is an implicit conversion Inlined[T] => T, and T => Inlined[T], and there is no need for the Singleton upper-bound.

Currently, not. The same for commutativity.

1 Like

Assuming that the user customise it such that 0/0 == NaN and 1/0 == Inf:

transparent inline given: /[0, 0] = ${someMacroThatGivesNaN}

but the compiler won’t even go that far to read it, it will just throw the error.

But seriously, this is not my main problem, my problem is that according to CH-isomorphism, a falsum is merely a type without instance. So why 1+2 =:= 3 is a falsum, but can exist, yet 0/0 is also a falsum, but cannot exist? Are they truly 2 different kind of types that requires special handling?

NaN exists for floating types. Not for integers/fixed types. Even if you do a term integer algebraic division you will get (at least) a runtime division by zero error. IMO, not a criteria for a good comparison.

Sorry my example was bad. Can I correct it so it doesn’t touch runtime division?

  class Foo[T <: Int with Singleton](t: T) {

    type Y = T / T
    type Z = T =:= 1
  }

  val f3 = new Foo(0)

  type EE = f3.Y // error
  type FF = f3.Z // works fine

Same error. I kind of agree with you that this is not the best example to demonstrate the dilemma. But think of this: if Int and / are not built-in types, but defined in your library, how could I tell the compiler to accept some extension to A / B, but immediately refute some special cases?

The example is still irrelevant, IMO. No matter when the evaluation takes place, it’s still wrong to do.

If you wish to discuss this further, please open a new thread. Let’s try to stay on topic of the OP.

OK, may take some time tho, need to understand how dotty did this

It will be the same as the internal compiletime.ops evaluations, which indeed occur where the argument becomes concrete.

2 Likes