Proposal to remove Existential Types from the language

Hello Scala Community!

I’d like to open the discussion for the pending SIP on dropping existential types from the language. You can read a full writeup on the details here.

Summary:

Existential types that can be expressed using only wildcards (but not forSome ) are still supported, but are treated as refined types.

Please read the link above for full details.

Implications

I’m aware of two issues (both from my time on SBT). Both of these issues arise from limitations in type inference and using existentials to be able to drop explicit type annotations in two scenarios:

Scenario 1: Higher kinded types

SBT relies on forSome for code like the following:

def joinAny[M[_]](inits: Seq[Initialize[M[T]] forSome { type T }]): Initialize[Seq[M[_]]]

Where “Initialize” is effectively an Applicative functor / Reader-style monad. In the past, existentials were the only way to get Scala’s type inference to handle these styles of methods.

Scenario 2: F-Bounded polymorphism/quantification

SBT has the following:
type Key = K forSome { type K <: Scoped.ScopingSetting[K] with Scoped }

This, again, was used to simplify defining these types and ensuring type inference preserved what was needed in the API.

I believe BOTH of these use cases (not explicitly outlined in the proposal) should be resolved with better type inference generally available in DOTTY. (I have not yet confirmed, but will get back to this thread once I have time too).

Opening this up for general community discussion and insight into any other use cases that would not be covered with wildcards or dependent types.

See also the discussion in https://github.com/lampepfl/dotty/issues/4353

The Java JWT library (https://github.com/jwtk/jjwt) contains the interface io.jsonwebtoken.SigningKeyResover which I could not implement in Scala without using existential types. The interface has a method

Key resolveSigningKey(JwsHeader header, Claims claims);

where JwsHeader is defined as

public interface JwsHeader<T extends JwsHeader<T>> extends Header<T> {
  ...
}

The Scala implementation of that method looks as follows:

new SigningKeyResolver {
   override def resolveSigningKey(header: JwsHeader[X] forSome { type X <: JwsHeader[X]}, claims: Claims): Key = ...
   ...
}

Will Dotty still allow to implement such a Java interface in Scala?

wouldn’t you just use

override def resolveSigningKey[T <: JwsHeader[T]](header: JwsHeader[T]), claims: Claims): Key = ...

instead?

The Scala compiler rejects to compile that variant:

Error:(53, 32) method resolveSigningKey overrides nothing.
Note: the super classes of <$anon: io.jsonwebtoken.SigningKeyResolver> contain the following, non final members named resolveSigningKey:
def resolveSigningKey(x$1: io.jsonwebtoken.JwsHeader,x$2: io.jsonwebtoken.Claims): java.security.Key
                  override def resolveSigningKey[X <: JwsHeader[X]](header: JwsHeader[X] /* forSome { type X <: JwsHeader[X] } */, claims: Claims): Key =

I was expecting that a solution based on a type alias might work, but I couldn’t succeed:

type SomeJwsHeader = {
  type T <: JwsHeader[T]
  type Apply = JwsHeader[T]
}

And then:

trait ScalaSigningKeyResolver extends SigningKeyResolver {
  override def resolveSigningKey(header: SomeJwsHeader#Apply, claims: Claims): Key = ???
}

But I get the same error message :frowning:

method resolveSigningKey overrides nothing.
Note: the super classes contain the following, non final members named resolveSigningKey:
def resolveSigningKey(x$1: io.jsonwebtoken.JwsHeader[T] forSome { type T <: io.jsonwebtoken.JwsHeader[T] },x$2: io.jsonwebtoken.Claims): java.security.Key

I’m curious to know if there is a way to represent this JwsHeader[T] forSome { type T <: io.jsonwebtoken.JwsHeader[T] } type without using forSome.

Since the Java interface has type:

Key resolveSigningKey(JwsHeader header, Claims claims);

I think the correct signature for the override is just:

def resolveSigningKey(header: JwsHeader[_], claims: Claims): Key = ???

This works in Dotty but not in Scala 2 and I think Scala 2 is wrong here to force you to write this crazy existential type, for the same reason that if I declare a class Foo[X <: Int], I can write Foo[_] to talk about any Foo and it’s equivalent to writing Foo[_ <: Int], I should be able to write JwsHeader[_] without specifying a bound for the wildcard.

1 Like

Hey @jsuereth!

I mentioned 2 more that are present in the sbt codebase:

  • Parser[Seq[ScopedKey[T]] forSome { type T }]
  • Initialize[Task[T]] forSome { type T }

Mentioned in Drop `forSome`, restrict existentials to wildcards · Issue #443 · scala/scala-dev · GitHub

I’d love to know how these would be rewritten in Dotty, without losing any expressiveness.

I don’t think they can be rewritten directly. You’d have to introduce intermediate classes and thread them through the whole codebase. E.g.

  class KeyParser[T] {
    val parsed: Seq[ScopedKey[T]]
    ...
  }

These idioms and workarounds demonstrate the additional power of existential types which falls away.

To be clear: Removing existential types is not done in order to save a bit of syntax. The reason for removing them is that they have complicated interactions with almost everything else in the type system, including soundness, and that they overlap sufficiently with dependent types, so that the added complications stand in no relation to the added usefulness.

1 Like

Can type KeyParser[T] = Seq[ScopedKey[T]]; ... KeyParser[_] be used?

No, that would be too easy… I.e. then it would be syntactic sugar, after all.

Hey @djwijnand,

Those are actually the same “class” of issue as Scenario 1, so I just listed one of the examples. It’s for the same reason and I think is expressable in dotty without requiring the forSome. The primary reason you see the forSome in the type signature there is because you couldn’t write:

def joinAny[M[_]](inits: Seq[Initialization[M[_]]: Initialize[Seq[M[_]]

I’m hoping to find time this week to get a working rewrite in Dotty that solves the same use case but without requiring the existential.

1 Like

I believe that that encoding (and the new way of expressing what was supported in the past) should be written down in the proposal. It seems important at least for the java compatibility story.

I think it also could be interesting to take a look on existential types in relation with mathematical concept of natural domain of function definition applied to polymeric/generic method.
In particular - if one have lot of duplicate implicit arguments in code, one could be interested in newly introduced implicit functions types aka Context Queries (to extract repeating args list into type alias)
Theoretically if one may have similar problem with overcomplicated repeating generic methods definitions one may extract them into existential type.
So for example (formally written based on introduction snippet of this topic), if one have code like

    def doSmth0[PKey <: Scoped.ScopingSetting[PKey] with Scoped](key: PKey): Unit = {
      println(s"doSmth0: $key")
      doSmth1(key)
    }

    def doSmth1[PKey <: Scoped.ScopingSetting[PKey] with Scoped](key: PKey): Unit = {
      println(s"doSmth1: $key")
      doSmth2(key)
    }

    def doSmth2[PKey <: Scoped.ScopingSetting[PKey] with Scoped](key: PKey): Unit = {
      println(s"doSmth2: $key")
    }

He/she can rewrite it as following (using essentials)

    type Key = K forSome { type K <: Scoped.ScopingSetting[K] with Scoped }

    def doSmth0(key: Key): Unit = {
      println(s"doSmth0: $key")
      doSmth1(key)
    }

    def doSmth1(key: Key): Unit = {
      println(s"doSmth1: $key")
      doSmth2(key)
    }

    def doSmth2[PKey <: Scoped.ScopingSetting[PKey] with Scoped](key: PKey): Unit = {
      println(s"doSmth2: $key")
    }
  }

So here theoretically one may significantly reduce boilerplating, if one need only pass that key of [PKey <: ...] through multiple methods “transparently” and use that key “fine grained semantics” only in some very last place.

Basically (in theory) same trick could be made to represent any generic method natural domain of function definition in form of existential type.
So having method like

  def doSmth[T1 >: ... <: ... , ... , TN >: ... <: ...](arg1: P1[T1, ... ,TN], argM: PM[T1, ... ,TN]) = ???

One may express its natural domain of function definition like

  type DoSmthDomain[T1 >: ... <: ... , ... , TN >: ... <: ...] = (P1[T1, ... ,TN], PM[T1, ... ,TN])
  type DoSmthNaturalDomain = DoSmthDomain[T1, ..., TN] forSome {type T1 >: ... <: ... ; ... ; type TN >: ... <: ...}

And this application of forSome-types looks for me deserving to be supported in some form in upcoming versions of language.
Also in related discussions I could see quite old topic with thoughts that existentiales should gone together with “mandatory generic parameters” (which should be eventually fully replaced with inner abstract types, if I got that correctly).
In that case I could imagine that DoSmthDomain[_, ... , _] would be somehow expressed fully via abstract types (without generic parameters at all) and then that wildcard would become available just as DoSmthDomain and we would get that existentiales “for granted” (as I believe was promised in that old topic)

But for now as far as I know one could replace structural holder with abstract types with some parametrised alias, but not wise versa. So it looks like support of such DoSmthDomain[_, ... , _] still could be useful for now.

Perhaps it could/need to be limited to not participate in further types composition and boundaries - and be only available as a type for defining methods arguments (just to reduce generic methods signatures size in some cases). But I believe that even in such very limited form it still could be quite useful (even knowing that Seq[DoSmthDomain[_, ... , _]] should be prohibited, as example of types composition)

Also statement from documentation

Existential types largely overlap with path-dependent types, so the gain of having them is relatively minor.

was not fully clear for me, so I was trying to explain my self what actually that relationship could be by attempting to rewrite some hypothetical forSome-usecase into path-depended one.
In fact I was “successful” unless I was staying in Scala 2.12 but face with recursive refinement problem when I’ve switched to Dotty, and then need to slightly rewrite it again. (Remarkable that Dotty rewrite looks also like valid Scala 2 code, but in fact it is not …)

Here is code that I’ve tried (full version here, rewrite for Dotty)

    type KeyTyper = { type Key <: Scoped.ScopingSetting[Key] with Scoped }
    def keyTyper[PKey <: Scoped.ScopingSetting[PKey] with Scoped]() = new {type Key = PKey}

    def testDoSmth(): Unit = {
      trait TestKey extends Scoped.ScopingSetting[TestKey] with Scoped
      doSmth0(new TestKey{})
    }

    def doSmth0[PKey <: Scoped.ScopingSetting[PKey] with Scoped](key: PKey): Unit = {
      println(s"doSmth0: $key")
      doSmth(new {type Key = PKey})(key)
    }

    def doSmth(keyType: KeyTyper)(key: keyType.Key): Unit = {
      println(s"doSmth: $key")
      doSmth2(keyType)(key)
    }

    def doSmth2[PKey <: Scoped.ScopingSetting[PKey] with Scoped](keyType: KeyTyper { type Key = PKey })(key: keyType.Key): Unit = {
      println(s"doSmth2: $key")
      doSmth3(key)
    }

    def doSmth3[PKey <: Scoped.ScopingSetting[PKey] with Scoped](key: PKey): Unit = {
      println(s"doSmth3: $key")
      doSmth4(keyTyper[PKey]())(key)
    }

Here idea was close to that one from previous post - find some way to “extract” complexity of generic method signature into some type alias.
It was chosen some “non invasive” approach - instead of replacing/boxing existential with wrapper, just supply it’s type in some separate parameter (potentially it could be target to become erased)
Here is that problem that I could observe in current Dotty master branch (complete example on scastie, working rewrite for Dotty)

scala>     trait Scoped
     |
     |     object Scoped {
     |       trait ScopingSetting[S <: ScopingSetting[S]]
     |     }
     |
     |     type KeyTyper = { type Key <: Scoped.ScopingSetting[Key] with Scoped }
7 |    type KeyTyper = { type Key <: Scoped.ScopingSetting[Key] with Scoped }
  |                                                        ^^^
  |                             forward reference in refinement is deprecated
// defined trait Scoped
// defined object Scoped
// defined alias type KeyTyper
   = {z1 => Object{Key <: Scoped.ScopingSetting[LazyRef(z1.Key)] & Scoped}}

So aside question here: is it supposed as new expected behavior for refinements ? (what was the problem with reclusive types in pure refinements?)
(FYI: when I was trying to feed complete snippet in Dotty repl from master branch it just hangs forever in contrast to Dotty from scastie which eventually shows some error)