Can we make adding a parameter with a default value binary compatible?

Similarly, we should allow passing optional values without Some() wrappers, allowing fields to be later declared as optional without breaking existing code and without annoying syntax overhead. Wdyt?

I dream that the following code works in Scala:

def foo(a: String: b: Option[String], c: Option[Int]): String = ???

foo("bar", "baz", 5)
foo("bar", "baz", Some(5))
foo("bar", Some("baz"), 5)
foo("bar", Some("baz"), Some(5))
foo("bar", None, 5)
foo("bar", "baz", None)
foo("bar", None, None)

There’s always this temptation to make options easier, for example like you said

And these often work perfectly, as long as the types are know and there’s only one level of option:

def foo(opt: Option[Option[Int]]) = ???

foo(None) // Is this Some(None) or None ?

def bar[T](opt: Option[T]) = opt

def baz[T](x: T) =
  bar[T](x) // implicit wrapping

// my intuition:
bar[Option[Int]](None) // None // not Some(None)

baz[Option[Int]](None) // bar(Some(None)) => Some(None) // not None

But you’ll notice if you inline baz, you get bar !

For this reason, I believe we can’t, or at least shouldn’t add utilities like the one proposed
(But I was victim of similar ideas many times, so I understand the appeal !)

3 Likes

I started taking a crack at Scala 2 and 3 compiler plugin implementations of a @Unroll annotation:

The goal being to take something like this

  @unroll.Unroll("n")
  def foo(s: String, n: Int = 1) = println(s * n)

And unroll it into something like this

  def foo(s: String, n: Int = 1) = println(s * n)
  def foo(s: String) = foo(s, n = 1)

I have some basic tests passing in Scala 2, but I’m stuck on the following crash in Scala 3

[error] value foo$default$3 is not a member of unroll.Unrolled - did you mean (Unrolled.this : unroll.Unrolled).foo$default$3?
[error] value foo$default$4 is not a member of unroll.Unrolled - did you mean (Unrolled.this : unroll.Unrolled).foo$default$4?
[error] value foo$default$4 is not a member of unroll.Unrolled - did you mean (Unrolled.this : unroll.Unrolled).foo$default$4?

Anyone here have Scala 3 compiler expertise to advise what I may be doing wrong? The synthetic method i want to generate should be pretty straightforward, but I’m having trouble figuring out exactly what the Scala 3 compiler expects of me here. The repository has a readme with the relevant commands to run, if you want to try it out yourself GitHub - lihaoyi/unroll

1 Like

this was resolved on Discord - but for those curious, Names in Scala 3 compiler are not just interned strings, but a tree structure, so default names have a specific constructor or else they won’t be found as a member

Exactly. The concept is called semantic names. It prevents you from accidental name collisions and misinterpretations.

1 Like

To follow up here, I have an implementation of @Unroll (using @odersky’s suggested naming) working and passing tests in all of Scala 2.12, Scala 2.13, and Scala 3:

As far as I can tell, everything works:

  1. Unrolled methods are binary compatible to adding parameters with defaults, whether in classes, objects, or traits.

  2. Unrolled class primary constructors and secondary constructors are binary compatible to adding default parameters

  3. Unrolled curried methods work (for now we only allow unrolling the first parameter list, but unrolling other lists should be easy to add)

  4. Unrolled case classes are perfectly binary compatible to adding default parameters in Scala 3, and are (surprisingly) almost perfectly binary compatible in Scala 2 as well!

    1. new, apply and copy are unrolled
    2. Even in Scala 2 pattern matching goes through a different encoding that bypasses unapply and is binary compatible.
    3. Even MIMA is happy! (it “helpfully” ignores the signature change of unapply due to erasure, since you are free to call the method with the same signature, even though once you try to get the value out of the Option you’ll get a ClassCastException)

It’s tested via MIMA, via java.reflect, via manual classpath mangling to take run a downstream codebase compile against an older version of an upstream codebase and run it against a newer upstream codebase with additional parameters with defaults.

There are some limitations as expected: around overloading, around multiple parameter lists, etc. But these should not be a problem for 99% of use cases.

It’s not a lot of code - ~150LOC on Scala 3 and ~200LOC on Scala 2. If anyone has some time, I’d love to get some help reviewing the two implementations:

I haven’t tried it on Scala.js and Scala-Native yet, but given that the compiler plugins cut in just after Typer in all versions of Scala, i would expect it @Unroll to provide binary compatibility there as well. There’s probably some edge cases in method signatures I haven’t account for in the implementation so far, but it should be easy to fill those out as necessary.

It is a compiler plugin, which means once published it should be easy to begin using in projects even without upstreaming it into the core Scala distributions. But it’s such a big improvement to the developer experience of anyone maintaining and evolving libraries over time that I’d argue we probably should upstream it anyway.

8 Likes

While it’s good that this can be implemented in few lines in a compiler plugin, I think it is also a great case study for scala 3 annotation macros.

@unroll isn’t changing the developer experience or type-checking so it feels like the kind of thing that has been considered in-scope for scala 3 annotation macros.

Secondly, this illustrates a real case that there is demand for. Keeping compiler plugins in sync is labor we could avoid and spend on other projects for our community.

1 Like

Note for Scala 3 - you can run the phase before tasty and still not leak the API through separate compilation by using the Invisible flag on the generated methods

@lihaoyi does unroll generate fromProduct? From my brief look it doesn’t seem like it does.

Here are more details about fromProduct:

@sideeffffect as of now, no it doesn’t. But it should be easy to add, using the approach you suggested. Don’t see any reason why it wouldn’t work

One more update here: I have a POC using the @Unroll annotation in two of my projects - uPickle and MainArgs - that suffered greatly in binary-compat boilerplate. You can see the net diff below:

Generally everything works. MiMa passes on these two repos with all the manually-written forwarders removed. That’s hundreds of lines of boilerplate removed, and the flexibility to easily extend the various public APIs with additional default parameters without needing to worry about bincompat breakage or bincompat stub boilerplate

I’m hitting two issues in Scala-3.3.x/Scala.js and Scala-3.3.x/Scala-Native respectively:

@sjrd @WojciechMazur could I trouble you to take a look at the error above, along with the Scala 3 plugin transform (unroll/unroll/plugin/src-3/UnrollPhaseScala3.scala at main · com-lihaoyi/unroll · GitHub), and see if anything seems obvious? On my side I will try to minimize the failure, but if someone familiar with Scala.js/Scala-Native might have some insights that would save us some time. The fact that these crashers only happen on their respective non-JVM platforms indicate that Scala-JS/Native must be relying on some invariants that Scala-JVM. And the issue does not appear in Scala 2.x Js/Native

4 Likes

The implementation and analysis is complete enough that I have opened a SIP for further discussion

5 Likes