Can we make adding a parameter with a default value binary compatible?

This seems reasonable, though because methods with overloaded names aren’t quite first-class w.r.t. other features, you couldn’t use an unhidden with-defaults approach at all if you already have, say,

def makePerson(first: String, last: String) = ???
def makePerson(entry: DbEntry, allowPartial: Boolean = false) = ???

Starting from that, you can’t even use the mechanism. Although you could if you allowed explicit unrolling of defaults (but it wouldn’t work for more than one default argument):

@unrolledDefault
def makePerson(entry: DbEntry, allowPartial: Boolean = false) = ???

// becomes
def makePerson(entry: DbEntry, allowPartial: Boolean) = ???
def makePerson(entry: DbEntry) = makePerson(entry, false)

But that can be done manually in the cases where it’s needed, trading off some source compatibility for binary compatibility.

It also won’t interact well even as is in hopefully-rare cases where an opaque type overload was already present.

opaque type Passport = String
def makePerson(first: String, last: String, passport: Passport) = ???
def makePerson(first: String, last: String) = ???

But this is a weird enough edge case that I don’t think it should derail the idea.

Yeah there definitely will be some limitations around overloading. What this proposal does is provide synthetic overloads for backwards compatibility, and if there are existing overloads then there’s always the possibility of a clash.

The proposal as written has this synthetic-forwarder-generation logic as opt-in via an annotation @telescopingDefaults, so “can’t use it on overloads” is a possible answer. We already lose a bunch of language features when overloads are present - e.g. result type inference, defining default values for each overload, etc. - so I feel like this can be an acceptable limitation that fits reasonably well into the other kinds of edge cases Scala already has.

Also, for many classes of overloads, replacing the overloaded methods with Magnet pattern implicit conversions is another possible workaround. That’s what I do throughout the com.lihaoyi ecosystem and it works well enough.

I wonder if this approach could break existing code in subtle ways.

Currently we can assume that two instances of a case class are the same if and only if their fields are the same.

With the proposed solution, this reasoning would no longer be sound, because there may be additional fields we are not aware of.

The scheme looks at first glance quite reasonable to me. Definitely worth following up, maybe leading to a pre SIP? I like unrolledDefaults as a name for the annotation.

3 Likes

There should be some examples with multiple parameter lists. An example where the default refers to a parameter from a earlier list. It seems it should work out.

For the case classes, you also need to override fromProduct, right? It could look something like this

object Person:
  @synthetic
  def fromProduct(p: Product): Person = p.productArity match
    case 2 =>
      Person(
        p.productElement(0).asInstanceOf[String],
        p.productElement(1).asInstanceOf[String],
      )
    case 3 =>
      Person(
        p.productElement(0).asInstanceOf[String],
        p.productElement(1).asInstanceOf[String],
        p.productElement(2).asInstanceOf[String],
      )
    case 4 =>
      Person(
        p.productElement(0).asInstanceOf[String],
        p.productElement(1).asInstanceOf[String],
        p.productElement(2).asInstanceOf[String],
        p.productElement(3).asInstanceOf[Option[String]],
      )

Shout out to @armanbilge who discovered this

I don’t think so. AFAIK, fromProduct is only used by Mirrors (typeclass derivation). So, if you want to support typeclass derivation you have to implement a custom fromProduct, otherwise you don’t need it.

So, if you want to support typeclass derivation you have to implement a custom fromProduct, otherwise you don’t need it.

Are you saying that by not implementing a custom fromProduct, you’re essentially prohibiting typeclass derivation? Shouldn’t then the recommended practice be to implement it? What do you know about what the users of your case class want to use it for? The chance that they will want working derivation is high.

2 Likes

Yes. I was still seeing things along the lines of the linked Pre-SIP, which explicitly ignored the derivation use-case because it was impossible to implement correctly with that approach.

But I agree that if there is a solution that works with the approach based on parameters with default values (as described in your post), that’s good to have!

@lihaoyi do you plan to move this idea forward yourself? Otherwise, if it is not too urgent, it seems like this could be a nice and self-contained subject for a student project next semester at EPFL (September-February). What do you think?

3 Likes

I don’t have immediate plans to move this forward. Feel free to commandeer the idea and turn it into a real project!

1 Like

Would it be possible to make this behavior the default?

That would be the best programmer experience, wouldn’t it? Not having to worry about binary compatibility anymore :relaxed:

3 Likes

Just wanted to bump this again, with another concrete use case I encountered:

My last update to the com.lihaoyi::mainargs library involves adding a new default parameter to a bunch of user-facing methods. As a result, a ton of method signatures needed to be duplicated and “manually telescoped” or “manually unrolled” to maintain binary compatibility

Some of these signatures were already duplicated twice for compatibility concerns in the past, and now are duplicated three times.

While extemely tedious, it is impossible for a library to simultaneously (a) make use of Scala language features, like default argument values and (b) provide a smooth user experience free from NoSuchMethodErrors and the like and (c) avoid this duplication. That puts library authors between a rock and a hard place, having to give up one of them:

  • Some libraries give up (a), limiting themselves to a subset of Scala that doesn’t use default arguments, and forcing additional builder-pattern boilerplate on all their users
  • Some libraries give up on (b), expecting that users will hit JVM LinkageErrors sometimes and be forced to recompile their un-changed source code against newer library versions
  • Some libraries give up on (c), and fill their implementation with boilerplate telescoping methods.

For Mainargs I’ve chosen to give up (c), and decided to live with the boilerplate in exchange for providing an optimal user experience. But these telescoping/unrolled binary compatibility shims are extremely mechanical, and it should be straightforward to automate their generation via a compiler plugin or annotation macro.

I don’t have any concrete implementation to show yet, just wanted to keep the conversation going as I encounter these cases in the wild

10 Likes

Scala stewards, please take note of this :pray:
This language change is one of the few that will improve Scala users lives the most, especially the library authors.
The efforts that need to be put up to ensure binary compatibility are painstaking. This would help a great deal.
/cc @Kordyjan

3 Likes

data-class lets you put an annotation (@since) on the first “new” paramter, which reduces the number of synthetics if the initial version already uses default arguments.

There are probably some tricky aspects in here. If we have

trait T {
  @telescopingDefaults def f(x: Int, y: Int = 1) = 0
  @synthetic def f(x: Int) = f(x, 1)
}

We want t.f(42) to compile to the non-synthetic overload. We also want to hide the synthetic one in IDEs and Scaladocs. But the method should probably still be there, for example for the Mixin phase to generate forwarders. But it looks all doable to me.

I tried a few examples around overriding and couldn’t find issues, it seems the scheme would work well. Existing subclasses would override the new synthetic method, newly compiled subclasses would not be source-compatible, so they have to be rewritten to override the new signature (and get an overriding synthetic method).

I wonder if this transformation could be done conpletely at the bytecode level. e.g. via ASM rather than via a compiler plugin. That would allow us to share the implementation between Scala2 and 3.

After all, generating bincompat forwarders seems purely a JVM-level concern, and the only thing Scala related is knowing how to call Scala default argument value methods inside the forwarders. Apart from that, the Scala compiler should not need to know about these forwarders at all and vice versa

1 Like

It’s also a Scala.js IR and Native IR concern. So you’d have to do the work 3 times.

1 Like

That’s true. I guess it might save effort doing it i the compiler then, though it would still need to be done twice for Scala 2 and 3

1 Like

Similarly, we should allow passing optional values without Some() wrappers, allowing fields to be later declared as optional without breaking existing code and without annoying syntax overhead. Wdyt?

I dream that the following code works in Scala:

def foo(a: String: b: Option[String], c: Option[Int]): String = ???

foo("bar", "baz", 5)
foo("bar", "baz", Some(5))
foo("bar", Some("baz"), 5)
foo("bar", Some("baz"), Some(5))
foo("bar", None, 5)
foo("bar", "baz", None)
foo("bar", None, None)

There’s always this temptation to make options easier, for example like you said

And these often work perfectly, as long as the types are know and there’s only one level of option:

def foo(opt: Option[Option[Int]]) = ???

foo(None) // Is this Some(None) or None ?

def bar[T](opt: Option[T]) = opt

def baz[T](x: T) =
  bar[T](x) // implicit wrapping

// my intuition:
bar[Option[Int]](None) // None // not Some(None)

baz[Option[Int]](None) // bar(Some(None)) => Some(None) // not None

But you’ll notice if you inline baz, you get bar !

For this reason, I believe we can’t, or at least shouldn’t add utilities like the one proposed
(But I was victim of similar ideas many times, so I understand the appeal !)

3 Likes