This post is inspired by recent work we’ve been doing to enforce binary compatibility in the com-lihaoyi
ecosystem, as well as the earlier discussion about adding some alternative to case classes which are binary compatible:
I’m personally not a fan of that approach. Binary compatibility is an important concern, but it’s an implementation limitation, not a semantic/language concern. Furthermore, that solution doesn’t apply to def
s, which suffer the same problem. This post proposes an alternative.
Why not withFoo?
Going all-in on the .withFoo
approach is very Java-esque: Basically the entire reason I want to write Scala is because I can write syntax that matches exactly what I mean like:
case class Person(first: String, last: String, country: String)
Person(first = "Haoyi", last = "Li", country="singapore")
Rather than the Java-style syntax that’s full of boilerplate and patterns to work around language weaknesses:
Person()
.withFirst("Haoyi")
.withLast("Li")
.withCountry("Singapore")
In other languages like Python or SQL, adding a field (or column) with a default value is largely a backwards-compatible operation. Can we do the same for Scala?
Principles
-
Given that binary compatibility is an implementation concern - it simply doesn’t exist in a program compiled all-at-once from source - we should not change the Scala Language or type system to accommodate it. Some kind of annotation would be ideal, given @odersky’s stated principle that annotations are for things that do not affect typechecking
-
We shouldn’t have to contort our Scala source code for binary compatibility concerns. That rules out the
.withFoo
automation in the earlier proposal, and also rules out the very tedious way we currently manually perform these operations today. I want to be able to say
case class Person(first: String, last: String)
And later evolve it to
case class Person(
first: String,
last: String,
country: String = "unknown"
)
or
case class Person(
first: String,
last: String,
country: String = "unknown",
number: Option[String] = None
)
Without breakage. After all, it’s (almost) source compatible, in other languages such a change would be backwards compatible, and I would like Scala to be up to that standard
-
The same solution should apply to both
case class
es and plaindef
s. Both of these currently allow parameters, allow parameters with defaults, and cause bincompat breakage when a new parameter with default is added. To a developer, these concepts are the same: something that takes arguments, possibly with defaults. Binary compatibility should be managed the same way for both -
We want to handle the case where someone adds a parameter to the right-side of a parameter list, with a default value. This is the case that is already (almost) source compatible, and is backwards-compatible in other languages like Python or SQL. We don’t need to handle more complex cases like changing parameter types or re-ordering parameters, which are universally backwards-incompatible across the programming landscape.
Proposal Sketch
We use a @telescopingDefaults
annotation (name is arbitrary) to automate the generation of “telescoping” methods and constructors.
Defs
To begin with, let’s consider a simpler scenario: def
s that we want to evolve with additional parameters. e.g. starting from:
def makePerson(first: String, last: String) = ???
To
@telescopingDefaults
def makePerson(
first: String,
last: String,
country: String = "unknown") = ???
To
@telescopingDefaults
def makePerson(
first: String,
last: String,
country: String = "unknown",
number: Option[String] = None) = ???
The @telescopingDefaults
annotation would generate the following forwarders:
def makePerson(
first: String,
last: String,
country: String = "unknown",
number: Option[String] = None) = ???
@synthetic
def makePerson(first: String, last: String, country: String) = makePerson(first, last, country)
@synthetic
def makePerson(first: String, last: String) = makePerson(first, last)
Thus, any bytecode which was compiled against earlier versions of def makePerson
with fewer parameters can continue to call those earlier signatures un-changed.
These definitions can be synthetic and hidden from the Scala compiler:
- Downstream bytecode being compiled against the latest version of
def makePerson
always has the most recent signature available to compile against and generate bytecode against. - But downstream bytecode compiled earlier against older versions of
Person
with fewer parameters, will continue to be able to call the@synthetic
forwarders, which will send the method call to the right place
Case Classes
Case classes can be handled similarly, given an annotated case class
:
@telescopingDefaults
case class Person(
first: String,
last: String,
country: String = "unknown",
number: Option[String] = None
)
We could generate the following additional code:
class Person(val first: String, val last: String, val country: String, val number: Option[String] = None){
@synthetic
def this(first: String, last: String, country: String){
this(first, last, country)
}
@synthetic
def this(first: String, last: String){
this(first, last)
}
def copy(
first: String = this.first,
last: String = this.last,
country: String = this.country,
phone: Option[String] = this.phone) = new Person(first, last, country, phone)
@synthetic
def copy(first: String, last: String, country: String) =
new Person(first, last, country)
@synthetic
def copy(first: String, last: String) =
new Person(first, last)
}
object Person{
def apply(
first: String,
last: String,
country: String = "unknown",
phone: Option[String] = None) = new Person(first, last, country, phone)
@synthetic
def apply(first: String, last: String, country: String) =
new Person(first, last, country)
@synthetic
def apply(first: String, last: String) =
new Person(first, last)
def unapply(p: Person): Person = p
}
Unlike def
s, which can only be called, there are three cases to consider for case class
es:
-
apply
/new
: These work similar todef
s above: as new parameters with defaults are added, the old signature is kept working via forwarders, so bytecode compiled against the old signatures can continue to work -
copy
: This is similar toapply
/new
above, except the copy method doesn’t care about the default values specified for the parameters: all paramsfoo
default tothis.foo
. However, we can still use the default values as an indicator to when we need to start caring about backwards compatibility: e.g. here we generate syntheticcopy
overloads only down to 2 parameters, since we do not need to provide binary compatibility to earlier versions ofPerson
-
unapply
: I think as of Scala 3 this will work right out of the box:unapply
no longer returnsOption[TupleN[...]]
as it did in Scala 2, and instead just returnsPerson
, with pattern matching just relying on the._1
._2
etc. fields to work. Thus, ap match{ case Person(first, last) => ???}
callsite compiled againstcase class Person(first: String, last: String)
should continue to work even whenPerson
has evolved intocase class Person(first: String, last: String, country: String = "unknown", number: Option[String] = None)
-
Notably, while binary-compatible, this will not be source compatible. That is unavoidable unless we are willing to (a) loosen the semantics of pattern matching to allow matching against prefixes of the full argument list or (b) go all in on pattern matching with named fields.
-
Regardless, binary compatibility is arguably much more important than source compatibility, as @sjrd argues in Designing libraries for source and binary compatibility.
-
What do people think? Are there any obvious blockers that I’m missing? I haven’t actually implemented this yet, but I’m wondering if the fundamental idea is sound.
The implementation of generating forwarding proxies seems relatively straightforward. And if it can avoid me constantly jumping through hoops manually writing forwarders or avoiding case class
es to preserve binary compatibility, it would definitely be worth investing in automation