Discussion for SIP-51: drop 2.13 library forwards binary compatibility

Hi All

The hasn’t been a forum thread about SIP-51 (Drop Forwards Binary Compatibility of the Scala 2.13 Standard Library) so far, so here it is.

I’m not going to repeat what’s in the SIP but I can share a couple of thoughts about implications.

First of all, the proposed behavior can be tested out in current sbt using the following settings:

// Update scala-library on the dependencyClasspath according to the dependency tree,
// instead of keeping it pinned to the scalaVersion.
scalaModuleInfo := scalaModuleInfo.value.map(_.withOverrideScalaVersion(false)),

// When invoking the scala compiler, prevent zinc from setting `-bootclasspath scala-library.jar`
// (with the scala-library of scalaVersion).
// Instead, include the scala-library from the dependencyClasspath in the ordinary `-classpath`.
classpathOptions := classpathOptions.value.withAutoBoot(false).withFilterLibrary(false),

// Running an application through sbt does not fork, it uses a custom class loader within the
// JVM of sbt. By default, this class loader uses the scala-library according to scalaVersion.
// https://www.scala-sbt.org/1.x/docs/In-Process-Classloaders.html
classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.Flat,

sbt will then update the standard library according to the dependency tree. The Scala compiler version remains unchanged and is defined by scalaVersion. I believe this is how the change should be implemented, we want to leave users in control of the Scala compiler version.

As shown in the SIP, there is a way that users can experience a failure after we implement this change. When a new verison of a library L starts using new scala-library API and someone updates to that version of L while remaining on an old sbt, the scala-library on the classpath stays according to the user’s scalaVersion and may be missing the new API. This can result in compile-time and run-time errors.

I think it’s not possible to check for this and show an early warning or error to the user.

  • We can encourage library authors to write in their release notes that users need to upgrade sbt
  • There could be a cooldown period between the sbt release that implements the change and the first Scala 2.13 release which actually takes advantage of it (i.e., breaks forwards binary compatibility).

Thanks for the summary and pointers to how to test this already! @lrytz

I’m very positive towards dropping fw compat to allow for implementing performance optimizations etc.

It would be good to hear more voices from the community about views on this? (Previously the Reply-button on your post was hidden due to config issue. but now everybody should see the Reply button…)

Given the size of scala-library, if I recall correctly class loader layering adds significant performance boost in various scenarios, basically caching JIT across subprojects and command invocations.

scalaLibraryVersion task?

I wonder if it’s possible to create a task like scalaLibraryVersion, which would pick up the version found in update graph, so users can interactively find out the version discrepancy between the requested scalaVersion (the version of Scalac used?) vs the scala-library.jar.

Layered classloader

If that’s possible, could we keep classLoaderLayeringStrategy such that the scala-library classloader would then be implemented based on scalaLibraryVersion?

scala-libary takes a few seconds to JIT, so repeating that over and over adds to significant overhead to compilation in various scenarios like testQuick. testQuick would basically pause for 2~3s trying to JIT scala-library and testing libraries if you flatten the classloader. Turbo mode (layers test libraries too) would start running test with 0s pause.

Taking a stroll back to sbt 0.13

As a historical note, sbt 0.13.0 actually used to work as SIP-51 proposes and treats scala-library as a normal library:

  • sbt no longer overrides the Scala version in dependencies. This allows independent configurations to depend on different Scala versions and treats Scala dependencies other than scala-library as normal dependencies. However, it can result in resolved versions other than scalaVersion for those other Scala libraries.

This created confusions among the users as seen in In SBT 0.13, does scalaVersion still control the version of scala used for compile, run and test? (Mar, 2014), automatically provide dependencyOverrides for all scala-lang modules (Nov, 2015) and became the impetus to add Eviction Warning feature, which we’ve now replaced by Eviction Error because there were too many false positives for other libraries. On #2286 I sort of tried to question both sides, to which Paulp wrote:

There are three jars of interest: library, reflect, compiler. Anyone who mixes these jars across versions is doing it wrong. That you can dream up multipronged situations like a bug fixed in the library while a new bug was introduced into the compiler and there is no workaround in either case is like planning around asteroid strikes. There is no reason to present this as yet another matter for individual configuration. Let the hypothetical people who need to override the behavior do so by performing some ridiculous hack of the sort I’m always having to do.

All this predated Miles adding actual Scala version overriding to support Typelevel Scala in sbt/sbt#2634, which landed as sbt 0.13.12 in July 2016.

Taking on the tiger again

This is not to discourage SIP-51. But we now have the hindsight of what confusion may ensue again, not to mention the potential disruption to build tools, including Bazel.

To keep the alignment of compiling Scala version and scala-library, would it make sense for example to fail the build instead when we detect that scalaVersion not equal to scalaLibraryVersion? This way, the user would be aware at the point of bumping up the library dependency.


Would this only display the verison, or could users set it like dependencyOverrides? Irrespectively, dependencyOverrides should work for scala-library.

I was looking into this yesterday, here’s my WIP: Use scala library from classpath on 2.13 by lrytz · Pull Request #7416 · sbt/sbt · GitHub

I tested it locally and it seems to work

object A extends App { println(scala.util.Properties.versionString) }

// sbt:sbtproj> run
// [info] running A
// version 2.13.6

Does the change look right to you?

About running Scala compiler 2.13.n with a library 2.13.(n+x) on the compile-time classpath:

  • I observed that current sbt updates the Scala (2.13) library according to the dependency graph by default when using Scala 3. So we already have experience with a Scala compiler that runs against varying versions of the library.
  • We’re not changing the Scala library on the compiler’s runtime classpath. Mixing up different versions there would break (the compiler is built with the optimizer, it has code inlined from the standard library). Maybe that’s what paulp was referring to.
  • We can probably set some tests to run older compilers on new libraries.
  • The change doesn’t affect projects with a dependency on scala-compiler.jar. They need to be fully cross-versioned as that jar is not binary compatible. (see my post below)

There will probably be some confusion caused by the change. But we’re also removing a special case and aligning the way scala-library is handled with everything else. We’re not changing it for the sake of consistency, but for unfreezing the standard library. I think it’s the right tradeoff.

What do you have in mind here? The work needed to make sure all build tools support the change?

1 Like

After more testing and thinking, I found some issues related to the fact that scala-reflect and scala-compiler are built using -opt-inline-from:scala/** (inlining from scala-library).

  • The console task runs the scalaVersion of the REPL. On the runtime classpath, the scala-library version needs to match the REPL (code was inlined into the REPL). On REPL’s compile-time classpath we might want the updated scala-library for consistency with the compile task. But this is broken: using new definitions in the updated library only works when compiling the RPEL line, but then fails to run.

  • Using scala-relfect (libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value). If another dependency forces an update of scala-library, the bytecode in scala-reflect may crash at runtime (NoSuchMethodError).

  • The same issue would show up in projects that use scala-compiler in some way (and probably also scalap), for example when using Scala as a JSR 223 scripting engine.

This needs to be addressed. Maybe your idea to fail the build could come into play here. If the classpath contains scala-reflect or scala-compiler and the version doesn’t match scala-library, the user needs to adjust the scalaVersion.

For the REPL, we could emit a warning and use a scalaInstance based on the updated scala-library version.

I personally think failing the build and forcing people to bump their scala point version is fine.

Arguably forcing an explicit build file change is even better than the alternative of automatically bumping the point version for them, because in that case the scalaVersion specified in the build file doesnt match up with the scala library version actually used.

From Mill’s perspective, we can probably implement whatever is decided without any issue

I think the change looks good, but I haven’t tried using it myself yet.

I don’t think we should do that, it would mean that the dependencies enforce a minimal Scala compiler version. There can be good reasons a project needs to keep using an older Scala compiler, for example a regression in the compiler or a missing dependency for the new version (compiler plugin).

Scala 3 introduced LTS releases to address similar concerns.

For projects that have a dependency on scala-reflect or scala-compiler I don’t see another option though.

For future Scala 2.13 releases we can consider to disable inlining between library/reflect/compiler jars. IIRC, last time I checked a compiler built with the optimizer was measurably faster (running on HotSpot). But I don’t think I ever measured how much of that speedup is thanks to inlining from the library into the compiler.

Sync library / reflect / compiler versions

As discussed above, due to inlining we should make sure to keep scala-library, scala-reflect and scala-compiler versions in sync (for projects that have a dependency on scala-reflect or scala-compiler).

This problem is in fact not new as pointed out by @dwijnand. For example in Akka modules need to be kept at the same minor version.

Unfortunately there is no general solution for this right now in sbt

  • Coursier has a “SameVersion” rule that achieves the desired result

    $> cs resolve org.scala-lang:scala-library:2.13.12 org.scala-lang:scala-reflect:2.13.10 --rules 'SameVersion(org.scala-lang:scala-library,org.scala-lang:scala-reflect,org.scala-lang:scala-compiler)'
  • Maven BOM (bill of materials) seems to be somewhat related but not exactly what we need


New observation: unfreezing the standard library will cause Scala 2 macros to break. A macro compiled with Scala 2.13.new cannot be used with a Scala 2.13.old compiler.

In detail: when running the compiler there are two unrelated classpaths involved

  • the JVM classpath / runtime classpath, used by the JVM to execute the compiler
  • the compilation classpath, used by the compiler to look up symbols referenced by the code being compiled

Macros are looked up by the compiler in the compilation classpath and then dynamically loaded and executed. A macro compiled with 2.13.new can invoke a new method in scala-library that doesn’t exist in 2.13.old, leading to a NoSuchMethodException.

As an example I’m using ExecutionContext.opportunistic which was added in 2.13.4. The method is not public, but we can use a structural type to access it as explained in the scaladocs. With SIP-51 such an addition could be made public.

// compile using 2.13.4 or newer

import scala.reflect.macros.blackbox.Context
import language.experimental.macros

object A {
  def foo(x: Int): Int = macro impl

  def impl(c: Context)(x: c.Expr[Int]): c.Expr[Int] = {
    import c.universe._

    val ec = (scala.concurrent.ExecutionContext: {def opportunistic: scala.concurrent.ExecutionContextExecutor}).opportunistic

    c.Expr(q"2 + $x")
// compile using 2.13.1

object Test {
  def main(args: Array[String]): Unit = println(A.foo(40))

This leads to

[error] /Users/luc/code/tmp/sbtproj2/b/src/main/scala/Test.scala:3:18: exception during macro expansion:
[error] java.lang.NoSuchMethodException: scala.concurrent.ExecutionContext$.opportunistic()

Ways forward

  • The “family lockstep upgrade” requirement is due to inlining. We could stop inlining between library / reflect / compiler in a future 2.13 release, then it would be fine if sbt upgrades scala-library to a newer version without changing scala-reflect.

  • The macro issue is unrelated to inlining. I think there is no way out, if a macro was compiled with 2.13.n it can only run on compilers 2.13.n or newer.

  • The problem of running a REPL from sbt (console task) mentioned earlier is similar to the situation with macros. The REPL’s runtime classpath is according to scalaVersion, but the compile time classpath may contain a newer scala-library. A line of code can reference a new method that doesn’t exist in the old library. The code successfully compiles and is then loaded reflectively to run on the REPL’s runtime classpath, causing a NoSuchMethodException.

Givn these difficulties, I’m starting to come around to @lihaoyi’s proposal to fail the build and require users to upgrade the scalaVersion according to their dependencies. (I wonder how it would handle nightly Scala builds).


I finally submitted a PR to sbt to implement the necessary changes.

On Scala 2

  • Users are required to update their scalaVersion if some dependency pulls in a newer scala-library
  • All artifacts of a Scala release (library, reflect, compiler, …) are kept at the same version using the new csrSameVersions setting

On Scala 3

  • The scala-library on the runtime classpath while executing the compiler is updated to the version present on the dependency classpath

I think these are also the changes that need to be implemented in other build tools.