Discussion for SIP-51: drop 2.13 library forwards binary compatibility

Hi All

The hasn’t been a forum thread about SIP-51 (Drop Forwards Binary Compatibility of the Scala 2.13 Standard Library) so far, so here it is.

I’m not going to repeat what’s in the SIP but I can share a couple of thoughts about implications.

First of all, the proposed behavior can be tested out in current sbt using the following settings:

// Update scala-library on the dependencyClasspath according to the dependency tree,
// instead of keeping it pinned to the scalaVersion.
scalaModuleInfo := scalaModuleInfo.value.map(_.withOverrideScalaVersion(false)),

// When invoking the scala compiler, prevent zinc from setting `-bootclasspath scala-library.jar`
// (with the scala-library of scalaVersion).
// Instead, include the scala-library from the dependencyClasspath in the ordinary `-classpath`.
classpathOptions := classpathOptions.value.withAutoBoot(false).withFilterLibrary(false),

// Running an application through sbt does not fork, it uses a custom class loader within the
// JVM of sbt. By default, this class loader uses the scala-library according to scalaVersion.
// https://www.scala-sbt.org/1.x/docs/In-Process-Classloaders.html
classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.Flat,

sbt will then update the standard library according to the dependency tree. The Scala compiler version remains unchanged and is defined by scalaVersion. I believe this is how the change should be implemented, we want to leave users in control of the Scala compiler version.

As shown in the SIP, there is a way that users can experience a failure after we implement this change. When a new verison of a library L starts using new scala-library API and someone updates to that version of L while remaining on an old sbt, the scala-library on the classpath stays according to the user’s scalaVersion and may be missing the new API. This can result in compile-time and run-time errors.

I think it’s not possible to check for this and show an early warning or error to the user.

  • We can encourage library authors to write in their release notes that users need to upgrade sbt
  • There could be a cooldown period between the sbt release that implements the change and the first Scala 2.13 release which actually takes advantage of it (i.e., breaks forwards binary compatibility).
9 Likes

Thanks for the summary and pointers to how to test this already! @lrytz

I’m very positive towards dropping fw compat to allow for implementing performance optimizations etc.

It would be good to hear more voices from the community about views on this? (Previously the Reply-button on your post was hidden due to config issue. but now everybody should see the Reply button…)

Given the size of scala-library, if I recall correctly class loader layering adds significant performance boost in various scenarios, basically caching JIT across subprojects and command invocations.

scalaLibraryVersion task?

I wonder if it’s possible to create a task like scalaLibraryVersion, which would pick up the version found in update graph, so users can interactively find out the version discrepancy between the requested scalaVersion (the version of Scalac used?) vs the scala-library.jar.

Layered classloader

If that’s possible, could we keep classLoaderLayeringStrategy such that the scala-library classloader would then be implemented based on scalaLibraryVersion?

scala-libary takes a few seconds to JIT, so repeating that over and over adds to significant overhead to compilation in various scenarios like testQuick. testQuick would basically pause for 2~3s trying to JIT scala-library and testing libraries if you flatten the classloader. Turbo mode (layers test libraries too) would start running test with 0s pause.

Taking a stroll back to sbt 0.13

As a historical note, sbt 0.13.0 actually used to work as SIP-51 proposes and treats scala-library as a normal library:

  • sbt no longer overrides the Scala version in dependencies. This allows independent configurations to depend on different Scala versions and treats Scala dependencies other than scala-library as normal dependencies. However, it can result in resolved versions other than scalaVersion for those other Scala libraries.

This created confusions among the users as seen in In SBT 0.13, does scalaVersion still control the version of scala used for compile, run and test? (Mar, 2014), automatically provide dependencyOverrides for all scala-lang modules (Nov, 2015) and became the impetus to add Eviction Warning feature, which we’ve now replaced by Eviction Error because there were too many false positives for other libraries. On #2286 I sort of tried to question both sides, to which Paulp wrote:

There are three jars of interest: library, reflect, compiler. Anyone who mixes these jars across versions is doing it wrong. That you can dream up multipronged situations like a bug fixed in the library while a new bug was introduced into the compiler and there is no workaround in either case is like planning around asteroid strikes. There is no reason to present this as yet another matter for individual configuration. Let the hypothetical people who need to override the behavior do so by performing some ridiculous hack of the sort I’m always having to do.

All this predated Miles adding actual Scala version overriding to support Typelevel Scala in sbt/sbt#2634, which landed as sbt 0.13.12 in July 2016.

Taking on the tiger again

This is not to discourage SIP-51. But we now have the hindsight of what confusion may ensue again, not to mention the potential disruption to build tools, including Bazel.

To keep the alignment of compiling Scala version and scala-library, would it make sense for example to fail the build instead when we detect that scalaVersion not equal to scalaLibraryVersion? This way, the user would be aware at the point of bumping up the library dependency.

2 Likes

Would this only display the verison, or could users set it like dependencyOverrides? Irrespectively, dependencyOverrides should work for scala-library.

I was looking into this yesterday, here’s my WIP: Use scala library from classpath on 2.13 by lrytz · Pull Request #7416 · sbt/sbt · GitHub

I tested it locally and it seems to work

object A extends App { println(scala.util.Properties.versionString) }

// sbt:sbtproj> run
// [info] running A
// version 2.13.6

Does the change look right to you?

About running Scala compiler 2.13.n with a library 2.13.(n+x) on the compile-time classpath:

  • I observed that current sbt updates the Scala (2.13) library according to the dependency graph by default when using Scala 3. So we already have experience with a Scala compiler that runs against varying versions of the library.
  • We’re not changing the Scala library on the compiler’s runtime classpath. Mixing up different versions there would break (the compiler is built with the optimizer, it has code inlined from the standard library). Maybe that’s what paulp was referring to.
  • We can probably set some tests to run older compilers on new libraries.
  • The change doesn’t affect projects with a dependency on scala-compiler.jar. They need to be fully cross-versioned as that jar is not binary compatible. (see my post below)

There will probably be some confusion caused by the change. But we’re also removing a special case and aligning the way scala-library is handled with everything else. We’re not changing it for the sake of consistency, but for unfreezing the standard library. I think it’s the right tradeoff.

What do you have in mind here? The work needed to make sure all build tools support the change?

1 Like

After more testing and thinking, I found some issues related to the fact that scala-reflect and scala-compiler are built using -opt-inline-from:scala/** (inlining from scala-library).

  • The console task runs the scalaVersion of the REPL. On the runtime classpath, the scala-library version needs to match the REPL (code was inlined into the REPL). On REPL’s compile-time classpath we might want the updated scala-library for consistency with the compile task. But this is broken: using new definitions in the updated library only works when compiling the RPEL line, but then fails to run.

  • Using scala-relfect (libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value). If another dependency forces an update of scala-library, the bytecode in scala-reflect may crash at runtime (NoSuchMethodError).

  • The same issue would show up in projects that use scala-compiler in some way (and probably also scalap), for example when using Scala as a JSR 223 scripting engine.

This needs to be addressed. Maybe your idea to fail the build could come into play here. If the classpath contains scala-reflect or scala-compiler and the version doesn’t match scala-library, the user needs to adjust the scalaVersion.

For the REPL, we could emit a warning and use a scalaInstance based on the updated scala-library version.

I personally think failing the build and forcing people to bump their scala point version is fine.

Arguably forcing an explicit build file change is even better than the alternative of automatically bumping the point version for them, because in that case the scalaVersion specified in the build file doesnt match up with the scala library version actually used.

From Mill’s perspective, we can probably implement whatever is decided without any issue

I think the change looks good, but I haven’t tried using it myself yet.

I don’t think we should do that, it would mean that the dependencies enforce a minimal Scala compiler version. There can be good reasons a project needs to keep using an older Scala compiler, for example a regression in the compiler or a missing dependency for the new version (compiler plugin).

Scala 3 introduced LTS releases to address similar concerns.

For projects that have a dependency on scala-reflect or scala-compiler I don’t see another option though.

For future Scala 2.13 releases we can consider to disable inlining between library/reflect/compiler jars. IIRC, last time I checked a compiler built with the optimizer was measurably faster (running on HotSpot). But I don’t think I ever measured how much of that speedup is thanks to inlining from the library into the compiler.

Sync library / reflect / compiler versions

As discussed above, due to inlining we should make sure to keep scala-library, scala-reflect and scala-compiler versions in sync (for projects that have a dependency on scala-reflect or scala-compiler).

This problem is in fact not new as pointed out by @dwijnand. For example in Akka modules need to be kept at the same minor version.

Unfortunately there is no general solution for this right now in sbt

  • Coursier has a “SameVersion” rule that achieves the desired result

    $> cs resolve org.scala-lang:scala-library:2.13.12 org.scala-lang:scala-reflect:2.13.10 --rules 'SameVersion(org.scala-lang:scala-library,org.scala-lang:scala-reflect,org.scala-lang:scala-compiler)'
    org.scala-lang:scala-library:2.13.12:default
    org.scala-lang:scala-reflect:2.13.12:default
    
  • Maven BOM (bill of materials) seems to be somewhat related but not exactly what we need

Macros

New observation: unfreezing the standard library will cause Scala 2 macros to break. A macro compiled with Scala 2.13.new cannot be used with a Scala 2.13.old compiler.

In detail: when running the compiler there are two unrelated classpaths involved

  • the JVM classpath / runtime classpath, used by the JVM to execute the compiler
  • the compilation classpath, used by the compiler to look up symbols referenced by the code being compiled

Macros are looked up by the compiler in the compilation classpath and then dynamically loaded and executed. A macro compiled with 2.13.new can invoke a new method in scala-library that doesn’t exist in 2.13.old, leading to a NoSuchMethodException.

As an example I’m using ExecutionContext.opportunistic which was added in 2.13.4. The method is not public, but we can use a structural type to access it as explained in the scaladocs. With SIP-51 such an addition could be made public.

// compile using 2.13.4 or newer

import scala.reflect.macros.blackbox.Context
import language.experimental.macros

object A {
  def foo(x: Int): Int = macro impl

  def impl(c: Context)(x: c.Expr[Int]): c.Expr[Int] = {
    import c.universe._

    val ec = (scala.concurrent.ExecutionContext: {def opportunistic: scala.concurrent.ExecutionContextExecutor}).opportunistic
    println(ec)

    c.Expr(q"2 + $x")
  }
}
// compile using 2.13.1

object Test {
  def main(args: Array[String]): Unit = println(A.foo(40))
}

This leads to

[error] /Users/luc/code/tmp/sbtproj2/b/src/main/scala/Test.scala:3:18: exception during macro expansion:
[error] java.lang.NoSuchMethodException: scala.concurrent.ExecutionContext$.opportunistic()

Ways forward

  • The “family lockstep upgrade” requirement is due to inlining. We could stop inlining between library / reflect / compiler in a future 2.13 release, then it would be fine if sbt upgrades scala-library to a newer version without changing scala-reflect.

  • The macro issue is unrelated to inlining. I think there is no way out, if a macro was compiled with 2.13.n it can only run on compilers 2.13.n or newer.

  • The problem of running a REPL from sbt (console task) mentioned earlier is similar to the situation with macros. The REPL’s runtime classpath is according to scalaVersion, but the compile time classpath may contain a newer scala-library. A line of code can reference a new method that doesn’t exist in the old library. The code successfully compiles and is then loaded reflectively to run on the REPL’s runtime classpath, causing a NoSuchMethodException.

Givn these difficulties, I’m starting to come around to @lihaoyi’s proposal to fail the build and require users to upgrade the scalaVersion according to their dependencies. (I wonder how it would handle nightly Scala builds).

7 Likes

I finally submitted a PR to sbt to implement the necessary changes.

On Scala 2

  • Users are required to update their scalaVersion if some dependency pulls in a newer scala-library
  • All artifacts of a Scala release (library, reflect, compiler, …) are kept at the same version using the new csrSameVersions setting

On Scala 3

  • The scala-library on the runtime classpath while executing the compiler is updated to the version present on the dependency classpath

I think these are also the changes that need to be implemented in other build tools.

4 Likes

TL;DR;

This is a bad idea. Instead, bump the version of the Scala library to 2.14, but keep the 2.13 binary compatibility platform.


I think this is a bad idea. I’m mainly discussing the impact for the Scala 2.13 world. For Scala 3.x the story might be different.

In the Scala 2.13 world, some users are forced to stay there and have larger and probably more legacy applications. Sometimes with strange requirements and dependencies that are unmaintained upstream. When critical issues need to be fixed, dependencies are sometimes vendored in. Tooling is not always on the latest versions, too.

So what are the issue?

  • Breaking the binary compatibility scheme in between the smallest version increment the Scala library has.

Which is the minor version for Scala 2.13. Especially, since users were promised, that the releases inside the smallest version increment (2.13.x) will going to be backward and forward binary compatible.

  • Forcing a recent Scala library version at compile-time will block any compiler plugin not yet released for that Scala library.

But using an older Scala library at compile time means, many updates can’t be made, resulting in unfixed bugs and security issues.

  • Targetting an older Scala library version (for compatibility) means, we also need to use that older version of the Scala compiler.

  • Missing mindset and best practices to differentiate between compile-time vs. runtime version of the Scala library.

Currently, Scala 2.13 libraries try to use the latest library version. That is the best practice and was perfectly ok due to the forward compatibility promise. That means, these libraries were also compiled against that version. With the new scheme, downstream projects will also be forced the use at least that new versions for compilation.

In source-based project it’s rather simple. Just make sure, your code still compiles with all supported versions. In our binary world, it’s a bit more complicated. But it essentially comes down to these two rules:

  • compile to the lowest version you want to support
  • run with the most recent version you can

The dependency distribution system we use, the Maven repository and the used pom format, already support this. All build tools support this. But currently almost no library projects in the Scala ecosystem makes use of that distinction.


As a consequent, we shouldn’t break the 2.13 release line and the binary compatibility story. It’s that simple.

And here is what my sugested compromise looks like:

  • Release the first non-forward compatible Scala library as 2.14 (instead of breaking forward compatibility in Scala 2.13.17)
  • Keep backward compatibility to 2.13
  • Keep the _2.13 artifact suffix and the 2.13 binary platform.
  • Communicate the change in 2.14 and add guidance for library authors to which Scala library they should compile. Teach them the difference between compile-time and runtime.

With that scheme, upgrading to the latest Scala 2.14 version at runtime should be always possible due to their backward compatibility to Scala 2.13. But libraries can be build against 2.13 to avoid downstream binary breakage.

If you think, bumping to Scala 2.14 is a too much effort and requires tooling changes. That may be right. But I’m convinced that that is still much easier to manage than the upcoming issues which will bubble up when we break forward compatibility in between 2.13.x released at such a late point.

2 Likes

This change is certainly not as simple as adding a new feature and leaving everyone else in peace.

The main impact is that upgrading a dependency may force projects to bump their Scala 2.13 version. This could cause a project to get stuck if it depends on a fully cross-versioned library that is not available in the newer version (basically compiler plugins, maybe also some macros, but ordinary macros that don’t use internal compiler API are binary compatible).

For Scala 3 users (at least on sbt), nothing changes. The scala-library.jar was always resolved according to the dependency graph.

The difficulties and tradeoffs are documented and were discussed multiple times in the committee, at the end the SIP was approved.

Note that the change in sbt has shipped in 1.10.0, which came out in May, and we haven’t seen any issues so far. But I get that slow moving / legacy projects are not on the latest sbt, which are the users that are more likely to be affected.

Your suggestion to move to Scala 2.14 is not an option in my opinion. It would look strange and need a lot of explanation, and it would force us to maintain one more Scala version besides 2.12, 2.13, 3 LTS and 3 next. The changes and additional maintenance required in the tooling landscape would also be a lot bigger.

Our recommendation for library authors won’t change, we encourage updating to the latest 2.13 version. The 2.13 compiler is evolving slowly (bugfixes, JVM compat, some usability / linting, Scala 3 migration helpers) and there’s no reason any project should remain on an older version. If project upgrades get stuck due to a missing compiler plugin, the community and the compiler maintainers will learn about it an we’ll find a solution.

4 Likes

A somewhat radical way to avoid issues here would be to actually start guaranteeing backwards compatibility for the internal compiler API of Scala 2.13 so full cross-versioning is no longer needed, it’s a constraint that might make sense at this point in the lifecycle of Scala 2.

5 Likes

Actually, there was a user on Discord recently that had some small issues with this and asked for help:

The user was using SBT 1.9.9 and Scala 2.13.13. They tried to update to SBT 1.10.5 and Scala 2.13.14 and got the following error:

[error] (documents / scalaInstance) expected `documents/scalaVersion` to be "2.13.15" or later,
[error] but found "2.13.14"; upgrade scalaVersion to fix the build.

As expected, they followed the instructions and updated Scala 2.13.15.
However, due to a bug on 2.13.15, their code stopped compiling.

Fortunately, in this case, the issue was just a fatal warning, but this is quite annoying: If some version of the compiler has a bug that does not warrant a quick next release (e.g. only affects very few cases), and all libraries start updating the Scala version, you effectively can’t update anything until the bug is fixed :confused:

I wonder if it would be possible to configure some override in SBT like “I want to use Scala 2.13.14, but with the 2.13.15 stdlib”, as some temporary workaround.

the issue was just a fatal warning … you effectively can’t update

This is a red herring. The whole point of -Wconf is to manage warnings.

Lints, by definition, are not specified. They are engineered to maximize convenience rather than inconvenience, but it is always necessary to assess warnings when upgrading.

Language changes, including binary compatibility of the library, fall under much stricter oversight.