Question about changes in the Scalac release cycle

Hello guys,

As some of you know, the Java community is planning some serious changes in the release cadence of OpenJDK (these are the minutes of the last JSR meeting). The most important part of the document is the following:

OpenJDK to move faster - they would like to move to a 6 month release cadence with releases in March and September with the next release of Java SE in early 2018.

This release cycle has now been approved.

I wonder if the Scala team at Lightbend is planning to make any change to the Scala release cadence any time soon? Also, does these news change the JDK that Scala 2.13 will target? From the linked document above, it looks like JDK10 will be out by early 2018, coinciding with the Scala 2.13 release.

/cc @SethTisue @lrytz @adriaanm @retronym


Java keeps backward binary compatibility across major releases. Scala doesn’t. I most certainly hope that Scala doesn’t get a new major version every 6 months!

1 Like

2.13 will run on Java 8, so it cannot depend on Java 9 or 10 features (JVM or JDK library). See here for an overview of 2.13:

We’re working on various aspects of JDK 9 support for 2.12 and 2.13, see

As SĂ©bastien says, given that Scala major releases are not binary compatible, a cadence of 6 months would be too demanding for our community.

I think my question has been misinterpreted :slight_smile: . I’m not proposing we follow suit — I agree that 6 months would be unreasonable. What I wondered is if these changes in the JDK release cycle would trigger either a standarization of how often we cut major and minor releases like the Java community has now or a change in the targeted JDK in Scala 2.13 (which Lukas has already answered; thanks!).

I think this is a good opportunity to be more concrete about the Scala release cycle. My impression is that it changes drastically from version to version (e.g. Scala 2.12 was really long, Scala 2.13 is expected to be short), and I believe it could be beneficial if we rather say “Scala major versions are released every 12 or 15 months”.

1 Like

This has been the case so far, but nothing says that Scala could not have new feature releases regularly without breaking BC, right? You could still have major releases every X years that break BC if needed.

Sure but Scala core would have to drop their policy on guaranteeing forward binary compatibility across minor releases for that to be useful. Otherwise you can’t even add any method in the library in minor releases, so you’re still waiting for the BC-breaking release for new features to happen.

A bit off-topic, but is forward binary-compatibility really all that useful? As you pointed out, it’s the reason we need major releases for basically any tiny change to the library. Of course, major releases don’t have to be binary incompatible, but everyone assumes so, including the Sbt cross-compilation eco-system. So small, incremental improvements can’t be shipped without a disruptive Scala major-release upgrade.

IMO forward bin compat is completely useless.

Except, in fact, for the benefit of Scala.js users. Currently when a new minor version of Scala is released, say 2.12.4, all we have to do in Scala.js is checkout an earlier version (say v0.6.20), do ++2.12.4 and then compile and publish the scalajs-compiler_2.12.4.jar for that version. Users of Scala.js can immediately benefit from 2.12.4, and at that point they still use scalajs-library_2.12.jar that was built from the source of the 2.12.3 std lib. This works because we have a guarantee that the 2.12.4 std lib is forward bin compat with 2.12.3. If we don’t have that guarantee, then users could get linking error with this scheme. They would have to wait for a new version of Scala.js whose scalajs-library.jar has been built from the sources of the 2.12.4 std lib.

I would say that’s not for the benefit of Scala.js users, but for its maintainers. The users couldn’t care less how you produced scala.js 2.12.4 :slight_smile: What about significant fixes in the standard library, the ones that can be made forward and backward compatible? Wouldn’t Scala.js users expect that fix if they’re on 2.12.4?

Nah it is for the benefits of the users. For us, we don’t really care. At worst we’re a bit more pressured into releasing a new version of Scala.js after each new minor version of Scala.

But the ones who are really annoyed are users: when a new minor version of Scala is released, they still can’t use it right away. They have to wait for us to get around to releasing an entire new version of Scala.js, which are typically weeks apart. Instead of just waiting a couple of hours (sometimes tens of minutes) before I just republish scalajs-compiler.jar.

An argument in favor of forward bincompat is that it simplifies the life of sbt plugin maintainers and users. If Scala doesn’t promise forward bincompat from now on, then plugin maintainers will be forced to compile with the minor Scala version sbt is compiled with. If a plugin is compiled with Scala 2.12.3 and sbt is still stuck with 2.12.1, sbt could fail to load the plugin if the plugin uses a new method that was added in scala-library 2.12.3.

This is not an isolated example. It would also disrupt the compiler plugins and macro libraries’ ecosystem. Most of the plugins and macro libraries do not cross-compile to the full Scala version, but the partial one. If a user adds a version of a compiler plugin that was compiled with 2.12.3 to a project that still uses 2.12.0, compilation could also fail.

There are solutions to these problems. The first one that comes to mind is to let our tools force a consistent use of Scala versions. For example, sbt would force modules with sbtPlugin := true to compile with a Scala version lower or equals to the Scala version that the plugin’s sbt uses. To make a plugin compile with the newest Scala version, sbt plugin maintainers would need to update their sbt.version in project/ A similar solution could be applied for the builds of compiler plugins and macro libraries: if they don’t cross-compile to the Scala full version, sbt would fail.

This is not a complete solution, though:

  • Developers stuck on outdated tools will not get these checks.
  • Developers need to become familiar with the new semantics and know how to act on the errors.

But, despite these problems, I think the Scala community would be much better off without forward bincompat because:

  • Both scala-library and compiler maintainers would be able to ship things much faster. For example, @oscar’s addition of substitute methods to <:< and =:= was merged into 2.13.x instead of 2.12.3 because of a breakage of forward binary compatibility.
  • New features and improvements to the compiler that have to wait because they require changes to scala-reflect would be possible.
  • API additions to the scala-library would not need to wait until the next major version.

All these together make major versions lighter in features (easing migration). IMO, the most important observation is that it would allow the Scala team to release major versions much less often.

In my book, the cost of forward bincompat is too high. For the moment, I believe it’s better to remove it.

1 Like

Good point. Although one could argue that you should compile and test your sbt plugin against the Scala version that will end up being used, at the end of the day. Otherwise, say you are relying on a bug fix in the std lib, your plugin might still be broken with the older Scala version. So I would agree with you that we should make sure that sbt plugins are compiled against the Scala version matching the sbt version they are compiled for.

The point is valid if you are using new compiler features and/or relying on new compiler bug fixes.

Macros will be fine even if we drop forward BC, because their POM metadata will tell that they need the newer version of scala-library/scala-reflect, which will therefore be put on the dependencyClasspath by ivy resolution. And since scala-library and scala-reflect guarantee backward BC, it’s OK.

For compiler plugins, using the partial cross-version has always been broken, since scala-compiler.jar does not promise forward nor backward binary compatibility (nor source compat). This is a very real thing: the Scala.js compiler plugin has all sorts of hoops to keep compiling against all versions of scala-compiler.jar using a single set of source files. See and in particular for a very nasty one. So compiler plugin authors who use a partial cross-version are already wrong and exposed to random breakages right now. This must not be done!

1 Like

Right, ivy resolution will force the latest Scala version unless there’s an explicit override. Good point!

I am not familiar with the Scala.js release process, so why do you need to tie a new release of scalajs-library to a new release of the scala.js compiler plugin? Why doesn’t it work the same way as for the scalajs-compiler.jar?

It doesn’t. But it should: An excerpt:

Also, with scalaVersion set to “2.12.0” and a dependency to that version of monocle-core I expected scala-library 2.12.1 to be used at compile-time and runtime, given that’s what monocle depends on.

It doesn’t work the same because scalajs-library.jar is cross-compiled on the binary version, whereas the compiler is cross-compiled on the full version. scalajs-library_2.12-0.6.20.jar is the standard library for the 2.12 eco-system, version 0.6.20, and it is based on the Scala std lib sources of 2.12.3 because that was the most recent version as of the release of Scala.js 0.6.20. scalajs-compiler_2.12.3-0.6.20.jar is the compiler plugin for 2.12.3 specifically, version 0.6.20.

This means that when 2.12.4 comes along, we just go back to the tag v0.6.20, and can publish scalajs-compiler_2.12.4-0.6.20.jar after the fact. This works because it is a new artifact on Maven. We cannot do the same for scalajs-library_2.12-0.6.20.jar, because that artifact already exists, and we cannot change it.

So then the natural question is: why don’t we also cross-version scalajs-library on the full version? Well then, we have an even bigger problem. If a library A was compiled with scalaVersion := "2.12.3", it will depend on scalajs-library_2.12.3. And then if another project B which depends on A uses scalaVersion := "2.12.4", it will depend on scalajs-library_2.12.4. Now you end up with two versions of scalajs-library on the classpath, instead of one being evicted by ivy resolution. And that’s really bad.

If we want ivy to evict the older scalajs-library, we need to use the Scala version number as the artifact version. But then where do we put the Scala.js version number? Do we combine the two version numbers to publish That’s no good because would evict it? If we switch them, we have the reversed problem, obviously.

This double-version problem means that one Scala.js version can only ever be built for one version of the Scala std lib. And hence once 0.6.20 has been released based on 2.12.3, the only way you can possibly get the 2.12.4 std lib is to wait for Scala.js 0.6.21.

1 Like

I would like to note for the record that there are problems with a lack of forward binary compatibility when there are bugs and other changes that inadvertently break something (i.e. it isn’t actually as compatible as it’s supposed to be either at the source or binary level).

Because of this–and this has happened for a number of releases throughout 2.10, 2.11, and 2.12–you can end up with situations where libraries are stuck on older point releases. But now if you have some libraries stuck on older point releases, but other libraries requiring new features that are enabled by the lack of forward bincompat, you can end up with an ecosystem that you cannot resolve at all.

Of course, if you ever get something working you can just stick with it, but that makes it harder to benefit from bug fixes, and may impede moving complex projects between major versions (since the many child libraries never manage to get in sync to all use the same minor version).

For projects of modest size, I wouldn’t expect this to be a showstopper. But for huge projects with many dependencies, the lack of forward bincompat may pose an additional hurdle (i.e. scalability is impeded). (Of course, it’s already hard enough to get all dependencies on the same major version, but at least now once they’re all there, things will work.)

I think this already happens with major releases, for instance when a certain library version is no longer cross-published with the latest Scala major version (I have an example in a different thread, Akka being the library in question). The upside would be that minor versions can continue coming at a fast pace and blocker bugs will be fixed much quicker.

Also, forward binary compatibility is not as good as it sounds. For instance, one cannot run Scala 2.12.3 with a 2.12.2 library, so in your example you’d be stuck in exactly the same place (except you’d be counting on a guarantee that’s not there). The culprit is adding colored output in the REPL, adding a new method in the Scala library, adding an exception to MiMa, and publishing 2.12.3 with the code that’s already using it. Sure, it’s private[scala], but if you must use scala-library 2.12.2 per your thought experiment, you’re in a worse position than if forward binary compatibility wasn’t assumed.

Maybe we should note the bigger picture: practically everyone else works exclusively with backwards binary compatibility and it’s just fine. Not saying we shouldn’t innovate, but it’s a data point.

1 Like

Not being able to achieve forward binary compatibility in practice is a reason to abandon it, I agree.

And I also agree this library-incompatibility thing happens all the time with major releases, which is why I’m worried about it also happening for minor releases!

Note that Java’s backwards binary compatibility is incredibly strong–they basically never remove anything, or haven’t until very recently–which doesn’t work as well for us if we want to tidy up things that in retrospect could have been done better. I had always thought of major-release forward bincompat as a consolation for not having eternal backwards bincompat.