Continuing or dropping Scala 2.10 maintenance in the ecosystem?

Scala 2.10 used to (and still has, to some extent), huge gravitas in the Scala ecosystem. To large parts, this was due to Spark and sbt supporting only 2.10. Spark has migrated to 2.11, and sbt 1.0.x is around the corner (well, it’s already there, because the RCs are binary compatible).

Most Scala libraries (as far as I can tell), cross-build against at least 2.10, 2.11, and 2.12. Some even added support for 2.13, which means four major Scala versions.

One one hand, this is a significant improvement over the state of the art before 2.10: It was unthinkable to support so many versions simultaneously with as little changed code.

On the other hand, it is also a maintenance burden. Good cross-compatibility also means that progress is hindered. Libraries are unable to exploit newer features from, say, 2.11.x.

Which is why I am asking community maintainers to consider dropping 2.10 support immediately. It means version switches in build files (e.g. for Scala modules like scala-xml) or compatibility libraries (e.g. macro-compat) can go away. In almost all cases I’ve seen, this was used to distinguish between 2.10 and 2.11+.

Additionally, dropping 2.10 support will incentivize people to migrate to sbt 1.0.x. In many cases it’s not that hard to do. (Big thanks to the sbt developers for that.)


Scala 2.10 represents a huge maintenance burden.

The compiler is very sensitive, especially when working with macros, easily crashing with unintelligible error messages that can take hours, or even days to debug, because really, at some point you start editing random source files in the hope that the crash would go away.

And generating ScalaDoc for 2.10, 2.11 and 2.12 is basically mission impossible.

The Scala ecosystem has a big problem in not doing a good job of maintaining backwards compatibility. But given the tools we have, it isn’t reasonable to expect libraries to maintain support for 3 or 4 major Scala versions, along with Scala.js and possibly Scala Native in the mix.

1 Like

We also plan on dropping 2.10 in ENSIME when the next hurdle arrives.

However, realistically, the sbt plugin ecosystem needs to be updated before anybody can use sbt 1.0. I give it 1+ year. I really hope the Scala Center can find the resource to help because many sbt plugins are now unmaintained or the changes are quite large (and the number of people who understand sbt is quite small). We were already asked to upgrade from 0.12, and then again for AutoPlugins, each transition incurring technical debt that 1.0 no longer supports. For example, the ensime-sbt plugin would take me several fulltime days to upgrade (it’s not just the code, it’s also the regression tests), which is time that I simply do not have to donate to the community.

1 Like

4 posts were split to a new topic: Scala and backwards compatibility

Just to add something: My proposal of course only means that we should, going forward, stop publishing 2.10 artifacts for new releases. For maintenance branches, it’d still be fine. The reasoning is that IMHO people who are stuck on 2.10 for whatever reason are also unlikely to be able to bump any dependency. So those should not be affected by this change in policy.


Less than 10% downloads of cats is 2.10, but I don’t know how many of them are actual end users v.s. libraries that cross build. 2.10 has caused some headaches in cats in the past and being able to use scalameta without hacky workarounds would be nice for the project. IMHO, cats 1.0 should be released with 2.10 support. There isn’t much extra cost at this point, and 1.0 will remain stable for a longer period of time. Post sbt 1.0.0 release and adoption, we won’t see much incentive to continue support 2.10, we could, again IMHO, passively drop 2.10 support in cats.

1 Like

I think the biggest user group of scala 2.10 are users of Spark 1.x.x which doesn’t fully support scala 2.11. I couldn’t find any information if spark 1.x.x is still maintained(last release(1.6.3) happened November 7, 2016). Some time ago spark 1.x.x was the only version supported by Cloudera but I see it is no longer the case(

I never supported 2.10 for Scanamo, as it didn’t just work and by the time it was gaining any traction Spark had 2.11 support, which had been the reason I was considering it. I thought it pretty unlikely anyone would try and use it in an SBT plugin.

Is it likely that people using Spark 1.x.x would want to use the latest and greatest Scala libraries, e.g. Monix 3.0 (I’m mentioning Monix because @alexandru has already dropped 2.10 support).

I just had a brief look into the build file for Scanamo – any reason why you’re using macro-compat, or is that some sort of leftover?

I think its a mistake to consider dropping support until some time after the release of SBT 1.0. I maintain an SBT plugin and I’d upgrade after something is released, not whilst it is RC.

In general, for major releases, I would suggest adopting FreeBSD style support model.


I’m glad that this discussion is on the table. From my side, I would like to drop 2.10 support in Scala.js 1.x. Of course, we can only do that if the maintainers of most Scala libraries also drop 2.10, or they will yell at me because their cross-building scripts are more complicated because of Scala.js.

1 Like

FWIW, The sonatype download stats for some projects I maintain:

  • Ammonite: 22% 2.10, 44% 2.11, 33% 2.12
  • FastParse-JVM: 29% 2.10, 58% 2.11, 13% 2.12
  • Scalatags-JVM: 33% 2.10, 50% 2.11, 17% 2.12

Those aren’t small numbers of people on 2.10; it seems like I’ll be stuck supporting it for quite some time more

1 Like

I have almost finished migration of my major sbt plugins (e.g., sbt-sonatype, sbt-pack, sbt-sql, etc.) so that we can drop Scala 2.10 support. However, migration to sbt-1.0 (Scala 2.12) was hard because I encountered various incompatibility issues (e.g., lack of Scala 2.12 supported sbt-plugins). I totally agree with the idea of dropping Scala 2.10, but dropping Scala 2.10 itself is already challenging because many libraries are already depending on Scala 2.10 or 2.11 code.

As another example of migration pains, Spark seems struggling to support Scala 2.12 because since Scala 2.12, Scala compiler started to use Java8 lambda (which sometimes generates non-serializable classes)
Spark sends closures to remote machines to perform distributed processing, so if Scala compiler generates non-serializable lambda classes, it’s almost impossible to send and run a user code in remote like Spark.

We already have Scala 2.10.x, 2.11.x, 2.12.x, Scala.js, Scala.native, etc. I think I’m spending a lot of my effort only for configuring TravisCI, CircleCI, etc. to enable cross building for these versions and target platforms.

I think a current problem is sbt plugins are maintained by individual contributors, so the knowledge is quite distributed (mostly concentrated in sbt developers side) and making the migration to the latest one so difficult. Cross-building itself is not so much a pain if we have a working configuration for major CI tools. At least Scala 2.11 and 2.12 are almost compatible except a tricky case like Spark.

If we have a central repository for all major sbt-plugins and common Scala libraries, I think any incompatible changes in Scala language or sbt core side can be detected in continuous integration manner while developing core Scala tools. Then we can discuss how to address these incompatibilities in Scala language and sbt development side, before relying on individual contributors effort.

In a nutshell, I’d like to propose changing the current Scala development process from:

  • (current) Ship it (a new Scala/sbt version) -> let individual contributors test and fix problems
  • (future) Gather all core libraries and plugins to somewhere (e.g., Scala center?) -> Test incompatible changes for each Scala version-> Fix or provide workarounds -> Ship it
1 Like

That does exist for JVM Scala. It’s called the Scala community build, and it currently includes 109 projects. See

Nothing comparable yet exists for Scala.js or Scala Native.

As for sbt, the Scala community build is also used to test new minor sbt versions before they are released. But it isn’t focused on that purpose, so it gives only limited assurance. (For example, it doesn’t rebuild sbt plugins from source.)


People like myself use maven for publishing sbt plugins. I’d also add that in corporate environments Scala exists, because people have produced things like Jenkins plugin (some firms have old CI environments, not everyone is on Travis etc…)

Dropping support for Scala 2.10 is not a good idea, at least not until plugins like the above are freely available on SBT 1.0 and have been released for some time.

Unless the goal of this exercise is to make it harder for some people to use Scala in the corporate environment. I have worked in environments where resistance to anything but Java is high and it takes a lot of work to pursade people of the benefit.

Making what I’d say is a rash decision to kill off support before all the tools are ready will just make it harder for people to adopt the language and/or harder for those pushing for its adoption.

That’s a remnant that I hadn’t noticed. Thanks for pointing it out.

On ScalaMock, it currently looks like this (Central downloads across all Scalamock artefacts from the last 30d):

But I echo the thoughts from some people above.
Scala 2.10 still has a significance in the Big Data space (Spark 1.x) so in the corporate world, it would really hurt if support for that version dropped off, and weaken the position of Scala as a whole.

Again, I find it highly unlikely that users of Spark 1.x would want to use the bleeding-edge version of, say, Monix 3.0.

1 Like

This rule makes the most sense. An outdated Scala induces an outdated library ecosystem, and this effect appears in the interplay of interdependent libraries as well. A fresh library ecosystem induces a relatively fresh Scala, as other libraries do among themselves.