Next Scala 3 LTS version will increase minimum required JDK version

TL;DR

The Scala 3 Next minors in 2025 Q4 and the next LTS will drop JDK 8 support. We are seeking feedback on whether the new minimum JDK should be 11 or 17.

More info is available in the blogpost at Next Scala 3 LTS series will increase minimum required JDK version | The Scala Programming Language

This is the thread to discuss it in more detail.

9 Likes

Going for Java 17 might be too ambitious, but it allows for interop with Java’s record classes and sealed interfaces.

3 Likes

Given we’ll have another LTS in Java 25 by 2025 Q4 (it’s due in September), jumping straight to Java 17 seems reasonable: the meta-policy being to support 3 Java LTS versions.

For folks who are unable to upgrade their Java version, I think it’s reasonable to say they can stick with an old Scala LTS minor version, which anyway will continue to be maintained with point releases as necessary

15 Likes

Once a new LTS comes out, the current LTS (3.3.x) will be maintained for one more year. Scala development guarantees | The Scala Programming Language

However, when library authors start publishing new releases using the new LTS, people stuck on the old LTS won’t be able to upgrade these dependencies.

2 Likes

How about stating a general policy like “Support the JDK for 18 months after it goes out of Premium Support” ? That would set expectations for the future, too.

That also puts JDK11 on the knife’s edge.

Here’s the JDK roadmap: Oracle Java SE Support Roadmap

4 Likes

At this point, I largely agree with Haoyi – my sense is that the Scala community has been a bit over-cautious about this, and I think that’s been a mild net-negative.

(Not least, in that it’s been tacitly encouraging organizations to stick with ridiculously-old JDKs for no good reason other than not being nudged to upgrade them. I realized in my last job that our Scala team were perhaps the only remaining people in a vast enterprise still stuck on JDK 8, simply because we’d never previously set aside the time to upgrade.)

3 Likes

Why not target the latest LTS at the time of release and let people who aren’t upgrading the JDK stay on older Scala versions until they are ready to upgrade?

I think the arguments in this JEP make a lot of sense

https://openjdk.org/jeps/14

Targeting already-old JDKs prioritizes those users who can’t upgrade the JDK but who are capable of upgrading Scala. Is that the situation much of the community is in?

Sticking to old JDKs isn’t entirely harmless, it came up just a few days ago in [JVM] Use StableValue as compilation target for lazy val - #2 by sjrd

8 Likes

How does this affect the Scala tooling ecosystem JDK dependency?

I guess the tooling ecosystem will be able to stop testing JDK 8 and remove any custom support. Not a huge improvement, but one less thing to worry about.

The Long Term Support for e.g. Eclipse Temurin 17 is ending in less than 3 years (October 2027). For 21, in less than 5 years (December 2029).
Maybe Java 17 isn’t so far fetched after all…

Java version history - Wikipedia

I don’t know how much value would we get from mere 11. With 17 we get sealed, java.lang.Record, etc… This is quite important for interoperability with Java.

Notably, Spark 4 will require JDK 17.

6 Likes

Let’s consider building, testing, publishing, code formatting, linting, and IDE (jump to definition, code completion, syntax highlighting on partially-edited code).

In my opinion, Scala 3 LTS adopting JDK 17 wouldn’t significantly change the ongoing maintenance burden in the short-term since we already have to deal with bug reports from a wide range of JDK versions. Taking sbt 1.x for example, we publish using JDK 8, but CI uses a mix of JDK 8, 17, and 21. JDK deprecating methods and features are more of the problem, given people expect compatibility with the latest JDKs.

  1. As long as the tools are implemented using JDK 8 and tested using JDK 21 LTS (and later 25 LTS), the users of tooling would have maximum flexibility to choose Scala 2.13 or Scala 3.x on JDK 17.
  2. However, if the tooling authors adopt Next Scala 3 LTS, for example if some version of sbt 2.x or Scala Meta picked Scala 3.7, and dropped JDK 8 or 11 support, the users of those tooling would naturally be forced to also upgrade to JDK 17, even if they are still using Scala 2.13 or Scala 3.3, unless we come up with some indirection mechanism.
2 Likes

Thanks for raising this, this has been the main issue on my mind. We are looking forward to adopting sbt 2 when it releases, but if this forces us to upgrade to e.g. JDK 17, this does complicate our situation.

Thanks to the -release compiler flag, you can still emit bytecode targeting an older JDK than the one you are running on. But without actually running your tests on that older JDK, you might miss issues like dependencies that dropped JDK 8 support or other subtle changes in the JDK/JVM implementation, since the -release flag cannot check for those.

Projects like Cats Effect are particularly sensitive to these runtime nuances, which is why we have an extensive build matrix testing multiple JDK versions from multiple vendors across several OSes and architectures. Becoming no longer able to run our Scala 2.13 tests on JDK 8 would be concerning.

unless we come up with some indirection mechanism.

One idea I am interested in but have not had a chance to explore yet is using sbt’s javaHome setting to run (forked) tests on an older JDK than is required by the build/tooling. Further reading:

1 Like

sbt 1.2.x added cross JDK forking, which is a helper command to automatically pick up alternative Java homes so you can write:

> java++ 8!
> test
2 Likes

I’ve always been in favor of being on the bleeding edge as much as possible, and as someone who doesn’t work on load-bearing projects with massive teams I’m likely not in the requested feedback demographic, but I personally can’t see a great reason to not drop 11 as well.

In my experience, the most painful jump is 8 → 9. Once you’re past that, 11 → 17, or 17 → 23, etc. are incredibly easy. Gradle used to be a major roadblock, but nowadays the time difference between a JDK version becoming GA and Gradle supporting it is on the order of days; I personally don’t see any reason software that was able to make the 8 → 9 jump should be stuck to a version older than 21, much less 17.

If there are projects not stuck on 8, but stuck on 11 and unable to upgrade to 17, I haven’t seen any and I don’t know of any. Even old modded Minecraft versions, the JVM software known even outside of programming circles for being notorious about JDK version incompatibilities, have methods for running them on Java 17 now; they’re not beholden to 11. Additionally, the blogpost lists several libraries as “requiring Java 11”, but all of them to my knowledge have no issues with 23.

Even in the case of there being some old build out there with some old Gradle plugin somehow requiring modern Scala to support Java 11 but not 17, wouldn’t the existence of the 3.3 LTS migration time mitigate this? Is there a reason I’m missing for supporting an old Java version beyond 8?

5 Likes

For context we publish an on-premise software for infra, so we’re extremely serious about not forcing users to upgrade dependency versions more than what where OS lifecycle is.

We have ditched support for Java below 17 for 18 months now. (PostgreSQL is harder than Java in that regard).

So for one time, I won’t be in the “hold it back group”.
Actually, JDK upgrades are so much easier than Scala syntax upgrades, it’s not even comparable.

And for that old piece of software on a server not updated since “check notes” at least 4 years and running on EOL jdk? There’s 0 chance it will use Scala 3. I would be very astonished but really eager to understand a use case where people payed the price for migrating from Scala 2 to Scala 3 but not for going to at least the last Java LTS at the time. If it implies “paying huge sole to have old JDK supported”, then it means Scala would do free works when resources are already so scare.

So go ahead, please only support Java 17 and up.

In addition, given the speed of chance imposed on the Scala language, I don’t see why you support more than 2 jdk LTS max. It’s roughly what distro supports without premium support.

4 Likes

I am not sure if anyone brought Kotlin into question, but it still supports Java 8 as the minimum version, and also, if you specify a higher JDK, it starts introducing optimizations specific for that JDK version.

So Kotlin can interop with Java 17 features, such as sealed interfaces and Java records by just specifying the Java version in the build — i.e., it can generate record classes, it can work with Java’s sealed types, and its own sealed types can be generated using specific Java bytecode such that they are recognized as sealed types in Java as well.

Here are some links:

Important to note that Kotlin has the burden of Android compatibility, with Scala having unburdened itself of Android since 2.12.

6 Likes

I actually brought this up in another thread in the same vein, but with a more tangible benefit: [JVM] Use StableValue as compilation target for lazy val - #6 by velvetbaldmime

I see a lot of people arguing for being on the bleeding edge and/or only supporting the latest LTS versions of the JDK. But I don’t really see many arguments of what real advantage that would give, much less how those advantages would weigh against the cost of dropping compatibility with older versions.

First of all, just because scalac emits bytecode that is compatible with java 8 (or 11 or maybe 17 in the future) does not hold you back in any way of being on the bleeding edge yourself and running your applications on and using API of Java 23.

The main potential benefit as stated a few times is that scalac could use some new features in the compiled code, however that could also be done by following the example of kotlin and enabling/disabling those via the -release flag. And even so it remains to be seen if Scala could indeed leverage those features while maintaining compatibility with the spec, and hopefully binary compatibility. And if it is even worth it to leverage those features, i.e. do we actually gain any meaningful benefits in exchange for the cost of supporting them.

1 Like

That’s the best of both workd and would be amazing

In principle, sure – but that’s an ongoing maintenance burden, managing potentially significantly-different compile paths depending on the target. Given that the core team is relatively small, that’s not a trivial factor.

4 Likes