Make Scala-platform API independent of Java namespaces


I thought that as long as the implementation is not copied there is no copyright infringement.


Maybe dotty should be targeted as Java-independent platform? Should be a bit easier.


Oracle’s claim is that the APIs themselves are copyright, and therefore can’t be legally re-implemented. I don’t recall offhand whether that claim has been formally ruled upon yet or not…


It has. Twice. In Google’s favour. But that’s not the end, unfortunately.

The court ruled correctly originally that APIs are not subject to copyright. [The US Court of Appeals for the Federal Circuit] threw that out and ordered the court to have a jury determine the fair use question. The jury found it to be fair use, and even though CAFC had ordered the issue be heard by a jury, it now says ‘meh, we disagree with the jury.’

Techdirt via LWN


It’s a fantastic idea, but I feel that it shouldn’t be considered in isolation. Not only do we have dotty looming, but we’re now two major releases into Java’s module system, and the unavoidable warnings because SBT uses protobuf (why???) and protobuf doesn’t do modules.

Before we start abstracting our own packages, perhaps we should implement our own module system. It’s getting increasingly obvious that we need one; and we can’t just use Java’s as-is, because it’s less suited to Scala.JS and because it might be legally considered an API… (regardless of any prior art in OSGi).

Ideally we’d have something that could reasonably abstract over both Java modules on the JVM and imports/exports under scala.JS, and would be useable in some form via Scala native. This means we’d also get a solution that’s nicer to work with than Java’s module system, not that that’s especially hard to do! I have some thoughts on this, especially regarding the runtime command-line warts that Java demands, many of which could be more elegantly handled in the module-info files.

Get that far, and it’s relatively simple (because the hard work has been done by this point) to have java.core NOT be open to scala programs by default. Problem solved!


Kevin Wright makes a fantastic point, although an implementation of that sounds difficult. Is it worth looking into source code distribution like golang to allow Scala, .js and native to all use the same libraries?

Or is it worth finding a partly compiled language that can then be compiled to JVM, js or native?

With these modules it it opens the way to allow entirely new options to share libraries in ways that Java does not support. I think it is worth discussing these new options if this change is decided.


There does not seem to be much documentation online, but that’s exactly the idea of the TASTY format, currently being developed at EPFL in the context of Dotty. Distributing plain source code is kind of brittle and a little “dumb.” The idea of TASTY is to store a fully resolved and typed tree compressed in a binary format. I don’t think there has been particular focus on modules, but at least it solves the distribution + cross-compilation aspects.


TASTY will not help, nor any other “partially compiled” format.


If I understand the linked post correctly, you’re saying that Tasty isn’t sufficient because:

  • Different platforms may require different dependencies
  • Different platforms may require different source-level implementations of the same interface.

I think both of these things can be handled, for inspiration see the Multi-Release JAR File specification which solves a similar problem when the “platform” is just different Java versions. We could have multi-platform tasty jars based on the same principle.


There does not seem to be much documentation online, but that’s exactly the idea of the TASTY format

Yes, documentation needs to be improved, for now the discussion in probably has the most amount of details.


I wasn’t thinking about any sort of “interim” pre-compiled formats here… just the idea that Scala might internalise the concept of modules, make them something truly native to the language, and do so in a way that could abstract over both the Java and JS interpretations of the same concept.

We could then represent java.core as such a module and not make it available by default. Instead, there would be scala.core,, etc. which would be implemented in terms of java.core on the JVM, but would instead be based on JS/node built-ins on that platform.


In reply to @smarter

We already kind of have this, by using dependencies in SBT with varying numbers of % symbols to allow the most relevant version (e.g. Java or JS) of a library to be used.

I don’t believe it would be too difficult to take this concept - which is already exposed by scala.js - and enshrine it as a scala-wide concept across all targeted platforms


That assumes that scala.core itself does not depend on java.core! And at this point you’re back to square one: who is going to put in the effort to reimplement the Scala standard library not to depend on java.* stuff?

Also, you’ll see that some concepts from java.lang._ are deeply ingrained in Scala’s type system and core compiler mechanics. For example, String, or the boxed classes. In Scala.js, these classes are so deeply embedded in the compilation pipeline that they have their own IR concept of “hijacked class”. They’re even a core part of the type system of the Scala.js IR. There’s even more Java than Scala in the IR specification! This coupling was necessary in Scala.js because the existing Scala ecosystem relies on it. There was no other way that wouldn’t have involved rewriting scalac, the Scala stdlib and a few libraries from the ecosystem.

Decoupling all of that is going to be an insane amount of work.


I don’t see a need to decouple the Scala runtime from the Java runtime. I think the larger benefit to the Scala language is to provide implementations of Java packages to Scala programs. The Scala libraries can be platform dependent as long as it is required that each platform support a certain core set of features.


“Decoupling all of that is going to be an insane amount of work.” … yes, it will be

I sincerely hope that nobody believed it would be easy, it’s not. Whatever way you care to look at this, decoupling scala from pre-assumed java concepts would never be a trivial task!

I didn’t pitch into this conversation trying to claim that there’s a way to make this simple, because there really really isn’t. I just wanted to move the bar a little by proposing a manner in which it might be made a little bit more structured and a little bit less painful if we should decide that the benefits of this approach would outweigh the costs.


As a clarification scala-java-time does export to java.time. The source code is in org.threeten.bp, otherwise running tests is unreliable due to security restrictions on the JVM about overrriding classes on the java namespace

The packages are renamed at build time. In practice this is transparent to end users. Just reuse the code in JVM/JS using java.time


Is it possible to standardize hijacked class to an SIP? It seems like a higher performance replacement to value class, and the syntax of hijacked class is more concise than opacity types + implicit conversion.

Hijacked class is similar to Haxe’s abstract, which is actually the Haxe’s approach to build a library whose API is independent of any specific platform.

Haxe reference the underlying value in an abstract using this keyword, so does Scala.js’s hijacked class.


Despite of the huge effort of migration, hijacked classes also show a possibility to create a new core library as the replacement of java.lang for Scala on JVM.

I guessing many of contributors are keen to create a hijacked core library of new design, considering the Scala community had created Scalaz and many other libraries for redesigning fundamental features that are already implemented in Scala standard library.

So the whole process can be split to three steps:

  1. Standardize the feature of hijacked classes or opacity classes.
  2. Creating a new hijacked core library for each backend.
  3. Porting scala-library to hijacked core library instead of java.lang .


@yangbo I think you misunderstood what a hijacked class is.

First, since you mention Haxe’s abstract types (docs), they are actually much closer to the Opaque type aliases proposal, combined with some extension methods.

Scala.js’ hijacked class is an IR-level device with some very weird properties:

  • The instances of a hijacked class are actually primitive values of the appropriate type. For example, in Scala.js, instances of java.lang.Double are primitive JavaScript numbers. There is an actual subtyping relationship in the IR. This is meaningless in Scala/JVM.
  • They are the canonical class-level representative for their underlying primitive values, e.g., x.getClass will return classOf[java.lang.Double] if x is a primitive JavaScript number.
  • Since all instances of a hijacked class are primitives, you can cast them down, which means that this.asInstanceOf[Int] is meaningful in the body of java.lang.Integer (and does not involve calling intValue() as it would on the JVM).

Hijacked classes are necessary to provide interoperability with JavaScript, while keeping (most of the) portability with existing Scala/JVM code that expects things like (x: Any) match { case _: java.lang.Integer = "yes"; case _ => "no" } to answer "yes".


Thanks, this was my biggest issue with scala-java-time, it was using the org.threeten.bp. Didn’t realise there was a technical reason why.