I thought that as long as the implementation is not copied there is no copyright infringement.
Maybe dotty should be targeted as Java-independent platform? Should be a bit easier.
Oracleâs claim is that the APIs themselves are copyright, and therefore canât be legally re-implemented. I donât recall offhand whether that claim has been formally ruled upon yet or notâŚ
It has. Twice. In Googleâs favour. But thatâs not the end, unfortunately.
The court ruled correctly originally that APIs are not subject to copyright. [The US Court of Appeals for the Federal Circuit] threw that out and ordered the court to have a jury determine the fair use question. The jury found it to be fair use, and even though CAFC had ordered the issue be heard by a jury, it now says âmeh, we disagree with the jury.â
Itâs a fantastic idea, but I feel that it shouldnât be considered in isolation. Not only do we have dotty looming, but weâre now two major releases into Javaâs module system, and the unavoidable warnings because SBT uses protobuf (why???) and protobuf doesnât do modules.
Before we start abstracting our own packages, perhaps we should implement our own module system. Itâs getting increasingly obvious that we need one; and we canât just use Javaâs as-is, because itâs less suited to Scala.JS and because it might be legally considered an API⌠(regardless of any prior art in OSGi).
Ideally weâd have something that could reasonably abstract over both Java modules on the JVM and imports/exports under scala.JS, and would be useable in some form via Scala native. This means weâd also get a solution thatâs nicer to work with than Javaâs module system, not that thatâs especially hard to do! I have some thoughts on this, especially regarding the runtime command-line warts that Java demands, many of which could be more elegantly handled in the module-info files.
Get that far, and itâs relatively simple (because the hard work has been done by this point) to have java.core NOT be open to scala programs by default. Problem solved!
Kevin Wright makes a fantastic point, although an implementation of that sounds difficult. Is it worth looking into source code distribution like golang to allow Scala, .js and native to all use the same libraries?
Or is it worth finding a partly compiled language that can then be compiled to JVM, js or native?
With these modules it it opens the way to allow entirely new options to share libraries in ways that Java does not support. I think it is worth discussing these new options if this change is decided.
There does not seem to be much documentation online, but thatâs exactly the idea of the TASTY format, currently being developed at EPFL in the context of Dotty. Distributing plain source code is kind of brittle and a little âdumb.â The idea of TASTY is to store a fully resolved and typed tree compressed in a binary format. I donât think there has been particular focus on modules, but at least it solves the distribution + cross-compilation aspects.
TASTY will not help, nor any other âpartially compiledâ format.
If I understand the linked post correctly, youâre saying that Tasty isnât sufficient because:
- Different platforms may require different dependencies
- Different platforms may require different source-level implementations of the same interface.
I think both of these things can be handled, for inspiration see the Multi-Release JAR File specification which solves a similar problem when the âplatformâ is just different Java versions. We could have multi-platform tasty jars based on the same principle.
There does not seem to be much documentation online, but thatâs exactly the idea of the TASTY format
Yes, documentation needs to be improved, for now the discussion in Consider making a TASTY backend ¡ Issue #21 ¡ twitter/rsc ¡ GitHub probably has the most amount of details.
I wasnât thinking about any sort of âinterimâ pre-compiled formats here⌠just the idea that Scala might internalise the concept of modules, make them something truly native to the language, and do so in a way that could abstract over both the Java and JS interpretations of the same concept.
We could then represent java.core as such a module and not make it available by default. Instead, there would be scala.core, scala.io, etc. which would be implemented in terms of java.core on the JVM, but would instead be based on JS/node built-ins on that platform.
In reply to @smarter
We already kind of have this, by using dependencies in SBT with varying numbers of % symbols to allow the most relevant version (e.g. Java or JS) of a library to be used.
I donât believe it would be too difficult to take this concept - which is already exposed by scala.js - and enshrine it as a scala-wide concept across all targeted platforms
That assumes that scala.core
itself does not depend on java.core
! And at this point youâre back to square one: who is going to put in the effort to reimplement the Scala standard library not to depend on java.*
stuff?
Also, youâll see that some concepts from java.lang._
are deeply ingrained in Scalaâs type system and core compiler mechanics. For example, String
, or the boxed classes. In Scala.js, these classes are so deeply embedded in the compilation pipeline that they have their own IR concept of âhijacked classâ. Theyâre even a core part of the type system of the Scala.js IR. Thereâs even more Java than Scala in the IR specification! This coupling was necessary in Scala.js because the existing Scala ecosystem relies on it. There was no other way that wouldnât have involved rewriting scalac, the Scala stdlib and a few libraries from the ecosystem.
Decoupling all of that is going to be an insane amount of work.
I donât see a need to decouple the Scala runtime from the Java runtime. I think the larger benefit to the Scala language is to provide implementations of Java packages to Scala programs. The Scala libraries can be platform dependent as long as it is required that each platform support a certain core set of features.
âDecoupling all of that is going to be an insane amount of work.â ⌠yes, it will be
I sincerely hope that nobody believed it would be easy, itâs not. Whatever way you care to look at this, decoupling scala from pre-assumed java concepts would never be a trivial task!
I didnât pitch into this conversation trying to claim that thereâs a way to make this simple, because there really really isnât. I just wanted to move the bar a little by proposing a manner in which it might be made a little bit more structured and a little bit less painful if we should decide that the benefits of this approach would outweigh the costs.
As a clarification scala-java-time
does export to java.time
. The source code is in org.threeten.bp
, otherwise running tests is unreliable due to security restrictions on the JVM about overrriding classes on the java
namespace
The packages are renamed at build time. In practice this is transparent to end users. Just reuse the code in JVM/JS using java.time
Is it possible to standardize hijacked class to an SIP? It seems like a higher performance replacement to value class, and the syntax of hijacked class is more concise than opacity types + implicit conversion.
Hijacked class is similar to Haxeâs abstract
, which is actually the Haxeâs approach to build a library whose API is independent of any specific platform.
Haxe reference the underlying value in an abstract
using this
keyword, so does Scala.jsâs hijacked class.
Despite of the huge effort of migration, hijacked classes also show a possibility to create a new core library as the replacement of java.lang
for Scala on JVM.
I guessing many of contributors are keen to create a hijacked core library of new design, considering the Scala community had created Scalaz and many other libraries for redesigning fundamental features that are already implemented in Scala standard library.
So the whole process can be split to three steps:
- Standardize the feature of hijacked classes or opacity classes.
- Creating a new hijacked core library for each backend.
- Porting
scala-library
to hijacked core library instead ofjava.lang
.
@yangbo I think you misunderstood what a hijacked class is.
First, since you mention Haxeâs abstract types (docs), they are actually much closer to the Opaque type aliases proposal, combined with some extension methods.
Scala.jsâ hijacked class is an IR-level device with some very weird properties:
- The instances of a hijacked class are actually primitive values of the appropriate type. For example, in Scala.js, instances of
java.lang.Double
are primitive JavaScriptnumber
s. There is an actual subtyping relationship in the IR. This is meaningless in Scala/JVM. - They are the canonical class-level representative for their underlying primitive values, e.g.,
x.getClass
will returnclassOf[java.lang.Double]
ifx
is a primitive JavaScriptnumber
. - Since all instances of a hijacked class are primitives, you can cast them down, which means that
this.asInstanceOf[Int]
is meaningful in the body ofjava.lang.Integer
(and does not involve callingintValue()
as it would on the JVM).
Hijacked classes are necessary to provide interoperability with JavaScript, while keeping (most of the) portability with existing Scala/JVM code that expects things like (x: Any) match { case _: java.lang.Integer = "yes"; case _ => "no" }
to answer "yes"
.
Thanks, this was my biggest issue with scala-java-time, it was using the org.threeten.bp
. Didnât realise there was a technical reason why.