Alternative scalajs/scala-native distribution mechanisms

A post was split to a new topic: Finding way to engage with more FLOSS Scala maintainers

Ah, so basically you’re not interested in reducing the maintenance burden overall? For the sake of argument, let’s say maintaining Scala.js cross-compilation was as easy as maintaining a different Scala major version (i.e., ++-like easy), would you still call be fighting to get rid of it?

1 Like

We’ve had these for, literally, years by this point. If you run a library that people want to use on Scala.js, just tell people you don’t have time to support it and someone else will fork and publish it.

But if your concern is “people can’t say no when random internet strangers ask them to do more work”, this clearly won’t work at all

Do you have any scala.js/scala-native projects you maintain that you can point me at the build to show how troublesome it is? That would definitely help people understand your concern

I think the overall point is that now-a-days, you can’t just build a useful Scala library for Scala and publish one major version at a time.

Instead, you’ve got to deal with supporting multiple backends and multiple major versions of Scala, and in-all, that’s a burden that falls squarely on the maintainers of libraries.

I think his point is that he worries no one is paying attention to the burden falling on maintainers of popular open source libraries as a result of this.

I would imagine that as a result of this, one could argue that it makes it more difficult for people maintaining libraries to actually maintain them, thereby potentially having a negative effect on the overall community.

I don’t think it’s a point that you can just toss aside and discount.

1 Like

My main pain point is this issue here Support multiple scala-js/scala-native versions · Issue #47 · portable-scala/sbt-crossproject · GitHub . I am attempting to release an ultra stable library as part of the new scala platform (scalajson) and I find out that I am unable to build against different versions of Scala.js (and also scala-native). A big part of the Scala.js community is still on 0.6.x, and I also want to publish for 1.0.0. Same deal with scala-native

Turns out the only way to accomplish this is with environment variable hacks and at this point I want to explore using CBT (although apparently I can’t because the plugin for using the scala platform will only support SBT).

In my opinion, the first step to solve this problem is to stop treating different platforms (such as Scala-js or Scala-native) as sbt plugins. Instead they should be first class backends for Scala.js, so its easy to have a simple alternative to crossScalaVersions for scala-native/scala-js (as a setting in sbt ideally).

Another problem is this issue with standard libraries, my personal pet hate where Scala ecosystem appears to be porting Java libs rather than creating idiomatic Scala versions that cross compile to all platforms (even if it just wraps existing libraries on JVM platform). Onceeveryone settles on scala._ packaged library (or whatever the library happens to exist) its incredibly easy to deal with migrations and cross compile code. A very real example of this is Scala Future, it was supported almost instantly in Scala.js and thanks to this basically all libraries which used Future for asynchronous/concurrency problems had no issues porting to Javascript.

Compare this to the chaos we have with date libraries (in Scala world, we now have to deal with 4 date libraries. JodaTime, Javatime for JDK8, scala-java time (GitHub - cquiroz/scala-java-time: Implementation of the `java.time` API in scala. Especially useful for scala.js), and nscala-time (GitHub - nscala-time/nscala-time: A new Scala wrapper for Joda Time based on scala-time). In order to cross build stuff at work, I had to use type alias’s (i.e. a DateTime alias which points to Javatime for JDK8 on JVM and points to scala-java-time DateTime on Scala.js because for some reason it sits in a different package (i.e. org.threethen.bp). Is it so hard to just release a date library that is compiled on all platforms and included by default that is also part of Scala (and not just Java ports, which usually fail for non trivial things in areas like io and date because JVM makes a lot of assumptions). I mean the other thing is that, Scala is not Java, we don’t have to have a library frozen in stone for ages due to backwards compatibilty issues. Even if there are problems which these ports, they can be solved iteratively over time (we are also a strongly typed static language, and we have scalafix)

In summary (tl;dr version)

  • Make SBT cross publish to other platforms (scala-js/scala-native) just as easily as it is for different scala versions now. Scala should also stop assuming JVM is the “default backend”. This probably means that zinc should stop relying on .class files (JVM specific) and maybe work directly on trees
  • Make and provide common libraries by using the best libraries out there as inspiration that is provided and included by default on all platforms. We saw how succesfull this was with Future for Scala.js, if Future wasn’t part of the Scala stdlib the async story on Javascript would have been a complete and utter nightmare

Also I like the idea of source distributions with maybe a global internet cache of binaries to speed things up. A huge amount of this complexity is usually due to having to support binary compatibility between differing versions of Java/Scala (and now Scala.js with 0.6.x vs 1.0.0x). I have envy of languages like Go and Crystal which basically said “lets forget about dynamically loadable libraries” which require binary compatibility, which has made publish OSS libraries so much easier. Also source compatibility is much easier to maintain then binary compatibility. Even though we have tools to help with binary compatibility (i.e. sbt-mima) this is only for the JVM (its also not installed/disabled by default). Also no such tools exist for Scala-js/scala-native (should they start caring about binary compatibility)

The best solution for this is either source distributions or making it as easy as possible to publish against everything with SBT. If we choose to do the latter than

  1. SBT in general needs to be faster, like, a lot faster. When you start having complex builds that cross publish on a lot of things, your build tool needs to be as fast as possible. Somewhat complex builds easily take minutes just to load in SBT now
  2. SBT also needs to be a lot easier to use, and a lot easier to cross publish things (i.e. sbt-doge helped in certain cases, but it came a bit late and its still confusing on how to use it)
  3. Stuff like Mima needs to be enabled by default
  4. Scala/SBT/Zinc needs to stop treating JVM as the default backend which also means things like scala-native and scala-js needs to stop being “sbt plugins” to enable things like crossScalaVersions for different backends

On the surface, I think adding in really good support for source distributions is probably easier.

I do, see Explore CBT · Issue #34 · mdedetrich/scalajson · GitHub and Support multiple scala-js/scala-native versions · Issue #47 · portable-scala/sbt-crossproject · GitHub .

In general its just becoming a massive PITA, and builds are becoming really complex with in ThisBuild vs using common settings and stuff like sbt-doge. I really hope CBT can simplify things in this area, or maybe just make source distributions the default (the current dependsOn syntax for source inclusion is terrible and also ironically doesn’t work with cross project that well)

3 Likes

Let me start by correcting a few misconceptions.

That is a common misconception. Source compatibility is virtually impossible to maintain. By comparison, Binary compat is therefore infinitely easier to maintain than source compatibility. I explained all of this here: A Guide on Binary Compatibility - need your input! - #2 by sjrd. Fortunately, source compatibility is less important than binary compatibility, because it does not suffer from the dependency diamond problem. That also I explain in the mentioned post.

Another misconception. sbt-mima is just as capable at checking binary compatibility for Scala.js libraries as for JVM libraries. We made sure of that when designing Scala.js.

Uh!!? Scala.js cares about binary compatibility very much! In fact, I think the 0.6.x series come in second place for the longest streak of maintaining backward binary compatibility in terms of elapsed time (2.5+ years so far, after sbt 0.13.x which is 4 years in), and actually hold the record for the longest streak in terms of number of versions (21 binary compatible releases from 0.6.0 to the currently released 0.6.20).


OK, now my actual open ended answers.

I don’t think so. Scala.js and Scala Native both emit reasonable .class files that tools such as zinc can use. See my first post in this thread. This was designed as such since Scala.js 0.1 (I’m not kidding) so that all those tools would play nicely with Scala.js out of the box.

I very much doubt that. Sources in Scala are extremely volatile, due to some of its language features, most notably implicits. As I said earlier, maintaining source compatibility in Scala is not possible. Therefore dependency graphs based on source dependencies will break in horrifying ways, and nobody will be able to fix them.

Between source deps and @fommil’s suggestion of Scala.js mirrors everywhere, at the least the latter stands a chance, because it’s technically sound. The former isn’t.

This is an idea worth exploring, but it would require strong buy-in from sbt core. I won’t fight that fight, but if other people in the community are willing to, be my guest. Note however that there is a fundamental difference between switching the Scala major version versus switch the target, regardless of whether Scala.js is activated via a plugin or not. The former doesn’t change at all the set of defined settings and tasks. The latter, however, means that some settings and tasks are defined in some targets and not in others (obvious example: fastOptJS). This means that for example when you switch your project from Scala.js to Scala/JVM, if you have settings that read fastOptJS.value, your build breaks because the dependency graph cannot be built. This is fundamental.

One of the “tweaks to SBT” I was thinking about was this boilerplate here:

lazy val effect = Project...
lazy val effectJVM = effect.jvm
lazy val effectJS  = effect.js
lazy val effectNative = effect.native

lazy val iteratee = Project...
lazy val iterateeJVM = iteratee.jvm
lazy val iterateeJS  = iteratee.js
lazy val iterateeNative = iteratee.native

Anyone know how easy it would be to let people do

lazy val (iterateeJVM, iterateeJS, iterateeNative) = Project...

?

Or even

lazy val (iteratee, iterateeJVM, iterateeJS, iterateeNative) = Project...

Where iteratee is a placeholder project that does nothing much other than aggregate the {JVM, JS, Native} projects, so you can easily e.g. run tests or compile against all of them.

Of all the things you have to do in SBT to make cross-building work, this is one of the parts which I find neither simple (e.g. converting project -> crossproject) nor unavoidable (e.g. configuring different dependencies for different platforms). It would be great if we could smooth it over

2 Likes

This is actually false, plenty of other languages deal with source distribution (most recent mainstream example is node.js). As you are write, it does introduce the diamond dependency problem, but this is solvable (in tools like npm there is a command like shrinkwrap which generates a dependency map, see https://docs.npmjs.com/cli/shrinkwrap ). Another example of such a language is https://crystal-lang.org/ (which is statically compiled and also uses source dependencies)

Granted in node this is slightly easier, because node simply inlines all packages in node_modules so its easily possible to have 2 packages inside a project that have different versions. This is however possible in Scala (or at least JVM since OSGi allows this)

Alternately, maintaining source compatibility doesn’t mean imply ignoring binary compatibility it can also simply mean that you don’t have to generate binaries for your platform, you just need to make sure that the resulting binaries from your source aren’t incompatible. You can simple use a tool which (before creating a new tag) will compile a source against a matrix of targets and just verify the produced .class files are binary compatible (with mima for example) compared to the previous version. The point is this can be done outside of SBT.

The point of making Scala work with source compatibility is it shifts some maintenance burden from library authors to users, and if you look at languages like Node/Ruby/Python this is helpful in creating a big ecosystem of libraries. Java managed to get away with this mainly because its one of the few static languages that building is very simple with and has a very strong maintanence on binary compatibility

In final the point is that there are other mainstream languages (both static and dynamic) that work with source distributions and they can handle it fine. It does put a bigger burden on library users, but it also makes it a lot easier for library maintainers (which is arguably what you want if you we want more libraries in our ecosystem)

And scala-native?

Its not just binary compatibility, its also forward compatibility. I have had to go into projects and fork maintain separate forks to upgrade their Scala.js versions. I mean I am not complaining here, Scala.js has done a wonderful goal when it comes to backwards compatibility, however we now have this issue of 0.6.x vs 1.0.0.x and these kinds of issues will continue to happen. Like I said, I have this issue right now where there isn’t a sane way to cross compile between scala.js 0.6.x and 1.0.0.x and it will be the same deal with scala-native.

Of course, but this is fundamental issue to how SBT is designed. Its based around JVM (or assumes JVM as the main backend) when this shouldn’t be the case. I mean SBT works on patched direct acyclic graphs, so this is definitely possible, its just that the settings have been designed in such a way that JVM is assumed to be the main backend.

In those languages, it is actually possible to evolve a library in backward source compatible ways. As I said before, this is not the case for Scala: you can’t add a public method without breaking source compatibility. Your entire argument before that sentence is moot because of this fundamental difference between Scala and the languages you have mentioned.

Scala Native has the same design principle. However, since it is less mature, we still sometimes discover “design bugs” that cause NIR not to be binary compatible when it should. However, they are being fixed. Therefore, modulo bugs, sbt-mima also does the correct thing for Scala Native.

The problem you describe there is not forward compatibility. Between Scala.js 0.6.x and 1.x, we are actually breaking backward binary compatibility, which explains the difficulties. It has however nothing to do with forward compatibility.

Also, as I explained above, this problem will not continue to happen for Scala.js, since 1.x is supposed to be backward binary compatible forever.

As for Scala Native, it is less mature, so yes, these things will still happen for Scala Native. However, precisely because it is less mature, I would advise library authors against cross-compile for different versions of Scala Native at the same time. This is mental. Just upgrade to the latest Scala Native version and only publish for that one. Forget about older versions. There isn’t a large ecosystem on things stuck on old versions of Scala Native that you still need to support.

Cross-compiling between Scala.js 0.6.x and 1.x is only necessary because Scala.js 0.6.x was already so mature and stable that there actually is a large ecosystem working on 0.6.x, including production applications. Scala Native does not have that problem yet.

I don’t think this is an accurate depiction of the problem. If sbt did not assume JVM as default, you would have a ScalaJVMPlugin that would declare some JVM-specific JVM settings. And then what? You still have exactly the same issue when switching back and forth between ScalaJVMPlugin to ScalaJSPlugin: some settings become undefined and your whole setting dependency graph can break. Anyway, there are exactly three sbt settings that do not apply to a Scala.js project: console (for which we warn), javaSource and javacOptions. Every single other sbt settings applies. So I would even question the assertion that sbt is too assuming of the JVM.

No, sbt does not assume the JVM. The problem is the dynamic nature of sbt. It is theoretically possible to create a sbt whose world is closed (global plugins are not added to the build, only a closed set of local plugins are) and that can be linked to native code or even javascript code. Of course, you would need to relink every time you add a local plugin, and you need the actual codebase to cross-compile.

There are other technical solutions to the problem of plugins only targeting a concrete backend (Scala Native vs Scala.js) version. One could imagine an sbt plugin that defines a public interface called by settings and tasks, but whose implementation classloads the backend version and calls concrete methods there. This has two problems:

  • Dramatically complicates the design of the sbt plugin.
  • All the arguments of the methods invoked have to be either primitives or Java classes (e.g. String).
  • It may not actually be possible, depending on concrete technical details.

I’m just throwing out there this idea. I do not consider it’s feasible and worth the trouble – I assume it would require @sjrd too much time to implement it this way.

I don’t believe this is true, especially for Crystal which has features such as mixins/monkey patching which can break source compatibility just by adding public methods (and actually, everything in Crystal is public, they have no concept as private). In Ruby you can pretty much do anything you want (granted this is a dynamic language). Adding public methods can break any current code.

Sure, but this problem is still going to exist. Its not just Scala.js, it will apply to other backends as well. We have companies/OSS projects still having to publish for Scala 2.10. And as specified before, this also forms a matrix, so if we have some common library (lets say a Date library) we end up having to form a matrix of all possible Scala/Scala-js/scala-native configurations.

Its not just about Scala.js, its about the Scala ecosystem in general

I am just stating that, just as @fommil has said, its become quite difficult (for whatever reason) for people to maintain libraries, moreso than what I suspect is typical of other mainstream languages (which is what Scala is competing against). I wasn’t trying to imply that making SBT agnostic against different Scala backends would be easy. In fact, just as stated it would likely be quite difficult in SBT due to the fact that all settings are global, this is likely a design issue though. I mean scala-native should not even have to have any idea about scala-js fastOptJs, or even be able to access it, because its irrelevant (I would actually argue this is one of the problems in SBT, having all settings as global actually makes it more confusing for people using SBT because you have access to certain settings in certain contexts which you should even access, or even care about). Maybe the first step would be for SBT to implement a concept of private/public in regards to scoping so that all settings aren’t global?

I think for similar reasons (at least according to @cvogt) this is why cross compiling against scala backends isn’t hard to do in CBT, because CBT allows you to hide details about implementations in traits with private (just like in any Scala code). At least to me these kind of problems are just demonstrating design problems that SBT has, hence why they are not easy to fix.

I mean at the end of the day, especially with Dotty looming on the horizon, this problem is just going to get worse and worse.

I don’t like the fact that you’re blaming sbt’s design. In my view, it has nothing to do with this. You can effectively create private keys (tasks and settings) with sbt that are not accessible by other keys. This is why autoImport exists, and this is also why you can scope keys inside tasks. Ah, and don’t forget about local private keys, which also exist though most of people don’t know about them.

Why people don’t know about them or don’t use them correctly is another discussion.

CBT relies on source compatibility, and that’s okay. But let’s not pretend that source compatibility solves this problem, because it doesn’t. The main problem here is the tight coupling between Scala.js and the sbt plugin to use it — likewise for Scala Native.

I actually have zero experience maintaining a cross-compiled library, so that’s why I don’t go into the main discussion here. But so far we’re beating around the bush…

No, again, same problem: if you do that, now when you switch your project from JVM to JS or conversely, your .settings() don’t typecheck anymore (instead of failing to build the dependency graph). In fact that’s worse than the previous alternatives; with those, at least, you could write your .settings() with the appropriate dynamic dependencies so that your dependency graph would in fact be correct in all configurations. With private settings for platform-dependent things, you made this impossible.

There is a chance to bread backward-compatibility in the future. Just align the breaking changes with Scala versions, as long as Scala.js 1.0 on Scala 2.13 is not backward-compatible with Scala.js 1.0 on Scala 2.12.

So it becomes a good opportunity to advertise Scala.js when someone created a PR to add Scala.js support for a JVM library.

Since sbt supports creating subprojects in a plugin, crossProject may become unnecessary in the future.

I’m assuming he means that in Scala, any public method newly added breaks
code that relied on that method not yet existing to trigger an implicit
conversion.

For example:

class A

class B { def yo: Unit = println(“Yo!”)

object B {

  • implicit def aToB(a: A): B = new B*

  • val a = new A*

  • a.yo // prints “Yo!”*

}
a.yo, because A does not have yo, triggers implicit conversion to B, which
has yo, so this prints “Yo!”. This breaks if some one adds a method to A
called yo, because now there will be no implicit conversion to B:

class A { def yo: Unit = println(“Yeah!”) }

class B { def yo: Unit = println(“Yo!”)

object B {

  • implicit def aToB(a: A): B = new B*
  • val a = new A*
  • a.yo // prints "Yeah!*
    }

I’m just going to put this right here…

This is valid. And we should not discount this viewpoint.

You said that earlier. I took note of it. I don’t think I have said anything in this thread that should even suggest that I am discounting it.

On the contrary, my very first post was all about getting more understanding of what makes it a burden to support Scala.js. While on that topic, I am still waiting for answers (any answer, actually) to the question I asked in that post. Without those answers, I cannot begin to think about how to discuss possible solutions.

1 Like

[quote=“jvican, post:18, topic:1166”] You can effectively create private keys (tasks and settings) with sbt that are not accessible by other keys. This is why autoImport exists, and this is also why you can scope keys inside tasks. Ah, and don’t forget about local private keys, which also exist though most of people don’t know about them.
Why people don’t know about them or don’t use them correctly is another discussion.
[/quote]

Sure they may not be accessible but they still create issues when they are being patched as sjrd noted here, which brings me to my next point.

Honestly, I don’t see what the problem is. If settings are private for a project, the shouldn’t be patchd from another project at all. Obviously this may not be strictly correct from how SBT is designed, but at least to my eyes its semantically more correct with how the build tool should be treated.

I mean effectively you are saying that a Scala backend is unable properly encapsulate settings only specific to their platform (which shouldn’t be set from other platforms)

I am not trying to blame SBT for the sakes of blaming SBT. Its probably completely possible to represent this correctly as you are stating. All I am trying to state is that right now, the maintenance for library maintainers is starting to get silly and I don’t think its useful to just stick our heads in the sand.

I mean people give crap about Spark using maven and having to manually patch .pom files to cross build Scala versions and now I am basically having to end up doing the same thing with SBT but with environment variables.

Afaik CBT doesn’t rely on anything specifically, (are you talking about source compatibility for CBT itself or for the projects its building?). What I do know is that in CBT the entire DAG is represented in Scala. Like you said maybe the issue with SBT is the tight coupling