Alternative scalajs/scala-native distribution mechanisms

Since sbt supports creating subprojects in a plugin, crossProject may become unnecessary in the future.

I’m assuming he means that in Scala, any public method newly added breaks
code that relied on that method not yet existing to trigger an implicit
conversion.

For example:

class A

class B { def yo: Unit = println(“Yo!”)

object B {

  • implicit def aToB(a: A): B = new B*

  • val a = new A*

  • a.yo // prints “Yo!”*

}
a.yo, because A does not have yo, triggers implicit conversion to B, which
has yo, so this prints “Yo!”. This breaks if some one adds a method to A
called yo, because now there will be no implicit conversion to B:

class A { def yo: Unit = println(“Yeah!”) }

class B { def yo: Unit = println(“Yo!”)

object B {

  • implicit def aToB(a: A): B = new B*
  • val a = new A*
  • a.yo // prints "Yeah!*
    }

I’m just going to put this right here…

This is valid. And we should not discount this viewpoint.

You said that earlier. I took note of it. I don’t think I have said anything in this thread that should even suggest that I am discounting it.

On the contrary, my very first post was all about getting more understanding of what makes it a burden to support Scala.js. While on that topic, I am still waiting for answers (any answer, actually) to the question I asked in that post. Without those answers, I cannot begin to think about how to discuss possible solutions.

1 Like

[quote=“jvican, post:18, topic:1166”] You can effectively create private keys (tasks and settings) with sbt that are not accessible by other keys. This is why autoImport exists, and this is also why you can scope keys inside tasks. Ah, and don’t forget about local private keys, which also exist though most of people don’t know about them.
Why people don’t know about them or don’t use them correctly is another discussion.
[/quote]

Sure they may not be accessible but they still create issues when they are being patched as sjrd noted here, which brings me to my next point.

Honestly, I don’t see what the problem is. If settings are private for a project, the shouldn’t be patchd from another project at all. Obviously this may not be strictly correct from how SBT is designed, but at least to my eyes its semantically more correct with how the build tool should be treated.

I mean effectively you are saying that a Scala backend is unable properly encapsulate settings only specific to their platform (which shouldn’t be set from other platforms)

I am not trying to blame SBT for the sakes of blaming SBT. Its probably completely possible to represent this correctly as you are stating. All I am trying to state is that right now, the maintenance for library maintainers is starting to get silly and I don’t think its useful to just stick our heads in the sand.

I mean people give crap about Spark using maven and having to manually patch .pom files to cross build Scala versions and now I am basically having to end up doing the same thing with SBT but with environment variables.

Afaik CBT doesn’t rely on anything specifically, (are you talking about source compatibility for CBT itself or for the projects its building?). What I do know is that in CBT the entire DAG is represented in Scala. Like you said maybe the issue with SBT is the tight coupling

Like you want console (JVM-only) and fastOptJS (JS-only) to be private? This doesn’t make any sense.

I get the sense that this point, multiple backends make maintaining libraries in the Scala system more difficult, hasn’t been accepted as a valid point. As a result, I see a lot of defending the status quo rather than contributing to brainstorming about what we could do differently.

I think what would be more constructive is if we could instead put our heads together to try to imagine ways to move more burden away from library maintainers. Right now I see only two individuals making suggestions in this direction. I think more of us could chip in here.

3 Likes

My point was that the tight coupling has nothing to do with sbt — i.e. it will happen in CBT too.

What I meant by “relying on” is that CBT’s fundamental rule is to cross-compile the build before using it. For cross-compiling it, you have source dependencies on both the CBT version you build and the plugins.

I don’t think I have said anything in this thread that should even suggest that I am discounting it.

I’d like to say your posts have been the most constructive posts in the whole thread. I don’t see why people want to so frantically start scrambling for solutions even when there is not even consensus on what the basic facts on the ground are, nevermind what the problem to solve is.

Seems like a recipe for spending lots of effort, not actually solving any problem, and ending up regretting the choices made.

For what it’s worth my questions, trying to dig into the unhappiness and surface the actual problems, have been more-or-less ignored as well. Honestly feels like they’ve been “tossed aside” or “discounted”. But perhaps this just isn’t the right group or place for a solid 5-why’s/root-cause analysis.

I completely agree. I feel like at this point we need to take an approach
that focuses on specifics, its almost too tough to talk about the actual
problem in general terms. For instance, Travis brought up build time
increases. And he’s right that sucks, so maybe the path to a better
maintainer experience is finding these paper cuts and just being methodical
about crushing them.

-Dan

If what you see from my posts is defending the status quo, look again. I have so far done nothing but

  • Ask for clarification and identification of what makes life difficult for library maintainers (receiving no answer so far),
  • Give the technical background necessary to be able to discuss things in a technically informed way.
  • Explain why some solutions proposed so far are not viable for technical reasons.

You can re-read every single paragraph I have written, and it fits in one of those three buckets. Now if those buckets are wrong to talk about to begin with, then I really don’t know what to do.

If the premise of the whole thing is “libraries maintainers must not touch Scala.js whatsoever”, then the answer is simple: stop touching it, and let a Scala.js user fork and publish the Scala.js version of your library. It has been done before; it works.

That definitely “moves more burden away from library maintainers”. But I hope we can do better than that: reduce the overall burden altogether. But for that we need to understand exactly what this burden is.

1 Like

This is actually what we wanted to do in the beginning, but we couldn’t because sbt broke support for pattern-extracting (lazy) vals at the top-level of .sbt files in sbt 0.13.7. See the release notes.

I already linked 2 issues demonstrating my personal problems, and I have given plenty of feedback in other threads regarding issues. You can also look at my comment history on github, I am not just complaining because I have nothing better to do.

To reiterate (and in no particular order and related to this topic)

  • You cannot right no cross publish against non JVM Scala backends, and this will become a bigger problem as our permutations expand due increasing the size of our build matrix (i.e. its not sustainable)
  • SBT (for whatever reason) ends up becoming quite slow when your start supporting a lot of different backends. By slow, I mean the loading time of SBT is quite high, on some projects like Circe its taking like a minute to load (and this is before you can even compile anything)
  • There is still confusion about how to approach settings in cross building, i.e. in ThisBuild vs setting settings in a common variable and reusing it in seperate projects.
  • There is also the general problem of libraries and JVM. Right now Scala is sought of weird middle between being a language of its own (completely seperate of Java) and being a “better java”), see my original point above about DateTime (why for example are we porting Java IO libraries to certain platforms, but not porting stuff like JDK 8 DateTime, even though it exists, or why can’t just make a scala datetime package which people can use and is available, by default, on all platforms. This last point is something that I added above what @fommil suggested.

I mean the thing is we had a similar issue before in another area, for example if you had a library that is being published for 2.10, 2.11 and 2.12 which depended on a lot of other modules, and some of those modules didn’t have a version of Scala for 2.12 you would have to just wait until that library was published. This is now solved with GitHub - sbt/sbt-doge: sbt plugin to aggregate tasks across subprojects and their crossScalaVersions and is included in default in SBT 1.0.0.

So the way I see it, we either make it a lot easier to deal with build matrix’s or we explore other options like source distributions.

Completely agree here

I mean the issue have already been stated numerous times, what exactly is unclear?

This isn’t the premise and also the solution isn’t an ideal sustainable one. I don’t think that confusing people with having a different package/artifact name just for Scala.js is a good idea

You most certainly can: use an environment variable as detailed in the release notes of Scala.js 1.0.0-M1. It might not be ideal, as opposed to ++, but it’s definitely possible and has very little boilerplate. Also, things like Travis have very good support for running a matrix with different environment variables.

Again, I acknowledge that it is not ideal compared to ++, but I cannot take “You cannot right now do this” as valid criticism. I’m willing to discuss how we can improve this, but I first need someone starting to give at least a starting idea on how this could be done better, taking into account the technical constraints. For example, just saying “there should be a command **1.0.0-M1 in sbt to switch to Scala.js 1.0.0-M1” does not take technical constraints into account, because you need a different sbt plugin to be on the classpath of your build to be able to switch the version of Scala.js, and a command does not have the power to change the classpath of the build. I also have several times heard people suggesting that sbt-scalajs should be a tiny shell independent of the actual version of Scala.js, and load the linker and everything else reflectively in a ClassLoader. This is viable, though quite complex, until you start considering other sbt plugins depending on sbt-scalajs, such as scalajs-bundler or sbt-web-scalajs.

I will say it again: I don’t have any idea to solve this, but if someone does, I am all ears.

OK, that’s the first time I hear about this one. Let’s add it to the list. I’m not sure why it happens. I mean it would obviously take “twice the time” to resolve the setting dependency of a crossProject (which really is 2 projects) vs one project, but from your comment it sounds like this is more than 2x?

Is that really related to Scala.js cross-compilation or to sbt multi-projects in general? I used to recommend to people that they always use a commonSettings, except for the very specific case of crossScalaVersions. That was easy to understand and apply (even though maybe not understand why), but apparently it confuses some tools like Ensime. Now my recommendation would be: try in ThisBuild first for any common settings that you have, and if that has no effect for a particular setting, put it in commonSettings. An easy way to test whether it has any effect is not to put in the build, start up sbt, then set it dynamically. set will tell you how many other settings and tasks are affected by the change. If it says 0, chances are it does not have any effect.

I did not comment on this earlier because it seemed to me that the DateTime questions were kind of rhetorical. Of course we can write a standard Scala date-time API. Why don’t we? Probably because it’s too big an endeavor to undertake given that java.time already exists and is “good enough”. So yes, we should definitely port java.time to all the platforms, and there are various efforts in this direction. The fact that there are 3 competing solutions is a real bummer. We first made scalajs-java-time, but then other people created competing solutions instead of contributing to the existing one. At that point I don’t know what to do anymore. Am I supposed to take an authoritarian position and declare one of the alternatives the right one, and encourage everyone to shun the other ones? What if the other ones do have some parts of the API that are not supported in the right one yet?

I’m genuinely asking those questions. The java.time situation bothers me.

To this particular point:

I don’t understand because, as mentioned above, the JDK 8 java.time API is ported (3 times).

So, you’re saying that since sbt-doge solved that issue before, which was “similar”, why is it that we haven’t solved platform cross-compilation yet? I can answer that. I very much wish myself that we had solved that already.

The fact is that sbt-doge had a much easier problem to solve. For starters, it was already definitely possible to publish some modules with 2.10 and 2.11 support, while other modules supported 2.12 too. You just could not use +, and instead had to use ++ “manually”. ++ has always worked, way before sbt-doge came. I know because I have used it since the beginning of Scala.js supporting several Scala versions, and I am still doing so across 18 minor versions of Scala for about 20 modules in Scala.js, that support different subsets of Scala versions.

All that sbt-doge solved was to take into account crossScalaVersions at the project level rather than at the build level, which means that it made + work in addition to ++ in such situations. It certainly was a worthwhile improvement, but not a fundamental one.

For the Scala.js cross-compilation issue, even though it can appear as “similar” on a very cursory glance, it is very different. ++ itself doesn’t work, never mind +. And I have commented above about why we could not make ++ work for Scala.js cross-compilation (or at least haven’t managed to do so).

If someone comes up with a solution that makes some kind of ++ command work for Scala.js, I will jump on the solution and distribute it right away. I haven’t been able to come up with a solution for that myself, though. I hope someone finds something where I failed.

It is not ideal for downstream users of libraries, but it is ideal for the Scala/JVM library maintainer. You have to give that solution that much credit.

Package and artifact names can be the same, only the groupID has to change. This could potentially be addressed if the JVM maintainer allows a trusted JS maintainer to publish the Scala.js artifacts in the same groupID.

1 Like

I know you can work around it right now, most problems you can work around. The point I am making is that its not a real solution, and its also not sustainable (because if I also want to support scala-native, I know have these permutations of environment variables I have to deal which can also get more complicated when taking into account scala versions)

I know, I am just pointing out that its a problem and that we should at least investigate a way to solve it

Like with most benchmarking its hard to quantify this (whether its related directly to crossprojects or just large SBT projects in general) but with the sharing of dependencies amongst cross projects and settings, it does really seem to slow down. SBT 1.0.0 may have helped here, but I can’t use it due to lack of Intellij support

Yeah this is precisely what I am talking about. I used to use commonSettings until the folks at Ensime complained that it was the “wrong approach” and now it sees to be more of a trial and error thing.

I mean the thing is, why cant we just take what is here GitHub - cquiroz/scala-java-time: Implementation of the `java.time` API in scala. Especially useful for scala.js and namespace it for scala? I mean the difficulty here is takes co-operation both from Scala guys and the community. This is what is frustrating, the solution is already here but we ended up reimplementing it 3 or 4 times

Yes and this is the problem, it should be ported once and in the scala namespace. We have 4 copies of the same thing with slight variations, this is the issue.

I guess this is what is frustrating, there isn’t an ideal solution. Also I am not necessarily advocating for something exactly the same as crossScalaVersions, my point is that tinkering around with environment variables (or dynamically patching build files) is not a real solution, its a hack.

Sure, I just see it as a bandaid to a real problem.

What would be different if it were in the scala namespace? AFAICT, it would still be exactly the same situation, wouldn’t it?

To me it doesn’t appear as a hack but a proper solution (although potentially not as convenient as something else). I’m not sure what in this strategy you consider a hack. Is it the usage of environment variables per se? If yes, don’t you think most of Linux is a hack? Where is the line where things go from well designed to hack in this scenario, according to you?

XKCD summarized this one nicely. I don’t see that adding it to the scala namespace does anything other than confusing things worse. There is a standard; it is rather decent and widely used; we shouldn’t be changing that standard in incompatible ways. The implementation situation is a mess, yes, but that’s no reason to abandon the standard itself…

It sends a signal that “this is meant this is the standard date time library you are meant to be using”. Also note that the intention is for it to be provided, by default, on all scala platforms.

Its a hack from the point of view that you are using environment variables to bypass a missing capability that the software should provide. Cross building is basic functionality that is expected of a build tool. I mean if we take this argument to its extension, if we supposedly had to set environment variables to specify library dependencies because of a shortcoming of the build tool, many people would agree that such a solution is a hack.

OK, fair enough. I was confused because what you describe is what I would call a workaround, not a hack.

To me, a hack is when you rely on the implementation details of a component rather than its public specification/contract; which exposes you to breakages if the component you are using changes its implementation in the future, even though it still complies with its specification (which it is allowed to do). Examples of hacks in the Scala.js codebase, by my definition, include here, here and here (note that these hacks don’t leak outside of the codebase).

1 Like

Thanks for clarifying!

But yes, my main point is that this workaround isn’t sustainable as more Scala versions/backends get released