How do I use the Scala Platform?

There have been micro-discussions of this issue here and there, so feel free to add pointers or just copy/paste. The point isn’t to start over, just to bring the discussion into focus.

I think there has been a bit of sidestepping the issue of how the platform is intended to be used, and it’s certainly reasonable to kick details into the future, but I think at least a general statement of intended usage is important. If you don’t know what you’re delivering it’s very easy to get off track.

So I will offer a possible starting point that seems reasonable to me. Here’s the lowball version:

  • Platform modules have a GitHub badge and scala-platform metadata tag on Scaladex, and possibly a landing page like typelevel.org but are otherwise no different from any other libraries from the end-user’s point of view.

This has the benefit of being very simple but leaves version hell as the end-user’s problem. In the general case you want the flexibility to pick and choose but for beginners it’s a lot to bite off.

So a higher-end solution might also include:

  • An sbt plugin that provides a way to load up a given platform release, with settings that you can include for each module, with known compatible versions of everything, like Typelevel’s sbt-catalysts.

With the “big-step” development process being suggested in the proposed process something like this seems reasonable.

The discussion of white-listing and transitive closure for the platform hints at additional tooling that could be useful, such a provided ScalaStyle config that provides warnings for imports outside of the platform (for those who intend to be strict about it) or even IDE plugins that can warn when your code has traversed off of a platform API into a white-listed dependency that’s not explicitly supported. This kind of thing seems fine to put off until later but I think it’s useful to think about what the ideal user experience would look like.

2 Likes

A possible improvement on the lowball version could be to have individual meta packages with a scalaplatform org id, and a descriptive package id, where the version numbers are streamlined to a single platform version to alleviate dependency hell. For example, I’d know that scalaplatform-httpclient_2.13-1.0.1 has no transitive dependency conflicts with scalaplatform-json_2.13-1.0.1.

In the possibly Glorious Future under package aliasing, having a package prefix of scala.platform for platform libraries would probably increase clarity. When I want an http client, I’d import something like scala.platform.Httpclient (or scala.platform.http.Client, or scala.platform.http.client._ or some other brand of bikeshed roofing)

Thanks for starting this thread, @tpolecat.

I can’t speak for the Scala Center, but my understanding was that one of the goals (possibly the most important one) was to ease the getting started experience. Say, building a simple REST client should be mostly about coding it, rather than picking and choosing libraries, preparing a build, etc.

  • default choices for common tasks (parsing, json, http, etc.)
  • simple or no Sbt tinkering
  • work the same way in the REPL and a project

One simple solution is a meta-package and a tool à la giter8 to spawn a new project. That’s the way of the Play Framework. The REPL needs to be invoked through Sbt, but otherwise fits the bill (once you choose Play).

Another one is inspired by scripting languages, and was tried a long time ago. Some may still remember Sbaz.

I can see the appeal of a “local” repository of libraries for getting started. In fact, if the platform is supposed to alleviate the pain of Scala modularization, this is the only way to “fix it”. Of course, there’s the downside of making builds dependent on the locally installed packages. These two approaches do not mix well, and that’s why Sbaz eventually lost.

2 Likes

Everything that tries to “just work” with locally installed packages that I’ve seen only just works for a very low bar of calling something working. I’ve seen people claim that in Python it just works. The reality is pip and virtualenv, and they’re not trivial to install on every platform. Similar things go for Ruby and node.js

I’d love to see a good solution to this problem, but the claim that other languages have a simple, working solution to the problem is just not true, and trying to have Scala be the first platform that manages to do that is a very ambitious goal. I’m not opposed to ambition, but let’s not pretend this is a solved problem.

In the sbt ecosystem, the Ivy cache is probably as close a thing to locally installed packages as we’re going to get. As a hack, pre-populating the Ivy cache to platform modules to move download/install time from the first compile run in sbt to when you “install the scala platform” could be an acceptable tradeoff.

1 Like

It seems SPP is largely focused on the getting started experience which I agree is a goal worth pursuing, but leaves me wondering what effect it will have on projects as they age. Will it change the experience of slowly migrating a project from one platform to another?

1 Like

Good question, especially in light of @tpolecat’s point about having several different-but-equal platforms with different philosophies. I like that approach, but I suspect it’ll have the downside that evolving from one platform to another, which is already pretty challenging (I’m ever-so-gradually moving Querki to be more functional), may come to look completely infeasible without a complete rewrite…

I think this is a good point that may hint at further requirements:

  • It must be trivial to convert a platform selection into a standard sbt dependency list.

In addition it would be nice if you could take a standard sbt dependency list and see how well it matches against a given platform. “If you first upgrade your code to use org.chickenpants.atomblaster to version 3.22.1 you can then replace the following 7 dependencies with Bazooka Platform version 1.2.3 …”

1 Like

There’s still another danger, which is that if most people stick within a platform, libraries will take that for granted and make assumptions based on that, like only working or working better with each other, and not working or working worse with outside libraries.

So I think it should be emphasized that it’s more about getting started quickly and having fewer things to worry about at the wrong time, than about siloing ecosystems.

1 Like

If it’s about getting started quickly we should examine the failure of Activator to fill this need. But the proposal seems much more ambitious to me.

Yeah, I’m curious whether anyone has done a “post-mortem” (not that it’s quite dead, but it seems to be moribund) analysis of why Activator didn’t really catch on. I was quite enthused about that project, and actually still use it on my dev machine…

1 Like

Just throwing in my $0.02 of where I see the value here (as a user).

Batteries included is good, and easy to get started is also good, but as @tpolecat says, there are lots of meanings for that. I don’t think theres much value in being prescriptive about what batteries are included. For instance, I’d love to see Typelevel stuff distributed via the Scala Platform, in addition to Lightbend stuff, and anything else.

From my perspective, the unique position that’s missing in the ecosystem is stability. Catalysts makes an attempt at this in the for Typelevel builds, but across Scala, getting a “working set” of stuff that works together is tricky. Conceivably, this would provide a stable enough base of “good stuff” that, for instance, Play and Spark would both build against v1 of the platform so I don’t have to do class path gymnastics to write code that can run on both.

Knowing that everything in platform v1 can be on the class path with everything else, even if there are 3 different JSON parsers that may or may not share an AST, would be huge. Being able to pull in sets of batteries for sbt meta packages or something like it would be a nice value-add on top of that.

Given that, I think the biggest requirement for packages that want to be included is a fairly high level of stability, in keeping with whatever the release cycle of the platform is (6 months to a year?) and a set of transitively closed dependencies.

If we get the “everything works together” piece right, the “batteries included” are just some tooling around coursier / sbt / giter8 / activator type templates, with the benefit that now when you try any pull in some extra batteries, you have a safe place to get them from that won’t suddenly throw you into class path hell because you have 3 versions of guava.

So, how do I use the platform? I use it as my only resolver, and stop worrying about what version of random low-level libraries I need, instead just pulling in the top level stuff I want. “Akka”. “Cats”. “Spark”. “Play”.

2 Likes

I agree with this and I think people are thinking and headed in this direction. The sbt new command now allows for a gitter template to setup a project which is great for people wanting to go directly to sbt and/or ScalaIDE(Eclipse) or IntelliJ. I have used activator and think it is good but think that it should be had via sbt activator or something. The third entry point is scala and/or sbt console. Perhaps sbt could even come with a template or set of templates that setup the default libraries to use with core Scala. I’m envisioning and template that could be modified and then sbt with that setup could be re-distributed inside a corporation. This could also allow for a Typelevel set of libs or including what is now considered a non core lib such as XML if that is needed by default.

In terms of goals for the platform, I think another crucial goal for the platform is to help facilitate interoperability for things that are very common and need stability.

Case in point is having a common String type. Although people can argue that java String type can be better (for some specific cases), having a common general String type provides immense benefits for the ecosystem. One only needs to have a look at Haskell to see the problems when you have a complete minimal core with no real support for libraries for common tasks (have a look at the Text vs List[Char] vs ByteString issues in Haskell).

I think that the Scala platform is also meant to facilitate such libraries. For example there may is a good argument that scala-xml should not be part of the stdlib (and it has now been made a module) however its existance in the early days means that we now don’t have 8 slightly different implementations of scala-xml which would make it hell for library/application authors that need to work with xml.

You can view this as having as being a side effect of batteries included, but batteries included is much more than just making it easy for new users by providing some options, its biggest benefit imho is allowing for interoperability.

1 Like