Spark as a Scala gateway drug and the 2.12 failure


#42

Now that the 2.12 port is done I was wondering if anyone on the Scala team (or anyone else) could comment on how things might look for 2.13 … I’m curious whether the recently-completed work will make things easier next time around.


#43

The main 2.12 changes impacting Spark were (in detail):

  • our new closure encoding actually made it easier to implement Spark’s “closure cleaner”, but it took some effort to convince ourselves (see details in doc linked above)
  • SAM types being compatible with function types resulted in a source incompatibility – this was resolved in 2.12.0 by improving type inference for overloaded higher-order methods. Further improvements coming in 2.13. One corner case remains with Unit-returning functions.
  • we need a stable API for REPL users, such as Spark. Help greatly appreciated in coordinating between the various projects.

For 2.13, I expect we’ll have this stable REPL API, but the collections are somewhat unknown in their impact on the Spark code base. If anyone would like to try – now is an excellent time, and this greatly benefits both communities! Sadly, our team at Lightbend will likely not be able to get to this in time for M5.


#44

@SethTisue can spark finally make it into the 2.12 community build?


#45

That would be very welcome, but likely a significant effort. I don’t think we will have time to tackle this ourselves in the next 6 months.


#46

can spark finally make it into the 2.12 community build?

we can discuss at https://github.com/scala/community-builds/issues/763. I’ve already put some thoughts there