SIP: Curried varargs

In spite of the implementation, so does the committee think this feature should be a part of the standard library, or a 3rd party library?

A third-party library. Anything that can be outside of the stdlib should prove its value as a third-party library before it can apply for inclusion in the stdlib.

This is rather disappointing, as it severely curtails the ability of those of us who are in favor of these proposals to rebut these concerns.

While the next meeting will be public, the situation has become asymmetric. We’ll be working against a formed position (whatever you may claim to the contrary, the SIP is made up of humans, and that’s how the human brain works), rather than participating in the formation of that opinion.

2 Likes

A reason why this proposal might not be suitable in a 3rd party library is that it could be a dependency of other core features, including HList, collection initializers, string interpolations.

1 Like

The string interpolators of the stdlib are already intrinsified by the compiler, and HLists (aka tuples in Scala 3) already eliminate all the overhead based on inline and IIRC match types. So both are already as efficient as possible.

In order for tuples to stand in for varargs, there needs to be some form of auto-tupling (ideally, opt-in at the definition site), else you need to write double parentheses at the call site. I brought this use case up on the auto-tupling removal thread (Let's drop auto-tupling). Unfortunately, unless something has changed, the frontrunner solution for opting in was removed as “not pulling its own weight” (https://github.com/lampepfl/dotty/pull/4311#issuecomment-381112023), which I feel like might have been due to confusion between present usefulness and potential future usefulness.

2 Likes

I think you might have misunderstood my reply. I was specifically answering the parent comment, and only answering the parent comment. In particular I am not arguing that tuples are a replacement for this SIP. I was arguing that this SIP wouldn’t be used to implement tuples.

Ah, I can totally see how the comment I actually replied to was only claiming that. I simply clicked “reply” on the latest post from you, when I really meant to reply to your earlier statement:

1 Like

It would still be nice to have efficient collection initializers.

One thing that annoys me is that there still doesn’t seem to be a nice way of efficiently constructing a singleton Iterator or Set in Scala. I have to write Iterator(x) or Set(x) and pay for allocating an array and its wrapper every time… or use Set.empty + x for the latter, which is more verbose.

2 Likes

Unfortunately, there’s a difference between “can be implemented” and “can be implemented by someone with extensive background in Shapeless or macros”. One of the really nice things about this proposal is it lowers considerably the difficulty of writing something like show"".

Can I interest you in Iterator.single(x)?

Right, I had missed this one, thanks. But there are plenty of collections that do not have a single constructor. The biggest offenders are Sets and Maps.

2 Likes

Wouldn’t a shapeless solution to the same problem also be much slower to compile?

3 Likes

Yes, and it would be considerably harder to debug.

Just a challenge to put out there: implement Cats’ show interpolator using just implicit evidences, union types, or HLists.

Because I’ve tried doing it, and the restrictions of the string interpolator API mean that you’re pretty much stuck with implicit conversions. In Dotty this means enabling implicit conversions on a project wide basis (or adding it to the import tax), and is very close to a dealbreaker. If there’s a reasonable way to do this, it implies a rather large gap in the current Dotty documentation, because I couldn’t find anything hinting this was possible, short of resorting to implicit conversions.

However, as show"This is $a, and this is $b!" (currently desugars to approximately this: new StringContext("This is", ", and this is", "!").show(WrappedArray(a, b))) would desugar to this:

new StringContext("This is", ", and this is", "!").show.applyBegin.applyNext(a).applyNext(b).applyEnd

The implementation would now be possible without enabling implicit conversions or resorting to a compiler plugin:

class ShowInterpolaterState(parts: List[String], builder: StringBuilder)
  def applyBegin: ShowInterpolaterState = this
  def applyNext[A: Show](a: A): ShowInterpolatorState = 
    parts match
      case p :: rest => new StringInterpolatorState(rest, builder.append(p).append(a.show))
      case Nil => builder.append(a.show); this
  def applyEnd: String = 
    parts.foreach(builder.append)
    builder.toString

def (sc: StringContext) json: 
  def show: ShowInterpolaterState = 
    new ShowInterpolatorState(sc.parts.toList, new StringBuilder)

I just watched part of 2019 November SIP Meeting at https://www.youtube.com/watch?v=jjEcYY2R9mU. @odersky mentioned that there will be a combination explosion problem when overloading applyNext methods. It’s true, just like any complex API that relies on method overloading. On the other hand, applyNext with type classes (like @morgen-peschke’s example) can easily solve the combination explosion problem but is hard to inline.

Fortunately, this proposal gives the library author the freedom to choose either method overloading or type class. Therefore, a sophisticated library author can take the advantage from both approaches and avoid the pitfall of each approach. For example, in html.scala, I use applyNext with type classes to avoid combination explosion problem, and also provide only a few additional overloading inlined applyNext methods to optimize the most frequent use cases.

The criticism was that the complexity of type-checking applications is already very high - we do not want to make the problem even more complicated and (most likely) slower to resolve by adding more choices that the compiler has to check.

Anyway, why not try the meta-programming approach for this?

Thinking about the complexity of type-checking is a good perspective. In fact, if we can implement as a built-in feature or an analytic plugin, it will be almost no more overhead on the type checker’s end than manually written builders, because it expand the vararg call before each parameter is type-checked. On the other hand, any other meta-programming based approaches will be slower, especially in nested vararg calls, because they all have to type-check the parameters twice.

Unfortunately analytic plugin is not supported in Scala 3. That’s why I thought making this feature built-in compiler might be a wise option.

2 Likes

I think analytic plugin could be an approach to reduce the complexity of the type checker, because with the help of analytic plugin, we can move many features out of the core type checker, instead they can be implemented as compiler build-in plugins.

Analytic plugin might be helpful for better modularizing the type checker, even when they are all running in one phrase.

It’s a conscious decision that there will be no analytic plugin, since that would lead to language fragmentation.