New syntax: applicative desugaring in for comprehensions (with PR)


This is true. It looks like you really want this feature, so you shouldn’t feel discouraged so quickly!

In my opinion, @nafg’s suggestion is perfect – it would allow you to achieve the desired functionality without any syntactic change. It’s true that it wouldn’t have such a big impact as a language change would, but it would be a solution that others with the same need as you do could use. I think it’s worth a try – especially if you’re displeased with the status quo. :slight_smile:


I may be the voice out of the chorus here, but I’d go even further and think about a code rewriting that automatically generates the appropriate translation.

I don’t know the details of the compiler phase that do the for desugaring, nor if it supports code introspection and backtracking, but what I picture in my mind is a desugaring that will transparently use the least powerful abstraction needed.

If the parser could somehow look ahead to see if the result of an operation is needed to bind in one of the successive, it could decide to use product vs. flatMap.

Some use cases to illustrate the idea are

case 1

for {
  a <- fa
  b <- fb
} yield y(a, b)

since the needed abstraction is only applicative it would desugar to

product(fa, fb) map {(a, b) => y(a, b)}

case 2

for {
  a <- fa
  b <- fb(a)
  c <- fc
} yield y(a, b, c)

since a is used to compute b a monadic abstraction is needed, but the translation could be reduced to the bare minimum needed to do the job

fa.flatMap { a =>
  product(fb(a), fc) map { (b, c) => y(a, b, c)}

and so on for more complex cases.


  • to do this, there needs to be a check in place to see if an applicative is available for the currently expanding operation, and where it’s not, but the monadic flatMap is defined, a fallback implementation of product().map should be invoked, relying on the fact that any monad also provides applicative behavior
  • explicit usage of flatMap could be desired for different reasons, but this would probably need an additional syntax to use at call site. E.g. in case 2, it should be possible to write something like: a <- fa.flatMap, or a <- fa.monad
  • in general I see no problem in creating additional desugaring rules, even though I know the for-comprehension can be tricky to get when you first approach the language. YMMV


The chorus probably knows that desugaring is done in the parser and wouldn’t want to have to do it in typer (which is complex enough as it is) but that’s the only phase where you could implement your proposal.


I already tried to implement this feature using macros, as @nafg suggested. I failed utterly, getting horrible compiler crashes. I think it all came down to the fact that modifying already typechecked trees (macro args) is pretty much impossible in current macros (and untypecheck doesn’t work). Maybe things will look somewhat better in scala.meta.


You can try if all else fails. And hope that it stops failing…


Hey @ghik, happy to see that you gave it a try!

untypecheck removes all typed attributes from trees that are typed, but it does not visit nodes of untyped trees, meaning that it stops whenever it sees an untyped tree (this is, at least, my understanding of it).

The compiler crashes you mention may be happening because you’re mixing typed and untyped trees in ways that will make untypecheck not work. Making this work is no easy task, so I second your claims that this is difficult with the status quo of macros. It’s a difficult problem that Eugene would like to solve in Scala Meta, see this ticket for more information:

As I see it, you have two options:

  1. Wait and hope that Scala Meta solves your problem.
  2. Try different avenues to work around your current issues:
  3. Avoid touching typed trees as much as possible.
  4. Try to reset attributes aggressively (this is @Jasper-M’s suggestion).
  5. Try to use typingTransform to manually transform attributes of trees.

My suggestion: if you like tinkering, go for the second option; if you’re not going to lose sleep over this, wait for Meta, and even consider helping out. It’s a good time since it recently started to implement the first bits of the semantic API.

Have a good day,


Hey @ghik, first of all, nice work!
As my comonadic comprehensions proposal is not going to be included in LBS in the near future, i want to push it into TLS.
I want to see if we merge our efforts and provide a combined PR for co-monadic and applicative comprehensions.
If you agree, let’s try to find a consistent approach to syntax and solution.


Yes, that would be cool. But I don’t yet clearly see how these two features would interact. What I personally cared about was integrating applicative syntax into already existing for-comprehensions so that it’s possible to mix monadic and applicative syntax. As far as I understand, your proposal is a completely independent syntax with separate desugaring.

So, how do you see these two proposals merging into one?


As i see it, we need to consider the following alternatives for consistency of syntax:

  1. Find a way to leverage the existing keyword (for) for both applicatives and comonads.
  2. Introduce a new syntax (a single new keyword) to both applicatives and comonads (and maybe even monads).
  3. Add different keywords for each comprehension kind.

I thought hard about the option of using the existing keyword for comonadic comprehension, but to no avail. Maybe you’ll have an idea how to do it. Also, i don’t think it is feasible to support combination of comonads and the other abstractions like you did in your PR.
Regarding option 2, i need to think about it.
As for option 3, apfor was thrown as a candidate and i’m fine with it.


I would be very keen to see the for keyword reused for both if possible … that was a feature of the applicative for that I found particularly appealing (and the feature of the comonadic for proposal that I found the least appealing TBQH).


@milessabin do you think there would be any interest in typelevel community for my original proposal to be included into Typelevel scalac?


I also found the reuse of for appealing. However, regarding comonads i think it’s problematic.
An obvious approach is to use: for (pattern1){..generators..} yield and check whether the first expression contains a generator or not in order to differentiate between monadic and comonadic comprehensions.
This, IMHO, will add more confusion than necessary.
Although, if you think that it’s better than adding a new keyword, i can make the adjustments.
Moreover, cofor requires a call-site (can’t start and expression) and returns a function and not a regular value. I think that reusing the same keyword, as said before, will introduce more unnecessary confusion.


@ghik yes, definitely.


@milessabin @ghik From my POV, both PRs may have different goals. As applicatives appear quite often in projects, i see the sense in reusing for syntax for them. Regarding comonads, the picture is quite different. I personally think that a comprehension may help in providing a motivation to explore this abstraction in projects (i can even imagine a Coeff library or a CofreeStyle).
Overloading for syntax for comonads does not appeal to me and i think will add confusion. I think that deploying a version with the new keyword can serve better for this experimental feature.
Of-course, if many people will object and ask for a single for keyword, i’ll make the adjustments. Unfortunately, no many expressed their opinion on the matter (understatement).
My 2-cents.


Yes, to me it also seems that our two proposals are quite independent. For now I guess I’ll just probably follow to resubmit my proposal to Typelevel Scala in order to gain more feedback.


I find your proposal the best for now since it is the closest to how the desugaring for for is done at the moment.


How should the following be desugared?

for {
  _ <- writeFile("foo.txt")
  file <- readFile("foo.txt")
} yield file

From the point of view of the compiler the two lines of the for might be unrelated, but desugaring to product(writeFile("foo.txt"), readFile("foo.txt")).map … would not do what we want: we really want to do the effects of readFile after we’ve done the effects of writeFile.


There was a similar question at the end of Simon’s presentation. Their current practice is to disable applicative do in that case (meaning that applicative do should not be enabled by default to preserve existing programs). Or you can write:

writeFile("foo.txt") >> readFile("foo.txt")

to force sequential ordering.


Your argument is valid… the first thing that comes to my mind is to explicit the applicative builder, or to explicit the monadic binding.

Of the two my personal opinion is in favour of using specific syntax for the most specific case (which I assume would be the sequencing).

So my suggestion would be something like

for {
  _    <| writeFile("foo")
  file <- readFile("foo")
} yield file

or any variation that would better transmit the meaning of the intended sequentiality

an alternative that combines applicatives similar to how it’s done now would add almost nothing to the ergonomics of a “multipurpose” for comprehension.


for {
  (fa, fb) <- app1(arg1) |@| app2(arg2)
  res <- using(fa * fb)
} yield res

is pretty much available already

what’s your opinion?


At this stage of the game, I think it has to be the applicative that is explicit. The suggested desugaring would break a vast amount of code – in particular, it would break scads of side-effecting Futures code, where the for comprehension is being used specifically to sequentialize the Futures. (Indeed, I just wrote one of those yesterday: it’s a fairly common pattern for avoiding race conditions between Actors.)

If the Scala ecosystem was entirely (or even predominantly) pure functional code, then I might be able to buy the original suggestion. But as things stand, there is way too much code out there that would break, and I don’t see any plausible rewriting tool that could reliably detect which was which…