Right now it’s impossible for simple val binding to appear right after beginning of the for-expression.
For example:
def bar(arg: Int): Option[Int] = ???
def baz(arg: Int): Option[Int] = ???
def makeFoo(ark: Int, bark: Int): Option[Int] =
for {
arkAndThree = ark + 3 //compiler fails here
a <- bar(arkAndThree)
b <- baz(bark)
} yield a + b
It is required to write it like this:
def makeFoo(ark: Int, bark: Int): Option[Int] = {
val arkAndThree = ark + 3
for {
a <- bar(arkAndThree)
b <- baz(bark)
} yield a + b
}
which is the inconvenience here. If the first one is allowed, then the makeFoo definition would look much better.
I understand that the current implementation of for-expressions just desugars the expression into bunch of flatMap's, starting from the right hand of first <- binding, but there is a simple and kind of obvious workaround to allow said behaviour: let the compiler pass through for-expression twice, first it should move all of the = bindings outside, and then process the rest like usual. I guess that the first step is just a simple rewrite then
Is this problem worth of solving, and if so, is this a viable solution?
I have always wanted this but thought of a personal preference than a problem to be solved. Glad to know there are others who prefer this thereby giving a chance to be solved.
One possible solution would be if for could take a type parameter:
def makeFoo(ark: Int, bark: Int): Option[Int] = for[Option] {
arkAndThree = ark + 3
a <- bar(arkAndThree)
b <- baz(bark)
} yield a + b
As a bonus, it makes the code more self-documenting in code blocks where the monad in use isn’t instantly obvious from context. It also gives another point where the compiler can type-check for you.
It’s frustrating how often I end up having to use hacks like this:
def makeFoo(ark: Int, bark: Int): IO[Int] = for {
_ <- IO.unit
arkAndThree = ark + 3
a <- bar(arkAndThree)
b <- baz(bark)
} yield a + b
I agree this would be really handy, however it’s probably overkill for this, so it might be worth raising it separately to help it get seen.
I don’t think this will work in the general case, as = bindings often depend on the values further up in the for-comprehension. That being said, I also don’t think it’s needed because the ones causing the trouble are only the leading ones and they can be processed in a single pass.
Original
Desugared
val r = for {
a = calcA
b <- foo(a)
c = calcC(a, b)
d <- bar(a, b, c)
} yield baz(a,b,c,d)
val r = {
val a = calcA
foo(a)
.map { b =>
val c = calcC(a, b)
(b, c)
}
.flatMap {
case (b, c) =>
bar(a, b, c).map { d =>
baz(a, b, c, d)
}
}
}
Agreed. I don’t think this is needed to make the original request possible. But the idea of allowing a for to take the Monad as a type parameter (and treating it as type inference if it isn’t provided) is rather appealing – heaven knows I have often had head-scratching bugs that came down to that, and being able to slap in a type ascription easily seems like it would sometimes be useful.
(It also might be helpful pedagogically – I could totally see myself teaching for comprehension with the ascription spelled out explicitly, and then show that it can be inferred instead.)
That works, but it’s a little inefficient, invoking the constructor for no purpose save to flatMap over it.
It’s less inefficient than calling F.unit & F.flatMap to do the same thing. And for[Option] will essentially do the same anyway. Also, I raise you another example:
this will return just fine:
def makeFoo(ark: Int, bark: Int): Try[Int] = for {
biggerArk <- Try(ark / 0)
a <- bar(biggerArk)
b <- baz(bark)
} yield a + b
and this will blow all over:
def makeFoo(ark: Int, bark: Int): Try[Int] = {
val biggerArk = ark / 0
for {
a <- bar(biggerArk)
b <- baz(bark)
} yield a + b
}
And it obscures the plain value a little.
Nothing obscures value more than assignment in a for itself. You could not even place a breakpoint at this line.
As soon as you forced to write an assignment or something using a closure (like fold, traverse, etc) in a for, it is easier and more readable to rewrite it with flatMaps from scratch.
We really need a point/pure/etc method so the type author can avoid this sort of trouble. There can be pretty significant functional differences between the two operations (for example, a Future.unit + Future#flatMap vs Future.successful)
That’s not a great example because the assignment isn’t a simple assignment, and needs the wrapper to capture the effect.
I agree with you, but just point/pure is not enough as I’ll try to show below.
It’s not about an assignment (‘simple’ assignment ark + 3 suddenly becomes not so simple if we replace types with java.lang.Integer) and more about consistency.
I think everybody here would agree that this should not throw under the suggested syntax:
def makeFoo(ark: Int, bark: Int): Try[Int] = for[Try] {
biggerArk = ark / 0
a <- bar(biggerArk)
b <- baz(bark)
} yield a + b
It should not throw at least for one reason: code below does not throw even now (2.12, 2.13, 3.*) if baz does not throw:
def makeFoo2(ark: Int, bark: Int): Try[Int] = for {
b <- baz(bark)
biggerArk = ark / 0
a <- bar(biggerArk)
} yield a + b
And a simple source code strings permutation should not be a reason for this code behavior changes.
But this is not possible neither with “let’s rewrite all assignments to enclosing block” nor “let’s wrap it in strict type constructor” proposed changes.
It only works if assignments in for would be wrapped in a lazy thunk/Function0 method, like .apply of Try, Future or IO. This will require one more duck-typing check for every assignment in for and two more method calls: apply for assignment itself and .flatMap for subsequent statements.