I wasn’t aware of these details. How do you determine whether implied for will correspond to implicit val or implicit def? Is this specified somewhere? Will the two following examples both generate implicit vals or only the first?
val x = e
implied for T = x
implied for T = { println("hello"); x }
EDIT: from playing around with Dotty, it looks like both will generate (lazy) val’s. That is the side effect is only executed once, when the implicit is used the first time (and the result is cached).
According to this description, you’ll get a def for both instances you wrote.
val x = e
implied for T = x
implied for T = { println("hello"); x }
In the first case, the def is simply a forwarder to the val. It would be useless and wasteful to create another field for this. In the second case, you’d get def over a cached var. The def is needed since the expression is impure. To get something that behaves like a strict val, you have to follow the same pattern as in the first case.
Implied alias instances map to implicit methods. If an alias has neither type parameters nor a given clause, its right-hand side is cached in a variable. There are two cases that can be optimized
That explains a lot! But then the next question is: how can you encode implicit defs that have side effects every time the implicit is used? Maybe sometimes
you don’t want to cache the results.
Of course you could always create a newtype that thunks the computation, but that’s quite roundabout. Are these usages of implicits you want to discourage?
I was not saying it does not work. I mentioned it because it seems now out of context within the new concept that tries to get rid of the word implicit (and also possibly improve the way we can control the generated error messages).
Additionally, what should we do about DummyImplicit? I feel that to use DummyImplicit to create unique definition signature (after erasure) is a bad workaround and a feature abuse. Can’t we handle it in some other way like introducing a soft keyword unique def?
I’ve never understood why core types usually don’t define natural implicit: Unit, Option, Either, Tuple defining the natural implicits would good in my view. I have often wanted (and implemented) version of an implicit Option (prefer Some, fallback to None) and Either (prefer Right, fallback to Left).
Afaik DummyImplicit is basically required due to how Java handles type erasure/methods. Apart from that there is no real use for it (from what I have seen).
The compiler cannot do this, because it alters the binary API. So binary compatibility would be extremely unstable.
Also it can’t work when mixing in a class two such methods coming from two different parents, since they would have the same erased signature before mixing.
This is much better, but Is it possible to do this automatically with a @unique annotation, so it will act like an @alpha(signature) that uses some kind of string signature according to the type signature of the definition before erasure? That will allow preserving binary compatibility, no?
E.g. @unique def foo(arg1 : Int, arg2 : String, arg3 : String*) : Unit
will act like @alpha("`foo : (Int, String, Seq[String]) => Unit`") def foo(arg1 : Int, arg2 : String, arg3 : String*) : Unit
No, that won’t work in cases of overriding relationships. It’s possible for a method A.m to override B.m, yet they have different types before erasure. An @unique that derives a name based on the signatures before erasure would not work in those situations.
and class B extends it, trying to override its methods. What would happen if the writers of B accidentally swap method names in the alpha annotation?
class B extends A {
@alpha("fooInt") override def foo(ps: String*) = ???
@alpha("fooString") override def foo(ps: Int*) = ???
}
Correct me if I’m wrong, but if you then have code like this, it will call fooInt instead of fooString:
val b: A = new B()
b.foo("Will", "this", "call", "fooInt?")
Currently, class B doesn’t compile because the alpha annotation can’t be used for double definitions yet, but once that is fixed, it won’t have any issue with external and internal names being different, so it might actually compile and then fail once run.