Updated Proposal: Revisiting Implicits

That’s exactly my point. Scala functions are simulated by other lower-level constructs, but that is mostly irrelevant to the developer and will likely hinder their work. A lot of clarity is gained when new constructs are designed for frequently used patterns that could’ve been implemented with existing lower-level constructs – that’s the core idea behind any non-assembly programming language.

1 Like

Agreed. But IMHO typeclasses don’t require a lot of extra ceremony on top of what’s already available in dotty. Adding too much extra sugar on top might confuse things more than anything else, but I know that opinions differ widely on this point. Anyway I came here to point out that the state of the art is already virtually equal to what @rgwilton suggested, modulo the keywords. I don’t think I’m the right person to defend all the given stuff, given that I wasn’t the biggest fan myself, though I must say it’s grown on me a little.

2 Likes

Ah, what I’m actually hinting at is that we don’t need those additional low-level constructs. We are not using them in any way that is not to specifically for these design patterns; hence, we really only need the sugar.

Fair enough :slight_smile:

Yes, exactly this.

Quite a lot of things in Scala end up being syntactic sugar to make code easier to read or write even though they can equally be expressed using lower layer constructs. I would regard typeclasses as being important enough that adding a little syntactic sugar to make them as simple as possible to read and write worth a bit of extra sugar.

4 Likes

wrt. 0.22 plans:

given [T: Ordering] as Ordering[Tree]

Is it just me or does this really read absolutely nothing like what it actually does?

What’s the 0.21 equivalent, the => syntax? I think that’s orders of magnitude more intuitive. There was this issue with it where some confusion would potentially be present when considering given function types, and I kinda hoped there’s going to be an improvement to it, but I liked @odersky’s reasoning wrt. this and thought its pretty fine. To me this feels like a step backwards.

edit: striked through likely incorrect understanding, but the usage of as I still find rather confusing. I guess my misunderstanding just demonstrates the fact :slight_smile:

1 Like

Speaking of the => syntax though:

given [T] with Ord[T] as Ord[List[T]] { ... }
given with (outer: Context) as Context = outer.withOwner(currentOwner)

Still prefer the 0.21 => syntax to its actual replacement. If we have to use with here though, wouldn’t

given [T] Ord[List[T]] with Ord[T] { ... }
given Context with (outer: Context) = outer.withOwner(currentOwner)

both read more naturally and give some intuition wrt. what this does due to its similarity with given parameters?

(removed the as keyword too, as per my previous comment.)

I’m going to read “Brideshead Revisited” and wherever it says “Brideshead”, insert “Implicits”.

And for “World War 2”, understand “Scala 3”.

Wikipedia doesn’t always cover fiction with perspicacity, but it seems to me that this summation applies:

It occurs to him that the efforts of the builders – and, by extension, God’s efforts – were not in vain, although their purposes may have appeared, for a time, to have been frustrated.

Except where it says “God”, insert “Odersky”.

One can google “Brideshead implicit evidence”:

the evidence produced that there is implicit in Brideshead Revisited an heretical private religion

Perhaps someday an apostate will write a critical history of Scala.

What I find most distracting is with or given when outside the parameter block. I disagree with the benefit/drawback analysis here. One form of ambiguity is being traded for another (within block vs between blocks), and it adds what I find to be strange keywords in locations I expect members – another stumbling block when parsing. It just feels like there will be more “Scala Puzzlers” as a result of this than placing with/given/implicit/bikeshed in the parameter block.

Other than that, the most recent proposal is pretty clean. I would prefer : over as but understand why that has issues.

I was going to replace “Brideshead” with “Bikeshed”, but to each their own.

3 Likes

I was a big fan of the syntax in the 0.21.0-RC1 release, it just seems so expressive:

given [A,B]: (Show[A], Show[B]) => Show[(A,B)] =
  (a,b) => s"(${a.show}, ${b.show})"
2 Likes

what I’m sad about is that in the newest proposal, the Context bounds are back, ala

def maximum[T: Ord](xs: List[T]): T = xs.reduceLeft(max)

There just shouldn’t be more than one way to introduce contextual/given/implicit parameter. That’s difficult to understand/explain, especially to newcomers.
Something like this should be enough

def maximum[T](with Ord[T])(xs: List[T]): T = xs.reduceLeft(max)

And it also scales nicely, when you realize you actually need to name the parameter

def maximum[T](with T: Ord[T])(xs: List[T]): T = xs.reduceLeft(max)

Also, it works for multiple-parameter type classes, something Context bounds just can’t do

def asdf[A, B](with Convertor[A, B])(xs: List[A]): List[B] = ...
1 Like

Note: they can, though it’s currently a bit awkward:

def asdf[A, B: [B] =>> Convertor[A, B]](xs: List[A]): List[B] = ...

which we’ll be able to write like this in the future:

def asdf[A, B: Convertor[A, _]](xs: List[A]): List[B] = ...

But otherwise I agree with your points.

1 Like

Note: Context bounds were always part of the proposal. The argument of extra syntax has to be weighed against the arguments of conciseness, convenience and backwards compatibility. Maybe if we never had had context bounds there would be no compelling reason to introduce them now, since the syntax of context parameters got so much lighter. But since context bounds exist, and are found useful, there’s less incentive to get rid of them. They are still a big improvement over context parameters for simple cases.

Note, in terms of my feedback for typeclasses, I’m not at all wedded to the existing implicit syntax (which I have generally avoided due to its perceived complexity), except that I superficially prefer the term “implicit” over “given”, but I understand the reasons why you cannot reuse that.

But what I am hoping for is a bit more syntactic sugar for defining typeclasses, and more uniformity for defining extensions. I would expect to use both of these features so having a clean, obvious syntax for defining them would be a win in my opinion.

Perhaps a specific syntax for defining typeclasses could mean that context bounds would mostly not be needed and potentially could be deprecated?

1 Like

A specific syntax of typeclasses has been proposed and pursued for years before the present implicit work started. In my opinion this is a dead end. It is a strength of Scala that typeclasses are regular types and instances are regular terms. You are free to disagree but you will never convince me otherwise.

4 Likes

The problem with this is that this means the compiler will search for an implicit of type Ord[T] before it has seen the parameter of type List[T] which would have constrained T to some specific type (we could change the compiler to delay the implicit search until it has recorded constraints from all the non-implicit parameters, but that would prevent having a non-implicit parameter whose type depend on previous implicit parameter, it would also prevent intentionally searching for the implicit first to constrain the type of subsequent explicit arguments).

1 Like

I don’t particularly mind the idea of typeclasses being regular types and instances, as it’s one less thing to remember. What I’d really like is some way to reduce the declaration site boilerplate involved in setting one up.

This would be analogous to the way for-comprehensions map to plain method calls, and the new enum syntax being effectively sugar for a sealed trait.

2 Likes

What about my proposal? It seems to me that – semantically speaking – it has no major flaws. Perhaps there is an implementation detail there that is problematic (hard to do / increases compile time)?

Which part do you think is boilerplate exactly ? Looking at https://dotty.epfl.ch/docs/reference/contextual/typeclasses-new.html, the only thing which I’d say qualify as boilerplate is the apply method in the companion object to allow writing Foo with summon[Foo]:

object Monoid {
  def apply[T] with (m: Monoid[T]) = m
}

If this is really a problem, perhaps we could add a selection typing rule to type Foo.bla as summon[Foo].bla if everything else fails, but that would mean even more magic in the selection typing logic which is already quite complicated.

1 Like

the only thing which I’d say qualify as boilerplate is the apply method in the companion object

About that object Monoid I tried to rework the documentation here: [Documentation] Trying to rewrite typeclasses-new by aesteve · Pull Request #8147 · lampepfl/dotty · GitHub if some of you can give early feedbacks.

Thanks :slight_smile:

My perception of boiler plate for defining typeclass implementations are:

  1. it doesn’t say/indicate what it is. I.e. the programmer’s intent is unclear. For comparison this isn’t the case with OO classes, where the programmer’s intent is clearly identified by keywords such as “class, object, trait, extends, etc”.
  2. context bounds - my understanding is that these are only syntactic sugar and predominantly useful for typeclasses instances.
  3. the object definition - I look at that, and think to myself, what on earth is this for?
  4. that it apparently requires more complex extension syntax - Martin indicated that collective extension syntax alone isn’t sufficient because of type class definitions, so it seems to me that the type class definition makes other parts of the language more complex as well.

For me, I perceive that the complexity of the type classes definition might be high enough that it would still make my code incomprehensible for a normal developer to understand (I.e. perhaps someone smart who has a background in C, Java, Python, etc, but not Haskell or other functional languages).

There are aspects of Scala that I really love, and many of the features in Scala 3 look great. But I have to be honest, despite writing Scala as a hobby for the last 10ish years, I am really wondering whether it isn’t pragmatically the right language for me, and perhaps I would prefer programming in Rust, Go, or even just Java instead (which is gradually adding more FP constructs). The beginner surface of Scala seems simple, but with a really deep underlying complexity, I was really hoping that Scala 3 was going to be a simpler language than Scala 2, but that is not my perception from reading some of the dotty documentation - if anything it feels even more complex.

I could be completely wrong, but I think the FP experts will like Scala 3, although they might prefer Haskell or other pure FP language. But I do wonder what companies, currently using Scala 2 and perhaps already concerned about complexity, will think of Scala 3?

I apologize if this comes across as a rant, and who am I to complain, given that I’m not doing any of the work updating the language, which I also appreciate is really hard. But hopefully feedback from someone who is predominantly concerned about readability for non-experts is still helpful in some way.

3 Likes