Pre-SIP: Equivalent of `inline implicit def`

Related to Use Cases for Implicit Conversion Blackbox Macros.

Use cases

As @lihaoyi showed in the discussion linked above, some libraries (usually using metaprogramming) such as Source Code, Iron, Refined, Mill… use implicit conversions combined with macros.

For example, here is the conversion used by Iron:

implicit inline def autoRefine[A, C](inline value: A)(using inline constraint: Constraint[A, C]): A :| C =
  macros.assertCondition(value, constraint.test(value), constraint.message)
  IronType(value)

Here value has to be inline to be evaluable at compile-time and analyzed by macros.assertCondition.

Here is a more minimal example.

types.scala

opaque type PosInt = Int
object PosInt:
  inline def unsafe(inline value: Int): PosInt = value

macros.scala

import scala.quoted.*

def autoPosIntImpl(expr: Expr[Int])(using Quotes): Expr[PosInt] =
  import quotes.reflect.report.errorAndAbort

  expr.value match
    case Some(value) if value > 0  => '{PosInt.unsafe($expr)}
    case Some(value) => errorAndAbort(s"$value <= 0")
    case None => errorAndAbort(
      s"""Cannot retrieve value. Got:
         |---
         |${expr.show}
         |---""".stripMargin
    )

test.sc

//> using scala "3.4.0"
//> using file "types.scala"
//> using file "macros.scala"

import scala.language.implicitConversions

implicit inline def autoPosInt(inline value: Int): PosInt = ${autoPosIntImpl('value)}

val x: PosInt = 5 //Compiles
val y: PosInt = -5 //Error: -5 <= 0

Now using Conversion:

//> using scala "3.4.0"
//> using file "types.scala"
//> using file "macros.scala"
//> using file "conversion.scala"

given Conversion[Int, PosInt] with
  override def apply(value: Int): PosInt = ${autoPosIntImpl('value)}

val x: PosInt = 5

This does not work:

[error] Cannot retrieve value. Got:
[error] ---
[error] value
[error] ---
[error]   override inline def apply(value: Int): PosInt = ${autoPosIntImpl('value)}

And we cannot mark value as inline:

[error] Cannot override non-inline parameter with an inline parameter
[error]   override inline def apply(inline value: Int): PosInt = ${autoPosIntImpl('value)}

As far as I know, there is no replacement for implicit inline def and an alternative must be available before deprecating implicit def.

Proposed solution: InlineConversion

We cannot change the definition of Conversion#apply to mark its parameter as inline as it would break existing code. An alternative would be to add a similar mechanism but inline:

trait InlineConversion[-T, +U]:
  inline def apply(inline value: T): U

I implemented it with an inline implicit def (which could be replaced by the same mechanism as Conversion):

conversion.scala

import scala.language.implicitConversions

trait InlineConversion[-T, +U]:
  inline def apply(inline value: T): U

implicit inline def convert[T, U](inline t: T)(using inline conversion: InlineConversion[T, U]): U = conversion(t)

test.sc

//> using scala "3.4.0"
//> using file "types.scala"
//> using file "macros.scala"
//> using file "conversion.scala"

inline given InlineConversion[Int, PosInt] with
  override inline def apply(inline value: Int): PosInt = ${autoPosIntImpl('value)}

val x: PosInt = 5 //Compiles
val y: PosInt = -5 //Error: -5 <= 0

There might be better alternatives but I think this one is a good starting point.

4 Likes

may be good to elevate this to Pre SIP in the title

Should I only prefix the title with “Pre-SIP” or may I also change the category to Scala Improvement Process?

Just Pre-SIP is fine.

1 Like

The unrestricted use of implicit conversions is a code smell, which makes several aspects of the Scala type system and type inference fragile and slow. In current Scala 3, conversions require a language import at each use site. I am in favor of tightening the screws further and making use of conversions without that language import an error instead of just a feature warning.

See Proposed Changes and Restrictions For Implicit Conversions

I plan to bring this up as a Pre-SIP soon. We had a long backup of other things that prevented this being proposed sooner. But it’s not forgotten.

In light of all this I am against making implicit conversions more powerful and more of a feature. In particular combining implicit conversions with macros makes me shudder.

Some libraries will have to be rewritten if that’s not longer possible, but that will be a good thing. We collectively need to wean ourselves off this implicit conversion drug. Maybe some specialized idioms will no longer be expressible. If we get a cleaner language with fewer footguns for it I am willing to make that trade.

6 Likes

Perhaps, but that’s a fairly casual statement with deep implications. If “fewer footguns” means that popular and useful libraries no longer work as well, that may be a pretty bad trade for the language’s future. I’d recommend careful study of the ramifications before cutting features like this.

(Mind, I don’t know myself how deep the implications are in this specific case. But some of the tools linked above aren’t minor, and making them worse is a potentially significant consequence, especially when the language is already fighting for mindshare.)

11 Likes

I do not see how preventing libraries from existing is a good thing. There are valid use-cases for implicit conversions, and if we do not at least try to provide alternatives for them, we are splitting the ecosystem without a good cause IMO.

9 Likes

I wonder if we should also investigate better foot-armor. For instance, use the code-rewriting tools we already have for Scala 2 to 3 to offer to rewrite implicit conversions (maybe triggered by a @desugar("conversions") annotation); or a compiler flag to yell more expressively when something doesn’t work and implicit conversions touch it.

I don’t think there are a very large number of cases where code subtly fails to produce the correct result because of implicit conversions, are there? Mostly it doesn’t work at all, but you have no idea why; or it does work, but you also have no idea why, because of the conversions.

This suggests to me that it’s worth at least investigating whether the pain points can be addressed without restricting the feature yet further.

2 Likes

In general it’s true that this could lead to pretty wild implicit conversions, but the motivating case for this Pre-SIP is Iron where it is only used to introduce compile-time checks to restrict usage of the implicit conversion from an unrefined to a refined type which sounds legitimate to me.

9 Likes

I agree with this at a high level, but the devil is in the details

My first big worry is that this is fighting the last decade’s battles. It isn’t 2014 anymore; and people going crazy with implicit conversions isn’t really a thing. Sure, it was a thing in the past, but it isn’t a major problem now. Run a poll on reddit or twitter or hacker news and I’m sure you will find that implicit conversions are not top of mind (v.s. things like IntelliJ, SBT, compile times, compatibility/stability/maturity, etc.).

We should first find the usage in the wild thst we consider problematic, what usages aren’t problemstic, and then decide what language changes we need to make. The current proposal of “Martin decides what is bad, all these libraries that people love are declared to be code-non-grata” is the tail wagging the dog.

If it turns out that the tradeoff is “we limit a feature that doesn’t affect anyone’s day to day, and wide swathes of the ecosystem break”, that seems like an exceedingly bad trade. Less “Python 3” and more “Elm 0.19” level of bad (Elm 0.19 limited use of a few key advanced features to core contributors only, without offering alternatives. This resulted in many large codebases being forever unable to upgrade without a huge rewrite, and the callous way the breakage was done damaged trust in the platform. The previously-vibrant community has largely disintegrated)

My second big worry is that there have been several threads here trying to spark discussion on current use cases for implicit conversions. People are genuinely willing to work together to find a compromise path forward, but those threads largely haven’t seen any engagement by the core Scala 3 team

Without such discussion and shared understanding, it’s hard to see how we will come up with a good plan to move off implicit conversions. That’s not to say such a plan doesn’t exist, just that to get it we will require much more open collaboration than the proposed “your code will break and that is a sacrifice i am willing to make” strategy described above

That said, I am still in favor of limiting implicit converisons overall. It just needs to be done for the right reasons, and with the right approach, in order to be successful. The devil is in the details.

29 Likes

I can understand the general approach, we all where there last 2 decades, but for balance: iron is today the most, perhaps only pure scala 3 library I saw that made me think that Scala 3 has huge benefices, and not only costs. And the presentation on it at scala.io was a real vitrine for scala 3, and with some really nice glimpses about how zero-cost abstractions can be done in scala 3.

Having the message that the most exciting scala 3 lib to my knowledge is not handling it correctly and thus shall have its wings cut is extremelly surprising, and even gloomy.

6 Likes

I don’t think implicit conversions with macros are as scary as they used to be in Scala 2.

In Scala 3 implicit conversions already are more restricted and clean than in Scala 2:

  • It requires a language import
  • Orphan instances need to be imported explicitly or via given
  • Use a typeclass Conversion instead of a special syntax implicit def

The macro API is also cleaner and in Scala 3 macros are less powerful (in a good way). Note that the proposed solution is for blackbox/non-transparent implicit macros which pose fewer problems.

As far as I know all you can do with blackbox macros throwing warning/error messages, checking things at compile-time (e.g Iron/Refined constraints) and returning a “custom” expression which still needs to satisfy the output type of the conversion. That does not sound like much magic to me.

Do you have a specific use in mind that is problematic?

5 Likes

inline given Conversion[T, U] might still be useful even after implicit conversions are removed. Indeed the following will remain possible:

val x: into PosInt = 5 //Compiles
val y: into PosInt = -5 //Error: -5 <= 0

I agree that macros the way they are used in Iron or Refined are among the most well-behaved.

But in general, macros can take very long to compile and can produce arbitary code. The least I want is to see a call in the source code that might cause this. So, if one argues for mixing macros and implicit conversions, one also needs to have a convincing story how to prevent abuses. So far I have not seen that.

Even for refinement checking, is it really that much to ask to write (say) e.narrow if an expression e should be narrowed to a refinement type? Refinement type checking is probably several orders of magnitude slower than normal type checking, so this would be welcome as an indicator for where compilation time is spent. In our own research on refinement types we find explicit narrowing acceptable.

1 Like

Indeed this does not look so bad:

extension [T](inline t: T)
  inline def convert[U](using inline conversion: InlineConversion[T, U]): U = conversion(t)

given InlineConversion[Int, PosInt] with
  inline def apply(inline value: Int): PosInt = ${autoPosIntImpl('value)}

val x: PosInt = 5.convert
val y: PosInt = -5.convert //Error: -5 <= 0

Is that the right order of priorities?

Somewhere in my giant pile of buttons, I have one that reads, “There is not now, nor will there ever be, a programming language in which it is the least bit difficult to write bad code”. It’s blithe, but I’ve never found a reason to doubt it.

IMO, the issue isn’t whether it is possible to abuse a feature; the issue is whether a feature invites frequent abuse. It’s true that there was a time when Scala 2 and its idioms around implicit conversions invited that, but it seems to me like that time has passed – in practice, those bad patterns seem to be dying out even in Scala 2, much less Scala 3. The community has, by and large, learned better; that’s way more important than all the protections built into the language.

I suspect you’re fighting yesterday’s war here, and that it’s being a bit counter-productive…

10 Likes

It is, unfortunately. Scala has a history of being banned in certain companies because existing code is too obscure and hard to maintain. I wish Scala programmers were all responsible and would adhere to the principle of least power (Haoyi’s interpretation, not the finally tagless one, which is very clever but misses the point). But many of them are not. I am not talking about thought leaders who are indeed now largely beyond it, I am talking about random Scala programmers in companies who want to show they are smart or indispensable by writing overly clever code. All the feedback I get is that this is currently problem #1.

Other languages don’t have both implicit conversions and macros. So if we want to have both it is our due diligence to show how abuses can be kept in check.

1 Like

As a person who’s company is currently entirely dependent on Scala, I see it’s much more likely companies will ban Scala if it just breaks their existing code from ever working. If you truly care about industry, then talk and listen to industry. Implicit conversion “obscurity” is not what we worry about.

15 Likes

Where are you getting this feedback? The 2023 Scala Language Survey lists Tooling and Compile Speeds, as the top two things that need improving by an overwhelming margin, from a broad base of 1300ish responses

I can speak for my own part of industry in Databricks. 1,000s of engineers, 1,000,000s of lines of code. Implicit conversions are not a problem for us. Migration difficulties are. Each and every breaking change materially impacts upgrade timelines, while obscure language features do not cause us difficulty. For Databricks folks, the Scala Survey priorities for improvement basically match our own: tooling and compile speeds are top pain points. Some things we can improve on our own, while others we need to rely on upstream projects to improve. We havent encountered any specifically macro-related compile time issues.

Perhaps other folks from industry can chip in if they have different experiences

16 Likes

To emphasize the impact this has: we still regularly hear about customers stuck on 2.12 due to spark. Spark has been released for Scala 2.13 since 3.2.0 / October 2021 – but it’s not the default, and to this date, both Databricks Runtime and Amazon EMR (I guess these are the main hosted Spark offerings) offer 2.12 only.

Hopefully they will move to 2.13 soon (the spark 3.5 release notes say 4.0 will drop Scala 2.12), which should also enable Scala 3 (for3Use2_13).

I can second that. 2.13 was a library release, so the sort of issues migrating to 2.13 are different than migrating to 3. But still, every breaking change is a lot of work, things like changing the Seq alias to immutable or the removal of breakOut in a 20MLOC codebase mean a lot of work (with automatic rewrites).

11 Likes