Implicits with inferred return type lead to undefined behavior?

Implicits that don’t have an explicit return type are special-cased in the compiler, leading to undefined behavior. Essentially, they are not in scope before their point of definition. However, not always, and the exceptions are surprising enough to cause concern.

Here’s the relevant code from the compiler:

/** Should implicit definition symbol `sym` be considered for applicability testing?
 *  This is the case if one of the following holds:
 *   - the symbol's type is initialized
 *   - the symbol comes from a classfile
 *   - the symbol comes from a different sourcefile than the current one
 *   - the symbol and the accessed symbol's definitions come before, and do not contain the closest enclosing definition, // see #3373
 *   - the symbol's definition is a val, var, or def with an explicit result type
 *  The aim of this method is to prevent premature cyclic reference errors
 *  by computing the types of only those implicits for which one of these
 *  conditions is true.
 */
def isValid(sym: Symbol) = {
  def hasExplicitResultType(sym: Symbol) = ...
  def comesBefore(sym: Symbol, owner: Symbol) =..

  sym.isInitialized ||
  sym.sourceFile == null ||
  (sym.sourceFile ne context.unit.source.file) ||
  hasExplicitResultType(sym) ||
  comesBefore(sym, context.owner)
}

Notice that this might work if the symbol was already initialized (which depends on the order in which files are passed to the compiler, if I’m not mistaken!). In short, any code that uses an implicit before the definition may or may not compile, depending on when the code gets compiled, the order the files are reported by the file system or Sbt, or the amount of code that happened to be compiled before and loaded the same type.

Implicit scope and resolution is already hard to reason about, adding some amount of nondeterminism can make it extremely frustrating to users. Are there good reasons for keeping the exceptions? Why not forbid using an implicit without an explicit type before its point of definition? I have vague memories about such a proposal, but I couldn’t find anything on Google.

Unfortunately there is code that relies on this unspecified behavior. For an example in the wild, here’s some code in Specs2 that uses an implicit right before it defines it (without a return type). However, I don’t think it’s hard to “fix” such code.

1 Like

Yes—and IIUC there’s consensus on fixing it. Of the top of my head I looked at WartRemover and Scalafix and both agree.

Scala has trouble correctly resolving implicits when some of them lack explicit result types. To avoid this, all implicits should have explicit type ascriptions.

ScalaFix also fixes this (with a couple limitations) because Dotty requires the fix: Scalafix · Refactoring and linting tool for Scala

Dotty requires implicit vals and defs to explicitly annotate return types. The ExplicitImplicit rewrite inserts the inferred type from the compiler for implicit definitions that are missing an explicit return type.

I probably should also thank you for pointing out this code—I hadn’t seen specific discussion of this issue. Also, it’s probably a good idea to improve semantics (under some flag for compatibility) for the next releases of Scalac, and maybe earlier for Typelevel Scala (following their policy which requires PRs on scala/scala first)? Ping @tpolecat @milessabin.

I should have been clearer in my original email. Specs2 fails to compile with incremental changes. Here’s a repro using Scala 2.12.1:

  • sbt matcher/compile

  • add a whitespace in MatchResultMessages.scala

  • compile:

      [error] /Users/dragos/workspace/triplequote/tests/specs2/matcher/src/main/scala/org/specs2/matcher/MatchResultMessages.scala:13: could not find implicit value for parameter mm: scalaz.Monoid[Product with Serializable with MatchResultMessages.this.MatchResultMessage]
      [error]     Reducer.unitReducer { r: MatchResult[T] => r match {
      [error]                         ^
      [error] one error found
      [error] (matcher/compile:compileIncremental) Compilation failed
1 Like

I agree. A first step is to agree on how Scala should handle such implicits, and here we need input from the maintainers. I see two options:

  • forbid implicits with inferred return types altogether (like Dotty)
  • forbid implicits with inferred types when used in the same class and before their point of definition

Since this is undefined behavior and already fails for incremental builds, I think the second option can be done in the 2.12 cycle already.

1 Like

…and here’s the ticket I referred to: SI-801. According Martin’s comment the intention was to forbid it always, but it seems it’s not really enforced.

Edit: and I opened SI-10229

1 Like

Dotty always demands the return type, with one exception: implicit vals (but not defs!) in a local scope don’t need a return type. The reason is that such val’s cannot be referred to from before they were defined anyway, so we don’t need their types beforehand.

+1 to nailing this down. I’ve run into this problem this week and spent a few hours questioning my own sanity.

If Scalac makes this illegal, how should it inform the users of the change?
The most annoying part of the error you show is that the error doesn’t point to the implicit which stopped being applicable. In the offending run, that can’t be helped—Scalac doesn’t know that the implicit coming later would get the right type.
Somewhat surprisingly, your example does compile on the first run—so Scalac knows the offending implicit is applicable and can tell it’s being considered, and could tell you “add explicit return type to the implicit definition on line XXX”.
I’d expect you’re not always so lucky.

We should at least check how many failures you get after this change on the community build. If a clean compile with 2.12. breaks too much code that worked, one would need more care.

Maybe there could be a flag enabling this “misfeature” (defaulting to off, that is, what you’re asking) and any new error should suggest the alternative of enabling this flag. Hopefully I’m worrying too much.

You are right, but I’d expect that it works for a clean compile, or otherwise the code wouldn’t exist for long. So in those runs where the implicit is applied an additional warning (or error) can be issued, and since the error is “hey, this may not always work”, it should be ok to happen only when it “works”, and fail with an “implicit not found” when it doesn’t.

Sure, that’s a good first step. I would avoid adding a new flag if the transition to Dotty will involve adding explicit types for implicits anyway (except for locals), but I don’t feel strongly about it.

1 Like

I don’t feel strongly either and it’s not my call in the end; I only mean “we can do it even if we need to care about compatibility”. And I was imagine a -X or -Y flag to be dropped soon.
Regarding Dotty, there’s at least 2.13 planned before the switch—it’s not tomorrow. Still, “warn about all implicits without explicit return types” is an option—it just might be very noisy, so maybe it should wait for a more mature scalafix.

That makes sense—I just didn’t have the fantasy to explain the observed behavior from the code you’ve shown. But now I sort of see it—compiling other files might have forced initialization of the offending symbol for some sort of reason.

1 Like