Better type inference for Scala: send us your problematic cases!


#41

@aksharp Could you write down a complete example illustrating the issue ?


#42

This is in Dotty

scala> if (true) 1 else 2f
val res0: Float = 1.0

scala> if (true) Some(1) else Some(2f)
val res1: Some[Int | Float] = Some(1)

There is an inconsistency. Either the first case should have been inferred to Int | Float or the second one to Option[Float]


#43

@ramnivas Currently in dotty, we alwaus widen unions at the top-level (that is, not inside a type parameter), because keeping unions tends to break existing code, it’s possible that we 'll change this behavior but doing so without breaking too much code is tricky, see https://github.com/lampepfl/dotty/pull/2330#issuecomment-298233273 for the last discussion we had on this subject. Your example also involves conversions between numeric types which is another can of worms, the current rules are explained at http://dotty.epfl.ch/docs/reference/dropped/weak-conformance.html, but may also need to be debated further (and probably go through the SIP process too?)


#44
Starting dotty REPL...
scala> List(1L, 1F)                                                             
val res0: List[Double] = List(1.0, 1.0)
scala> List[Long|Float](1L, 1F)                                                 
1 |List[Long|Float](1L, 1F)
  |                 ^^
  |                 found:    Double(1.0)
  |                 required: Long | Float
  |                 
1 |List[Long|Float](1L, 1F)
  |                     ^^
  |                     found:    Double(1.0)
  |                     required: Long | Float

#45

This post was flagged by the community and is temporarily hidden.


#46

In trying to make a PR with my test case, I noticed a regression. Dotty 0.7.0 correctly infers the type and compiles the example, but Dotty versions 0.8.0 and later (including master) regress to the behaviour of Scala 2.12 and fail to compile the example.


#47

Hi, I have an issue with type class and type inference.
The code is something like this :

import java.util.UUID

object Test extends App {

  /**
    * Trait to convert a string to something
    */
  trait BuilderFromString[X] {
    def fromString(s: String): X
  }

  // function who use BuilderFromString to to something
  def getInfo[T: BuilderFromString](key: String): T = ???

  // implicit to work with String and UUID
  implicit val stringToString: BuilderFromString[String] = s => s
  implicit val stringToUUID: BuilderFromString[UUID] = s => UUID.fromString(s)

  // this line compile
  val info1 = getInfo[String]("test")

  // this line doesn't work
  // Error:(16, 29) ambiguous implicit values:
  // both value stringToString in object Test of type => Test.BuilderFromString[String]
  // and value stringToUUID in object Test of type => Test.BuilderFromString[java.util.UUID]
  // match expected type Test.BuilderFromString[T]
  //  val info: String = getInfo("test")
  val info: String = getInfo("test")

}


#48

@deblockt This is working as intended, the compiler cannot choose between stringToString and stringToUuid because neither has a type more precise than the other and they’re declared in the same scope. You can force a relative ordering by declaring your implicits in a hierarchy like this:

trait LowPriorityImplicits {
  implicit val stringToUUID: BuilderFromString[UUID] = s => UUID.fromString(s)
}
object MyImplicits extends LowPriorityImplicits {
  implicit val stringToString: BuilderFromString[String] = s => s
}
import MyImplicits._

#49

@smarter I don’t understand the

the compiler cannot choose between stringToString and stringToUuid because neither has a type more precise than the other

stringToString is a BuilderFromString[String], stringToUUID is a BuilderFromString[UUID].

On this line val info: String = getInfo("test"), getInfo must return a String so the function need a BuilderFromString[String] … No?


#50

Ah right, I missed that, so good news: your original program already works as is in Dotty :).


#51

Good news :slight_smile:


#52

As a mere Scala user (haven’t followed Dotty dev closely) I would like to know if Dotty:

  1. Still requires the use of the Aux pattern
  2. If we still need to use currying so that inference can work in a function/method’s parameters and if the left to right rule still applies
  3. I we still have “type erasure” issues when dealing with Scala code only (no mixing Java)

Finally are you also interested in examples that may involve (for example Shapeless) macro magic?

TIA


#53

You can now refer to other parameters in the same parameter list, which should make the Aux pattern more or less obsolete. But the type inference doesn’t seem to work yet though…

trait Foo[In] { type Out }
trait Bar[In]

implicit def fooInt: Foo[Int]{ type Out = String } = ???
implicit def fooString: Foo[String]{ type Out = Boolean } = ???

implicit def barInt: Bar[Int] = ???
implicit def barBoolean: Bar[Boolean] = ???


def works[A, B, C](implicit f1: Foo[A] { type Out = B }, f2: Foo[B] { type Out = C }, b: Bar[C]): C = ???
// compiles, inferred a Boolean
works[A = Int]

def fails[A](implicit f1: Foo[A], f2: Foo[f1.Out], b: Bar[f2.Out]): f2.Out = ???
// doesn't compile: no implicit argument of type Foo[f1.Out] was found for parameter f2 of method fails
fails[Int] 

#54

@Jasper-M Thanks for the info. Hope that second ‘fails’ example will work in the future. Much cleaner syntax.


#55

In case of Right(3) and "something" the common type is java.io.Serializable, so I think the issue is with choosing the candidate of type inference, not the subclassing itself.

As per interfacing the dynamic languages, for example JavaScript, the benefit of using languages like Scala.JS and TypeScript is because they have stronger type system to check programming errors, not because it’s permissible to Option(2) getting mixed up with Int. If people wanted that, they would use JavaScript itself. In situations where List[Any] is actually wanted, we can ask people to write Option(2): Any.

This should also improve situations such as List(1, 2, 3).contains("wat") - Type-safe contains.

In Scala 2.x that would’ve caused too much false positives for ADTs, but thanks to Eq, I don’t think it’s that far fetched of an idea.


#56

Here is another example where scalac gives a compilation error: https://scastie.scala-lang.org/n8Oxft9xSVCmfg2gii87Vw. It seems that dotc already compiles that code, but do you think scalac could be fixed too?


#57

What about this sort of thing where the method call in yield determines the type parameters for the preceding decode calls?


#58

EDIT REDUX: My apologies for the hostile tone of my previous edit. I was having a rough day and took much greater offense than was warranted at my post being flagged as off-topic. Please indulge me in one final revision.

I have an example of some code which fails to compile, I believe due to a failure in type inference. It would have been difficult to eliminate the dependency on lift-json, because the failure seems to depend on some tricky implicits defined in their DSL for constructing JSON.

This Scastie snippet is where the code itself can be found. Unfortunately I had some difficulty adding the dependency in Scastie – not sure whether this was user error or due to some failure in the system (it did appear that scaladex was down at the time). Apologies for that. In order to see properly the actual compilation failure which I am trying to demonstrate, the snippet requires this dependency:

"net.liftweb" % "lift-json_2.12" % "3.3.0"

in sbt parlance.

Note that if the second List is replaced with List[JObject], it does compile.