Proposal to drop Weak Conformance from the language

It doesn’t really work - I can’t make the list contain doubles, only ints:

scala> val list = List[Int&Double](1.asInstanceOf, 1.1.asInstanceOf)
val list: List[Int & Double] = List(1, 1)

scala> list.last
val res1: Int & Double = 1

ETA: which makes sense, because the type Int&Double doesn’t allow non-integer values. So I don’t see how this is useful, but maybe I’m missing your point.

In @Jasper-M’s example, the + isn’t really overloaded. The type Int&Double is a subtype of Int, so Dotty coerces the values to Ints and binds the + to Int.+. Illustrated by the exception here:

scala> val list = List[AnyVal](1, 1.1).asInstanceOf[List[Int&Double]] // Illegal cast
val list: List[Int & Double] = List(1, 1.1)

scala> list.map(_+1)                                                                                                                                                                                                                                          
java.lang.ClassCastException: java.lang.Double cannot be cast to java.lang.Integer

I’m not saying that this is working perfectly already. But apparantly once you correctly set up all types, dotty is already able to handle this (as far as the type system is concerned). And (list: List[Int']).map(_ + 1) does work.

The way some intersections get erased might need some tweaking indeed. But the casts in your example are unsound in this new scheme as well. As you said 1.1 is not a valid Int & Double in any scenario. So it’s expected that this blows up.

@propensive Here’s a problematic example:

    Int & Double <: X & Y

where X and Y are type variables. You can solve this by taking X=Int and Y=Double or
the other way around. The two solutions could have drastically different consequences downstream.

1 Like

But apparantly once you correctly set up all types, dotty is already able to handle this (as far as the type system is concerned). And (list: List[Int']).map(_ + 1) does work.

map(_+1) binds to some one concrete type’s + method. It doesn’t work if the List[Int'] contains values of more than one concrete type. That’s the important bit, and it’s missing. For similar reasons, you can’t actually construct such a list holding values of different concrete types without explicit casts, and if you construct it using asInstanceOf you won’t be able to apply any methods that don’t exist on Any.

@propensive proposed that:

every number literal is typed as the intersection of all of the numeric types that can accurately and precisely represent is.

But it’s not that way today, so Int&Double is an empty type. I’m afraid don’t understand what the relevant value/behavior is that you think already exists and didn’t in Scala 2.x, beyond type intersections per se. Can you please explain?

Thanks, @odersky. I suppose it would be equally correct (more generally correct?) to choose X = Int | Double and Y = Int | Double, but I can easily believe that that could be the start of further problems downstream…

1 Like

Thanks for exploring this, Daniel! For any parameterized type, there would be a unique choice for which type from the intersection would be chosen for the erased type. From a Java point-of-view, that would be the end of the story, but Scala would also have access to the full type intersection. The fact that Scala has been tracking the exact literals would mean that the intersection type stored in TASTY would provide all the information the compiler needed to know that it could safely cast any of the values in the List (or whatever it is) to any one of the types in the intersection.

I’m not sure why Dotty permits mapping using the overloaded + method like that, but I think it just makes a choice using weak conformance.

2 Likes

A more radical choice would be to give numeric literals no type at all. Any numeric literal which is typed without an expected numeric type would be an error. Unfortunately this includes statements such as val x = 1. Explicit suffixes for Int and Double can make this easier (val x = 1i) but you could also consider making an exception for variable definitions (and infer Int or Double, respectively).

I like @propensive’s proposal, but if that’s impractical, here’s another view:

Like @soronpo mentioned, there’s a view in which implicit widening and implicit conversion (which are currently separate) could be unified, by replacing the former with the latter.

Provide implicit conversions (maybe behind an import, or maybe in predef) where there’s no loss of precision – Byte => Int, Int => Long, Float => Double, Int => Double etc.

Then, look at it from the “principle of least surprise”. Let’s say the List, Array, etc. constructors were “builders” instead:

val foo = Array().add(1).add(1.0).add(1.0f).build()
// instead of
val foo = Array(1, 1.0, 1.0f)

Then, you’d expect the type of the result to be decided by the first thing you add. So I’d propose that the first argument to the constructor should decide the type, and further arguments would have to be either the same type, or be implicitly converted to the same type. If that can’t happen, you get a reasonable compile error that it’s the wrong type.

If the conversions were placed in predef, this would render most usages of literal-based collection construction source compatible (except in cases like Array(0, 1.0, 3.0), in which case they would have to change the first element of the constructor. Hashtag sorrynotsorry)

This would also probably simplify the typechecker considerably, especially if it were generally applied to varargs methods. I think it’s a reasonable constraint that I personally wouldn’t mind being applied to all of my code. In particular, users of -Xfatal-warnings -Ywarn-numeric-widen, which is a thing that lots of people (including myself) frequently cargo-cult from Rob Norris, wouldn’t be affected at all. Others would get a reasonable, easily-corrected (possibly even automatically correctable?) compile-time error.

1 Like

Right - the idea of syntactic literals that can then be rendered into one of a range of concrete numeric types is where I would be starting if I were starting not from here. The Haskel numerical support works a lot like this, and is quite nice to work with. So, very roughly:

1 : given Numeric[N] => N
3.14 : given Floating[F] => F

Then the numeric typeclasses would have some “under the hood” magics to actually make the representation from the parsed constant, which would be, I hope, applied at compilation and fully inlined away.

But perhaps we can’t get to this from where we are now.

1 Like

sorry but I think that’s the worst variant of all presented here. Array(a, b, c) is precisely not Array().add(a).add(b).add(c). Then we wouldn’t need any type inference at all.

  1. I am in support of dropping weak conformance.
  2. I am against giving literal expressions a voodoo power.

A static type system is a lightweight proof engine to check that your program behaves reasonably, and prevents calling method .eat(banana) on an Int, for instance. It’s a lightweight system because it makes the assumption that all terms in a type behave in the same way with regards to the type checking.

In the world where we can’t tell the type of literal expression 1 for sure, how would one prove its soundness?

1 Like

Let me try to elaborate a bit. What’s being proposed here is a proposal to drop referential transparency from numeric literal. People use 1 + 1 as a safe example of a pure expression because it is expected that 1 and 2 evaluate to Int type.

scala> def head[A](xs: A*): A = xs.head
head: [A](xs: A*)A

scala> head(1 + 1, 5.0f)
res0: Float = 2.0

scala> head(2, 5.0f)
res1: Float = 2.0

In Scala 3, res0 will evaluate to AnyVal. Ideally neither should compile because Int and Float are incompatible types (Int.MaxValue can’t be expressed using Float), but those two expressions getting types are different level of bad in my opinion.

2 Likes

I would love to disallow weak conformance. Actually, I was happy with OCaml behavior back in the day, so you still have a fair marge of strictness (ok, other people call that “user unfriendlyness”) before I got fed up.

Some caveats with the current proprosition:

  • I love @propensive idea about intersection type, and I feel sorry it can’t seem to work. Is it certain it is not the correct solution for the long term? (I saw Proposal to drop Weak Conformance from the language and corresponding answers)
  • if we go that way, I REALLY need a compiler flag to error on inference to AnyVal. I’m totally happy to be forced to write val foo: List[AnyVal] = List(1L, 'x', 4.3) if really it’s what I want - I don’t want that, and if I want, I really, really want everybody to be sure that it’s a conscius decision, not an inferance problem.
  • regarding user friendlyness, I believe val foo = List(1, 2.0, 3) //oups, List[Double] is really, really suprising.

All in all, I believe that the proposal make things better (more strict).

From all of the choices in this discussion, I prefer the original SIP proposal.

It seems to remove a bit of complexity from the language, but still retain the standard use case that is likely to be helpful in the mainline case.

I also don’t understand what the issue is with widening the type to AnyVal. In particular I’m not convinced that this will cause significant issues in practice.

Inference to AnyVal, or Any, or AnyRef, or Product with Serializable is always an error in practice, and if it is not, it should REALLY REALLY be documented why you need that.

That proposale add a new fearly simple trigger toward inferance to AnyVal, so the need for a compiler flag that fails compilation when it infer toward a parametrizable list of types (at least all the above by default) is even bigger.

It can be a warning, but at least on my code base, it as big an error as a non exhaustive pattern match (well, ok, this one is also a warning - can we make it an error please?)

1 Like

No compiler can ever guarantee that the logic of a program is correct. Instead, I think of the Scala compiler as a tool to help minimize the the number of logic errors, but there is always a trade off between simplicity (in the code and/or the compiler) vs completeness.

My assumption is that most of the time that you end up with a List[AnyVal] that the code is likely to fail complication very quickly when that list is used and it doesn’t match the expected type.

If ending up with AnyVal, Any, AnyRef, etc, is always a bad idea then perhaps that could be handled generically (as you say, by always requiring an explicit type annotation). But this issue seems somewhat orthogonal to the the weak conformance discussion.

I agree. If writing head(1 + 1, 5.0f) silently infers AnyVal that should be caught at compile time.
I tried to write a Scalafix rule for this, but it should be an out-of-box behavior.

But would the following code similarly throw up an warning/error? I.e. why is this problem restricted to literal values?

 val ints = List(1, 2, 3)
 val doubles = List(1.0, 2.0, 3.0)
 (int ++ doubles).head

**Welcome to Scala 2.12.4 (OpenJDK 64-Bit Server VM, Java 1.8.0_191).
Type in expressions for evaluation. Or try :help.

Seq(1, 2, 3, 4.0, 5.0, 6.0)
res0: Seq[Double] = List(1.0, 2.0, 3.0, 4.0, 5.0, 6.0)
Seq(1, 2, 3) ++ Seq(4.0, 5.0, 6.0)
res2: Seq[AnyVal] = List(1, 2, 3, 4.0, 5.0, 6.0)**