Proposal to drop Weak Conformance from the language

Sorry, I meant “language features” not “language flags”. My bad.

Could this use case be satisfied with the following?

import scala.language.harmonize

No, language features in scala.language._ are not appropriate for this use case. Imports of language features do not actually change valid programs into other valid programs. They only restrict what programs are valid. If they could change semantics, they would also cause the explosion of untested combinations.

To provide historical context, in Scala 2.7, the numeric conversion was done as part of Predef.int2double implicit conversion, as part of standard library. This was ok since the conversion is happening for the whole Int type (not just literal expressions), but the conversion only happened when the expected type was already Double. So List(1.0, 1 - 1) or if (true) 1.0 else (n: Int) would return List[AnyVal], AnyVal etc.

We could’ve just ended inference to AnyVal, but instead, Scala 2.8.0 designers mirrored the behavior of Java’s primitive conversion, and made a language spec change to introduce the notion of numeric widening:

If e has a primitive number type which weakly conforms to the expected type, it is widened to the expected type using one of the numeric conversion methods toShort , toChar , toInt , toLong , toFloat , toDouble defined here.

This desire to avoid AnyVal is confirmed in Dotty document:

The principal motivation behind weak conformance was to make an expression like this have type List[Double] :

List(1.0, math.sqrt(3.0), 0, -3.3) // : List[Double]

It’s “obvious” that this should be a List[Double]. However, without some special provision, the least upper bound of the lists’s element types (Double, Double, Int, Double) would be AnyVal , hence the list expression would be given type List[AnyVal] .

This too was ok since the conversion is happening for the whole Int (not just literal expressions). Because secondary non-subtyping hierarchy was bolted on, there were some odd inconsistencies like #2841 by Paul

scala> List(1, 1L)
res0: List[Long] = List(1, 1)

scala> List(List(1), List(1L))
res1: List[List[AnyVal]] = List(List(1), List(1))

and a modified example from Jon

scala> List(0, 1, 2.0)
res4: List[Double] = List(0.0, 1.0, 2.0)

scala> 0 :: 1 :: 2.0 :: Nil
res5: List[AnyVal] = List(0, 1, 2.0)

The line of argument I am making is “if we drop weak conformance, then we must absolutely not treat literal expressions in a special way.”

We can do this in more than one way to handle List(1.0, 0), List(1.0, 1 - 1) etc:

  • Let it evaluate to List[AnyVal], and explain to newcomers what AnyVal is.
  • Forbid inference to AnyVal, so the compilation fails.
  • Introduce subtype hierarchy to numeric types?

I’d also advocate for consistent treatment of 2 and 1234567890.

I would love it if someone can address my proposal to expand the concept of implicit conversion and drop Weak Conformance.

What you propose is basically to take available implicit conversions into account when harmonizing (i.e., finding a T in varargs like in the examples for this proposal). I believe this is too dangerous. It’s also unclear what should happen if there is an implicit conversion from Foo to Bar and one from Bar to Foo. What if there are three types involved, with circular or partial conversions? Besides, we don’t really have a concrete motivating example for such power.

The whole discussion is a lot more controversial than I anticipated.

To re-iterate: We must let List(1, 2.0) have type List[Double] instead of List[AnyVal]. Anything else is an unnecessary hurdle for newcomers. This was confirmed by Michael Lewis, who as far as I know is the only full-time teacher in this conversation apart from myself. So, I will stay firm on that point.

Weak conformance achieves this goal but at the price of relatively high complexity and some surprises where conversions happen that people did not expect. The new proposal drops weak conformance, which as far as I can see is uncontroversial. It replaces it with the minimal change that makes the use case I referred to compile and nothing else.

That’s it, for me. I don’t think we should consider as part of this proposal sweeping changes like intersection types for numbers or AnyVal avoidance in inference. That’s simply out of scope.

3 Likes

For motivation, I would rather the following code result values to have the same type and value:

val result1 : Double = arg match {
  case true => 1.0
  case false => 0
}
val result2 = arg match { 
  case true => 1.0
  case false => 0
}

So it’s not a matter of Weak Conformance for me. I consider both to be a Double and because of the same language feature- implicit conversion. I don’t like language special cases.

Clear rules can be provided for such ambiguity (throw an ambiguity error).

1 Like

Fair enough. Simplicity is a virtue.

Shouldn’t we then limit it only to explicit untyped literals, that is, 198571 but not final val magicNumber = 198751 or 198751: Int? The latter two can contain some surprises due to needing to think deeply about types.

The former is all that’s needed for List(1, 2.5) to have type Double.

1 Like

198751: Int already gives a List[AnyVal].

I am open to treat inline val x = 1; List(x, 2.0) differently than List(1, 2.0). That’s a judgment call. The current representation of compiler types makes it easier to treat the two the same, since both x and 1 have type 1. But one could also implement the other scheme.

1 Like

I think you get the best illusion of the type system deferring judgment about the type of 1 if Int literal adaptation applies only to 1 (not even 1: Int, just 1). The implementation might be adaptation of Int, but from an external perspective it isn’t observably different from deferring the choice of type until the context is determined.

If you allow inline val x: Int = 1 to adapt, then it seems that you can inspect x, verify that its type is Int, and then you have to think what in the world does it mean that in this case, but not with val y = 1, it somehow turns into a Byte. But if only a bare 1 can do it, you can just assume that the context of inspection assumes Int unless it has some reason to do something else. So you can’t observe the “true” type of 1, in some sense.

So I think this way the rule can stay approximately the same, but the user doesn’t have to think about lists of expressions and sets of primitive numeric types and whether a variable is inline val or val or var. Instead the rule is: if you see a whole number typed out, it’s whatever type makes sense there, with Int as the default if it’s not otherwise clear.

3 Likes

@odersky
I guess I can see how a newcomer might find it confusing that Scala considers 1 and 1.0 differently. But if that’s the use case, from your teaching experience, could we not make the rules truly minimal and restrict them to just harmonising literal int to doubles?

For everything else, the newcomer has learnt that they need:

  • L/l in order to define longs
  • F/f in order to define single-precision floating-point numbers
  • toByte in order to define bytes
  • toShort in order to define shorts
  • single quotes, ', in order to define characters

Dale

Even this is arguably too specific. The reason this trips up newcomers is because they’ve quite clearly entered a bunch of numbers and yet the list is not something that’s a number… so we force it to be numeric by conforming everything to whichever type is capable of representing all the numbers.

So how about just being direct - instead of saying List(1, 2.0) is a List[Double], say it’s a List[Number] - not by using java.Lang.Number, but by introducing our own new Number type for which the various primitives are refinements.

e.g.

Number { type Underlying = Int }

It also opens up the possibility of including BigInt/BigDecimal, rationals, fixedpoint, unsigned, and maybe even Complex numbers into the scheme.

1 Like

I don’t understand what kind of newcomer that would have be, who, after finding that List(1, 2.0) does not give the desired result, would not immediately try List(1.0, 2.0) and succeed.

If some one does not see that simple solution, the only explanation I can find is that they do not properly understand that there are different types of numbers in Scala. Without that understanding, you cannot get very far (“Why 1.0 is not a proper List index? Why 1/2 gives 0?”), so I would consider that understanding an absolute prerequisite taught in the first class session and not try to optimize the language for usage by people who don’t understand that an Int is not a Double.

The more I follow this discussion, the more I am convinced that List(1, 2.0) should just be List[AnyVal] and nothing else. The only valid argument in favor of List[Double] is convenience, certainly not beginner-friendliness. A beginner, once they understand that there is Int and Double with common supertype AnyVal will find it easy to understand why it would be List[AnyVal], while it is much harder to understand why it would be List[Double] either due to the existing rule or the one proposed here.

There is nothing you could do with a Number that you cannot also do with an AnyVal, so it might as well be a List[AnyVal]. OK except maybe calling .toDouble and friends, but definitely not arithmetic operations. So such a Number type would be totally useless.

2 Likes

Will it be possible to write

val a = Array[Double](5,5L,5.5)
val v:Double = 5L

?

This proposal only changes what type is inferred if none is specified. In those cases where a type is already specified, nothing changes.

And thanks, this reminds me: the other simple solution to make sure you get a List[Double] is, of course, to say so, i.e. List[Double](1, 2.0) instead of List(1, 2.0).

5 should be inferable as a Natural an Int or Double. I think there is very good reasons for having Ints and Doubles as the default numbers, one of which is that Double is a super set of Int. However I think its scandalous that in 2019 Naturals are not yet part of standard, mainstream Scala: 31 bit Naturals, storable as 32 bit Ints. Naturals or Nats hence being a subset of Ints. I’d like to the changes to be orientated to future inclusion of Nats and other refinements of the Int and Double types.

A Char should most definitely not be inferable as an Int.

While List(1, 2.0) may be practical to see as List[Double], not having the same type for 1 :: 2.0 :: Nil makes no sense. They are taught as interchangeable but obviously aren’t from this example.

2 Likes

Option.fold in its documentation has:

This is equivalent to scala.Option map f getOrElse ifEmpty.

but that’s false. Example:

object Main extends App {
	(null: Option[String]).map(List(_)).getOrElse(Nil) // compiles
	(null: Option[String]).fold(Nil)(List(_)) // doesn't compile
}

I’m not saying it’s good (IMO it’s bad, Option.fold is still frequently useless), but the equivalence or interchangeability is already a lie sometimes.