Is there a reason why value class's backing class can't be 'cast' to the value class?

A simple case:

    class ValueClass[T](val x :T) extends AnyVal
    val ints = Seq(42)
    val values = ints.asInstanceOf[ValueClass[Int]]
    val h = values.head

Obviously, this is plain wrong and throws a ClassCastException. However, it would be very beneficial and made value classes considerably more useful if it worked. Aside from the collection case, where the ‘only’ issue is performance, there is also a case of implementing a generic interface with a value class as the type argument, which results in a bytecode clash of the erased, ‘bridge’ method signature inherited from the base type and the actual implementation. It simply cannot be done, AFAIK.

On first glance however, I can’t see why it couldn’t work; why the compiler, when inserting the cast from the erased object to a value class, couldn’t verify that the object is not of class ValueClass and treat it as the backing value of that class? There is an argument that in a generic context (i.e., an erased type instead of Int here), this wouldn’t offer any type safety whatsoever. But this delayed casting error is no different from how generics work in JVM anyway.

The consensus is against you. It is generally accepted that the fact that the above doesn’t work is a feature of value classes.

However, if this is what you want to happen, I have good news for you: you can use an opaque type alias instead. The behavior that you ask would work with opaque type aliases, although again, the consensus is that this is a weakness of opaque type aliases (which is a necessary trade-off to get their benefits).

1 Like

Thanks, yes, I knew about opaque types but they are Scala 3, which isn’t even here yet, and migration is going to be huge and slow…
It’s hard to argue with some enlighted consensus, but static type safety is there for when you want it and generics are a much bigger hole than this would be. I can’t imagine a single scenario where this would be detrimental - a Seq[Int] still needs to be cast to Seq[Opaque], you can’t do it by mistake and, as is, value classes basically can’t be put into collections. Casting is always the programmer saying ‘yes, I know what I’m doing, I want to switch to manual now’ and - for me, a non-academic with deadlines - often a great feature. Apart from saving time in critical scenarios, it allows to implement things which wouldn’t be possible without it (or at least be much more efficient). It’s built in into Scala anyway - even pattern matching on higher types can’t infer correct type parameters of the narrowed type, requiring a cast in what is a very common and much more dangerous idiom. And when all is said and done, interoperability with Java and JVM, which are huge selling points, basically eliminate the possiblity of complete type safety.

In other scenarios this already works:

class Base[T](val t :T)

class Value[T](val t :T) exends ValueClass[T] 

class Sub[T](t :Value[T]) extends Base[Vallue[T]]

val subs = Seq(new Sub(new Value(42))
val bases = subs.asInstanceOf[Seq[Base[Int]]]
bases.head //< works

This means that a HMap can be cast to HList of its values, a great and useful hack.

Saying about opaque types that it was a necessary compromise for the benefits and opposing what is basically their Scala 2 equivalent, both in features and main use case, is simply illogical to me.

You realize that by saying that, you’re implying that I’m an academic without consideration for people’s deadlines, and that my answers must therefore come from some ivory tower nonsense, do you? That’s not very nice, and certainly not a good way to make me, or others, change my mind.

Accepting the underlying types as values of their value classes during downcasting added the compiler in erasure (which is the mechanism you propose in your first message) must mean that they are also accepted during type tests. That means that

class MyValueClass(underlying: Int) extends AnyVal

def foo(x: Any): String = x match {
  case x: MyValueClass => "it's the value class"
  case x: Int          => "it's an int"

will match all incoming Ints in the first case, whereas currently, they are not, and are instead matched by the second case. I consider that detrimental.


First, I am sorry, I didn’t mean to offend and certainly not be protectional/dismissing. What I meant is that primo, casting is cutting corners and science, especially libraries and APIs is much less tolerant to that; ‘it will be read when it’s ready’ is much more acceptable and common in operating systems, computer games etc. than in commercial projects. Secundo, dedication to purity of functional languages is much more a thing in these circles than in the industry, which Scala’s success I think attests. I might have been put on edge/made slightly confrontational by the argument ‘the consensus is against you’, which has much less weight in computer science than, say, history.

Second, I did come here with a question and full understanding that even a fair knolwedge of the language cannot compare the familiarity of the programmers of the compiler itself. I was interested if it was considered, and what potential consequences I cannot forsee it could have - like the one you made here. I am still not entirely convinced that pattern matching must follow the quite singular case of ‘jumping out of hyperspeed (erasure)’, but I am much more willing to accept your word for it and it can see it would result in some inconsistency failing perhaps the principle of least surprise. I could also believe that it could more problems and unclear semantics when the value class is ‘more deeply nested’ in a type; also, even a quick assessment shows, that any non-covariant use could be problematic or impossible. This was the kind of response I was hoping for, assuming the matter has actually been considered and discussed.

Well, a good thing then that Scala 3 gives us choice. It will certainly sweeten the pain of taking some of my toys away.