I like sealed hierarchies because it prevents me making mistakes with refactoring, not just because I can’t remember the names the first time around.
Interesting, what do sealed hierarchies give you that Scala 3.x enums don’t? They both do exhaustivity checking right?
If you want it not to appear in autocomplete by default, the generated method could be private[scala] and there could be extension methods that could be imported.
This would also resolve the issue of having both lower level and safer versions without generating twice as much code. The extension methods are defined once and could include as many varieties as desired. Most people shouldn’t be working with ordinals anyway and therefore can just not add them import.
Yes, as far as I know. I’m just saying that autocomplete is not a replacement for exhaustivity. Autocomplete saves you a little time while coding; exhaustivity checking lets you refactor with confidence that sneaky bugs are impossible.
Yes I agree completely, my point about autocomplete is just that if we add unsafe methods to the public API of every enum, people will use them without a lot of care because they will be suggested by their IDE. Especially if they have an innocuous name. I hope if we add such a method it will be unsafeFromOrdinal
or accursedUnutterableNotConsistentUnderRefactoringFromOrdinal
. I also like @nafg’s suggestion.
Re: fromOrdinal
My apologies if this has been broached already.
There is an implied type conversion from Int
to [0..N)
The argument type of fromOrdinal should be [0..N)
not Int
hell no! Nulls should not appear in any APIs unless explicitly mentioned in the name (such as fromOrdinalOrNull
), in my opinion.
I see a lot of value in having a fromOrdinal that throws (for working efficiently due to the lack of Option wrapping), as long as there’s a fromOrdinalOption that returns an Option. That way we still provide a way for users to specify how that “missing int” is to be handled (instead of relying on the exception being thrown).
About the argument type being [0..N)
as suggested by @gabrieljones - this makes it impossible to call when all you have is an Int
. Users would be forced to do a type check which could be built-in into the method we’re discussing.
Edit: By the above, I mean it shouldn’t be the only way to construct a value from some kind of Int. But I’d gladly welcome having both that (a strongly typed function that takes [0..N)
and never throws or wraps into Option) and the more dynamic ones (Int => Option[MyEnum]
, Int => MyEnum
).
Hey Jakub,
What’s your take on Finalising Enumerations for Scala 3 that Rob pointed out, and my expansion on this in Finalising Enumerations for Scala 3 ?
I feel it is the more central issue. There should be an agreement under what circumstance it is correct at all to generate fromOrdinal and toOrdinal, and after that discuss what their type signature should be.
I think neither of these methods (fromOrdinal, toOrdinal, values) should be generated when there are parameterized cases.
IMO: An enum is either an enumeration (equivalent to a sealed trait with just case objects implementing it) or something else.
I’ll skip over that I think “enum” is confusing when cases can be parameterized because I don’t see this syntax changing at this point
We agree in spirit.
- Ideally we wouldn’t even be having this discussion, we’d just have
fromOrdinal
(which throws) andfromOrdianlOption
but the problem is Martin keeps rejecting all this stuff for some reason so now we need to think of… compromises. - Don’t forget that we’re going to have safe null support in Scala 3 so the null returning method would be
fromOrdinal(i: Int): E | Null
with the null safely included in the type-system which IIUC would require some kind of null checking before one could get to theE
.
@japgolly - It is absolutely not ideal to have fromOrdinal
which throws and fromOrdinalOption
which doesn’t. You still have no high-performance method that works in the presence of errors!
Fair enough. I’m admittedly not giving this proper thought. I’ll shoosh.
On the bright side, if we didn’t, we probably wouldn’t have noticed that ordinal
isn’t stable across refactors
My original resistance against fromOrdinalOption was that I assumed it’s trivial to check whether an index is in range. But that’s only true for simple enumerations. For general ADTs, some of the slots will not be defined.
So our choices seem to be the following
- don’t define ordinal numbers for ADTs. Not an option without a lot of changes and probabaly code bloat. Serialization methods already use
fromOrdinal
for simple cases of ADTs. - Add a pre-check. i.e.
hasOrdinal(n: Int): Boolean
. This could be very efficient. Internally, the implementation maintains avalues
array that associates each ordinal with its enum value. So the test would amount to a bounds check and a null test. - Add a
fromOrdinalOption
method. That one causes runtime overhead. I assume thatfromOrdinal
would typically be called in high performance code. So going through anOption
can be problematic. On the other hand: In most cases we might know that the ordinal exists, so the simple, throwingfromOrdinal
would work. So maybe it’s not an issue. - Let
fromOrdinal
return anull
in case of a bad input. That one I would veto. It’s really the same situation asfromOrdinal
returning an option, only less safe but faster. I have already argued at length why I think that is the wrong model. - Add to an enum companion
E
a method
That’s similar todef fromOrdinalOr[T](x: T): E | T
fromOrdinalOption
but it does not need to box. Performance-sensitive low-level code could call it asfromOrdinalOr(null)
which would give the desired behavior while being explicit thatnull
can be returned.
Overall, I think fromOrdinalOption
+ fromOrdinal
is fine, if we can convince ourselves that high-performance code always can call fromOrdinal
(@Ichoran WDYT?). If not, then either prevalidation with hasOrdinal
or fromOrdinalOr
could also be considered.
What is the use case of the ordinal methods?
Is it only for serialization purposes? I thought the consensus by now was that ordinals are suboptimal for serialization.
Is it for type class derivation? Does the derivation framework even require fromOrdinal
? Is it necessary to expose that stuff to the user outside of the framework?
It’s true that if we don’t actually intend to use fromOrdinal
for serialization, then perhaps we shouldn’t define it at all … (neither ordinal
– we’d still get it for java.lang.Enum
s by inheritance, but that’s fine) And instead we should have fromLabelName
.
Type class derivation framework, as it stands, overlaps with the design of enum desugaring: the Mirror
framework generates a def ordinal(e: E): Int
method, where E
is a sealed class, for fetching the offset of a case from the first sealed child of E
. This is generated for any sealed class type E
if all its sealed children are contained within its companion object E
. (Anything that looks like the enum desugaring.)
For traditional sealed case class hierarchies, that is implemented as a pattern match, for enums it is optimised to call the def ordinal: Int
instance method on an enum case, which is also generated for parameterised enum cases, like case Some[T](t: T)
. If we get rid of ordinal
as an instance method on enum cases then ordinal
on the companion can be deoptimised back to a pattern match.
It may make more sense for the mirror framework to rename Sum.ordinal
to caseOffset
or similar to avoid collisions, then rename ordinal
in scala.reflect.Enum
to the same.
This would make the path clearer for using the declared case label as the canonical value for serialisation which can survive reordering.
@odersky - I don’t see how high-performance code can always know that the argument it’s passing to fromOrdinal
is certainly in range. Therefore, it would have to use fromOrdinalOption
and lose performance (or catch exceptions and lose even more performance unless the out-of-bounds stuff is very rare).
Therefore it seems to me that either hasOrdinal
or fromOrdinalOr
is necessary for a solution that is both robust and high performance.
We will be making lookup by case label always public as well
Hello. Will scala enums have a way to set an adopting type to be a summ type and require it to provide a certain set of cases? A have come across this idea when wanted to implement suspensible computations with finite state machines. I imagined it as something like a base enum with two cases begin
and end
, that would be extended like so: enum MyFSM extends AnyFSM
where AnyFSM
has at least two cases. Will it be possible in scala?