Defaults are a little bit more complicated than many posts here imply.
They are intertwined with the uninitialized state.
Note that assignment to underscore results in the same value as an uninitialized val.
Critically, this is what is inside a fresh array
are analogous to printing out a val of those types that is uninitialized or a var assigned to underscore. Any plans to allow defaults being different than uninitialized values is going to get complicated by arrays.
Valhalla (last I checked, I have a month of mailing list to catch up on though), decided that value types will not have custom defaults for a variety of reasons. The default value will be the same as the uninitialized value (in line with ‘codes like a class, works like an int’) – meaning numeric members will be 0 and references null. A class can modify its accessors to interpret that differently.
Date class would be encoded as a byte for the day, a byte for the month, and an int for the year – but if the default is January 1, 1970 the accessors will have to add 1, 1, and 1970 respectively.
Any other choice would make the array allocation path quite a bit slower. And class allocation as well, as there is a desire for the uninitialized state and default to be the same. Getting all 0’s back from the allocator is cheap, as it clears them in bulk. Writing arbitrary user defined bit patterns to fresh arrays or objects (or bits on the stack) is not going to be fast.
There is still some debate on the topic, at least a few weeks ago. But the relationship between user defined defaults, uninitialized values and array allocation are the key things to take note of.