Context Bound Return Type

Hi there,

I have been dealing with a problem lately, for which I have not been able to find a satisfying solution and I would greatly appreciate your input.

Let’s say we have a trait and some implicit values:

trait SupportedType[T]

object SupportedType {
    implicit val floatSupportedType = new SupportedType[Float]
    implicit val doubleSupportedType = new SupportedType[Double]
}

And we have a class whose constructor uses a context bound with this trait:

sealed abstract class DataType[T: SupportedType] (...) { val cValue: Int }

object FloatDataType extends DataType[Float] { override val cValue: Int = 1}
object DoubleDataType extends DataType[Double] { override val cValue: Int = 2 }

Now comes the problem. We want to convert an integer (say coming from a native library) representing each of these data types, to the corresponding data type. So we have something like this:

def fromCValue(cValue: Int): DataType[_] = cValue match {
    case FloatDataType.cValue => FloatDataType
    case DoubleDataType.cValue => DoubleDataType
}

The problem is that with this function signature, the compiler does not know that the generic type of the returned data type satisfies the SupportedType context bound and I cannot use the corresponding functions I define for SupportedTypes. Given that the constructor of DataType has the context bound, is there any way of specifying that the return type of fromCValue will also satisfy it?

I am aware that context bounds work by adding an extra implicit evidence argument and thus can only be used with functions (and class constructors which are functions), but is there any solution to this problem that maybe does not involve context bounds?

Thank you,
Anthony

P.S. As a point of reference, for whoever may be interested, this is related to my other post, “Dotty-style Union Types in Scala” and it still trying to resolve issues surrounding that idea and for the same application.

Hello,

What’s wrong with simply using inheritance?

trait DataType[T]

sealed trait SupportedDataType[T] extends DataType[T] { val cValue: Int }
object FloatDataType extends SupportedDataType[Float] { override val cValue: Int = 1}
object DoubleDataType extends SupportedDataType[Double] { override val cValue: Int = 2 }

def fromCValue(cValue: Int): SupportedDataType[_] = cValue match {
  case FloatDataType.cValue => FloatDataType
  case DoubleDataType.cValue => DoubleDataType
}

Best, Oliver

In my code above, I define two versions of DataType. One for Float and one for Double. SupportedType[T] also has some methods defined in it (e.g., cast[S: SupportedType](dataType: DataType[S]): T, and +[S: SupportedType](that: S)) and I define an implicit class SupportedTypeOps[T](val value: T) extends AnyVal that allows me to enrich the functionality of Float, Double, and all other supported types I might have. Inheritance would require wrapping of the primitive types as discussed in “Dotty-style Union Types in Scala”.

For an additional example that might help, let’s say we have another method in the DataType class:

sealed abstract class DataType[T: SupportedType] (...) {
    val cValue: Int
    
    def getElementFromBuffer(buffer: ByteBuffer): T
}

Then I want to be able to use cast and + on the returned value of getElementFromBuffer.

Hello,

Value classes can extends universal traits. Can’t you make your traits
universal?

 Best, Oliver

What about using @specialized like the library spire?

These answers still don’t cover the use case I am describing. Let me describe something that’s close to a solution but not just there yet. Let’s say our DataType is defined as a trait:

sealed trait DataType {
    type ScalaType
    implicit val supportedScalaType: SupportedType[ScalaType]
    
    def getElementFromBuffer(buffer: ByteBuffer): ScalaType
}

object FloatDataType extends DataType {
    override type ScalaType = Float
    override implicit val supportedScalaType = SupportedType.floatSupportedType
    ...
}

object DoubleDataType extends DataType {
    override type ScalaType = Float
    override implicit val supportedScalaType = SupportedType.doubleSupportedType
    ...
}

and SupportedType is defined as earlier:

trait SupportedType[T]

object SupportedType {
    implicit val floatSupportedType = new SupportedType[Float]
    implicit val doubleSupportedType = new SupportedType[Double]
}

Now it is obvious that for all possible data types there exists the required implicit (due to the sealed keyword). However, whenever I call dataType.getElementFromBuffer, for some dataType, the returned type is DataType#ScalaType and the compiler cannot find a SupportedType implicit for that type. I have to explicitly also call import dataType.supportedScalaType in order to be able to use it as a ScalaType.

So, is there a way to create implicit for all possible values of DataType#ScalaType, due to DataType being sealed and make the compiler aware of that?

Thank you!

In the general case if you want to prove a constraint as part of a return type you have to return the proof (i.e., the instance). Here you also have an existential so you could return something like this, where you don’t know A but you do know that whatever it is, it has a matching SupportedType instance that can be passed on down the line.

def fromCValue(cValue: Int): (DataType[A], SupportedType[A]) forSome { type A } = …

Note that in order to maintain the knowledge that it’s the same [unknown] A for both values you will have to pattern-match on the returned pair. Tuple projections (._1 etc.) won’t work.

Or you could just add val supportedType: SupportedType[A] as a field on DataType which lets you get a handle on it directly, since you’re closing over it anyway. That’s probably easier.

My current solution, mentioned in the previous comment I made to this post, resembles your proposal of adding val supportedType: SupportedType[A] as a field on DataType. The problem in this case is that when getElementFromBuffer is called, the returned type is datatype.ScalaType, for that particular dataType instance. Even though I made sure all data type instances have an implicit value for SupportedType internally, there is currently no way to expose that implicit to the caller of getElementFromBuffer. Is there a way to do that? My previous comment provides a more detailed explanation of this problem.

I may misunderstand. You would need to pass the instance explicitly or re-introduce it to implicit scope.

// A method abstracting over SupportedType
def someMethod[A: SupportedType](a: A) = 42

// Given a datatype with existential ScalaType …
val x: DataType = ...

// Pass explicitly
someMethod(x.getElementFromBuffer(null))(x.supportedScalaType)

// or reintroduce to implicit scope
implicit val proof = x.supportedScalaType
someMethod(x.getElementFromBuffer(null))

That is correct and it is what I am currently doing! I’m just wondering if there is a way to avoid reintroducing the implicit or passing it explicitly, since we know it must exist (meaning that we know there exists a SupportedType[T] for T being the return type of getElementFromBuffer). I was wondering if it may be doable with macros but I haven’t found a way to do it and I’m new to scalameta.

@eaplatanios It looks like you’re trying to a Scala wrapper for
TensorFlow. If so, I’d be happy to join forces and help with this question
and more. I’m on a plane now, but feel free to direct message me, @refried
on gitter.im. We should be able to get a long way without macros apart
from what’s already in shapeless.

That is correct and it is what I am currently doing! I’m just
wondering if there is a way to avoid reintroducing the implicit or
passing it explicitly, since we know it must exist (meaning that we
know there exists a SupportedType[T] for T being the return type of
getElementFromBuffer).

I think your underlying problem might be that your dataType must always
be supplied as an existence proof. You can always write /types/ like
DataType {type ScalaType = ClassLoader} even if you have not allowed for
the existence of DataTypes of those types. So unless you exhaustively
define SupportedType, which would be unwise, you cannot make the
determination that a SupportedType exists for every ScalaType. It is
only the case for every ScalaType associated with an inhabited DataType
subtype. (NB: there exists no SupportedType[DataType#ScalaType].)

To illustrate, a method to extract SupportedType from a datatype by
means of the existence proof (i.e. the argument dt):

def universalSupportedType(dt: DataType)

   : SupportedType[dt.ScalaType] = {

 def go[ST](dt: DataType {type ScalaType = ST}): SupportedType[ST] =

   dt match {

     case FloatDataType => floatSupportedType

     case DoubleDataType => doubleSupportedType

   }

 go(dt)

}

Mind your path stability, or lift your type members.

(tested Scala 2.12.2)

1 Like

Hello Anthony,

It might help if you could state your problem not just in terms of how
you want the code to look like, but more importantly, what useful tasks it
should accomplish.

While my understanding is very limited regarding what you want (or what
are all the things that can be done with implicit arguments), I think what
you want is simply impossible. Or perhaps it is possible using macros, but
only with lot of effort and little benefit.

Maybe I misunderstand you, but the handling of implicit arguments by the
compiler is not very smart, instead it is fairly simple, but designed such
that programmers can do clever things with it. The Scala compiler is not
going to remove method arguments - be they explicit or implicit - because
it is somehow obvious from the code what those arguments will be. Or if you
have an argument of type A, but you only ever can call it with arguments of
type B, the compiler will not switch the argument type from A to B.

Implicit arguments make most sense in cases, where there are instances
that are not previously known. In your case, however, it seems you already
know all instances, so I don’t know why you need them at all.

 Best, Oliver