Pre-SIP: adapters to simplify testing with closed classes

Consider example:

// class you have no access to (e.g. some java class)
final class Foo() {
  def foo(x: Int): Unit = ()
}

// write an adapter
trait FooAdapter:
  def foo(x: String): Unit

object FooAdapter:
  def apply(impl: Foo): FooAdapter = new FooAdapter { export impl.* }

The problem here is - you should write an interface by hand, which creates a lot of boilerplate for huge classes and there is no solution for this problem yet

I propose to create a language feature.

Consider keyword adapts which generates code above.

trait FooAdapter adapts Foo

I guess alternative solution here is to create a macro annotation, but there is no way to create a trait or abstract class with Quotes API.

@adapter
trait FooAdapter:
  type Adapted = Foo

@experimental
class adapter extends MacroAnnotation {
  override def transform(using quotes: Quotes)(
    tree: quotes.reflect.Definition
  ): List[quotes.reflect.Definition] =
    import quotes.reflect.*
    tree match
      case cls @ ClassDef(name, constr, parents, valDef, stmts) =>
        val tpt = stmts
          .collectFirst { case TypeDef("Adapted", tpt: TypeTree) => tpt }
          .getOrElse(report.errorAndAbort("error"))
        
        val methodSymbols = 
          (tpt.tpe.typeSymbol.methodMembers.toSet -- TypeRepr.of[Object].typeSymbol.methodMembers).toList
          
        List(
          ClassDef(
            cls = Symbol.newClass(
              parent = Symbol.spliceOwner,
              name = name,
              parents = List(TypeRepr.of[Object]),
              decls = classSym =>
                methodSymbols.map { sym =>
                  Symbol.newMethod(
                    parent = classSym,
                    name = sym.name,
                    tpe = sym.info
                  )
                },
              selfType = None
            ),
            parents = List(TypeTree.of[Object]),
            body = methodSymbols.map(DefDef(_, _ => None))
          ),
           // companion definition
        )
      case _ =>
        report.errorAndAbort("error")

Imho the right solution to such problems would be to finally support code generation, and not creating very specific language features with only a very narrow use-case.

Otherwise this would open a very big can of worms, as there are many other such cases where one needs code generation. Creating special cases for any such cases would quickly pollute the language.

It was said already a million of times that code generation is a vital features of any serious programing language. Even very “minimal” languages like C, Go, or LISP have this feature; as it’s so incredibly important for so many use-cases!

At this point I think it makes no sense to complain further about the lack of code-gen. This fact is broadly know and reached the relevant people. Now it’s their turn.

But I’m a bit skeptical about the idea here in general. But maybe I don’t get it fully, so please correct me if I’m wrong! Wouldn’t this allow to very easily circumvent access restrictions? Which would make the point of access restrictions moot?

“Testing” private implementation details is imho anyway wrong already on the conceptual level. But I know there are divergent opposition on that topic… So maybe let’s not get into that part too deep.

Or is this “adapter” thingy just some shortcut for delegation, with automatic facade generation, so it doesn’t expose private parts?

Having some more support for delegation build atop exports would be in fact nice, even if it wouldn’t come with automatic facade generation. For example Kotlin has this by mechanism [also see the link at the bottom there] which isn’t bad. But I’m not sure how exactly it would / should look in Scala.

Let’s see what the relevant people think!

No, it is not about testing private implementation details. It is about delegation and automatic facade generation without exposing anything private as you said. It just ‘extracts’ public interface from desired class and gives you simple way to create an instance . So you can for example easily test layer above, which uses this closed class. I guess it is not about testing only, but this comes in mind first.

I’m not expert on those names, probably this is actually a ‘facade’ and not an ‘adapter’.

“Testing” private implementation details is imho anyway wrong already on the conceptual level. But I know there are divergent opposition on that topic… So maybe let’s not get into that part too deep.

I somewhat agree here with you in general, but not fully, in my opinion this exact pattern is very common, still not often used indeed, usually when you write some interop to poor java code. (Exports looks something like this)

Hmm. What is the usage you would put these adapters to? I sort of get what you’re suggesting, but I don’t understand the rationale for why this is helpful.

(I’m specifically trying to understand whether you’re asking for anything that ScalaMock doesn’t already provide – while I dislike mocking myself, that’s a fairly powerful library for that sort of thing.)

i guess the example in the first post, though looking insufficient, somewhat explains the problem:

does it have functionalities like GitHub - powermock/powermock: PowerMock is a Java framework that allows you to unit test code normally regarded as untestable.?

note that i dislike mocking too. however, mimicking classes is sometimes (though actually rarely imho) useful in tests

Right, I understand the problem – what I don’t grok is why it matters. Mocking is the only use case I can think of, but I’m not sure from the given code whether that’s actually the intent, and I think it would be helpful to understand what the end goal is.

(And I’ll note that I don’t know ScalaMock well – since I mostly avoid mocking like the plague, I haven’t done a ton with it. It seems to be way better for Scala than Mockito, but that’s the extent of my knowledge.)

This is somewhat related to scalamock also. Current way of mocking classes is very dirty (passing nulls to constructors, not always working, a lot of crutches, etc), I want to restrict it and allow to mock only interfaces.

Without some solution allowing to create an interface from class - it will be painful for users.

But same thing can be needed not only for scalamock, e.g. you can create stubs by hand

I think now this is very bad idea, and I need at least to rethink this. Thanks to all!

That’s funny. I think nobody fully got or explored the idea, but now it’s bad?

I mean, “extracting interfaces from classes” sounds actually interesting.

Couldn’t such a feature aid in mimicking prototype based inheritance in Scala? To use an object as prototype you need to know its interface, as this would be also the interface of a derived object. Do I get this right?

Just thinking out loud…

Thanks for sharing! Didn’t know this one.

I’m not a fan of mocking, but good to know there is a lib for when it gets really hairy.

I think it is bad as language feature for now. Want to try it as macro solution first, waiting blockers to be resolved

What are the blockers you’re waiting to be resolved?

For a macro solution we would need the possibility to create new definitions (in this case here Traits) outside of the macro scope, right?

Such feature is not planed, AFAIK. That was also the reason for my snarky remark in my first comment. There is currently fundamental opposition to such a feature as part of the language. Even it should be clear by now that people really need proper code generation.

One needs code-gen for all kinds of “glue code”, or to keep things DRY. Be it protocol endpoint stubs, glue between UI and its logic, FFI facades, application or service templates, or just to avoid the need to write repetitive maniacal definitions, like for example all the FunctionN or TupleN traits in the compiler, or this here.

But currently it doesn’t look like we would get any compiler support for something like that.

I mean, one can read TASTy, transform it, and “pretty print” it to disk. But that’s tedious as it’s not declarative but only works programmatic, and there is no real tooling support for such kind of meta programming.

Why outside of macro scope?

I need to generate a trait body and its companion object, as I understood this is possible with macro annotation, which can be expanded into multiple definitions.

@interface
trait WhateverInterface:
  type Impl = java.util.Whatever

This is a blocker. I still haven’t tried to generate companion though

macros that generate types don’t look very convincing to me. they aren’t compatible with static analysis and make things harder for IDEs (probably intellij would have major problems with macros that generate types).

instead, a language feature for extracting interfaces from classes could be cool. i haven’t yet thought about potential limitations, so i don’t know if it would really make sense or not (i.e. the feature from this topic overall).

Given that IntelliJ has trouble with sbt plugins that generate sources (sbt-buildinfo regularly confuses Intellij), macros would either be worse, or force them to fix whatever architectural quirk is causing it to fail builds because a generated file either exists or does not exist (depending on the atmospheric forecast).

aren’t the generated sources generated during project import? the way i use them is that i reimport project in intellij to get up-to-date set of generated files and it works. maybe that requires some extra wiring up in build.sbt to funcion properly, though.

To be honest, I still don’t understand.

I don’t get what such generated definitions would be good for. You couldn’t use, or actually even see them in your program.

You can only implement some already existing trait members with macros; you can’t generate “a trait body”, or an companion object, outside of the macro as this would introduce new definitions into the program, which is not possible out of principle.

Once again: Anything generated in a macro besides implementations for already defined things is not reachable outside of the macro. This is a fundamental limitation.

So even if it’s possible to create classes or objects inside a macro such entities won’t be visible in your regular program. They are only visible inside the macro scope and can’t be seen or reached from regular code.

A Scala 3 macro can’t add new definitions. It can only transform already existing code. This does never change, add, or remove any existing signatures / definitions (visible globally). (The only small exception is transparent inline which allows to specialize a signature further, as long as it’s still compatible with the original signature.)

Scala 3 macros are unusable for typical macro use-cases as seen in all other programming languages which have some macro support: You can’t really generate new code—which is actually by far the number one use-case for macros. You can only transform already existing code, in quite limited ways, just changing the implementation details.

“Code generation” in Scala 3 always means stitching together strings in some build script. There is no language support for that. (Which is actually even more primitive than CPP “macros” as these work at least on parser tokens, not raw strings; and have actually quite decent IDE support, including inline expansions in hovers, navigation support, and even some limited code intelligence!)

At the same time almost no professional real world codebase works without some code generation. But the compiler team is completely blind to this fact. Which is of course ironic as also the compiler uses “code generation” (of course though some wonky string templates, as there is no other way in Scala 3).

I could really go mad on this topic as I don’t get what’s the *** problem here!

I could understand if they said: “We currently don’t have the resources to tackle this problem. But we are open for helping making code-gen in Scala 3 happen.”

But to my knowledge the status quo is still fundamental opposition to the core idea of generating new definitions, not only implementations, through macros (and ideally rendering them to disk). As long as this does not change this whole topic is blocked indefinitely.

Usually what happens for me is that, if the file doesn’t already exist, compile fails until I drop down to sbt compile and generate the file myself.

Reloading the sbt config in IntelliJ doesn’t seem to fix it. Fully re-importing the project might fix the problem, but running sbt compile is a lot quicker so I haven’t tried that yet.

It would be good to refresh ourselves probably why this decision was made - and re-examine the consequences of allowing new definitions

1 Like