Proposal to add top-level definitions (and replace package objects)


Hi Scala community!

This is a request for comments for the proposal to add top-level definitions in Scala 3 as a simpler replacement for package objects. Dotty implementation is described here.


Any source file may contain top-level definitions that go into the enclosing package:

package p

type Labelled[T] = (String, T)
val a: Labelled[Int] = ("count", 1)
def b = a._2

case class C()

implicit object Cops {
  def (x: C) pair (y: C) = (x, y)

The compiler generates synthetic objects that wrap top-level definitions falling into one of the following categories:

  • all pattern, value, method, and type definitions,
  • implicit classes and objects,
  • companion objects of opaque types.

If a source file src.scala contains such toplevel definitions, they will be put in a synthetic object named src$package . The wrapping is transparent, however. The definitions in src can still be accessed as members of the enclosing package.


Package objects have proven fragile and have several downsides:

  • only one package object definition per package, even though some top-level definitions would make sense to live in different files
  • a trait or class defined inside a package object is different from one defined inside the package itself and can lead to surprising behavior
  • unclear semantics w.r.t to the scope of what can be inherited (see #441)

Top-level definitions solve these downsides, but they don’t have the ability to inherit definitions from another trait or class. That can be easily worked around by using a regular object and importing all it’s members.

Pre-SIP: program Foo = { println("Hello world!") }
Fourth batch of Scala 3 SIPs

I want to understand something. Say I have:

\\File myLibTop.scala
trait JustALib {
  val foo : Int = 1
object myLib extends JustALib
\\File myLib.scala
package myLib
val foo2 : Int = foo

Will the above code compile? If not, why not?


@sjrd Do you think this feature will work well with Scala.js JS interop ? This is on my mind since I just read


I dislike this idea because now I will be unable to make “parameterized packages” - generic traits with some exportable symbols I need to be importable from all my packages

So I wouldn’t be able make a generic trait with a type definition using type parameters and two packages inheriting it.

Though you may partially fix this in your proposal if you add “visible” imports. Like

package my.pkg0
visible import my.pkg1._

So, “visible import” should make all the pkg1 symbols available for import from pkg0

Still no solution for “parameterized packages”


I find this a really useful change. Particularly placing opaque types and high-visibility implied declarations at the top level. We need two extra bits of functionality to make it really work well - import-for-reexport, and altering import implied to pull in not just the implied declarations, but everything else as well.


Combined with an implementation of on Scala.js’ side, yes, it will work very well :slight_smile:


If we’re going to do re-exports from other packages, I’d prefer syntax like

package my.pkg0 with my.pkg1._

This would create aliases only, so my.pkg0.Foo would resolve to my.pkg1.Foo.

This solves the re-export issue, and if you allow things like with my.pkg1.{Foo => Fooh, _} you can rename also.

It doesn’t solve the boilerplate code problem that arises when you want several different packages to offer the same underlying code maybe with different type parameters.


Both packages and objects inhabit the same namespace, so this is a double definition error. Scala 2 also flags this as an error:

package test

object  mylib
package mylib {}
pkg.scala:4: error: mylib is already defined as object mylib
package mylib {}
one error found


Can an object definition be a reasonable substitute?


Can’t we make it so that such cases are accepted by the compiler as syntactic sugar to allow inheritance instead of duplicating code.


Not really. Objects aren’t open for extension while packages are.


I wrote “reasonable” and I meant it.

You mentioned parameterizing package objects by inheriting from a common trait. For that particular use case I don’t think openness of packages has much impact. You can certainly define an object and import its members, as you would with a package. I guess there’s more to it though, so I probably missed something.


Not sure. What would be the rules of desugaring?


OK, let me clarify what I think will satisfy all issues with this proposal.

Proposal: $autoimport objects

Inside a package we can define an $autoimport object, and the compiler will automatically import it for us for every scope of the package and scope of the package users:

package myLib 

object $autoimport {
  val foo : Int = 0
import $autoimport._ //automatically generated by the compiler

val topFoo = foo //we can access all public fields of $autoimport
package myLib 
import $autoimport._ //automatically generated by the compiler

val topFoo2 = foo //we can access all public fields of $autoimport
import myLib._ 
import myLib.$autoimport._ //automatically generated by the compiler

val user = foo //we can access all public fields of $autoimport
val user1 = topFoo1 
val user2 = topFoo2

Of course, $autoimport can extend any trait or class without any special rules.
So we have:
:white_check_mark: Added top level definitions
:white_check_mark: Removed package objects
:white_check_mark: Still support package objects-like inheritance
:white_check_mark: No more “fragile” compilation

We can also write private[myLib] object $autoimport if we don’t want it autoimported to library’s users


I think, though, we can still keep package objects and instead change their semantics to what I referred to as $autoimport objects.


What happens when separately compiled modules both define any of these top level things? How can I break things by defining top level items that appear mutually exclusive but in fact collide due to hidden synthetic collisions, etc?

I would like to see an explanation on how collisions are avoided or inconsequential when the non overlapping syntactic definitions are placed in files with the same name and package in separately compiled projects.

Sure, this should be avoided, but accidents happen and there are some with malicious intent.


I love this! It’s really annoying having to come up with a specific object to place every little method/value in.

Would this proposal allow, or would it be possible to allow, creating an application without an enclosing object?

package p

def main(args: Array[String]): Unit = { ??? }


If I understand you correctly, your use case is something like this

trait X[T] {
  def a(t: T): Int
  def b(t1: T, t2: T): T

package object IntX extends X[Int] {
package object StringX extends X[String] {


Does package objects then provide something that you can’t use a plain object for?

For instance,

package x.y

package object z extends X

could be replaced with

package x.y

object z extends X

… provided a package and an object can share namespace.


Personally I’m strongly in favour of this proposal. It helps with scripting but also at the prototype stage where I basically write the code like in a worksheet and then refactor when things start to solidify.

I am not sure about the import implied thing I’ve read about somewhere else, but as long as it’s not part of the proposal (it doesn’t seem to be?) I have no objections whatsoever.


If Scala doesn’t wrangle the byte-code needlessly it Should Just Work™, after all the generated artifact is just a class with static members.

edit: Hmm, I’m starting to think that Scala actually gives special treatment to main-methods inside objects. It has to be generated as a static forwarder in the bytecode I imagine, unlike other methods which become members of the singleton instance.

I guess then that it depends on where in the compilation chain this happens. If the “package object” is generated as a scala artifact and then compiled along with the rest of the code it should generate the same static forwarder.