There is said in Programmatic Structural Types:
Some usecases, such as modelling database access, are more awkward in statically typed languages than in dynamically typed languages: With dynamically typed languages, it’s quite natural to model a row as a record or object, and to select entries with simple dot notation.
Unfortunately a dynamic approach has hard disadvantages.
- It has a lack of performance on big data
- It is error prone, the compiler does not see misprints
Is it possible to solve that disadvantages at all?
I think it is possible if move dynamics from a row to a whole dataset.
It is possible to improve performance significantly. And it is possible to minimize error prone.
Unfortunately I am not have experience to make good SIP. I can only propose an idea. Please take it into consideration.
At first we need a row abstraction.
trait Record extend Prodcut{
def set(i:int, v: Any):Record
def get(i:int):Any
}
object Row extend Record{
def create():Product = ...
}
type Person = Row {
val id: Long
val name: String
val age: Int
val checkDate: Date
}
So we can be able to write
val p:Person = (name="name1", age=1)
It could be desugaring to
val p:Person = Record.create().set(1,"name1").set(2,1).asInstanceOf[Person]
In practice any data factory would use dynamics binding to create rows.
The performance improvement is reached because such additional work is done once at all dataset.
object Dao{
def queryAll[T<: Record]()(implicit meta: RecordMeta[T]): Traversable[T] = {
new Traversable[T]{
def foreach[U](body: T=>U):Uni = {
//open dataset
val s = ConnectionContext.connection.executequery(...)
//crate metamap
val mapArray = new Array[Int](meta.size)
....
var hasRow = s.next
while(hasRow){
var i = 0
var r = Record.create().asInstanceOf[T]
while{i<mapArray.size}{
i++
// It seams 70 times cheaper than by key access
// we can use zero copy approach after all
r.set(i,s.get(mapArray(i)))
}
hasRow = s.next
}
}
}
}
def updateAll[T<: Record](tableName:String)(data: ArrayBuiler[T]=> Unit)(implicit meta: RecordMeta[T]):Unit = ...
}
So we would be able to write:
Dao.update[(id:Long,checkDate:Date)]("person"){ b =>
for(p <- Dao.queryAll[Person]()){
println(p.age)
b+= (id = p.id, checkDate = sysdate)
}
}
It could be desugaring to
Dao.update[(id:Long,checkDate:Date)]("person"){ b =>
for(p <- Dao.queryAll[Person]()){
println(p.get(2).asInstanceOf[Int])
b+= Record.create().set(0,p.get(0)).set(1,sysdate)
}
}
I think if amount of column is greater than 5 such approach will improve readability significantly and decrease error prone at the same runtime speed.
It would be great if there were ability to transform record type with match type(but is a long future, so here is very short example):
val q:Seq[(name:Column[String], age:Column[Int])] = ...
val q = for (c <- Persons) yield (name=c.name,age=c.age)
for(r<-q.result){
// r:(name:String,age:Int)
}
See also: