implicits

Trouble getting scala type inference to work

谁都会走 提交于 2020-01-17 14:02:44
问题 The general goal: Suppose for instance I want to develop a very pluggable issue tracker. Its core implementation might only support a ticket id and a description. Other extensions might add support for various other fields, yet those fields might exist in the database on the same table. Even if not, the number of queries to the database should not need to increase along with the number of extensions. They should be able to contribute to the definition of the query. Item[A, B, R[_]] would

Scala - Co/Contra-Variance as applied to implicit parameter selection

你离开我真会死。 提交于 2020-01-14 07:13:29
问题 I've got a trait like this: trait CanFold[-T, R] { def sum(acc: R, elem: T): R def zero: R } With a function that works with it like this: def sum[A, B](list: Traversable[A])(implicit adder: CanFold[A, B]): B = list.foldLeft(adder.zero)((acc,e) => adder.sum(acc, e)) The intention is to do something like this: implicit def CanFoldSeqs[A] = new CanFold[Traversable[A], Traversable[A]] { def sum(x: Traversable[A], y: Traversable[A]) = x ++ y def zero = Traversable() } sum(List(1, 2, 3) :: List(4,

Workaround for importing spark implicits everywhere

*爱你&永不变心* 提交于 2020-01-02 04:45:08
问题 I'm new to Spark 2.0 and using datasets in our code base. I'm kinda noticing that I need to import spark.implicits._ everywhere in our code. For example: File A class A { def job(spark: SparkSession) = { import spark.implcits._ //create dataset ds val b = new B(spark) b.doSomething(ds) doSomething(ds) } private def doSomething(ds: Dataset[Foo], spark: SparkSession) = { import spark.implicits._ ds.map(e => 1) } } File B class B(spark: SparkSession) { def doSomething(ds: Dataset[Foo]) = {

import implicit conversions without instance of SparkSession

混江龙づ霸主 提交于 2019-12-30 06:59:13
问题 My Spark-Code is cluttered with code like this object Transformations { def selectI(df:DataFrame) : DataFrame = { // needed to use $ to generate ColumnName import df.sparkSession.implicits._ df.select($"i") } } or alternatively object Transformations { def selectI(df:DataFrame)(implicit spark:SparkSession) : DataFrame = { // needed to use $ to generate ColumnName import sparkSession.implicits._ df.select($"i") } } I don't really understand why we need an instance of SparkSession just to

How does the Scala compiler synthesize implicit evidence with `<:<`?

三世轮回 提交于 2019-12-25 07:02:21
问题 Given this (admittedly contrived) code fragment in Scala: object Main extends App { class X { def foo = 1 } def f[A](value: A)(implicit ev: A <:< X) = { value.foo } println(f(new X())) } What does the Scala compiler do to make this pass? I have looked at some code in Predef but I don't understand the implementation. Please give a detailed step by step explanation. 回答1: Callsite Let's look at what the type inferencer does when you write: f(new X()) It first has to figure out, what the template

Why am I getting a “ diverging implicit expansion” error when trying to sort instances of an ordered class?

陌路散爱 提交于 2019-12-24 19:36:38
问题 There are a lot of questions on the subject but after one hour of reading, I still cannot understand what I am doing wrong. Here is a minimal example of the code I have (Scala 2.11): object Anomalies { sealed abstract class AnomalyType(val priority: Int) extends Ordered[AnomalyType] { override def compare(that: AnomalyType): Int = this.priority - that.priority def name = toString() } case object CRITICAL extends AnomalyType(0) case object SERIOUS extends AnomalyType(1) case object WARN

Other programming languages that support implicits “a la Scala”

点点圈 提交于 2019-12-21 07:12:42
问题 Scala implicits are very powerfull. I'm curious if they are a new/unique feature of Scala, or the concept already existed in other programming languages. Thanks. EDIT : To clarify my question, yes, I'm talking about this concrete implementation. Having "implicit things" all around seemed strange at first, but having used it for a while and seeing how others use it, I'm impressed by how well it works. 回答1: Looks like the inspiration was Haskell's type classes. At least one blog article claims

Conditions under which compiler will not define implicits (constructor, destructor, copy constructor, copy assignment) [duplicate]

北战南征 提交于 2019-12-18 10:01:52
问题 This question already has answers here : Conditions for automatic generation of default/copy/move ctor and copy/move assignment operator? (3 answers) Closed 6 years ago . This is supposed to be a trivial question but I could not find it explicitly on stackoverflow. The following will be defined implicitly if not provided by the user. default (parameterless) constructor copy constructor copy assignment operator destructor But I have read somewhere (which I cant seem to find now), that there

What is a diverging implicit expansion error?

房东的猫 提交于 2019-12-17 23:26:30
问题 While trying to find a solution to another question ([1]) I came across a diverging implicit expansion error. I'm looking for an explanation about what this means Here's the use case: scala> implicit def ordering[T](implicit conv: T => Ordered[T], res: Ordering[Ordered[T]]) = Ordering.by(conv) ordering: [T](implicit conv: (T) => Ordered[T],implicit res: Ordering[Ordered[T]])scala.math.Ordering[T] scala> def foo[T <% Ordered[T]](s : Seq[T]) = s.sorted <console>:6: error: diverging implicit

Can't prove that singleton types are singleton types while generating type class instance

寵の児 提交于 2019-12-17 08:55:22
问题 Suppose I've got a type class that proves that all the types in a Shapeless coproduct are singleton types: import shapeless._ trait AllSingletons[A, C <: Coproduct] { def values: List[A] } object AllSingletons { implicit def cnilSingletons[A]: AllSingletons[A, CNil] = new AllSingletons[A, CNil] { def values = Nil } implicit def coproductSingletons[A, H <: A, T <: Coproduct](implicit tsc: AllSingletons[A, T], witness: Witness.Aux[H] ): AllSingletons[A, H :+: T] = new AllSingletons[A, H :+: T]