scala-reflect

Is there a way to get the direct parents of a ClassSymbol in macro context?

依然范特西╮ 提交于 2019-12-11 03:15:35
问题 Im trying to get the direct super classes / traits of a ClassSymbol. The method baseClasses() does not work for me as it also includes the super super.. types. The java.lang.Class.getSuperclass() and java.lang.Class.getInterfaces() would actually be sufficient for my use case, but I can't find a way to go from ClassSymbol to java.lang.Class in macro context! 回答1: If you use macro - you can't obtain runtime-object Class for class which does not exist (loaded) in compile-time (so you can't have

Scala: Dynamically generating match clauses for case classes

人走茶凉 提交于 2019-12-11 00:44:56
问题 I want to use the power of Scala's pattern matching within a set of `condition-action' rules. These rules are not known in advance, but rather are generated at runtime according to some complex critera. The algorithmic generation mechanism can be considered as a completely separate and is not part of this question, which is concerned with how to express this via Scala reflection/quasiquotes. Concretely, I'm looking to generate case definitions (of the general form case v0@x(v1,_,v2): X => f

Scala : Can't unquote List[scala.collection.immutable.HashMap[String,Double]]

梦想与她 提交于 2019-12-08 10:32:17
问题 I am using Scala reflection and toolbox to evaluate dynamic function as a string. And I am trying to evaluate it with data of type List[HashMap[String, Double]] But it is giving error Can't unquote List[scala.collection.immutable.HashMap[String,Double]], consider using ... val dataAfterFunctionApplied = tb.eval(q"$functionSymbol.function($data)") The code which I am using is as below. package test import scala.collection import scala.collection.immutable.HashMap import scala.reflect.runtime

Scala: How to instantiate an interpreter that inherits the current context?

℡╲_俬逩灬. 提交于 2019-12-07 16:36:33
In a Scala code, I'd like to create an interpreter that will evaluate some strings which are Scala code, e.g., using ScriptEngine. But I'd like to pass the current variable and type definitions to it so that the code in the strings can use them, as if the new interpreter is forked from the current interpreter. With ScriptEngine I could use use the "put" method to put bindings into it, but this needs to be explicit and for each variable. And, there's no way to pass a class definition, or a method etc. Is there a way then, or am I misunderstanding something? The purpose is to let dynamic code to

Preserving type arguments in Akka receive

半腔热情 提交于 2019-12-07 14:37:22
问题 This question has kind of been answered by Roland Kuhn in this post, however, despite several comments asking for detail, he didn't bother to share the complete answer. Here's what I want to do: I have a wrapper class case class Event[T](t: T) of which I send instances to an Akka actor. In the receive method of that actor, I then want to distinguish between Event[Int] and Event[String] , which obviously isn't so simple due to type erasure. What Roland Kuhn shares in the mentioned post is that

Define spark udf by reflection on a String

…衆ロ難τιáo~ 提交于 2019-12-06 05:48:35
问题 I am trying to define a udf in spark(2.0) from a string containing scala function definition.Here is the snippet: val universe: scala.reflect.runtime.universe.type = scala.reflect.runtime.universe import universe._ import scala.reflect.runtime.currentMirror import scala.tools.reflect.ToolBox val toolbox = currentMirror.mkToolBox() val f = udf(toolbox.eval(toolbox.parse("(s:String) => 5")).asInstanceOf[String => Int]) sc.parallelize(Seq("1","5")).toDF.select(f(col("value"))).show This gives me

Preserving type arguments in Akka receive

情到浓时终转凉″ 提交于 2019-12-05 21:24:07
This question has kind of been answered by Roland Kuhn in this post , however, despite several comments asking for detail, he didn't bother to share the complete answer. Here's what I want to do: I have a wrapper class case class Event[T](t: T) of which I send instances to an Akka actor. In the receive method of that actor, I then want to distinguish between Event[Int] and Event[String] , which obviously isn't so simple due to type erasure. What Roland Kuhn shares in the mentioned post is that "there is exactly one way to do it", that is, embodying the type information within the message. So I

Define spark udf by reflection on a String

末鹿安然 提交于 2019-12-04 11:12:09
I am trying to define a udf in spark(2.0) from a string containing scala function definition.Here is the snippet: val universe: scala.reflect.runtime.universe.type = scala.reflect.runtime.universe import universe._ import scala.reflect.runtime.currentMirror import scala.tools.reflect.ToolBox val toolbox = currentMirror.mkToolBox() val f = udf(toolbox.eval(toolbox.parse("(s:String) => 5")).asInstanceOf[String => Int]) sc.parallelize(Seq("1","5")).toDF.select(f(col("value"))).show This gives me an error : Caused by: java.lang.ClassCastException: cannot assign instance of scala.collection

Spark/scala create empty dataset using generics in a trait

余生颓废 提交于 2019-12-04 04:59:25
问题 I have a trait called that takes a type parameter, and one of its methods needs to be able to create an empty typed dataset. trait MyTrait[T] { val sparkSession: SparkSession val spark = sparkSession.session val sparkContext = spark.sparkContext def createEmptyDataset(): Dataset[T] = { import spark.implicits._ // to access .toDS() function // DOESN'T WORK. val emptyRDD = sparkContext.parallelize(Seq[T]()) val accumulator = emptyRDD.toDS() ... } } So far I have not gotten it to work. It

Spark/scala create empty dataset using generics in a trait

廉价感情. 提交于 2019-12-02 07:52:14
I have a trait called that takes a type parameter, and one of its methods needs to be able to create an empty typed dataset. trait MyTrait[T] { val sparkSession: SparkSession val spark = sparkSession.session val sparkContext = spark.sparkContext def createEmptyDataset(): Dataset[T] = { import spark.implicits._ // to access .toDS() function // DOESN'T WORK. val emptyRDD = sparkContext.parallelize(Seq[T]()) val accumulator = emptyRDD.toDS() ... } } So far I have not gotten it to work. It complains no ClassTag for T , and that value toDS is not a member of org.apache.spark.rdd.RDD[T] Any help