scala

How to stream data from Kafka topic to Delta table using Spark Structured Streaming

核能气质少年 提交于 2021-01-29 15:11:34
问题 I'm trying to understand databricks delta and thinking to do a POC using Kafka. Basically the plan is to consume data from Kafka and insert it to the databricks delta table. These are the steps that I did: Create a delta table on databricks. %sql CREATE TABLE hazriq_delta_trial2 ( value STRING ) USING delta LOCATION '/delta/hazriq_delta_trial2' Consume data from Kafka. import org.apache.spark.sql.types._ val kafkaBrokers = "broker1:port,broker2:port,broker3:port" val kafkaTopic = "kafkapoc"

Implicit conversion for multiple parameters

青春壹個敷衍的年華 提交于 2021-01-29 15:07:10
问题 Is it possible to implement in Scala an implicit conversion for the group of parameters (without defining them as some class member) like implicit def triple2One (x :Int, s :String, d :Double) = x // just as an example So that I would be able to call it in the code like val x :Int = (1, "test", 2.0) 回答1: It is possible: scala> implicit def iFromISD(isd: (Int, String, Double)): Int = isd._1 iFromISD: (isd: (Int, String, Double))Int scala> val x: Int = (1, "two", 3.0) x: Int = 1 Naturally,

Why can't the compiler select the correct String.contains method when using this lambda shorthand?

牧云@^-^@ 提交于 2021-01-29 14:30:00
问题 Say I want to check if a string contains any of the letters in "cory": def hasCory(input: String): Boolean = { val myName = "cory" input.exists(myName.contains) } The compiler complains with: error: type mismatch; found : CharSequence => Boolean required: Char => Boolean Scala provides the Char -accepting method I want in StringOps: But it appears that the compiler cannot see this method unless I change the code to one of: input.exists(myName.contains(_)) input.exists(c => myName.contains(c))

Constraining type parameters on case classes and traits

本秂侑毒 提交于 2021-01-29 14:27:07
问题 I'm a bit confused about how to constrain type parameters so that they will only accept types that have implemented a particular typeclass. Here's some rather contrived sample code: // I want to tag my favourite types of number with this trait trait MyFavouriteTypesOfNumber[A] // implicits necessary because I cannot directly extend Int or Double implicit def iLikeInts(i: Int): MyFavouriteTypesOfNumber[Int] = new MyFavouriteTypesOfNumber[Int] {} implicit def iAlsoLikeFloats(f: Float):

How can I reflect on a field annotation (Java) in a Scala program?

浪子不回头ぞ 提交于 2021-01-29 14:21:35
问题 I'm using Scala 2.13 and I know there's been a lot deprecated since older versions. I've got this annotation: @Inherited @Target({ElementType.PARAMETER, ElementType.METHOD, ElementType.FIELD}) @Retention(RetentionPolicy.RUNTIME) public @interface Foo { int index() default 0; } (I know... I've got lots of ElementTypes there, but I'm struggling to see where this pops up in reflection so wanted to maximize my chances of a hit!) Used like this: case class Person(name: String, @Foo(index = 3) age:

Extract Embedded AWS Glue Connection Credentials Using Scala

我的未来我决定 提交于 2021-01-29 14:17:51
问题 I have a glue job that reads directly from redshift, and to do that, one has to provide connection credentials. I have created an embedded glue connection and can extract the credentials with the following pyspark code. Is there a way to do this in Scala ? glue = boto3.client('glue', region_name='us-east-1') response = glue.get_connection( Name='name-of-embedded-connection', HidePassword=False ) table = spark.read.format( 'com.databricks.spark.redshift' ).option( 'url', 'jdbc:redshift://prod

How to parse new line in Scala grammar with flex/bison?

∥☆過路亽.° 提交于 2021-01-29 14:15:52
问题 I want to parse Scala grammar with flex and bison. But I don't know how to parse the newline token in Scala grammar. If I parse newline as a token T_NL , Here's the Toy.l for example: ... [a-zA-Z_][a-zA-Z0-9_]* { yylval->literal = strdup(yy_text); return T_ID; } \n { yylval->token = T_LN; return T_LN; } [ \t\v\f\r] { /* skip whitespaces */ } ... And here's the Toy.y for example: function_def: 'def' T_ID '(' argument_list ')' return_expression '=' expression T_NL ; argument_list: argument |

Def Macro, pass parameter from a value

若如初见. 提交于 2021-01-29 13:52:21
问题 I have a working macros ie: object Main extends App { println("Testing assert macro...") val result = Asserts.assert(false, "abc") } and import scala.reflect.macros.blackbox.Context import scala.language.experimental.macros object Asserts { val assertionsEnabled: Boolean = true def assert(cond: Boolean, msg: String): Unit = macro assertImpl def assertImpl(c: Context)(cond: c.Expr[Boolean], msg: c.Expr[String]) : c.Expr[Unit] = { import c.universe._ cond.tree match { case Literal(Constant(cond

How to set expiry time for all Ignite caches?

[亡魂溺海] 提交于 2021-01-29 13:08:26
问题 I am starting ignite by a specific configuration. In that configuration, I specified expiration policy. But expiration is not working. When I specified a cache name in that property, it is working fine. I added configuration like below <property name="expiryPolicyFactory"> <bean class="javax.cache.expiry.CreatedExpiryPolicy" factory-method="factoryOf"> <constructor-arg> <bean class="javax.cache.expiry.Duration"> <constructor-arg value="MINUTES"/> <constructor-arg value="5"/> </bean> <

How to extract types from a tuple that implements a typeclass

谁都会走 提交于 2021-01-29 13:06:05
问题 Function a can receive single argument or a tuple, these arguments need to be members of typeclass StringIdentifiable How to extract and decompose tuple type into types that also have instances of the typeclass @typeclass trait StringIdentifiable[M] { def identify(id: M): String } def a[K: StringIdentifiable] (k:K){ k match{ case (k1) => implicitly[StringIdentifiable[K]].identify(k1) case (k1,k2) => implicitly[StringIdentifiable[k1.type]].identify(k1) implicitly[StringIdentifiable[k2.type]]