scala

How to understand the two sentences about “Covariance” and “Contravariance”?

耗尽温柔 提交于 2021-02-19 02:33:11
问题 I'm reading the first section of "Scala in depth", there are two sentences in the first section about "covariance" and "contrvariance": Covariance (+T or ? extends T) is when a type can be coerced down the inheritance hierarchy. Contravariance(-T or ? super T) is when a type can be coerced up the inheritance hierarchy. I have read some documents about "Covariance" and "Contravariance", but I can't understand the word "coerced down" and "coerced up" here in this context. 回答1: [TOP / ABSTRACT]

What is difference between Null, Nil and Nothing?

旧街凉风 提交于 2021-02-19 01:48:06
问题 I am learning Scala and am a bit confounded by the difference between next types: 'Null', 'Nil' and 'Nothing'. Can someone please help explain the difference to me? From what i gather, "Nil" is used to describe an empty list. 回答1: Nothingness in Scala There are 7 cases where you want to represent the concept of nothingness in Scala. Nil - Used for representing empty lists, or collections of zero length. For sets you can use Set.empty None - One of two subclasses of the Optional type, the

restapi(0)- 平台数据维护,写在前面

a 夏天 提交于 2021-02-19 01:46:01
在云计算的推动下,软件系统发展趋于平台化。云平台系统一般都是分布式的集群系统,采用大数据技术。在这方面akka提供了比较完整的开发技术支持。我在上一个系列有关CQRS的博客中按照实际应用的要求对akka的一些开发技术进行了介绍。CQRS模式着重操作流程控制,主要涉及交易数据的管理。那么,作为交易数据产生过程中发挥验证作用的一系列基础数据如用户信息、商品信息、支付类型信息等又应该怎样维护呢?首先基础数据也应该是在平台水平上的,但数据的采集、维护是在系统前端的,比如一些web界面。所以平台基础数据维护系统是一套前后台结合的系统。对于一个开放的平台系统来说,应该能够适应各式各样的前端系统。一般来讲,平台通过定义一套api与前端系统集成是通用的方法。这套api必须遵循行业标准,技术要普及通用,这样才能支持各种异类前端系统功能开发。在这些要求背景下,相对gRPC, GraphQL来说,REST风格的http集成模式能得到更多开发人员的接受。 在有关CQRS系列博客里,我以akka-http作为系统集成工具的一种,零星地针对实际需要对http通信进行了介绍。在restapi这个系列里我想系统化的用akka-http构建一套完整的,REST风格数据维护和数据交换api,除CRUD之外还包括网络安全,文件交换等功能。我的计划是用akka-http搭建一个平台数据维护api的REST-CRUD框架

Scala pattern matching: How to match on an element inside a list?

末鹿安然 提交于 2021-02-18 22:28:20
问题 Is it possible to rewrite the following code using Scala pattern matching? val ls: List[String] = ??? // some list of strings val res = if (ls.contains("foo")) FOO else if (ls.contains("bar")) BAR else SOMETHING_ELSE 回答1: You could implement this using a function like def onContains[T](xs: Seq[String], actionMappings: (String, T)*): Option[T] = { actionMappings collectFirst { case (str, v) if xs contains str => v } } And use it like this: val x = onContains(items, "foo" -> FOO, "bar" -> BAR )

Scala pattern matching: How to match on an element inside a list?

孤人 提交于 2021-02-18 22:27:06
问题 Is it possible to rewrite the following code using Scala pattern matching? val ls: List[String] = ??? // some list of strings val res = if (ls.contains("foo")) FOO else if (ls.contains("bar")) BAR else SOMETHING_ELSE 回答1: You could implement this using a function like def onContains[T](xs: Seq[String], actionMappings: (String, T)*): Option[T] = { actionMappings collectFirst { case (str, v) if xs contains str => v } } And use it like this: val x = onContains(items, "foo" -> FOO, "bar" -> BAR )

How do I ignore decoding failures in a JSON array?

我们两清 提交于 2021-02-18 22:15:30
问题 Suppose I want to decode some values from a JSON array into a case class with circe. The following works just fine: scala> import io.circe.generic.auto._, io.circe.jawn.decode import io.circe.generic.auto._ import io.circe.jawn.decode scala> case class Foo(name: String) defined class Foo scala> val goodDoc = """[{ "name": "abc" }, { "name": "xyz" }]""" goodDoc: String = [{ "name": "abc" }, { "name": "xyz" }] scala> decode[List[Foo]](goodDoc) res0: Either[io.circe.Error,List[Foo]] = Right(List

Spark Streaming MQTT

心已入冬 提交于 2021-02-18 19:12:20
问题 I've been using spark to stream data from kafka and it's pretty easy. I thought using the MQTT utils would also be easy, but it is not for some reason. I'm trying to execute the following piece of code. val sparkConf = new SparkConf(true).setAppName("amqStream").setMaster("local") val ssc = new StreamingContext(sparkConf, Seconds(10)) val actorSystem = ActorSystem() implicit val kafkaProducerActor = actorSystem.actorOf(Props[KafkaProducerActor]) MQTTUtils.createStream(ssc, "tcp://localhost

Spark Streaming MQTT

懵懂的女人 提交于 2021-02-18 19:11:58
问题 I've been using spark to stream data from kafka and it's pretty easy. I thought using the MQTT utils would also be easy, but it is not for some reason. I'm trying to execute the following piece of code. val sparkConf = new SparkConf(true).setAppName("amqStream").setMaster("local") val ssc = new StreamingContext(sparkConf, Seconds(10)) val actorSystem = ActorSystem() implicit val kafkaProducerActor = actorSystem.actorOf(Props[KafkaProducerActor]) MQTTUtils.createStream(ssc, "tcp://localhost

Spark Streaming MQTT

假装没事ソ 提交于 2021-02-18 19:08:07
问题 I've been using spark to stream data from kafka and it's pretty easy. I thought using the MQTT utils would also be easy, but it is not for some reason. I'm trying to execute the following piece of code. val sparkConf = new SparkConf(true).setAppName("amqStream").setMaster("local") val ssc = new StreamingContext(sparkConf, Seconds(10)) val actorSystem = ActorSystem() implicit val kafkaProducerActor = actorSystem.actorOf(Props[KafkaProducerActor]) MQTTUtils.createStream(ssc, "tcp://localhost

How to use type-level functions to create static types, dynamically?

做~自己de王妃 提交于 2021-02-18 17:39:14
问题 In TypeScript, there are type-level functions that allow creating new types based on given literal types/specifications (see Mapped Types, Conditional Types, etc.). For instance, here is such a function , let say provided by a lib author: type FromSpec<S> = { [K in keyof S]: S[K] extends "foo" ? ExampleType : never }; Its purpose is, given a specification S in the form of a map of string keys and arbitrary literals, it creates a new type in the form of a map with the same set of keys and with