json4s

Cannot deserialize a tuple

泄露秘密 提交于 2021-02-10 05:23:54
问题 If I do the following: import org.json4s.DefaultFormats import org.json4s.jackson.Serialization.{read, write} implicit val formats = DefaultFormats val tuple = (5.0, 5.0) val json = write(tuple) println("Write: " + json) println("Read: " + read[(Double, Double)](json)) I get the following output: Write: {"_1$mcD$sp":5.0,"_2$mcD$sp":5.0} Exception in thread "main" org.json4s.package$MappingException: No usable value for _1 Did not find value which can be converted into double at org.json4s

EMR 5.21 , Spark 2.4 - Json4s Dependency broken

删除回忆录丶 提交于 2021-02-09 20:57:35
问题 Issue In EMR 5.21 , Spark - Hbase integration is broken. df.write.options().format().save() fails. Reason is json4s-jackson version 3.5.3 in spark 2.4 , EMR 5.21 it works fine in EMR 5.11.2 , Spark 2.2 , son4s-jackson version 3.2.11 Problem is this is EMR so i cant rebuild spark with lower json4s . is there any workaround ? Error py4j.protocol.Py4JJavaError: An error occurred while calling o104.save. : java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.parse(Lorg/json4s/JsonInput;Z

EMR 5.21 , Spark 2.4 - Json4s Dependency broken

霸气de小男生 提交于 2021-02-09 20:45:01
问题 Issue In EMR 5.21 , Spark - Hbase integration is broken. df.write.options().format().save() fails. Reason is json4s-jackson version 3.5.3 in spark 2.4 , EMR 5.21 it works fine in EMR 5.11.2 , Spark 2.2 , son4s-jackson version 3.2.11 Problem is this is EMR so i cant rebuild spark with lower json4s . is there any workaround ? Error py4j.protocol.Py4JJavaError: An error occurred while calling o104.save. : java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.parse(Lorg/json4s/JsonInput;Z

Can't find ScalaSig for class java.lang.Object

自古美人都是妖i 提交于 2021-02-08 06:39:38
问题 Didn't find the answer in these topics: first, second Have the next problem. I have a case class named Foo: case class Foo(a: Int, b: List[Int]) When I need to make an AST of this class I invoke Extraction.decompose(<instance of Foo>) and get an AST represenation of foo instance. But if I make field b as private case class Foo(a: Int, private val b: List[Int]) I get org.json4s.package$MappingException: Can't find ScalaSig for class java.lang.Object exception. This is only true for private

Spark with json4s, parse function raise java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.parse$default$3()Z

一个人想着一个人 提交于 2021-01-27 23:07:54
问题 I wrote a function to process stream by spark streaming. And I encountered java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.parse$default$3()Z I have checked about the spark version(1.6.0) and scala version(2.10.5). It is consistent with json4s jar version(json4s-jackson_2.10-3.3.0.jar). I can not figure out what happened. And the following is the function code: import org.json4s._ import org.json4s.jackson.Serialization.{read => JsonRead} import org.json4s.jackson.JsonMethods._

Structural equality affected by location of case class definition after deserialisation

拟墨画扇 提交于 2020-08-07 06:00:02
问题 Why is structural equality comparison affected, after deserialisation to case class instance, by the location of case class definition being inside or outside another class. For example, the assertion in the following snippet package example import org.json4s.DefaultFormats import org.json4s.native.JsonMethods.parse class Foo { case class Person(name: String) def bar = { implicit val formats = DefaultFormats val expected = Person(name = "picard") val actual = parse("""{"name": "picard"}""")

Structural equality affected by location of case class definition after deserialisation

杀马特。学长 韩版系。学妹 提交于 2020-08-07 05:59:52
问题 Why is structural equality comparison affected, after deserialisation to case class instance, by the location of case class definition being inside or outside another class. For example, the assertion in the following snippet package example import org.json4s.DefaultFormats import org.json4s.native.JsonMethods.parse class Foo { case class Person(name: String) def bar = { implicit val formats = DefaultFormats val expected = Person(name = "picard") val actual = parse("""{"name": "picard"}""")

Structural equality affected by location of case class definition after deserialisation

ぐ巨炮叔叔 提交于 2020-08-07 05:59:28
问题 Why is structural equality comparison affected, after deserialisation to case class instance, by the location of case class definition being inside or outside another class. For example, the assertion in the following snippet package example import org.json4s.DefaultFormats import org.json4s.native.JsonMethods.parse class Foo { case class Person(name: String) def bar = { implicit val formats = DefaultFormats val expected = Person(name = "picard") val actual = parse("""{"name": "picard"}""")

Producing json in a Scala app using json4s

别说谁变了你拦得住时间么 提交于 2020-01-24 09:33:06
问题 I am attempting to produce JSON in a Scala app using json4s. Fairly straight forward, Here's some sample value I put together to test it in my Scalatra app: import org.json4s._ import org.json4s.JsonDSL._ object JsonStub { val getPeople = ("people" -> ("person_id" -> 5) ~ ("test_count" -> 5)) } In my controller, I simply have: import org.json4s._ import org.json4s.JsonDSL._ import org.json4s.{DefaultFormats, Formats} class FooController(mongoDb: MongoClient)(implicit val swagger: Swagger)

Flat nested JSON to header level using scala

心不动则不痛 提交于 2020-01-16 09:15:40
问题 Below is my sample JSON and can be nested to any level deep: { "key1": { "keyA": 'valueI' }, "key2": { "keyB": 'valueII' }, "key3": [ { "a":1, "b":2 }, { "a":1, "b":2 } ] } This shloud be splitted up into 2 JSONs as Key3 has 2 array elements. Output should be like this: JSON1 = { "key1_keyA":'valueI', "key2_keyB":'valueII', "key3_a":1, "key3_b":2 } JSON2= { "key1_keyA":'valueI', "key2_keyB":'valueII', "key3_a":1, "key3_b":2 } I am getting this kind of JSON from source and reading it from