typesafe-config

Passing external yml file in my spark-job/code not working throwing “Can't construct a java object for tag:yaml.org,2002”

流过昼夜 提交于 2019-12-17 14:32:27
问题 I am using spark 2.4.1 version and java8. I am trying to load external property file while submitting my spark job using spark-submit. As I am using below TypeSafe to load my property file. <groupId>com.typesafe</groupId> <artifactId>config</artifactId> <version>1.3.1</version> In my spark driver class MyDriver.java I am loading the YML file as below String ymlFilename = args[1].toString(); Optional<QueryEntities> entities = InputYamlProcessor.process(ymlFilename); I have all code here

Passing external yml file in my spark-job/code not working throwing “Can't construct a java object for tag:yaml.org,2002”

二次信任 提交于 2019-12-17 14:31:32
问题 I am using spark 2.4.1 version and java8. I am trying to load external property file while submitting my spark job using spark-submit. As I am using below TypeSafe to load my property file. <groupId>com.typesafe</groupId> <artifactId>config</artifactId> <version>1.3.1</version> In my spark driver class MyDriver.java I am loading the YML file as below String ymlFilename = args[1].toString(); Optional<QueryEntities> entities = InputYamlProcessor.process(ymlFilename); I have all code here

Scala Standalone JAR with a conf Folder

旧城冷巷雨未停 提交于 2019-12-13 12:48:38
问题 I'm using the sbt assembly jar plugin to create a standalone jar file. My project folder structure would look like this: MyProject -src - main - scala - mypackages and source files - conf // contains application.conf, application.test.conf and so on - test -project // contains all the build related files - README.md I now want to be able to run the fat jar that I produce against a version of the application.conf that I specify as a System property! So here is what I do in my unit test! System

Passing typesafe config conf files to DataProcSparkOperator

℡╲_俬逩灬. 提交于 2019-12-11 12:42:37
问题 I am using Google dataproc to submit spark jobs and google cloud composer to schedule them. Unfortunately, I am facing difficulties. I am relying on .conf files (typesafe config files) to pass arguments to my spark jobs. I am using the following python code for the airflow dataproc: t3 = dataproc_operator.DataProcSparkOperator( task_id ='execute_spark_job_cluster_test', dataproc_spark_jars='gs://snapshots/jars/pubsub-assembly-0.1.14-SNAPSHOT.jar', cluster_name='cluster', main_class = 'com

How to remove datatype from value in config file with typesafe?

让人想犯罪 __ 提交于 2019-12-11 04:37:14
问题 I have a config file beam-template.conf which has different properties like `beam.agentsim.agents.rideHail.keepMaxTopNScores = "int | 1" beam.agentsim.agents.rideHail.minScoreThresholdForRepositioning = "double | 0.1"` I am trying to get the properties values like this. Configfactory.parseFile(new File(path/beam-template.conf)).entrySet().asScala.foreach { entry => if (!(userConf.hasPathOrNull(entry.getKey))) { logString+="\nKey= " + entry.getKey + " ,Value= " + entry.getValue.render } } So

*Ordered* objects in Play (Scala) typesafe config

…衆ロ難τιáo~ 提交于 2019-12-11 04:14:50
问题 How do I access the ordered list of customers in the following .conf in Play 2.6.x (Scala): customers { "cust1" { env1 { att1: "str1" att2: "str2" } env2 { att1: "str3" att2: "str5" } env3 { att1: "str2" att2: "str6" } env4 { att1: "str1" att2: "str2" } } "cust2" { env1 { att1: "faldfjalfj" att2: "reqwrewrqrq" } env2 { att1: "falalfj" att2: "reqwrrq" } } "cust3" { env3 { att1: "xvcbzxbv" att2: "hello" } } } List("cust1", "cust2", "cust3") , in this example. 回答1: The following example should

scala typesafe config - How to load conf file from classpath not from top level resources

两盒软妹~` 提交于 2019-12-11 03:55:50
问题 I am using scala typesafe config (version 1.2.1) in one of my projects to read the application.conf file, my project has dependency on multiple other projects and I create a jar with dependencies to run the dependency projects. Problem - those projects also use typesafe and has application.conf files in top level directory and my maven jar with dependencies pick up only one application.conf in same classpath and drops rest of them (I tried using maven shade plugin to merge these conf files

java.lang.ClassNotFoundException: scala.Int when using akka 2.5.6 with sbt version 1.0.2

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-10 13:06:54
问题 I am trying to use akka -remoting version 2.5.4 with latest sbt 1.0.2 When I use sbt version 0.13.15 or 0.13.16 it is working very well. But when I'm using sbt verion 1.0.2 as here, I'm getting below error. The Below exception is runtime exception. [error] (run-main-0) java.lang.ClassNotFoundException: scala.Int [error] java.lang.ClassNotFoundException: scala.Int [error] at sbt.internal.inc.classpath.ClasspathFilter.loadClass(ClassLoaders.scala:74) [error] at java.lang.ClassLoader.loadClass

akka java programmatic override configuration

蹲街弑〆低调 提交于 2019-12-10 11:27:13
问题 The few topics I can find about this are for Scala, rather than Java, and none address remote actors. I have a base configuration file (SERVER_CONFIG_FILE): Include "akka-common" TheSystem { akka { actor { provider = "akka.remote.RemoteActorRefProvider" deployment { /OtherSupervisor { remote = "akka://OtherSystem@127.0.0.1:8553" } } } remote { transport = "akka.remote.netty.NettyRemoteTransport" netty { hostname = "127.0.0.1" port = 8552 } } } } I want to load it in my program, and then

Typesafe/Hocon config: variable substitution: reference path

瘦欲@ 提交于 2019-12-10 01:08:34
问题 We have a project with huge configuration files built using hocon configs. There is an intention to use variables where possible to create template_section and setup some values in template based on some options. The problem is that while using variables in this config, I have to refer to absolute path all the time. Is it possible somehow to use canonical name (if properties located on same level)? Example: foo { bar = 4 baz = ${foo.bar} // work perfect baz = ${[this].bar} // can I do smth