How to operate numPartitions, lowerBound, upperBound in the spark-jdbc connection?
问题 I am trying to read a table on postgres db using spark-jdbc. For that I have come up with the following code: object PartitionRetrieval { var conf = new SparkConf().setAppName("Spark-JDBC").set("spark.executor.heartbeatInterval","120s").set("spark.network.timeout","12000s").set("spark.default.parallelism", "20") val log = LogManager.getLogger("Spark-JDBC Program") Logger.getLogger("org").setLevel(Level.ERROR) val conFile = "/home/myuser/ReconTest/inputdir/testconnection.properties" val