spark-java

Sparkjava redirect while keeping the browser URL

被刻印的时光 ゝ 提交于 2019-12-11 04:52:42
问题 I have a sparkjava server app running, it serves a static HTML page using this line: staticFiles.location("/public"); If you go to http://example.com, you will see the HTML page. Now, I want to redirect users from other paths to my homepage, while keeping the browser URL. For example, if you visit http://example.com/message/123, you will still see the HTML page, while the browser URL stays http://example.com/message/123. So redirect.get() won't work here. 回答1: In order to serve the same file

Parse POST body to java object using spark

夙愿已清 提交于 2019-12-10 13:06:09
问题 I migrated from spring to spark while ago and now I'm stuck at something basic. When I make a POST request sending data in the body I want to have the JAVA object back in the controller.. In spring I used to do @RequestBody User user And it was "filled" automatically.. Now with spark I have the method: request.body(); But that gives me a serialized string like this: id=7&name=Pablo+Mat%C3%ADas&lastname=Gomez&githubUsername=pablomatiasgomez So how can I get the User DTO ? Of course, the User

How read data sent by Client with Spark?

谁说我不能喝 提交于 2019-12-10 07:30:56
问题 I have to read some data sent by Client using Spark (a framework for Java). This is the code of client's post request. I am using jQuery. $.post("/insertElement", {item:item.value, value: value.value, dimension: dimension.value }); The code of server: post(new Route("/insertElement") { @Override public Object handle(Request request, Response response) { String item = (String) request.attribute("item"); String value = (String) request.attribute("value"); String dimension = (String) request

sparkjava: Do routes have to be in main method?

[亡魂溺海] 提交于 2019-12-09 13:07:26
问题 I am new to sparkjava and like it overall. However, do new routes/endpoints have to be defined in the main method? For any significant web application, this will result in a very long main method or I need to have multiple main methods (and therefore split server resources among multiple instances). These two sparkjava documentation pages seem to define routes in the main method: http://sparkjava.com/documentation.html#routes and here http://sparkjava.com/documentation.html#getting-started.

spark-性能调优

寵の児 提交于 2019-12-07 15:33:27
问题: 1、分配哪些资源? 2、在哪里分配这些资源? 3、为什么多分配了这些资源以后,性能会得到提升? 分配哪些资源?executor、cpu per executor、memory per executor、driver memory 在哪里分配这些资源?在我们在生产环境中,提交spark作业时,用的spark-submit shell脚本,里面调整对应的参数 /usr/local/spark/bin/spark-submit \ --class cn.spark.sparktest.core.WordCountCluster \ --num-executors 3 \ 配置executor的数量 --driver-memory 100m \ 配置driver的内存(影响很大) --executor-memory 100m \ 配置每个executor的内存大小 --executor-cores 3 \ 配置每个executor的cpu core数量 /usr/local/SparkTest-0.0.1-SNAPSHOT-jar-with-dependencies.jar \ 调节到多大,算是最大呢? 第一种,Spark Standalone,公司集群上,搭建了一套Spark集群,你心里应该清楚每台机器还能够给你使用的,大概有多少内存,多少cpu core;那么,设置的时候

Apache Shiro with Embedded-Jetty or Spark-Java - Is it possible?

痞子三分冷 提交于 2019-12-06 14:47:57
Does anyone have an example project on how I could integrate Shiro with Spark-Java/Jetty(embedded) please? I can see from http://sparkjava.com/documentation#filters that it must be the way. But not sure what would be the smartest way to do this according to https://shiro.apache.org/web.html If you may have any examples, appreciate much! 来源: https://stackoverflow.com/questions/54835994/apache-shiro-with-embedded-jetty-or-spark-java-is-it-possible

SparkGraph 与SparkDataFrame 两种方式计算朋友的二度关系

萝らか妹 提交于 2019-12-05 19:38:43
例如现在有这些数据: 10010 95555 2016-11-11 15:55:54 10010 95556 2016-11-11 15:55:54 10010 95557 2016-11-11 15:55:54 10086 95555 2016-11-11 15:55:54 10086 95558 2016-11-11 15:55:54 10000 95555 2016-11-11 15:55:54 10000 95558 2016-11-11 15:55:54 第一列代表是用户这个手机号,第二列代表是用户的朋友的手机号,然后计算用户与用户之间有几个共同好友号码 用sparkgraph代码如下 package spark_graph import org.apache.spark.graphx.{Edge, _} import org.apache.spark.rdd.RDD import org.apache.spark.{SparkConf, SparkContext} /** * Created by dongdong on 18/1/18. */ object Spark_Contact_Test { def main(args: Array[String]): Unit = { val conf = new SparkConf() .setAppName("Spark

Spark - java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDDLike

邮差的信 提交于 2019-12-05 09:04:19
问题 public class SparkDemo { @SuppressWarnings({ "resource" }) public static void main(String[] args) { SparkConf conf = new SparkConf().setAppName("Spark APP").setMaster("spark://xxx.xxx.xxx.xx:7077"); JavaSparkContext sc = new JavaSparkContext(conf); JavaRDD<String> lines = sc.textFile("/Users/mchaurasia/file.txt"); JavaRDD<String> words = lines.flatMap((String s) -> { return Arrays.asList(s.split(" ")); }); JavaPairRDD<String, Integer> pairs = words.mapToPair((String s) -> { return new Tuple2

SparkJava custom error page

扶醉桌前 提交于 2019-12-04 23:56:11
问题 Does anyone know how to override existing 404 error page when using Spark micro web framework ? The default error page is: <html> <head> <meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/> <title>Error 404 </title> </head> <body> <h2>HTTP ERROR: 404</h2> <p>Problem accessing /strangepage. Reason: <pre> Not Found</pre></p> <hr /><i><small>Powered by Jetty://</small></i> </body> </html> I want to edit this custom error page (or maybe redirect it to a another route) : get

spark java: how to handle multipart/form-data input?

早过忘川 提交于 2019-12-04 18:11:08
问题 I am using spark to develop a web application; the problem occurs when I want to upload a file: public final class SparkTesting { public static void main(final String... args) { Spark.staticFileLocation("/site"); Spark.port(8080); Spark.post("/upload", (request, response) -> { final Part uploadedFile = request.raw().getPart("uploadedFile"); final Path path = Paths.get("/tmp/meh"); try (final InputStream in = uploadedFile.getInputStream()) { Files.copy(in, path); } response.redirect("/");