spark-java

How to get the request parameters using get in Spark Java framework?

╄→尐↘猪︶ㄣ 提交于 2019-12-04 16:48:56
问题 I'm new to sparkjava. I want to read my request params using spark java but I'm not able to find the correct syntax. please help me out. Below is my route method and the client call to it: my client request url: /smartapp/getDataViewModelConfig?collId=123' Route Method: get("smartapp/getDataViewModelConfig/:id", "application/json", (request, response) -> { String id = request.params(":id"); } The 'id' field is returning null here. Any suggestions as to what went wrong here? 回答1: If you have

How can a native Servlet Filter be used when using Spark web framework?

依然范特西╮ 提交于 2019-12-04 05:22:20
I'm playing around with Spark (the Java web framework, not Apache Spark). I find it really nice and easy to define routes and filters, however I'm looking to apply a native servlet filter to my routes and can't seem to find a way to do that. More specifically, I would like to use Jetty's DoSFilter which is a servlet filter (contrast with the Spark Filter definition). Since Spark is using embedded Jetty, I don't have a web.xml to register the DoSFilter. However, Spark doesn't expose the server instance so I can't find an elegant way of registering the filter programatically either. Is there a

How to return a static html page with Spark Java?

大憨熊 提交于 2019-12-03 20:50:35
问题 A hello world with spark: get(new Route("/hello") { @Override public Object handle(Request request, Response response) { response.type("text/html"); return "<h1>Hello Spark MVC Framework!</h1>"; } }); How can I return a static file index.html instead? Notes: I need this index.html to be in the jar in the spirit of simplicity of spark java, I'd like to avoid as much as possible going through templates, that would be overkill for a static page. 回答1: You can do so by passing the absolute path to

Spark - java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDDLike

不打扰是莪最后的温柔 提交于 2019-12-03 20:38:47
public class SparkDemo { @SuppressWarnings({ "resource" }) public static void main(String[] args) { SparkConf conf = new SparkConf().setAppName("Spark APP").setMaster("spark://xxx.xxx.xxx.xx:7077"); JavaSparkContext sc = new JavaSparkContext(conf); JavaRDD<String> lines = sc.textFile("/Users/mchaurasia/file.txt"); JavaRDD<String> words = lines.flatMap((String s) -> { return Arrays.asList(s.split(" ")); }); JavaPairRDD<String, Integer> pairs = words.mapToPair((String s) -> { return new Tuple2<String, Integer>(s, 1); }); JavaPairRDD<String, Integer> counts = pairs.reduceByKey((a, b) -> a + b);

sparkjava: Do routes have to be in main method?

余生长醉 提交于 2019-12-03 15:53:32
I am new to sparkjava and like it overall. However, do new routes/endpoints have to be defined in the main method? For any significant web application, this will result in a very long main method or I need to have multiple main methods (and therefore split server resources among multiple instances). These two sparkjava documentation pages seem to define routes in the main method: http://sparkjava.com/documentation.html#routes and here http://sparkjava.com/documentation.html#getting-started . Is there another way to do this that I'm not seeing? Cursory google searching hasn't shown me a better

spark java: how to handle multipart/form-data input?

a 夏天 提交于 2019-12-03 12:08:36
I am using spark to develop a web application; the problem occurs when I want to upload a file: public final class SparkTesting { public static void main(final String... args) { Spark.staticFileLocation("/site"); Spark.port(8080); Spark.post("/upload", (request, response) -> { final Part uploadedFile = request.raw().getPart("uploadedFile"); final Path path = Paths.get("/tmp/meh"); try (final InputStream in = uploadedFile.getInputStream()) { Files.copy(in, path); } response.redirect("/"); return "OK"; }); } } But I get this error: [qtp509057984-36] ERROR spark.webserver.MatcherFilter - java

Use https in spark-java

南楼画角 提交于 2019-12-03 11:36:02
问题 How can i use custom SSLContext and custom SSLServerSocketFactory in spark-java framework? I've searched in SparkServerImpl but have no idea how to inject sslfactory, any suggestions? 回答1: You can pass directly into Spark the parameters of the keystore, like this: Spark.secure(keyStorePath, keyStorePassword, trustStorePath, trustStorePassword); Those are all strings, and for example, the keystore values could be: String keyStorePath = "/home/user/keys/private-key.jks"; String keyStorePassword

How to get the request parameters using get in Spark Java framework?

爱⌒轻易说出口 提交于 2019-12-03 09:58:05
I'm new to sparkjava. I want to read my request params using spark java but I'm not able to find the correct syntax. please help me out. Below is my route method and the client call to it: my client request url: /smartapp/getDataViewModelConfig?collId=123' Route Method: get("smartapp/getDataViewModelConfig/:id", "application/json", (request, response) -> { String id = request.params(":id"); } The 'id' field is returning null here. Any suggestions as to what went wrong here? Laercio Metzner If you have to work with an URL like /smartapp/getDataViewModelConfig?collId=123 you have to deal with

Use https in spark-java

99封情书 提交于 2019-12-03 01:59:35
How can i use custom SSLContext and custom SSLServerSocketFactory in spark-java framework? I've searched in SparkServerImpl but have no idea how to inject sslfactory, any suggestions? You can pass directly into Spark the parameters of the keystore, like this: Spark.secure(keyStorePath, keyStorePassword, trustStorePath, trustStorePassword); Those are all strings, and for example, the keystore values could be: String keyStorePath = "/home/user/keys/private-key.jks"; String keyStorePassword = "password"; This way, Spark will have the parameters to create the SslContextFactory inside, as you can

Reloading the static files in Spark/Jetty-server

这一生的挚爱 提交于 2019-12-01 16:46:40
问题 I have a bit similar problem described here: Refresh static files served by SparkJava In my application user can upload the content to one folder that is also served to user with Spark.staticFileLocation("/public"); feature. I've understood that SparkJava is reading the 'static' content from that folder only once at startup and it is not aware of changes there. Is it possible to ask Spark (or Jetty via Spark) to reload the changes in static folder? 回答1: Move to externalStaticFileLocation("