How to run Spark Application as daemon

两盒软妹~` 提交于 2019-12-23 06:06:29

问题


I have a basic question about running spark application.

I have a Java client which will send me request for query data which is residing in HDFS.

The request I get is REST API over HTTP and I need to interpret the request and form Spark SQL queries and return the response back to client.

I am unable to understand how can I make my spark application as daemon which is waiting for request and can execute the queries using the pre instantiated SQL context ?


回答1:


You can have a thread that run in an infinite loop to do the calculation with Spark.

while (true) {
  request = incomingQueue.poll()
  // Process the request with Spark
  val result = ...
  outgoingQueue.put(result)      
}

Then in the thread that handle the REST request, you put the request in the incomingQueue and wait for the result from the outgoingQueue.

 // Create the request from the REST call
 val request = ...
 incompingQueue.put(request)
 val result = outgoingQueue.poll()
 return result



回答2:


The best option I've seen for this use case is Spark Job Server, which will be the daemon app, with your driver code deployed to it as a named application.

This option gives you even far more features such as persistence.

With job server, you don't need to code your own daemon and your client apps can send REST requests directly to it, which in turn will execute the spark-submit tasks.



来源:https://stackoverflow.com/questions/38326881/how-to-run-spark-application-as-daemon

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!