How to use hadoop MapReuce framework for an Opencl application?

我的梦境 提交于 2019-12-04 06:07:28

问题


I am developing an application in opencl whose basic objective is to implement a data mining algorithm on GPU platform. I want to use Hadoop Distributed File System and want to execute the application on multiple nodes. I am using MapReduce framework and I have divided my basic algorithm into two parts i.e. 'Map' and 'Reduce'.

I have never worked in hadoop before so I have some questions:

  1. Do I have write my application in java only to use Hadoop and Mapeduce framework?
  2. I have written kernel functions for map and reduce in opencl. Is it possible to use HDFS a file system for a non java GPU-Computing application? (Note: I don't want to use JavaCL or Aparapi)

回答1:


You could use Hadoop Streaming, with it you can write mappers and reducers in any language you want as long as your code can read from the stdio and write back to it. For inspiration you can take at examples of how R is used with Hadoop Streaming




回答2:


HDFS is a file system; you can use HDFS file system with any language.

HDFS data is distributed over multiple machines, it is highly available to process the data in GPU computing.

For more information reference Hadoop Streaming.



来源:https://stackoverflow.com/questions/15495698/how-to-use-hadoop-mapreuce-framework-for-an-opencl-application

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!