How can I use GPU with Java programming

折月煮酒 提交于 2019-12-07 12:14:38

问题


I am using CUDAC all these days to access the GPU. But now my guide asked me to work with Java and GPU. So I searched in Internet and found Rootbeer is the best option for it but I am not able to understand how to run a program using 'Rootbeer'. Can some one tell me steps for using Rootbeer.


回答1:


Mark Harris from Nvidia gave nice talk about the future of CUDA at SC14. You can watch it here.

The main thing that may be of interest for you is the part where he talks about programming languages and especially Java. IBM is working on CUDA4J and there are some nice plans about Java 8 features especially lambdas to be used for GPU programming. However, I am not a Java user and I can't answer your question regarding Rootbeer (besides the taste) but maybe CUDA4J will be something that suits you. Especially, if you know how to write CUDA C and need a solution backed up by a company like IBM.



来源:https://stackoverflow.com/questions/27165909/how-can-i-use-gpu-with-java-programming

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!