How to use GPU for mathematics [closed]

落爺英雄遲暮 提交于 2020-01-20 13:32:25

问题


I am looking at utilising the GPU for crunching some equations but cannot figure out how I can access it from C#. I know that the XNA and DirectX frameworks allow you to use shaders in order to access the GPU, but how would I go about accessing it without these frameworks?


回答1:


I haven't done it from C#, but basically you use the CUDA (assuming you're using an nVidia card here, of course) SDK and CUDA toolkit to pull it off.

nVidia has ported (or written?) a BLAS implementation for use on CUDA-capable devices. They've provided plenty of examples for how to do number crunching, although you'll have to figure out how you're going to pull it off from C#. My bet is, you're going to have to write some stuff in un-managed C or C++ and link with it.

If you're not hung-up on using C#, take a look at Theano. It might be a bit overkill for your needs, since they're building a framework for doing machine learning on GPUs from Python, but ... it works, and works very well.




回答2:


If your GPU is NVidia, you can use CUDA.

There is an example here, that explain all the chain, including some C/C++ code: CUDA integration with C#

And there is a library called CUDA.NET available here: CUDA.NET

If your GPU is ATI, then there is ATI Stream. .NET support is less clear to me on this. Maybe the Open Toolkit Library has it, through OpenCL support.

And finally, there is an Microsoft Research project called "Accelerator" which has a managed wrapper which should work on any hardware (provided it supports DirectX 9).




回答3:


How about Brahma (LINQ to GPU)?

Gotta love LINQ!




回答4:


I'm afraid that my knowledge of using the GPU is rather theoretical beyond writing shaders for DirectX / XNA and dabbling a little bit with CUDA (NVidia specific). However, I have heard quite a lot about OpenCL (Open Computing Language) which allows you to run algorithms which OpenCL will intelligently push out to your graphics cards, or run on the CPU if you don't have a compatible GPU.

The code you run on the GPU will have to be written specifically in OpenCL's subset of C99 (apologies if this does not meet your reqiurements as you've asked how to use it from C#), but beyond your number crunching algorithms, you can write the rest of your application in C# and have it all work together nicely by using The Open Toolkit;

http://www.opentk.com/




回答5:


There are two options if you don't want to mess with P/Invoke stuff and unmanaged code:

  1. Use mentioned CUDA.NET library. It works very well, but it's targeting CUDA, so only nVidia cards. If you'd like to solve more complex problems you'd have to learn CUDA, write your own kernel (in C...), compile it with nvcc and execute from C# via this library.
  2. Use Microsoft Research Accelerator. It's a nice library build by MS Research that runs your code on anything that has lots of cores (many-core nVidia/ATI GPUs and multi-core processors). It's completely platform independent. Used it and I'm pretty impressed with the results. There is also a very good tutorial on using Accelerator in C#.

The second option is that I'd recommend, but if you have no problem with sticking to nVidia GPUs only - the first would probably be faster.




回答6:


I have done it in C# by leveraging NVIDIA's CUDA libraries and .NET's P/invoke. This requires some careful memory management and a good detailed understanding of the CUDA libraries. This technique can be used in conjunction with any custom GPU/CUDA kernels you would like to create in C, so it's a very powerful flexible approach.

If you would like to save yourself a lot of effort you could buy NMath Premium from CenterSpace software (who I work for) and you can be running large problems on your NVIDIA GPU in minutes from C#. NMath Premium a large C#/.NET math library that can run much of LAPACK and FFT's on the GPU, but falls back to the CPU if the hardware isn't available or the problem size doesn't justify a round trip to the GPU.



来源:https://stackoverflow.com/questions/5894696/how-to-use-gpu-for-mathematics

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!