graph algorithms on GPU
问题 the current GPU threads are somehow limited (memory limit, limit of data structures, no recursion...). do you think it would be feasible to implement a graph theory problem on GPU. for example vertex cover? dominating set? independent set? max clique?.... is it also feasible to have branch-and-bound algorithms on GPUs? Recursive backtracking? 回答1: You will be interested in Exploring the Limits of GPUs With Parallel Graph Algorithms Accelerating large graph algorithms on the GPU using CUDA.