pyspark program for nested loop

大兔子大兔子 提交于 2019-12-08 11:09:16

问题


I am new to PySpark and I am trying to understand how can we write multiple nested for loop in PySpark, rough high level example below. Any help will be appreciated.

for ( i=0;i<10;i++)
   for ( j=0;j<10;j++)
       for ( k=0;k<10;k++)
          { 
           print "i"."j"."k"
}

回答1:


In non distributed setting for-loops are rewritten using foreachcombinator, but due to Spark nature map and flatMap are a better choice:

from __future__ import print_function
a_loop = lambda x: ((x, y) for y in xrange(10))
print_me = lambda ((x, y), z): print("{0}.{1}.{2}".format(x, y, z)))

(sc.
    parallelize(xrange(10)).
    flatMap(a_loop).
    flatMap(a_loop).
    foreach(print_me)

Of using itertools.product:

from itertools import product
sc.parallelize(product(xrange(10), repeat=3)).foreach(print)


来源:https://stackoverflow.com/questions/31513823/pyspark-program-for-nested-loop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!