pyspark Loading multiple partitioned files in a single load
I am trying to load multiple files in a single load. They are all partitioned files When I tried it with 1 file it works, but when I listed down 24 files, it gives me this error and I could not find any documentation of the limitation and a workaround aside from doing the union after the load. Is there any alternatives? CODE Below to re-create the problem: basepath = '/file/' paths = ['/file/df201601.orc', '/file/df201602.orc', '/file/df201603.orc', '/file/df201604.orc', '/file/df201605.orc', '/file/df201606.orc', '/file/df201604.orc', '/file/df201605.orc', '/file/df201606.orc', '/file