Scala 2.11 is out and the 22 fields limit for case classes seems to be fixed (Scala Issue, Release Notes).
This has been an issue for me for a while because I use ca
cases where case classes might not work; one of these cases is that the case classes cannot take more than 22 fields. Another case can be that you do not know about schema beforehand. In this approach, the data is loaded as an RDD of the row objects. Schema is created separately using the StructType and StructField objects, which represent a table and a field respectively. Schema is applied to the row RDD to create DataFrame in Spark.