I am Using spark-sql 2.4.1 and java 8.
val country_df = Seq( (\"us\",2001), (\"fr\",2002), (\"jp\",2002), (\"in\",2001), (\"fr\",2003),
Using pivot you can get the values as column names directly like this:
val selectCols = col_df.groupBy().pivot($"country").agg(lit(null)).columns data_df.select(selectCols.head, selectCols.tail: _*)