I\'m using Spark 2.0.2. I have a DataFrame that has an alias on it, and I\'d like to be able to retrieve that. A simplified example of why I\'d want that is below.
For Java:
As @veinhorn mentioned, it is also possible to get the alias in Java. Here is a utility method example:
public static Optional getAlias(Dataset dataset){
final LogicalPlan analyzed = dataset.queryExecution().analyzed();
if(analyzed instanceof SubqueryAlias) {
SubqueryAlias subqueryAlias = (SubqueryAlias) analyzed;
return Optional.of(subqueryAlias.alias());
}
return Optional.empty();
}