Reasons to use Azure Data Lake Analytics vs Traditional ETL approach
I'm considering using Data Lake technologies which I have been studying for the latest weeks, compared with the traditional ETL SSIS scenarios, which I have been working with for so many years. I think of Data Lake as something very linked to big data, but where is the line between using Data Lake technolgies vs SSIS? Is there any advantage of using Data Lake technologies with 25MB ~100MB ~ 300MB files? Parallelism? flexibility? Extensible in the future? Is there any performance gain when the files to be loaded are not so big as U-SQL best scenario... What are your thoughts? Would it be like