azure-data-lake

Azure Data flow taking mins to trigger next pipeline

人盡茶涼 提交于 2020-01-03 03:09:07
问题 Azure Data factory transferring data in Db in 10 millisecond but the issue I am having is it is waiting for few mins to trigger next pipeline and that ends up with 40 mins all pipelines are taking less than 20 ms to transfer data. But somehow it is waiting a few mins to trigger the next one. I used debug mode as well trigger the ADF using Logic App without debugging mood. Is there any way I can optimize it we want to move from SSIS to Data Flow but having a time issue 40 mins are so much in

Stream Analytics Job -> DataLake ouput

99封情书 提交于 2020-01-02 10:03:58
问题 I want to set up CI/CD (ARM template) with StreamAnalytics Job with output set to DataLake Store. https://docs.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs/outputs#microsoftdatalakeaccounts The issue comes with refreshToken: "It is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token" Furthermore after 90-days

Stream Analytics Job -> DataLake ouput

送分小仙女□ 提交于 2020-01-02 10:01:43
问题 I want to set up CI/CD (ARM template) with StreamAnalytics Job with output set to DataLake Store. https://docs.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs/outputs#microsoftdatalakeaccounts The issue comes with refreshToken: "It is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token" Furthermore after 90-days

Azure Function exception could not load ActiveDirectory file or assembly

℡╲_俬逩灬. 提交于 2020-01-02 08:54:54
问题 I am trying to write an azure timer function that writes files to Azure Datalake, but when I am adding the needed nuGet packages I am getting an error when I start the host The error is as follows [21/5/2018 8:36:20 AM] Executed 'NWPimFeederFromAws' (Failed, Id=03395101-41a5-44ef-96d8-f69c5d73eca7) [21/5/2018 8:36:20 AM] System.Private.CoreLib: Exception while executing function: NWPimFeederFromAws. NWPimFeeder: Could not load file or assembly 'Microsoft.IdentityModel.Clients.ActiveDirectory,

Data Lake Analytics U-SQL EXTRACT speed (Local vs Azure)

情到浓时终转凉″ 提交于 2020-01-02 07:51:10
问题 Been looking into using the Azure Data Lake Analytics functionality to try and manipulate some Gzip’d xml data I have stored within Azures Blob Storage but I’m running into an interesting issue. Essentially when using U-SQL locally to process 500 of these xml files the processing time is extremely quick , roughly 40 seconds using 1 AU locally (which appears to be the limit). However when we run this same functionality from within Azure using 5 AU’s the processing takes 17+ minutes. We are

Parse json file in U-SQL

不羁岁月 提交于 2020-01-02 02:53:06
问题 I'm trying to parse below Json file using USQL but keep getting error. Json file@ {"dimBetType_SKey":1,"BetType_BKey":1,"BetTypeName":"Test1"} {"dimBetType_SKey":2,"BetType_BKey":2,"BetTypeName":"Test2"} {"dimBetType_SKey":3,"BetType_BKey":3,"BetTypeName":"Test3"} Below is the USQL script, I'm trying to extract the data from above file. REFERENCE ASSEMBLY [Newtonsoft.Json]; REFERENCE ASSEMBLY [Microsoft.Analytics.Samples.Formats]; DECLARE @Full_Path string = "adl://xxxx.azuredatalakestore.net

How can I log something in USQL UDO?

两盒软妹~` 提交于 2020-01-01 15:42:08
问题 I have custom extractor, and I'm trying to log some messages from it. I've tried obvious things like Console.WriteLine , but cannot find where output is. However, I found some system logs in adl://<my_DLS>.azuredatalakestore.net/system/jobservice/jobs/Usql/.../<my_job_id>/ . How can I log something? Is it possible to specify log file somewhere on Data Lake Store or Blob Storage Account? 回答1: A recent release of U-SQL has added diagnostic logging for UDOs. See the release notes here. // Enable

How to use Azure Data Lake Store as an input data set for Azure ML?

痴心易碎 提交于 2019-12-31 05:28:27
问题 I am moving data into Azure Data Lake Store and processing it using Azure Data Lake Analytics. Data is in form of XML and I am reading it through XML Extractor. Now I want to access this data from Azure ML and it looks like Azure Data Lake store is not directly supported at the moment. What are the possible ways to use Azure Data Lake Store with Azure ML? 回答1: Right now, Azure Data Lake Store is not a supported source, as you note. That said, Azure Data Lake Analytics can also be used to

Azure Data lake analytics CI/CD

核能气质少年 提交于 2019-12-31 03:46:07
问题 I'm trying to build CI/CD for Azure Data lake analytics - USQL code and when i build the code using Visual studio build option in VSTS getting the below error - Using the Private agent for taking the build - C:\Users\a.sivananthan\AppData\Roaming\Microsoft\DataLake\MsBuild\1.0\Usql.targets(33,5): Error MSB4062: The "Microsoft.Cosmos.ScopeStudio.VsExtension.CompilerTask.USqlCompilerTask" task could not be loaded from the assembly Microsoft.Cosmos.ScopeStudio.VsExtension.CompilerTask. Could not

U-SQL Split a CSV file to multiple files based on Distinct values in file

不打扰是莪最后的温柔 提交于 2019-12-30 10:35:27
问题 I have the Data in Azure Data Lake Store and I am processing the data present there with Azure Data Analytic Job with U-SQL. I have several CSV files which contain spatial data, similar to this: File_20170301.csv longtitude| lattitude | date | hour | value1 ----------+-----------+--------------+------+------- 45.121 | 21.123 | 2017-03-01 | 01 | 20 45.121 | 21.123 | 2017-03-01 | 02 | 10 45.121 | 21.123 | 2017-03-01 | 03 | 50 48.121 | 35.123 | 2017-03-01 | 01 | 60 48.121 | 35.123 | 2017-03-01 |