cortana-intelligence

AzureML: “Train Matchbox Recommender” is not working and does not descibe the error

邮差的信 提交于 2019-12-10 21:03:51
问题 I tried to create my own experiment using the module, but failed to make it work. here is the exception i got: Error 0018: Training dataset of user-item-rating triples contains invalid data. [Critical] {"InputParameters":{"DataTable":[{"Rows":14,"Columns":3,"estimatedSize":12668928,"ColumnTypes":{"System.String":1,"System.Int32":1,"System.Double":1},"IsComplete":true,"Statistics":{"0":[10,0],"1":[5422.0,5999.0,873.0,6616.0,1758.0582820478173,7.0,0.0],"2":[1.0,1.0,1.0,1.0,0.0,1.0,0.0]}},{"Rows

Azure data factory and stored procedure

不想你离开。 提交于 2019-12-08 09:35:11
问题 I've got problem with Azure Data Factory and Stored Procedure. I've set SP as sink for input data: "sink": { "type": "SqlSink", "sqlWriterStoredProcedureName": "spAddProducts", "storedProcedureParameters": { "stringProductData": { "value": "str1" } }, and after execution I've got to process about 200k records, but after some limited number of processed rows (about 10k), I've got error: Copy activity met invalid parameters: ErrorCode=InvalidParameter,'Type=Microsoft.DataTransfer.Common.Shared

Azure Data Lake Analytics: Combine overlapping time duration using U-SQL

不羁岁月 提交于 2019-12-02 04:31:32
问题 I want to remove overlapping time duration from CSV data placed in Azure Data Lake Store using U-SQL and combine those rows. Data set contains start time and end time with several other attributes for each record. Here is an example: Start Time - End Time - Usar Name 5:00 AM - 6:00 AM - ABC 5:00 AM - 6:00 AM - XYZ 8:00 AM - 9:00 AM - ABC 8:00 AM - 10:00 AM - ABC 10:00 AM - 2:00 PM - ABC 7:00 AM - 11:00 AM - ABC 9:00 AM - 11:00 AM - ABC 11:00 AM - 11:30 AM - ABC After removing overlap, output

Using Azure Data Factory to get data from a REST API

余生长醉 提交于 2019-11-30 09:50:35
Is it possible to use Azure Data Factory to get data from a REST API and insert it to a Azure database table? Data factory now has a http connector , allowing you to do GET or POST (with a body) to a http endpoint. For example: { "name": "HttpLinkedService", "properties": { "type": "Http", "typeProperties": { "authenticationType": "Anonymous", "url" : "https://en.wikipedia.org/wiki/" } } } I have done this using Custom .Net Activities. I had a need to pull data from Salesforce API. I have a write up on how to do this here: http://eatcodelive.com/2016/02/06/accessing-azure-data-lake-store-from

Can one have multiple queries in streaming analytics job?

三世轮回 提交于 2019-11-29 10:54:09
As the title says, can you have more than one query in an Azure Streaming Analytics job? If so, how should that be structured? Vignesh Chandramohan yes, you can have multiple queries in stream analytics job. You would do something like below select * into type1Output from inputSource where type = 1 select * into type2Output from inputSource where type = 2 The job has two outputs defined, called type1Output and type2Output. Each query writes to a different output. 来源: https://stackoverflow.com/questions/36287058/can-one-have-multiple-queries-in-streaming-analytics-job

Can one have multiple queries in streaming analytics job?

十年热恋 提交于 2019-11-28 04:28:28
问题 As the title says, can you have more than one query in an Azure Streaming Analytics job? If so, how should that be structured? 回答1: yes, you can have multiple queries in stream analytics job. You would do something like below select * into type1Output from inputSource where type = 1 select * into type2Output from inputSource where type = 2 The job has two outputs defined, called type1Output and type2Output. Each query writes to a different output. 来源: https://stackoverflow.com/questions

Access Azure blog storage from within an Azure ML experiment

Deadly 提交于 2019-11-27 08:25:05
Azure ML Experiments provide ways to read and write CSV files to Azure blob storage through the Reader and Writer modules. However, I need to write a JSON file to blob storage. Since there is no module to do so, I'm trying to do so from within an Execute Python Script module. # Import the necessary items from azure.storage.blob import BlobService def azureml_main(dataframe1 = None, dataframe2 = None): account_name = 'mystorageaccount' account_key='mykeyhere==' json_string='{jsonstring here}' blob_service = BlobService(account_name, account_key) blob_service.put_block_blob_from_text("upload",

Access Azure blog storage from within an Azure ML experiment

醉酒当歌 提交于 2019-11-26 14:08:31
问题 Azure ML Experiments provide ways to read and write CSV files to Azure blob storage through the Reader and Writer modules. However, I need to write a JSON file to blob storage. Since there is no module to do so, I'm trying to do so from within an Execute Python Script module. # Import the necessary items from azure.storage.blob import BlobService def azureml_main(dataframe1 = None, dataframe2 = None): account_name = 'mystorageaccount' account_key='mykeyhere==' json_string='{jsonstring here}'