30Mb limit uploading to Azure DataLake using DataLakeStoreFileSystemManagementClient

回眸只為那壹抹淺笑 提交于 2019-11-29 14:04:25

Please have a try to use DataLakeStoreUploader to upload file or directory to DataLake, more demo code please refer to github sample. I test the demo and it works correctly for me. We can get the Microsoft.Azure.Management.DataLake.Store and Microsoft.Azure.Management.DataLake.StoreUploader SDK from the nuget. The following is my detail steps:

  1. Create a C# console application
  2. Add the following code

     var applicationId = "your application Id";
     var secretKey = "secret Key";
     var tenantId = "Your tenantId";
     var adlsAccountName = "adls account name";
     var creds = ApplicationTokenProvider.LoginSilentAsync(tenantId, applicationId, secretKey).Result;
     var adlsFileSystemClient = new DataLakeStoreFileSystemManagementClient(creds);
     var inputFilePath = @"c:\tom\ForDemoCode.zip";
     var targetStreamPath = "/mytempdir/ForDemoCode.zip";  //should be the '/foldername/' not the full path
     var parameters = new UploadParameters(inputFilePath, targetStreamPath, adlsAccountName, isOverwrite: true,maxSegmentLength: 268435456*2); // the default  maxSegmentLength is 256M, we can set by ourself.
     var frontend = new DataLakeStoreFrontEndAdapter(adlsAccountName, adlsFileSystemClient);
     var uploader = new DataLakeStoreUploader(parameters, frontend);
     uploader.Execute();
    
  3. Debug the application .

  4. Check from the azure portal

SDK info please refer to the packages.config file

<?xml version="1.0" encoding="utf-8"?>
<packages>
  <package id="Microsoft.Azure.Management.DataLake.Store" version="1.0.2-preview" targetFramework="net452" />
  <package id="Microsoft.Azure.Management.DataLake.StoreUploader" version="1.0.0-preview" targetFramework="net452" />
  <package id="Microsoft.IdentityModel.Clients.ActiveDirectory" version="3.13.8" targetFramework="net452" />
  <package id="Microsoft.Rest.ClientRuntime" version="2.3.2" targetFramework="net452" />
  <package id="Microsoft.Rest.ClientRuntime.Azure" version="3.3.2" targetFramework="net452" />
  <package id="Microsoft.Rest.ClientRuntime.Azure.Authentication" version="2.2.0-preview" targetFramework="net452" />
  <package id="Newtonsoft.Json" version="9.0.2-beta1" targetFramework="net452" />
</packages>

It answered here.

Currently there is a size limit of 30000000 bytes. You can work around by creating an initial file and then append, both with stream size less than the limit.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!