Approach for archive big files in azure storage

可紊 提交于 2019-12-11 18:28:33

问题


I'm interested in what is approach for storing of sensitive data in cloud in your projects.

I want to prepare proof of concept of feature which have to archive files in Azure. We are talking about samething around 30GB of new files per day. 10-2000 MB per file.

My first idea was to use Azure Storage to store those files. Files should be send to storage via Azure App Service so I have to prepare some WEBApi.

Based on this idea I am curious if there wont be any problems with sending such a big files to webapi? Any tips what should I also consider?


回答1:


The default request size for ASP.NET is 4MB, so you’ll need to increase that limit if you want to allow uploading and downloading of larger files.

You can do this by setting the maxRequestLength and maxAllowedContentLength values in your web.config file.

Note:the maxRequestLength value is in KB, but the maxAllowedContentLength value is in bytes.

Here is an example for increasing request limits to 2000MB:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
    <system.web>
        <!-- 100 MB in kilobytes -->
        <httpRuntime maxRequestLength="204800" />
    </system.web>

    <system.webServer>
        <security>
            <requestFiltering>
                <!-- 100 MB in bytes -->
                <requestLimits maxAllowedContentLength="209715200" />
            </requestFiltering>
        </security>
    </system.webServer>
</configuration>

You could use async Task, mainly for better support of handling large files.

Here is a article about uploading and downloading large file with Web API and Azure Blob Storage, you could refer to it.



来源:https://stackoverflow.com/questions/49520167/approach-for-archive-big-files-in-azure-storage

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!