Uploading large files to Sharepoint 365

馋奶兔 提交于 2019-12-24 18:25:15

问题


I'm using the CSOM to upload files to a Sharepoint 365 site.

I've logged in succesfully with Claims based authentication using methods found here "http://www.wictorwilen.se/Post/How-to-do-active-authentication-to-Office-365-and-SharePoint-Online.aspx"

But using SaveBinaryDirect on the ClientContext fails with a 405 due to cookies being attached to request too late.

Another method of using CSOM to upload files is similar to below. But with SP 365, this limits the file size to about 3 meg.

 var newFileFromComputer = new FileCreationInformation
                {
                    Content = fileContents,
                    Url = Path.GetFileName(sourceUrl)
                };


 Microsoft.SharePoint.Client.File uploadedFile = customerFolder.Files.Add(newFileFromComputer);
                    context.Load(uploadedFile);
                    context.ExecuteQuery();

Is there ANY way to do this using CSOM, SP 365 and file sizes up to say 100 meg?


回答1:


Indeed while trying to upload a file in SharePoint Online which size exceeds 250MB file limit the following exception will occur:

Response received was -1, Microsoft.SharePoint.Client.InvalidClientQueryExceptionThe request message is too big. The server does not allow messages larger than 262144000 bytes.

To circumvent this error chunked file upload methods have been introduced which support uploading files larger than 250 MB. In the provided link there is an sample which demonstrates how to utilize it via SharePoint CSOM API.

Supported versions:

  • SharePoint Online
  • SharePoint On-Premise 2016 or above

The following example demonstrates how to utilize chunked file upload methods in SharePoint REST API:

class FileUploader
{
    public static void ChunkedFileUpload(string webUrl, ICredentials credentials, string sourcePath, string targetFolderUrl, int chunkSizeBytes, Action<long, long> chunkUploaded)
    {
        using (var client = new WebClient())
        {
            client.BaseAddress = webUrl;
            client.Credentials = credentials;
            client.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
            var formDigest = RequestFormDigest(webUrl, credentials);
            client.Headers.Add("X-RequestDigest", formDigest);

            //create an empty file first
            var fileName = System.IO.Path.GetFileName(sourcePath);
            var createFileRequestUrl = string.Format("/_api/web/getfolderbyserverrelativeurl('{0}')/files/add(url='{1}',overwrite=true)", targetFolderUrl, fileName);
            client.UploadString(createFileRequestUrl, "POST");

            var targetUrl = System.IO.Path.Combine(targetFolderUrl, fileName);
            var firstChunk = true;
            var uploadId = Guid.NewGuid();
            var offset = 0L;

            using (var inputStream = System.IO.File.OpenRead(sourcePath))
            {
                var buffer = new byte[chunkSizeBytes];
                int bytesRead;
                while ((bytesRead = inputStream.Read(buffer, 0, buffer.Length)) > 0)
                {
                    if (firstChunk)
                    {
                        var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/startupload(uploadId=guid'{1}')", targetUrl, uploadId);
                        client.UploadData(endpointUrl, buffer);
                        firstChunk = false;
                    }
                    else if (inputStream.Position == inputStream.Length)
                    {
                        var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/finishupload(uploadId=guid'{1}',fileOffset={2})", targetUrl, uploadId, offset);
                        var finalBuffer = new byte[bytesRead];
                        Array.Copy(buffer, finalBuffer, finalBuffer.Length);
                        client.UploadData(endpointUrl, finalBuffer);
                    }
                    else
                    {
                        var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/continueupload(uploadId=guid'{1}',fileOffset={2})", targetUrl, uploadId, offset);
                        client.UploadData(endpointUrl, buffer);
                    }
                    offset += bytesRead;
                    chunkUploaded(offset, inputStream.Length);
                }
            }
        }
    }

    public static string RequestFormDigest(string webUrl, ICredentials credentials)
    {
        using (var client = new WebClient())
        {
            client.BaseAddress = webUrl;
            client.Credentials = credentials;
            client.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
            client.Headers.Add("Accept", "application/json; odata=verbose");
            var endpointUrl = "/_api/contextinfo";
            var content = client.UploadString(endpointUrl, "POST");
            var data = JObject.Parse(content);
            return data["d"]["GetContextWebInformation"]["FormDigestValue"].ToString();
        }
    }
}

Source code: FileUploader.cs

Usage

var userCredentials = GetCredentials(userName, password);
var sourcePath = @"C:\temp\jellyfish-25-mbps-hd-hevc.mkv"; //local file path
var targetFolderUrl = "/Shared Documents"; //library reltive url
FileUploader.ChunkedFileUpload(webUrl,
       userCredentials,
       sourcePath, 
       targetFolderUrl, 
       1024 * 1024 * 5, //5MB
       (offset, size) =>
       {
            Console.WriteLine("{0:P} completed", (offset / (float)size));
       });

References

Always use File Chunking to Upload Files > 250 MB to SharePoint Online




回答2:


Well, I haven't found a way to do it with the CSOM and that is truly infuriating.

A work around was posted by SEvans at the comments on http://www.wictorwilen.se/Post/How-to-do-active-authentication-to-Office-365-and-SharePoint-Online.aspx .

Basically just do an http put and attach the cookie collection from the claims based authentication. SEvans posted workaround is below


Great piece of code Wichtor. As others have noted, SaveBinaryDirect does not work correctly, as the FedAuth cookies never get attached to the HTTP PUT request that the method generates.

Here's my workaround - hope this helps some of you:

// "url" is the full destination path (including filename, i.e. https://mysite.sharepoint.com/Documents/Test.txt) 

// "cookie" is the CookieContainer generated from Wichtor's code 
// "data" is the byte array containing the files contents (used a FileStream to load) 

System.Net.ServicePointManager.Expect100Continue = false; 
HttpWebRequest request = HttpWebRequest.Create(url) as HttpWebRequest; 
request.Method = "PUT"; 
request.Accept = "*/*"; 
request.ContentType = "multipart/form-data; charset=utf-8"; 
request.CookieContainer = cookie; request.AllowAutoRedirect = false; 
request.UserAgent = "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"; 
request.Headers.Add("Accept-Language", "en-us"); 
request.Headers.Add("Translate", "F"); request.Headers.Add("Cache-Control", "no-cache"); request.ContentLength = data.Length; 

using (Stream req = request.GetRequestStream()) 
{ req.Write(data, 0, data.Length); } 

HttpWebResponse response = (HttpWebResponse)request.GetResponse(); 
Stream res = response.GetResponseStream(); 
StreamReader rdr = new StreamReader(res); 
string rawResponse = rdr.ReadToEnd(); 
response.Close();
rdr.Close();


来源:https://stackoverflow.com/questions/15077305/uploading-large-files-to-sharepoint-365

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!