filestream

Pass Base64 encoded string using JsonTextReader value as new stream

爱⌒轻易说出口 提交于 2019-12-07 00:51:47
We are consuming large JSON streams from an HTTP Post request. The goal is to stream the incoming body as JSON using JsonTextReader and extract the embedded base64 encoded binary files to disk. In XML, an equivalent method might be XMLReader.ReadElementContentAsBase64Async. Using JSON.NET, as we iterative how do we send the each item of the encodedImages array into a FileStream without holding the whole string in memory. Example JSON Object: { "company":"{clientCompany}", "batchName":"{clientBatchName}", "fileType":"{clientFileType}", "encodedImages":[ "{base64encodedimage}", "

How to read byte blocks into struct

非 Y 不嫁゛ 提交于 2019-12-06 13:35:11
问题 I have this resource file which I need to process, wich packs a set of files. First, the resource file lists all the files contained within, plus some other data, such as in this struct: struct FileEntry{ byte Value1; char Filename[12]; byte Value2; byte FileOffset[3]; float whatever; } So I would need to read blocks exactly this size. I am using the Read function from FileStream, but how can I specify the size of the struct? I used: int sizeToRead = Marshal.SizeOf(typeof(Header)); and then

Reading file, and monitor new line

£可爱£侵袭症+ 提交于 2019-12-06 09:06:11
问题 I'm looking to create a console application that will read a file, and monitor every new line since it's being write by another process every .5 seconds. How can I achieve that, within a Console App using .NET 4.5? 回答1: As @Sudhakar mentioned, FileSystemWatcher is useful when you want to be notified when a file updates sporadically, and polling at regular intervals is useful when you want to be constantly processing information from an always-growing file (such as a busy log file). I'd like

adding to azure blob storage with stream

天大地大妈咪最大 提交于 2019-12-06 07:43:31
I am trying to add an IFormFile received via a .net core web API to an azure blob storage. These are the properties I have set up: static internal CloudStorageAccount StorageAccount => new CloudStorageAccount(new StorageCredentials(AccountName, AccessKey, AccessKeyName), true); // Create a blob client. static internal CloudBlobClient BlobClient => StorageAccount.CreateCloudBlobClient(); // Get a reference to a container static internal CloudBlobContainer Container(string ContainerName) => BlobClient.GetContainerReference(ContainerName); static internal CloudBlobContainer ProfilePicContainer =>

File is being used by another process in c#

纵然是瞬间 提交于 2019-12-06 04:19:15
I am trying to delete a file in C#, however I am receiving a message that the file is used from another process. What I want to do is to check if the files exists and close it. I am using the following function in order to check if the file is open: public static bool IsFileInUse(string path) { if (string.IsNullOrEmpty(path)) throw new ArgumentException("'path' cannot be null or empty.", "path"); try { using (var stream = new FileStream(path, FileMode.Open, FileAccess.Read)) { } } catch (IOException) { return true; } return false; } and I am trying when the file is in use to close it: bool

Possible reasons for FileStream.Write() to throw an OutOfMemoryException?

☆樱花仙子☆ 提交于 2019-12-06 02:46:41
I have 10 threads writing thousands of small buffers (16-30 bytes each) to a huge file in random positions. Some of the threads throw OutOfMemoryException on FileStream.Write() opreation. What is causing the OutOfMemoryException ? What to look for? I'm using the FileStream like this (for every written item - this code runs from 10 different threads): using (FileStream fs = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Write, FileShare.ReadWrite, BigBufferSizeInBytes, FileOptions.SequentialScan)) { ... fs.Write(); } I suspect that all the buffers allocated inside the FileStream don't

Seeking and writing files bigger than 2GB in C#

别等时光非礼了梦想. 提交于 2019-12-06 02:05:40
In C#, the FileStream 's methods Read/Write/Seek take integer in parameter. In a previous post , I have seen a good solution to read/write files that are bigger than the virtual memory allocated to a process. This solution works if you want to write the data from the beginning to the end. But in my case, the chunks of data I am receiving are in no particular order. I have a code that works for files smaller than 2GB : private void WriteChunk(byte[] data, int position, int chunkSize, int count, string path) { FileStream destination = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Write)

Convert FileItem to File

拈花ヽ惹草 提交于 2019-12-05 22:48:07
I'm trying to upload a XML file (text.xml) using a simple upload html form, regulary read it as FileItem in the servlet and then get the ACTUAL file (text.xml) so i can print it, save it etc. Is there any educated way to do this simply? some people told me to use FileItem's property getInputStream.. is there any example somewhere? isn't there a shorter way? thanks BalusC Just use FileItem#write() method, exactly as demonstrated in User Guide of Apache Commons FileUpload . File file = new File("/path/to/text.xml"); fileItem.write(file); That's all. See also: How do I set the folder for storing

fulltext index returning no results from pdf filestream

蓝咒 提交于 2019-12-05 20:16:15
I have a filestream table running on SQL Server 2012 on a Windows 8.1 x64 machine, which already have a few PDF and TXT files stored, so I decided to create a fulltext index to search through these files by using the following command: CREATE FULLTEXT CATALOG FileStreamFTSCatalog AS DEFAULT; CREATE FULLTEXT INDEX ON storage (FileName Language 1046, File TYPE COLUMN FileExtension Language 1046) KEY INDEX PK__storage__3214EC077DADCE3C ON FileStreamFTSCatalog WITH CHANGE_TRACKING AUTO; Then I sent these commands after reading some people having the same problem as me: EXEC sp_fulltext_service

How do I zip files in Xamarin for Android?

元气小坏坏 提交于 2019-12-05 19:48:16
I have a function that creates a zip file a string array of files passed. The function does succeed in creating the zip file and the zip entry files inside it, but these zip entry files are empty. I've tried a couple of different methods - the function code below is the closest I've gotten to something working: public static bool ZipFile(string[] arrFiles, string sZipToDirectory, string sZipFileName) { if (Directory.Exists(sZipToDirectory)) { FileStream fNewZipFileStream; ZipOutputStream zos; try { fNewZipFileStream = File.Create(sZipToDirectory + sZipFileName); zos = new ZipOutputStream