filestream

Reading multiple files in a Stream

有些话、适合烂在心里 提交于 2019-12-10 11:43:36
问题 Hei! How can I read multiple text files at once? What I want to do is read a series of files and append all of them to one big file. Curently I am doing this: take each file and open it with a StreamReader read the StreamReader completely in a StringBuilder and append it to the current StreamBuilder check if the memory size is exceeded and if yes write the StringBuilder at the end of the file and empty the StrigBuilder Unfortunately, I observed that the reading speed avg is only 4MB/sec. I

C# Casting MemoryStream to FileStream

烂漫一生 提交于 2019-12-09 17:54:05
问题 My code is this: byte[] byteArray = Encoding.ASCII.GetBytes(someText); MemoryStream stream = new MemoryStream(byteArray); StreamReader reader = new StreamReader(stream); FileStream file = (FileStream)reader.BaseStream; Later I'm using file.Name. I'm getting an InvalidCastException: it displays follows Unable to cast object of type 'System.IO.MemoryStream' to type 'System.IO.FileStream'. I read somewhere that I should just change FileStream to Stream. Is there something else I should do? 回答1:

FILESTREAM/FILETABLE Clarifications for Implementation

时光毁灭记忆、已成空白 提交于 2019-12-09 13:11:32
问题 Recently our team was looking at FILESTREAM to expand the capabilities of our proprietary application. The main purpose of this app is managing the various PDFS, Images and documents to all of the parts we manufacture. Our ASP application uses a few third party tools to allow viewing of these files. We currently have 980GB of data on the Fileserver. We have around 200GB of Binary data in SQL Server that we would like to extract since it is not performing well hence FILESTREAM seems to be a

How to create fast and efficient filestream writes on large sparse files

北城以北 提交于 2019-12-09 09:50:00
问题 I have an application that writes large files in multiple segments. I use FileStream.Seek to position each wirte. It appears that when I call FileStream.Write at a deep position in a sparse file the write triggers a "backfill" operation (writeing 0s) on all preceding bytes which is slow. Is there a more efficient way of handling this situation? The below code demonstrates the problem. The initial write takes about 370 MS on my machine. public void WriteToStream() { DateTime dt; using

Looping through lines of txt file uploaded via FileUpload control

杀马特。学长 韩版系。学妹 提交于 2019-12-09 09:48:00
问题 I want to select a simple .txt file that contains lines of strings using a FileUpload control. But instead of actually saving the file I want to loop through each line of text and display each line in a ListBox control. Example of a text file: test.txt 123jhg345 182bdh774 473ypo433 129iiu454 What is the best way to accomplish this? What I have so far: private void populateListBox() { FileUpload fu = FileUpload1; if (fu.HasFile) { //Loop trough txt file and add lines to ListBox1 } } 回答1:

Upload a file with encoding using FTP in C#

大兔子大兔子 提交于 2019-12-09 05:24:59
问题 The following code is good for uploading text files, but it fails to upload JPEG files (not completely - the file name is good but the image is corrupted): private void up(string sourceFile, string targetFile) { try { string ftpServerIP = ConfigurationManager.AppSettings["ftpIP"]; string ftpUserID = ConfigurationManager.AppSettings["ftpUser"]; string ftpPassword = ConfigurationManager.AppSettings["ftpPass"]; //string ftpURI = ""; string filename = "ftp://" + ftpServerIP + "//" + targetFile;

What's the best way to write a short[] array to a file in C#?

早过忘川 提交于 2019-12-08 21:53:06
问题 I have an array of shorts (short[]) that I need to write out to a file. What's the quickest way to do this? 回答1: Use the BinaryWriter static void WriteShorts(short[] values, string path) { using (FileStream fs = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Write)) { using (BinaryWriter bw = new BinaryWriter(fs)) { foreach (short value in values) { bw.Write(value); } } } } 回答2: Following up on Jon B's answer, if your file contains any other data, you might want to prefix the data

Difference between basic_istream<>::tellg() and basic_ostream<>::tellp() [closed]

巧了我就是萌 提交于 2019-12-08 15:27:30
问题 This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center. Closed 6 years ago . I was just wondering why the member functions tellg() defined in basic_istream<> class and tellp() defined in basic_ostream<> class have different names.

save 2 kinds of data using one filestream?

天大地大妈咪最大 提交于 2019-12-08 10:32:15
问题 Im saving 2 kinds of data(text and image). Is it possible to save 2 types of data at the same time using 1 filestream? example var myTextFileObj:Object = new Object() myTextFileObj["Name"]="George Borming" var imgByteData:ByteArray = jpgEncoder.encode(myBitmapData);//lets assume i have a bitmap data here var folder:File = File.documentsDirectory.resolvePath("myApp/images/image.jpg"); var folder2:File = File.documentsDirectory.resolvePath("myApp/textfiles/mytext.txt"); fileStream = new

Blob data in huge SQL Server database

给你一囗甜甜゛ 提交于 2019-12-08 09:39:22
问题 We have 20.000.000 generated textfiles every year, average size is approx 250 Kb each (35 Kb zipped). We must put these files in some kind of archive for 10 years. No need to search inside textfiles, but we must be able to find one texfile by searching on 5-10 metadata fields such as "productname", "creationdate", etc. I'm considering zipping each file and storing them in a SQL Server database with 5-10 searchable (indexed) columns and a varbinary(MAX) column for the zipped file data. The