Read large txt file and use it in chunks
NickName:bLAZ Ask DateTime:2012-10-24T03:49:28

Read large txt file and use it in chunks

I'm newbie in C# but I've made some research for reading/writing large txt files, the biggest might be 8GB but if it is too much I will consider split it maybe to 1GB. It must be fast up to 30MBytes/s. I've found three approaches: for sequential operation FileStream or StreamReader/StreamWriter, for random access MemoryMappedFiles. Now I'd like first to read file. Here is an example of code that works:

 FileStream fileStream = new FileStream(@"C:\Users\Guest4\Desktop\data.txt", FileMode.Open, FileAccess.Read);
try
{
    int length = (int)fileStream.Length;  // get file length
    buffer = new byte[length];            // create buffer
    int count;                            // actual number of bytes read
    sum = 0;                          // total number of bytes read

    // read until Read method returns 0 (end of the stream has been reached)
    while ((count = fileStream.Read(buffer, sum, length - sum)) > 0)
        sum += count;  // sum is a buffer offset for next reading
}
finally
{
    fileStream.Close();
}

Do you think is it good way to that fast big files reading?

After reading I need to resend that file. It must be in 16384 bytes chunks. Every chunk will be sent until all the data will be transmitted. And that chunks have to be string type. Could you suggest me how to do it? Split and convert to string. I suppose the best way is to send that string chunk not after reading all file, but if at least that 16384 bytes is read.

Copyright Notice:Content Author:「bLAZ」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/13038112/read-large-txt-file-and-use-it-in-chunks

More about “Read large txt file and use it in chunks” related questions

Read large txt file and use it in chunks

I'm newbie in C# but I've made some research for reading/writing large txt files, the biggest might be 8GB but if it is too much I will consider split it maybe to 1GB. It must be fast up to 30MByte...

Show Detail

Read large txt file in chunks and process data

I'm supposed to read a large txt file in chunks and every word in chunk has to be processed. But some words can be cut into pieses. For instance: text_in_file = 'some text in file to be processed'

Show Detail

read numpy array from txt file by chunks

I have a 2D array saved in txt file(10 millions rows). Because it's too large, I need to load it by chunk. Let's say read every 1000 lines each time (as a batch size of training data in Neural Netw...

Show Detail

Read large file in chunks, compress and write in chunks

I've come up against an issue due to large file sizes and processing them, the files are gradually increasing in size and will continue to do into the future. I can only use deflate as a compression

Show Detail

Azure Function to read large file in chunks through http trigger request

I have an API in Azure function that uses Http trigger to receive data and send that across to the on-prem application. We have the UI front-end where user can upload large file size (no limit) and...

Show Detail

How to read a file in chunks that is to large to be stored in memory

I'm practicing and I ran across a problem about sorting numbers from a file that is to large to fit in memory. I don't know how to do this so I thought I would give it a try. I ended up finding ext...

Show Detail

Read Pickle file in Chunks into pandas

I have a process that writes out a dataframe into a pickle file using the standard protocol df.to_pickle: import pandas as pd #sample example data data = {'col1': [1, 2, 3, 4, 5], 'col2' :...

Show Detail

how to process all but one column before concatenating chunks when use chunks to read large csv file

I have a large csv file (7GB) and I used these codes to read it in Pandas: chunks=pd.read_table('input_filename', chunksize=500000) df=pd.DataFrame() df=pd.concat((chunk==1) for chunk in chunks) ...

Show Detail

Read large text file in chunks in a loop

I have a large 30GB file that I want to process. I am trying to read it line-by-line in chunks since it cannot be loaded into memory. base::readLines and readr::read_lines_chunked are only able...

Show Detail

Splitting a large file into chunks

I've a file with 7946479 records, i want to read the file line by line and insert into the database(sqlite). My first approach was open the file read the records line by line and insert into the da...

Show Detail