Reading and writing file in number of Byte Buffer chunks of smaller length
NickName:andi99 Ask DateTime:2017-07-25T05:55:59

Reading and writing file in number of Byte Buffer chunks of smaller length

I am trying to read file in ByteBuffer chunks of fixed length and then store it to a list of ByteBuffer and then after some operations read those ByteBuffer chunks in a sequential order to reconstruct the file. Problem is that while writing output file channel position is not increasing. I do not want to use byte arrays, as they are fixed length and file reconstruction does not work properly. So I would like to know how to increase size of file write channel position, or any other way to do this operation. Sample code would be appreciated. Here are my code snippets,

file = new File(fileName);  // hello.txt - 20 MB size
fis = new FileInputStream(file);
inChannel = fis.getChannel();
double maxChunkSequenceNoFloat = ((int)inChannel.size()) / chunkSize;
int maxChunkSequenceNo = 1;
if(maxChunkSequenceNoFloat%10 > 0) {
    maxChunkSequenceNo = ((int)maxChunkSequenceNoFloat)+1;
} else if(maxChunkSequenceNoFloat%10 < 0) {
    maxChunkSequenceNo = 1;
} else {
    maxChunkSequenceNo = (int)maxChunkSequenceNoFloat;
}
maxChunkSequenceNo = (maxChunkSequenceNo == 0) ? 1 : maxChunkSequenceNo;            
ByteBuffer buffer = ByteBuffer.allocate(chunkSize);
buffer.clear();

while(inChannel.read(buffer) > 0) {
    buffer.flip();
    bufferList.add(buffer);
    buffer.clear();
    chunkSequenceNo++;
}
maxChunkSequenceNo = chunkSequenceNo;

// write
File file1 = new File("hello2.txt") ; 
buffer.clear();
FileOutputStream fos = new FileOutputStream(file1);
FileChannel outChannel = fos.getChannel();
chunkSequenceNo = 1;
for(ByteBuffer test : bufferList) {
    writeByteCount += outChannel.write(test);
    //outChannel.position() += writeByteCount;'
    System.out.println("main - channelPosition: "+outChannel.position()
                        +" tempBuffer.Position: "+test.position()
                        +" limit: "+test.limit()
                        +" remaining: "+test.remaining()
                        +" capacity: "+test.capacity());              
}
BufferedReader br = new BufferedReader(new FileReader(file1));
String line = null;
while ((line = br.readLine()) != null) {
    System.out.println(line);
}
outChannel.close();
fos.close();

Bytebuffer position is correct but outChannel position remains at 1048 which is chunk size.

Copyright Notice:Content Author:「andi99」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/45290717/reading-and-writing-file-in-number-of-byte-buffer-chunks-of-smaller-length

More about “Reading and writing file in number of Byte Buffer chunks of smaller length” related questions

Reading and writing file in number of Byte Buffer chunks of smaller length

I am trying to read file in ByteBuffer chunks of fixed length and then store it to a list of ByteBuffer and then after some operations read those ByteBuffer chunks in a sequential order to reconstr...

Show Detail

reading unknown length file in buffer chunks in c

I'm trying to read unknown length binary file into buffer chunks without using the functions like lseek(),fseek. I have used struct buffer that has 1024 bytes at once. when reading file larger tha...

Show Detail

Reading chunks of data from file (C)

My binary file contains chunks of data in the following format: 0xAA ... variable length of bytes ... 0XFF Is there a good way to read these chunks of data directly into a buffer, instead of rea...

Show Detail

Read large txt file and use it in chunks

I'm newbie in C# but I've made some research for reading/writing large txt files, the biggest might be 8GB but if it is too much I will consider split it maybe to 1GB. It must be fast up to 30MByte...

Show Detail

How to get best performance while reading a byte array into chunks? MemoryStream or Buffer.BlockCopy?

I have a code where I have to read data from byte array into chunks of 32KB. The original byte[] could be of size from 100KB to 1MB. Right now I am using Buffer.BlockCopy in loop to create another...

Show Detail

reading and writing in chunks on linux using c

I have a ASCII file where every line contains a record of variable length. For example Record-1:15 characters Record-2:200 characters Record-3:500 characters ... ... Record-n: X characters As the...

Show Detail

Distributed file system as a distributed buffer system?

The Problem I've been developing an application which needs to support reads on a data object asynchronously with appending writes. In other words, a buffer. There will be many data objects at any...

Show Detail

Splitting a file into chunks

I'm trying to split large files (3gb+) into chunks of 100mb, then sending those chunks through HTTP. For testing, i'm working on a 29 mb file, size: 30380892, size on disk: 30384128 (so there is no...

Show Detail

Python read file in to buffer and process results in chunks when buffer is full

I need to read binary file in specific chunks, but in some cases that file gets new data while being read. So I am thinking that solution is to read file in to buffer until buffer gets full and then

Show Detail

How to split file into chunks while still writing into it?

I tried to create byte array blocks from file whil the process was still using the file for writing. Actually I am storing video into file and I would like to create chunks from the same file while

Show Detail