Home > Software design >  Linux split big files by chunks
Linux split big files by chunks

Time:05-26

I have a big file (15GB) located in my host.

I want to split this file into chunks of 200MB.

Currently, I do it using:

split -a 3 -d -b 200MB my_big_file /tmp/chunk_

The problem is that for now I have only 10GB free space, I want to split it by offset, meaning that the first step is to read from the big file 7GB, split it using split, remove the split files and then split from 7GB to 15GB.

How can I do it?

CodePudding user response:

Use dd command to read the file and specify value of block size as 1 and value of count as exactly half the number of bytes in file in order to read first half of file and redirect the output of dd command to split command, like this:

(Assumptions: big_file is name of your 15GB file and its size, in bytes, is exactly 15GB):

# dd if=big_file bs=1 count=8053063680 | split -a 3 -d -b 200MB - /tmp/chunk_

This will split the first half of file in chunks of 200MB.

Note that 8053063680 is half of number of bytes in 15GB (16106127360 bytes).

For second half

# dd if=big_file bs=1 skip=8053063680 count=8053063680 | split -a 3 -d -b 200MB - /tmp/chunk_

Again, be sure about the exact size of your file in bytes and based on that give value to count and skip.

  • Related