Celeb Glow
general | March 30, 2026

How to split a file up into many smaller files and then reconstruct them? [duplicate]

I need to upload a file which is about a GB to an online server to get to a friend. Now there seems to be a problem with my internet connection because if I try to upload a file more than 20MB then it just stops uploading and won't start again for some reason.

So I would like an application or script(s) (there may be the need for two, one which makes the files, and one which reconstructs them) which can do the following:

  • Split the file up into smaller files, where the max size for each file can be set by the user so that this can be used for other cases

  • Is able to reconstruct the file at the other end and verify integrity by checking its SHA512SUM

I would prefer it not to compress anything. I am running Ubuntu GNOME 16.04.1 with GNOME 3.20. Is there a way of doing this?

2

2 Answers

Check out the suggested answers before posting. The first suggested answer is this one:Split a large file into smaller files and then integrate them to get the original file

And here the commands

split -b 20M -d bigfile bigfile-part
cat bigfile-part* > bigfile

Edit:

... and for the hashsum part, you can generate a SHA512 checksum file from the original ("big") file:

sha512sum bigfile > sha512.txt

After putting all the small parts together again, check again with the new file, and compare the values.

2

I've coded two small scripts that can be used for that. First, put the script (name it 'splitter' or anything you like) in a separate folder with the file to be splitted then in the bash run:

./splitter FILE SIZE

Being FILE the file to be split and SIZE the size in MB. After that, you will see all the splitted blocks (SPLITTED_FILE.aa, SPLITTED_FILE.ab, and so on...), a checksum file (SPLITTED_CHECK_SHA256SUM) and a tar file with all those files inside (SPLITTED_TAR.tar). Delete all but the tar file. When you need to send it, extract it and send the individual files. Once they arrive, put them again in a tar file with the same name (SPLITTED_TAR.tar) and put it in a separate folder with the second script ('mergefile' or whatever you name it). Run:

./mergefile

It should extract all the files, merge them in a file with the same name as the original one and check the sha256sum.

Note that it's a very raw script, so just use it exactly the way I described, in a separate folder with nothing else in it. That's if you want to use it, think it's easier to just do it by hand on command line.. But I wanted to practice some shell script!

splitter script:

#!/bin/bash
#Usage: splitter [FILENAME] [SIZE]
PROGRAMNAME=$(basename $0)
PREFIX='SPLITTED_FILE'
if [[ $# != 2 ]]; then echo "Usage: $PROGRAMNAME [FILENAME] [SIZE]" exit 1
fi
FILENAME=$1
SIZE=$2
if [[ -f $FILENAME ]]; then if [[ $SIZE =~ ^[0-9]+$ && $SIZE != 0 ]]; then sha256sum $FILENAME > SPLITTED_CHECK_SHA256SUM split -b ${SIZE}'M' $FILENAME $PREFIX tar -cf SPLITTED_TAR.tar SPLITTED_FILE* SPLITTED_CHECK_SHA256SUM echo "Done." exit else echo "$PROGRAMNAME: Invalid size" exit 1 fi
else echo "$PROGRAMNAME: Invalid filename" exit 1
fi

mergefile script:

#!/bin/bash
#Usage: mergefile
PROGRAMNAME=$(basename $0)
PREFIX='SPLITTED_FILE'
if [[ $# != 0 ]]; then echo "Usage: $PROGRAMNAME. No arguments (SPLITTED_TAR.tar will be merged)" exit 1
fi
FILENAME=""
if [[ -f 'SPLITTED_TAR.tar' ]]; then tar -xf 'SPLITTED_TAR.tar' FILENAME=$(cut -f 3 -d " " SPLITTED_CHECK_SHA256SUM) cat ./SPLITTED_FILE* > $FILENAME if [[ $(sha256sum $FILENAME | cut -f 3 -d " ") == $(cut -f 3 -d " " SPLITTED_CHECK_SHA256SUM) ]]; then echo "SHASUM Checks!" else echo "File corrupted (SHASUM doesn't check!)" exit 1 fi echo "Done." exit
else echo "$PROGRAMNAME: SPLITTED_TAR.tar not found." exit 1
fi
1