How to Split Large Files by size in Python?
Handling a large file all at once can strain your computer's memory, causing it to slow down or freeze. Breaking it into smaller parts makes it easier for your computer to handle and allows it to process different parts simultaneously, speeding things up. Smaller files are easier to deal with, share, and send. So, if you're putting files on a server, breaking them into smaller pieces makes everything work better and faster. In this tutorial, you will learn how to split large files by size in Python, empowering you to handle big data with ease.
Chunking large files by size is a common task in data processing, especially when handling extensive data that can't fit in memory. In the example code below, we divide a large file into smaller ones, each with a size limit of 1 MB:
# Replace "path/to/your/file.txt" with your file location file_location = "path/to/your/file.txt" # File to open and break apart fileR = open(file_location, "rb") chunk = 0 chunk_size = 1000000 #1MB in bytes byte = fileR.read(chunk_size) while byte: # Open a temporary file and write a chunk of bytes fileN = "chunk" + str(chunk) + ".txt" fileT = open(fileN, "wb") fileT.write(byte) fileT.close() # Read next chunk_size 1024 bytes byte = fileR.read(chunk_size) chunk += 1