# Uploading Big Files
In the Uploading Data tutorial we learned how to upload documents using the Parcel SDK and assign them to an app. We were uploading short strings which resided in the main memory just to get familiar with the concept. Real data however is often hundreds of megabytes if not gigabytes big and it is stored on the disk.
To tackle this issue Node.js introduced Streams (opens new window) and Parcel SDK fully supports them. To upload and download big files Streams split the data into chunks and only read or write a single chunk at a time. This solves the issue of the memory being too small to fit a complete file. Also, we can easily update the user interface for showing the progress of uploading or downloading data when we process a new chunk. Namely, for uploading the file, we hook into the:
- open (opens new window) event to read the size of the file we are uploading and
- data (opens new window) event to update the uploading progress.
Let's write a helper function which listens to these two events provided an
instance of the ReadStream
:
function printUploadProgress(readStream: fs.ReadStream) {
let stat: fs.Stats;
readStream.on('open', (fd) => {
stat = fs.fstatSync(fd);
});
readStream.on('data', () => {
const percent = (100 * readStream.bytesRead) / stat.size;
process.stdout.write(`Uploading... ${percent.toFixed(1)}% (${readStream.bytesRead}) \r`);
});
}
For downloading the file, we use the drain (opens new window)
event of the WriteStream
to update the progress. The size of the document in
bytes can be read from the
Parcel's
document.size attribute. Let's go ahead and write the helper:
function printDownloadProgress(writeStream: fs.WriteStream, documentSize: number) {
writeStream.on('drain', () => {
const percent = (100 * writeStream.bytesWritten) / documentSize;
process.stdout.write(`Downloading... ${percent.toFixed(1)}% (${writeStream.bytesWritten}) \r`);
});
}
Finally, we call createReadStream() (opens new window)
to create a read stream on top of the local file, register the ReadStream
events
as described above, and upload it to Parcel by calling
uploadDocument():
Similarly we call createWriteStream (opens new window)
to use the stream for downloading a file, register the events used for showing
the download progress, and download the document from Parcel by calling
downloadDocument():
const readStream = fs.createReadStream(`${filename}-upload`);
printUploadProgress(readStream);
const document = await parcel.uploadDocument(readStream, {
// Replace with your app ID, e.g. "AXstH3HzQoEhESWzTqxup9d"
toApp: process.env.ACME_APP_ID! as AppId,
}).finished;
console.log(`\nUploading ${document.id} complete.`);
const download = document.download();
const saver = fs.createWriteStream(`${filename}-download`);
printDownloadProgress(saver, document.size);
await download.pipeTo(saver);
console.log(`\nDownloading ${document.id} complete.`);
EXAMPLE
You can view the full example which uploads and downloads a randomly generated 100 MB file in the Parcel Examples repository.