Storage-agnostic streaming library for binary data transfers

Nodestream Nodestream logo

NPM Version Build Status Coverage Status Documentation Status Built with GNU Make

Streaming library for binary data transfers

API docs


This library aims to provide a unified API for all the major storage systems out there (filesystem, AWS S3, Google Cloud Storage etc.). It also provides an easy way to manipulate data streams as they are being uploaded/downloaded from those storage systems (compression/ checksum calculation/encryption etc.).

Use cases

  • Single API to rule them all
  • Easy way to transform incoming/outgoing data
  • Work with filesystem storage during development, AWS S3 in production without changing code
  • Insert your idea here

Available adapters

S3 GridFS GCS Filesystem
Amazon S3 GridFS Google Cloud Storage Local Filesystem

Available transforms

See Pipelines and Transforms section for more info.

checksum compress progress crypto (WIP)
Checksum Calculator Stream (de)compressor Progress monitor Stream (en/de)cryption



The first step is to install nodestream into your project:

npm install --save nodestream

The next thing is to decide which adapter you want to use. An adapter is an interface for nodestream to be able to interact with a particular storage system. Let’s use local filesystem for a start:

npm install --save nodestream-filesystem


Let’s create and configure a nodestream instance with which your application can then interact:

// Require the main Nodestream class
const Nodestream = require('nodestream')
const nodestream = new Nodestream({
  // This tells nodestream which storage system it should interact with
  // Under the hood, it will try to require `nodestream-filesystem` module
  adapter: 'filesystem',
  // This object is always specific to your adapter of choice - always check
  // the documentation for that adapter for available options
  config: {
    // The `filesystem` adapter requires a `root` configuration option, so let's provide one
    root: [__dirname, '.storage']

Great! At this point, nodestream is ready to transfer some bytes!



You can upload any kind of readable stream. Nodestream does not care where that stream comes from, whether it’s an http upload or a file from your filesystem or something totally different.

For this example, we will upload a file from our filesystem.

We will be uploading the file to our local filesystem as well as reading it from the same filesystem. Normally you would probably use a source different from the target storage, but Nodestream does not really care.

”`js const fs = require(‘fs’) // This is the file we will upload - create a readable stream of that file const profilePic = fs.createReadStream(‘/users/me/pictures/awesome-pic.png’)

nodestream.upload(profilePic, { // directory and name are supported by all storage adapters, but each // adapter might have additional options you can use directory: ‘avatars’, name: ‘user-123.png’ }) .then(results => { // results can contain several properties, but the most interesting // and always-present is location - you should definitely save this // somewhere, you will need it to retrieve this file later! console.log(results.location) }) .catch(err => { // U-oh, something blew up

Related Repositories



Realtime apps made easy with templating ...



Storage-agnostic streaming library for binary data transfers ...