react-s3-uploader 0,1,1 npm

React component that renders an <input type="file"/> and automatically uploads to an S3 bucket

3 years after MIT


Provides a React component that automatically uploads to an S3 Bucket.


$ npm install react-s3-uploader

From Browser

var ReactS3Uploader = require('react-s3-uploader');


    signingUrlHeaders={{ additional: headers }}
    signingUrlQueryParams={{ additional: query-params }}
    uploadRequestHeaders={{ 'x-amz-acl': 'public-read' }}
    server="" />

The above example shows all supported props. For uploadRequestHeaders, the default ACL is shown.

This expects a request to /s3/sign to return JSON with a signedUrl property that can be used to PUT the file in S3.

contentDisposition is optional and can be one of inline, attachment or auto. If given, the Content-Disposition header will be set accordingly with the file's original filename. If it is auto, the disposition type will be set to inline for images and attachment for all other files.

server is optional and can be used to specify the location of the server which is running the ReactS3Uploader server component if it is not the same as the one from which the client is served.

The resulting DOM is essentially:

<input type="file" onChange={this.uploadFile} />

The preprocess(file, next) prop provides an opportunity to do something before the file upload begins, modify the file (scaling the image for example), or abort the upload by not calling next(file).

When a file is chosen, it will immediately be uploaded to S3. You can listen for progress (and create a status bar, for example) by providing an onProgress function to the component.

Extra props

You can pass any extra props to <ReactS3Uploader /> and these will be passed down to the final <input />. which means that if you give the ReactS3Uploader a className or a name prop the input will have those as well.

Using custom function to get signedUrl

If can use custom function to get provide signedUrl directly to s3uploader by adding getSignedUrl prop. The function you provide should take file and callback arguments. Callback should be called with an object containing signedUrl key.

import ApiClient from './ApiClient';

function getSignedUrl(file, callback) {
  const client = new ApiClient();
  const params = {
    contentType: file.type

  client.get('/my/signing/server', { params })
  .then(data => {
  .catch(error => {

    'x-amz-acl': 'public-read'


Bundled router

You can use the Express router that is bundled with this module to answer calls to /s3/sign

app.use('/s3', require('react-s3-uploader/s3router')({
    bucket: "MyS3Bucket",
    region: 'us-east-1', //optional
    signatureVersion: 'v4', //optional (use for some amazon regions: frankfurt and others)
    headers: {'Access-Control-Allow-Origin': '*'}, // optional
    ACL: 'private' // this is default

This also provides another endpoint: GET /s3/img/(.*) and GET /s3/uploads/(.*). This will create a temporary URL that provides access to the uploaded file (which are uploaded privately by default). The request is then redirected to the URL, so that the image is served to the client.

To use this you will need to include the express module in your package.json dependencies.

Access/Secret Keys

The aws-sdk must be configured with your account's Access Key and Secret Access Key. There are a number of ways to provide these, but setting up environment variables is the quickest. You just have to configure environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, and AWS automatically picks them up.

Other Types of Servers

Boto for Python, in a Django project
import boto
import mimetypes
import json


conn = boto.connect_s3('AWS_KEY', 'AWS_SECRET')

def sign_s3_upload(request):
    object_name = request.GET['objectName']
    content_type = mimetypes.guess_type(object_name)[0]

    signed_url = conn.generate_url(
        'FOLDER_NAME' + object_name,
        headers = {'Content-Type': content_type, 'x-amz-acl':'public-read'})

    return HttpResponse(json.dumps({'signedUrl': signed_url}))

Ruby on Rails, assuming FOG usage

# Usual fog config, set as an initializer
storage =
  provider: 'AWS',
  aws_access_key_id: ENV['AWS_ACCESS_KEY_ID'],
  aws_secret_access_key: ENV['AWS_SECRET_ACCESS_KEY']

# In the controller
options = {path_style: true}
headers = {"Content-Type" => params[:contentType], "x-amz-acl" => "public-read"}

url = storage.put_object_url(ENV['S3_BUCKET_NAME'], "user_uploads/#{params[:objectName]}", 15.minutes.from_now.to_time.to_i, headers, options)

respond_to do |format|
  format.json { render json: {signedUrl: url} }
Other Servers

If you do some work on another server, and would love to contribute documentation, please send us a PR!

Changelog (Starting at 1.2.0)

  • Adding optional preprocess hook supports asynchronous operations such as resizing an image before upload [#79 #72]
  • Fix uglify warning [#77]
  • Avoid react warning by not passing unnecessary props to Dom.input [#75]
  • Allow custom getSignedUrl() function to be provided [#22]
  • Replace unsafe characters (per AWS docs) with underscores [#69]
  • Support signatureVersion option
  • Not passing non-JSON response text to error handlers
  • Fixes issue where URL would include "undefined" if this.server was not specified
  • Using react-dom
  • Fixes issue where URL would include "undefined" if this.server was not specified
  • Breaking Change [Fixes #52] Removing express as a peerDependency. Projects should explicitly depend on express to use the bundled router
  • [Fixes #51] url encode the contentType
  • Fixes issue where URL would include "undefined" if this.server was not specified
  • [Fixes #48] Only setting the AWS region for the S3 client, not the global default
  • Added server prop to ReactS3Uploader to support running the signing server on a different domain
  • Added headers option to s3router to support specifying 'Access-Control-Allow-Origin' header (or any others)
  • [Fixes #44] Using unorm.nfc(str) in favor of str.normalize()
  • Added dependencies unorm and latinize for uploading files with non-latin characters.
  • Filenames are normalized, latinized, and whitespace is stripped before uploading

Related Repositories



Multiple file upload plugin with image previews, drag and drop, progress bars. S ...



Upload files directly to AWS S3, Google Cloud Storage and others in meteor ...



Working with S3 sucks! Let this nifty little library handle it for you with a R ...



ES6 classes that wrap a Fine Uploader S3, Azure, or Traditional instance & provi ...

Top Contributors

seanadkinson njj StanBoyet rogerso pholisma alexprice91 clarkie jakerichan jakubrohleder shalstvedt sfrdmn dpretty ani-u alex-tan DylanGriffith CoericK ilyakatz jonbrennecke Kenny-House mndvns anathematic vjustov zeke angkec davearata a-koka ugputu18


package version
aws-sdk 2.x
create-react-class ^15.5.2
object-assign ^2.0.0
prop-types ^15.5.8
uuid ^3.1.0
peer react *
react-dom *


-   v3.3.0 zip tar
-   v3.2.1 zip tar
-   v3.2.0 zip tar
-   v3.1.0 zip tar
-   v3.0.3 zip tar
-   v3.0.2 zip tar
-   v3.0.1 zip tar
-   v3.0.0 zip tar
-   v2.0.1 zip tar
-   v2.0.0 zip tar
-   v1.2.3 zip tar
-   v1.2.2 zip tar
-   v1.2.1 zip tar
-   v1.2.0 zip tar
-   v1.1.18 zip tar
-   v1.1.17 zip tar
-   v1.1.16 zip tar
-   v1.1.15 zip tar
-   v1.1.14 zip tar
-   v1.1.13 zip tar
-   v1.1.12 zip tar
-   v1.1.11 zip tar
-   v1.1.10 zip tar
-   v1.1.9 zip tar
-   v1.1.8 zip tar
-   v1.1.7 zip tar
-   v1.1.6 zip tar
-   v1.1.5 zip tar
-   v1.1.4 zip tar