Options
All
  • Public
  • Public/Protected
  • All
Menu

@azure/storage-blob

Package version

Azure Storage Blob client library for JavaScript

Azure Storage Blob is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that does not adhere to a particular data model or definition, such as text or binary data.

This project provides a client library in JavaScript that makes it easy to consume Microsoft Azure Storage Blob service.

Use the client libararies in this package to:

  • Get/Set Blob Service Properties
  • Create/List/Delete Containers
  • Create/Read/List/Update/Delete Block Blobs
  • Create/Read/List/Update/Delete Page Blobs
  • Create/Read/List/Update/Delete Append Blobs

Source code | Package (npm) | API Reference Documentation | Product documentation | Samples | Azure Storage Blob REST APIs

Getting started

Prerequisites: You must have an Azure subscription and a Storage Account to use this package. If you are using this package in a Node.js application, then Node.js version 8.0.0 or higher is required.

Install the package

The preferred way to install the Azure Storage Blob client library for JavaScript is to use the npm package manager. Type the following into a terminal window:

npm install @azure/storage-blob

Authenticate the client

Azure Storage supports several ways to authenticate. In order to interact with the Azure Blob Storage service you'll need to create an instance of a Storage client - BlobServiceClient, ContainerClient, or BlobClient for example. See samples for creating the BlobServiceClient to learn more about authentication.

Azure Active Directory

The Azure Blob Storage service supports the use of Azure Active Directory to authenticate requests to its APIs. The @azure/identity package provides a variety of credential types that your application can use to do this. Please see the README for @azure/identity for more details and samples to get you started.

Compatibility

This library is compatible with Node.js and browsers, and validated against LTS Node.js versions (>=8.16.0) and latest versions of Chrome, Firefox and Edge.

Compatible with IE11

You need polyfills to make this library work with IE11. The easiest way is to use @babel/polyfill, or polyfill service.

You can also load separate polyfills for missed ES feature(s). This library depends on following ES features which need external polyfills loaded.

  • Promise
  • String.prototype.startsWith
  • String.prototype.endsWith
  • String.prototype.repeat
  • String.prototype.includes
  • Array.prototype.includes
  • Object.assign
  • Object.keys (Override IE11's Object.keys with ES6 polyfill forcely to enable ES6 behavior)
  • Symbol
  • Symbol.iterator

Differences between Node.js and browsers

There are differences between Node.js and browsers runtime. When getting started with this library, pay attention to APIs or classes marked with "ONLY AVAILABLE IN NODE.JS RUNTIME" or "ONLY AVAILABLE IN BROWSERS".

Features, interfaces, classes or functions only available in Node.js
  • Shared Key Authorization based on account name and account key
    • StorageSharedKeyCredential
  • Shared Access Signature(SAS) generation
    • generateAccountSASQueryParameters()
    • generateBlobSASQueryParameters()
  • Parallel uploading and downloading
    • BlockBlobClient.uploadFile()
    • BlockBlobClient.uploadStream()
    • BlobClient.downloadToBuffer()
    • BlobClient.downloadToFile()
Features, interfaces, classes or functions only available in browsers
  • Parallel uploading and downloading
    • BlockBlobClient.uploadBrowserData()

JavaScript Bundle

To use this client library in the browser, you need to use a bundler. For details on how to do this, please refer to our bundling documentation.

CORS

You need to set up Cross-Origin Resource Sharing (CORS) rules for your storage account if you need to develop for browsers. Go to Azure portal and Azure Storage Explorer, find your storage account, create new CORS rules for blob/queue/file/table service(s).

For example, you can create following CORS settings for debugging. But please customize the settings carefully according to your requirements in production environment.

  • Allowed origins: *
  • Allowed verbs: DELETE,GET,HEAD,MERGE,POST,OPTIONS,PUT
  • Allowed headers: *
  • Exposed headers: *
  • Maximum age (seconds): 86400

Key concepts

Blob storage is designed for:

  • Serving images or documents directly to a browser.
  • Storing files for distributed access.
  • Streaming video and audio.
  • Writing to log files.
  • Storing data for backup and restore, disaster recovery, and archiving.
  • Storing data for analysis by an on-premises or Azure-hosted service.

Blob storage offers three types of resources:

  • The storage account used via BlobServiceClient
  • A container in the storage account used via ContainerClient
  • A blob in a container used via BlobClient

Examples

Import the package

To use the clients, import the package into your file:

const AzureStorageBlob = require("@azure/storage-blob");

Alternatively, selectively import only the types you need:

const { BlobServiceClient, StorageSharedKeyCredential } = require("@azure/storage-blob");

Create the blob service client

The BlobServiceClient requires an URL to the blob service and an access credential. It also optionally accepts some settings in the options parameter.

with DefaultAzureCredential from @azure/identity package

Recommended way to instantiate a BlobServiceClient

Setup : Reference - Authorize access to blobs and queues with Azure Active Directory from a client application - https://docs.microsoft.com/azure/storage/common/storage-auth-aad-app

  • Register a new AAD application and give permissions to access Azure Storage on behalf of the signed-in user

    • Register a new application in the Azure Active Directory(in the azure-portal) - https://docs.microsoft.com/azure/active-directory/develop/quickstart-register-app
    • In the API permissions section, select Add a permission and choose Microsoft APIs.
    • Pick Azure Storage and select the checkbox next to user_impersonation and then click Add permissions. This would allow the application to access Azure Storage on behalf of the signed-in user.
  • Grant access to Azure Blob data with RBAC in the Azure Portal

  • Environment setup for the sample

    • From the overview page of your AAD Application, note down the CLIENT ID and TENANT ID. In the "Certificates & Secrets" tab, create a secret and note that down.
    • Make sure you have AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET as environment variables to successfully execute the sample(Can leverage process.env).
    const { DefaultAzureCredential } = require("@azure/identity");
    const { BlobServiceClient } = require("@azure/storage-blob");
    
    // Enter your storage account name
    const account = "<account>";
    const defaultAzureCredential = new DefaultAzureCredential();
    
    const blobServiceClient = new BlobServiceClient(
    `https://${account}.blob.core.windows.net`,
    defaultAzureCredential
    );

    See the Azure AD Auth sample for a complete example using this method.

    [Note - Above steps are only for Node.js]

with StorageSharedKeyCredential

Alternatively, you instantiate a BlobServiceClient with a StorageSharedKeyCredential by passing account-name and account-key as arguments. (The account-name and account-key can be obtained from the azure portal.) [ONLY AVAILABLE IN NODE.JS RUNTIME]

  const { BlobServiceClient, StorageSharedKeyCredential } = require("@azure/storage-blob");

  // Enter your storage account name and shared key
  const account = "<account>";
  const accountKey = "<accountkey>";

  // Use StorageSharedKeyCredential with storage account and account key
  // StorageSharedKeyCredential is only avaiable in Node.js runtime, not in browsers
  const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
  const blobServiceClient = new BlobServiceClient(
    `https://${account}.blob.core.windows.net`,
    sharedKeyCredential
  );

with SAS Token

Also, You can instantiate a BlobServiceClient with a shared access signatures (SAS). You can get the SAS token from the Azure Portal or generate one using generateAccountSASQueryParameters().

const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account name>";
const sas = "<service Shared Access Signature Token>";

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net${sas}`
);

Create a new container

Use BlobServiceClient.getContainerClient() to get a container client instance then create a new container resource.

const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net`,
  defaultAzureCredential
);

async function main() {
  // Create a container
  const containerName = `newcontainer${new Date().getTime()}`;
  const containerClient = blobServiceClient.getContainerClient(containerName);
  const createContainerResponse = await containerClient.create();
  console.log(`Create container ${containerName} successfully`, createContainerResponse.requestId);
}

main();

List the containers

Use BlobServiceClient.listContainers() function to iterate the containers, with the new for-await-of syntax:

const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net`,
  defaultAzureCredential
);

async function main() {
  let i = 1;
  let iter = await blobServiceClient.listContainers();
  for await (const container of iter) {
    console.log(`Container ${i++}: ${container.name}`);
  }
}

main();

Alternatively without using for-await-of:

const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net`,
  defaultAzureCredential
);

async function main() {
  let i = 1;
  let iter = blobServiceClient.listContainers();
  let containerItem = await iter.next();
  while (!containerItem.done) {
    console.log(`Container ${i++}: ${containerItem.value.name}`);
    containerItem = await iter.next();
  }
}

main();

In addition, pagination is supported for listing too via byPage():

const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net`,
  defaultAzureCredential
);

async function main() {
  let i = 1;
  for await (const response of blobServiceClient.listContainers().byPage({ maxPageSize: 20 })) {
    if (response.containerItems) {
      for (const container of response.containerItems) {
        console.log(`Container ${i++}: ${container.name}`);
      }
    }
  }
}

main();

For a complete sample on iterating containers please see samples/iterators-containers.ts.

Create a blob by uploading data to

const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net`,
  defaultAzureCredential
);

const containerName = "<container name>";

async function main() {
  const containerClient = blobServiceClient.getContainerClient(containerName);

  const content = "Hello world!";
  const blobName = "newblob" + new Date().getTime();
  const blockBlobClient = containerClient.getBlockBlobClient(blobName);
  const uploadBlobResponse = await blockBlobClient.upload(content, content.length);
  console.log(`Upload block blob ${blobName} successfully`, uploadBlobResponse.requestId);
}

main();

List blobs inside a container

Similar to listing containers.

const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net`,
  defaultAzureCredential
);

const containerName = "<container name>";

async function main() {
  const containerClient = blobServiceClient.getContainerClient(containerName);

  let i = 1;
  let iter = await containerClient.listBlobsFlat();
  for await (const blob of iter) {
    console.log(`Blob ${i++}: ${blob.name}`);
  }
}

main();

For a complete sample on iterating blobs please see samples/iterators-blobs.ts.

Download a blob and convert it to a string (Node.js)

const { DefaultAzureCredential } = require("@azure/identity");
const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account>";
const defaultAzureCredential = new DefaultAzureCredential();

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net`,
  defaultAzureCredential
);

const containerName = "<container name>";
const blobName = "<blob name>"

async function main() {
  const containerClient = blobServiceClient.getContainerClient(containerName);
  const blobClient = containerClient.getBlobClient(blobName);

  // Get blob content from position 0 to the end
  // In Node.js, get downloaded data by accessing downloadBlockBlobResponse.readableStreamBody
  const downloadBlockBlobResponse = await blobClient.download();
  const downloaded = await streamToString(downloadBlockBlobResponse.readableStreamBody);
  console.log("Downloaded blob content:", downloaded);

  // [Node.js only] A helper method used to read a Node.js readable stream into string
  async function streamToString(readableStream) {
    return new Promise((resolve, reject) => {
      const chunks = [];
      readableStream.on("data", (data) => {
        chunks.push(data.toString());
      });
      readableStream.on("end", () => {
        resolve(chunks.join(""));
      });
      readableStream.on("error", reject);
    });
  }
}

main();

Download a blob and convert it to a string (Browsers).

Please refer to the JavaScript Bundle section for more information on using this library in the browser.

const { BlobServiceClient } = require("@azure/storage-blob");

const account = "<account name>";
const sas = "<service Shared Access Signature Token>";
const containerName = "<container name>";
const blobName = "<blob name>"

const blobServiceClient = new BlobServiceClient(
  `https://${account}.blob.core.windows.net${sas}`
);

async function main() {
  const containerClient = blobServiceClient.getContainerClient(containerName);
  const blobClient = containerClient.getBlobClient(blobName);

  // Get blob content from position 0 to the end
  // In browsers, get downloaded data by accessing downloadBlockBlobResponse.blobBody
  const downloadBlockBlobResponse = await blobClient.download();
  const downloaded = await blobToString(await downloadBlockBlobResponse.blobBody);
  console.log(
    "Downloaded blob content",
    downloaded
  );

  // [Browsers only] A helper method used to convert a browser Blob into string.
  async function blobToString(blob){
    const fileReader = new FileReader();
    return new Promise((resolve, reject) => {
      fileReader.onloadend = (ev) => {
        resolve(ev.target.result);
      };
      fileReader.onerror = reject;
      fileReader.readAsText(blob);
    });
  }
}

main();

A complete example of basic scenarios is at samples/basic.ts.

Troubleshooting

Enabling logging may help uncover useful information about failures. In order to see a log of HTTP requests and responses, set the AZURE_LOG_LEVEL environment variable to info. Alternatively, logging can be enabled at runtime by calling setLogLevel in the @azure/logger:

import { setLogLevel } from "@azure/logger";

setLogLevel("info");

Next steps

More code samples:

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Impressions

Generated using TypeDoc