Options
All
  • Public
  • Public/Protected
  • All
Menu

Class BlockBlobClient

Package version

BlockBlobClient defines a set of operations applicable to block blobs.

export

Hierarchy

Index

Constructors

constructor

  • Creates an instance of BlockBlobClient.

    memberof

    BlockBlobClient

    Parameters

    • connectionString: string

      Account connection string or a SAS connection string of an Azure storage account. [ Note - Account connection string can only be used in NODE.JS runtime. ] Account connection string example - DefaultEndpointsProtocol=https;AccountName=myaccount;AccountKey=accountKey;EndpointSuffix=core.windows.net SAS connection string example - BlobEndpoint=https://myaccount.blob.core.windows.net/;QueueEndpoint=https://myaccount.queue.core.windows.net/;FileEndpoint=https://myaccount.file.core.windows.net/;TableEndpoint=https://myaccount.table.core.windows.net/;SharedAccessSignature=sasString

    • containerName: string

      Container name.

    • blobName: string

      Blob name.

    • Optional options: StoragePipelineOptions

    Returns BlockBlobClient

  • Creates an instance of BlockBlobClient. This method accepts an encoded URL or non-encoded URL pointing to a block blob. Encoded URL string will NOT be escaped twice, only special characters in URL path will be escaped. If a blob name includes ? or %, blob name must be encoded in the URL.

    memberof

    BlockBlobClient

    Parameters

    Returns BlockBlobClient

  • Creates an instance of BlockBlobClient. This method accepts an encoded URL or non-encoded URL pointing to a block blob. Encoded URL string will NOT be escaped twice, only special characters in URL path will be escaped. If a blob name includes ? or %, blob name must be encoded in the URL.

    memberof

    BlockBlobClient

    Parameters

    Returns BlockBlobClient

Properties

accountName

accountName: string

credential

credential: StorageSharedKeyCredential | AnonymousCredential | TokenCredential

Such as AnonymousCredential, StorageSharedKeyCredential or any credential from the @azure/identity package to authenticate requests to the service. You can also provide an object that implements the TokenCredential interface. If not specified, AnonymousCredential is used.

memberof

StorageClient

Protected isHttps

isHttps: boolean
memberof

StorageClient

Protected storageClientContext

storageClientContext: StorageClientContext

StorageClient is a reference to protocol layer operations entry, which is generated by AutoRest generator.

memberof

StorageClient

url

url: string

Encoded URL string value.

memberof

StorageClient

Accessors

containerName

  • get containerName(): string

name

  • get name(): string

Methods

abortCopyFromURL

beginCopyFromURL

  • Asynchronously copies a blob to a destination within the storage account. This method returns a long running operation poller that allows you to wait indefinitely until the copy is completed. You can also cancel a copy before it is completed by calling cancelOperation on the poller. Note that the onProgress callback will not be invoked if the operation completes in the first request, and attempting to cancel a completed copy will result in an error being thrown.

    In version 2012-02-12 and later, the source for a Copy Blob operation can be a committed blob in any Azure storage account. Beginning with version 2015-02-21, the source for a Copy Blob operation can be an Azure file in any Azure storage account. Only storage accounts created on or after June 7th, 2012 allow the Copy Blob operation to copy from another storage account.

    see

    https://docs.microsoft.com/en-us/rest/api/storageservices/copy-blob

    Example using automatic polling:

    const copyPoller = await blobClient.beginCopyFromURL('url');
    const result = await copyPoller.pollUntilDone();

    Example using manual polling:

    const copyPoller = await blobClient.beginCopyFromURL('url');
    while (!poller.isDone()) {
       await poller.poll();
    }
    const result = copyPoller.getResult();

    Example using progress updates:

    const copyPoller = await blobClient.beginCopyFromURL('url', {
      onProgress(state) {
        console.log(`Progress: ${state.copyProgress}`);
      }
    });
    const result = await copyPoller.pollUntilDone();

    Example using a changing polling interval (default 15 seconds):

    const copyPoller = await blobClient.beginCopyFromURL('url', {
      intervalInMs: 1000 // poll blob every 1 second for copy progress
    });
    const result = await copyPoller.pollUntilDone();

    Example using copy cancellation:

    const copyPoller = await blobClient.beginCopyFromURL('url');
    // cancel operation after starting it.
    try {
      await copyPoller.cancelOperation();
      // calls to get the result now throw PollerCancelledError
      await copyPoller.getResult();
    } catch (err) {
      if (err.name === 'PollerCancelledError') {
        console.log('The copy was cancelled.');
      }
    }

    Parameters

    Returns Promise<PollerLike<PollOperationState<BlobBeginCopyFromURLResponse>, BlobBeginCopyFromURLResponse>>

commitBlockList

createSnapshot

delete

deleteIfExists

download

  • Reads or downloads a blob from the system, including its metadata and properties. You can also call Get Blob to read a snapshot.

    • In Node.js, data returns in a Readable stream readableStreamBody
    • In browsers, data returns in a promise blobBody
    see

    https://docs.microsoft.com/en-us/rest/api/storageservices/get-blob

    memberof

    BlobClient

    Example usage (Node.js):

    // Download and convert a blob to a string
    const downloadBlockBlobResponse = await blobClient.download();
    const downloaded = await streamToBuffer(downloadBlockBlobResponse.readableStreamBody);
    console.log("Downloaded blob content:", downloaded.toString());
    
    async function streamToBuffer(readableStream) {
    return new Promise((resolve, reject) => {
    const chunks = [];
    readableStream.on("data", (data) => {
    chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
    resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
    });
    }

    Example usage (browser):

    // Download and convert a blob to a string
    const downloadBlockBlobResponse = await blobClient.download();
    const downloaded = await blobToString(await downloadBlockBlobResponse.blobBody);
    console.log(
      "Downloaded blob content",
      downloaded
    );
    
    async function blobToString(blob: Blob): Promise<string> {
      const fileReader = new FileReader();
      return new Promise<string>((resolve, reject) => {
        fileReader.onloadend = (ev: any) => {
          resolve(ev.target!.result);
        };
        fileReader.onerror = reject;
        fileReader.readAsText(blob);
      });
    }

    Parameters

    • Default value offset: number = 0
    • Optional count: undefined | number
    • Default value options: BlobDownloadOptions = {}

    Returns Promise<BlobDownloadResponseParsed>

downloadToBuffer

  • downloadToBuffer(offset?: undefined | number, count?: undefined | number, options?: BlobDownloadToBufferOptions): Promise<Buffer>
  • downloadToBuffer(buffer: Buffer, offset?: undefined | number, count?: undefined | number, options?: BlobDownloadToBufferOptions): Promise<Buffer>
  • ONLY AVAILABLE IN NODE.JS RUNTIME.

    Downloads an Azure Blob in parallel to a buffer. Offset and count are optional, downloads the entire blob if they are not provided.

    Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For blobs larger than this size, consider downloadToFile.

    export

    Parameters

    • Optional offset: undefined | number

      From which position of the block blob to download(in bytes)

    • Optional count: undefined | number
    • Optional options: BlobDownloadToBufferOptions

    Returns Promise<Buffer>

  • ONLY AVAILABLE IN NODE.JS RUNTIME.

    Downloads an Azure Blob in parallel to a buffer. Offset and count are optional, downloads the entire blob if they are not provided.

    Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For blobs larger than this size, consider downloadToFile.

    export

    Parameters

    • buffer: Buffer

      Buffer to be fill, must have length larger than count

    • Optional offset: undefined | number

      From which position of the block blob to download(in bytes)

    • Optional count: undefined | number
    • Optional options: BlobDownloadToBufferOptions

    Returns Promise<Buffer>

downloadToFile

  • ONLY AVAILABLE IN NODE.JS RUNTIME.

    Downloads an Azure Blob to a local file. Fails if the the given file path already exits. Offset and count are optional, pass 0 and undefined respectively to download the entire blob.

    memberof

    BlobClient

    Parameters

    • filePath: string
    • Default value offset: number = 0
    • Optional count: undefined | number
    • Default value options: BlobDownloadOptions = {}

    Returns Promise<BlobDownloadResponseParsed>

    The response data for blob download operation, but with readableStreamBody set to undefined since its content is already read and written into a local file at the specified path.

exists

  • Returns true if the Azure blob resource represented by this client exists; false otherwise.

    NOTE: use this function with care since an existing blob might be deleted by other clients or applications. Vice versa new blobs might be added by other clients or applications after this function completes.

    memberof

    BlobClient

    Parameters

    Returns Promise<boolean>

generateSasUrl

getAppendBlobClient

getBlobLeaseClient

getBlockBlobClient

getBlockList

getPageBlobClient

getProperties

getTags

query

  • ONLY AVAILABLE IN NODE.JS RUNTIME.

    Quick query for a JSON or CSV formatted blob.

    Example usage (Node.js):

    // Query and convert a blob to a string
    const queryBlockBlobResponse = await blockBlobClient.query("select * from BlobStorage");
    const downloaded = (await streamToBuffer(queryBlockBlobResponse.readableStreamBody)).toString();
    console.log("Query blob content:", downloaded);
    
    async function streamToBuffer(readableStream) {
      return new Promise((resolve, reject) => {
        const chunks = [];
        readableStream.on("data", (data) => {
          chunks.push(data instanceof Buffer ? data : Buffer.from(data));
        });
        readableStream.on("end", () => {
          resolve(Buffer.concat(chunks));
        });
        readableStream.on("error", reject);
      });
    }
    memberof

    BlockBlobClient

    Parameters

    Returns Promise<BlobDownloadResponseModel>

setAccessTier

setHTTPHeaders

setMetadata

setTags

  • Sets tags on the underlying blob. A blob can have up to 10 tags. Tag keys must be between 1 and 128 characters. Tag values must be between 0 and 256 characters. Valid tag key and value characters include lower and upper case letters, digits (0-9), space (' '), plus ('+'), minus ('-'), period ('.'), foward slash ('/'), colon (':'), equals ('='), and underscore ('_').

    memberof

    BlobClient

    Parameters

    Returns Promise<BlobSetTagsResponse>

stageBlock

stageBlockFromURL

syncCopyFromURL

syncUploadFromURL

undelete

upload

  • Creates a new block blob, or updates the content of an existing block blob. Updating an existing block blob overwrites any existing metadata on the blob. Partial updates are not supported; the content of the existing blob is overwritten with the new content. To perform a partial update of a block blob's, use stageBlock and commitBlockList.

    This is a non-parallel uploading method, please use uploadFile, uploadStream or uploadBrowserData for better performance with concurrency uploading.

    see

    https://docs.microsoft.com/rest/api/storageservices/put-blob

    memberof

    BlockBlobClient

    Example usage:

    const content = "Hello world!";
    const uploadBlobResponse = await blockBlobClient.upload(content, content.length);

    Parameters

    • body: HttpRequestBody

      Blob, string, ArrayBuffer, ArrayBufferView or a function which returns a new Readable stream whose offset is from data source beginning.

    • contentLength: number

      Length of body in bytes. Use Buffer.byteLength() to calculate body length for a string including non non-Base64/Hex-encoded characters.

    • Default value options: BlockBlobUploadOptions = {}

    Returns Promise<BlockBlobUploadResponse>

    Response data for the Block Blob Upload operation.

uploadBrowserData

  • ONLY AVAILABLE IN BROWSERS.

    Uploads a browser Blob/File/ArrayBuffer/ArrayBufferView object to block blob.

    When buffer length <= 256MB, this method will use 1 upload call to finish the upload. Otherwise, this method will call stageBlock to upload blocks, and finally call commitBlockList to commit the block list.

    deprecated

    Use uploadData instead.

    export
    memberof

    BlockBlobClient

    Parameters

    • browserData: Blob | ArrayBuffer | ArrayBufferView

      Blob, File, ArrayBuffer or ArrayBufferView

    • Default value options: BlockBlobParallelUploadOptions = {}

    Returns Promise<BlobUploadCommonResponse>

    Response data for the Blob Upload operation.

uploadData

uploadFile

  • ONLY AVAILABLE IN NODE.JS RUNTIME.

    Uploads a local file in blocks to a block blob.

    When file size <= 256MB, this method will use 1 upload call to finish the upload. Otherwise, this method will call stageBlock to upload blocks, and finally call commitBlockList to commit the block list.

    memberof

    BlockBlobClient

    Parameters

    Returns Promise<BlobUploadCommonResponse>

    Response data for the Blob Upload operation.

uploadStream

  • ONLY AVAILABLE IN NODE.JS RUNTIME.

    Uploads a Node.js Readable stream into block blob.

    PERFORMANCE IMPROVEMENT TIPS:

    • Input stream highWaterMark is better to set a same value with bufferSize parameter, which will avoid Buffer.concat() operations.
    memberof

    BlockBlobClient

    Parameters

    • stream: Readable

      Node.js Readable stream

    • Default value bufferSize: number = DEFAULT_BLOCK_BUFFER_SIZE_BYTES

      Size of every buffer allocated, also the block size in the uploaded block blob. Default value is 8MB

    • Default value maxConcurrency: number = 5

      Max concurrency indicates the max number of buffers that can be allocated, positive correlation with max uploading concurrency. Default value is 5

    • Default value options: BlockBlobUploadStreamOptions = {}

    Returns Promise<BlobUploadCommonResponse>

    Response data for the Blob Upload operation.

withSnapshot

  • Creates a new BlockBlobClient object identical to the source but with the specified snapshot timestamp. Provide "" will remove the snapshot and return a URL to the base blob.

    memberof

    BlockBlobClient

    Parameters

    • snapshot: string

      The snapshot timestamp.

    Returns BlockBlobClient

    A new BlockBlobClient object identical to the source but with the specified snapshot timestamp.

withVersion

  • Creates a new BlobClient object pointing to a version of this blob. Provide "" will remove the versionId and return a Client to the base blob.

    memberof

    BlobClient

    Parameters

    • versionId: string

      The versionId.

    Returns BlobClient

    A new BlobClient object pointing to the version of this blob.

Generated using TypeDoc