Creates an instance of DataLakeServiceClient from url.
A Client string pointing to Azure Storage data lake service, such as "https://myaccount.dfs.core.windows.net". You can append a SAS if using AnonymousCredential, such as "https://myaccount.dfs.core.windows.net?sasString".
Creates an instance of DataLakeServiceClient from url and pipeline.
A Client string pointing to Azure Storage data lake service, such as "https://myaccount.dfs.core.windows.net". You can append a SAS if using AnonymousCredential, such as "https://myaccount.dfs.core.windows.net?sasString".
Call newPipeline() to create a default pipeline, or provide a customized pipeline.
Encoded URL string value for corresponding blob endpoint.
Such as AnonymousCredential, StorageSharedKeyCredential or any credential from the @azure/identity package to authenticate requests to the service. You can also provide an object that implements the TokenCredential interface. If not specified, AnonymousCredential is used.
Encoded URL string value for corresponding dfs endpoint.
StorageClient is a reference to protocol layer operations entry, which is generated by AutoRest generator.
storageClientContextWithBlobEndpoint is a reference to protocol layer operations entry, which is generated by AutoRest generator, with its url pointing to the Blob endpoint.
Encoded URL string value.
Only available for DataLakeServiceClient constructed with a shared key credential.
Generates an account Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.
Optional. The time at which the shared access signature becomes invalid. Default to an hour later if not specified.
An account SAS URI consisting of the URI to the resource represented by this client, followed by the generated SAS token.
Creates a DataLakeFileSystemClient object.
File system name.
ONLY AVAILABLE WHEN USING BEARER TOKEN AUTHENTICATION (TokenCredential).
Retrieves a user delegation key for the Data Lake service. This is only a valid operation when using bearer token authentication.
The start time for the user delegation SAS. Must be within 7 days of the current time.
The end time for the user delegation SAS. Must be within 7 days of the current time.
Returns an async iterable iterator to list all the file systems under the specified account.
.byPage() returns an async iterable iterator to list the file systems in pages.
Example using for await
syntax:
let i = 1;
for await (const fileSystem of serviceClient.listFileSystems()) {
console.log(`FileSystem ${i++}: ${fileSystem.name}`);
}
Example using iter.next()
:
let i = 1;
const iter = serviceClient.listFileSystems();
let fileSystemItem = await iter.next();
while (!fileSystemItem.done) {
console.log(`FileSystem ${i++}: ${fileSystemItem.value.name}`);
fileSystemItem = await iter.next();
}
Example using byPage()
:
// passing optional maxPageSize in the page settings
let i = 1;
for await (const response of serviceClient.listFileSystems().byPage({ maxPageSize: 20 })) {
if (response.fileSystemItems) {
for (const fileSystem of response.fileSystemItems) {
console.log(`FileSystem ${i++}: ${fileSystem.name}`);
}
}
}
Example using paging with a marker:
let i = 1;
let iterator = serviceClient.listFileSystems().byPage({ maxPageSize: 2 });
let response = (await iterator.next()).value;
// Prints 2 file system names
if (response.fileSystemItems) {
for (const fileSystem of response.fileSystemItems) {
console.log(`FileSystem ${i++}: ${fileSystem.name}`);
}
}
// Gets next marker
let marker = response.continuationToken;
// Passing next marker as continuationToken
iterator = serviceClient
.listContainers()
.byPage({ continuationToken: marker, maxPageSize: 10 });
response = (await iterator.next()).value;
// Prints 10 file system names
if (response.fileSystemItems) {
for (const fileSystem of response.fileSystemItems) {
console.log(`FileSystem ${i++}: ${fileSystem.name}`);
}
}
Generated using TypeDoc
DataLakeServiceClient allows you to manipulate Azure Data Lake service resources and file systems. The storage account provides the top-level namespace for the Data Lake service.