Azure DataLake service client library for Python¶
This preview package for Python includes ADLS Gen2 specific API support made available in Storage SDK. This includes:
New directory level operations (Create, Rename, Delete) for hierarchical namespace enabled (HNS) storage account. For HNS enabled accounts, the rename/move operations are atomic.
Permission related operations (Get/Set ACLs) for hierarchical namespace enabled (HNS) accounts.
Install the package¶
Install the Azure DataLake Storage client library for Python with pip:
pip install azure-storage-file-datalake --pre
Create a storage account¶
# Create a new resource group to hold the storage account - # if using an existing resource group, skip this step az group create --name my-resource-group --location westus2 # Install the extension 'Storage-Preview' az extension add --name storage-preview # Create the storage account az storage account create --name my-storage-account-name --resource-group my-resource-group --sku Standard_LRS --kind StorageV2 --hierarchical-namespace true
Authenticate the client¶
Interaction with DataLake Storage starts with an instance of the DataLakeServiceClient class. You need an existing storage account, its URL, and a credential to instantiate the client object.
To authenticate the client you have a few options:
Use a SAS token string
Use an account shared access key
Use a token credential from azure.identity
Alternatively, you can authenticate with a storage connection string using the
from_connection_string method. See example: Client creation with a connection string.
You can omit the credential if your account URL already has a SAS token.
Once you have your account URL and credentials ready, you can create the DataLakeServiceClient:
from azure.storage.filedatalake import DataLakeServiceClient service = DataLakeServiceClient(account_url="https://<my-storage-account-name>.dfs.core.windows.net/", credential=credential)
DataLake storage offers four types of resources:
The storage account
A file system in the storage account
A directory under the file system
A file in a the file system or under directory
The DataLake Storage SDK provides four different clients to interact with the DataLake Service:
- DataLakeServiceClient - this client interacts with the DataLake Service at the account level.
It provides operations to retrieve and configure the account properties as well as list, create, and delete file systems within the account. For operations relating to a specific file system, directory or file, clients for those entities can also be retrieved using the
- FileSystemClient - this client represents interaction with a specific
file system, even if that file system does not exist yet. It provides operations to create, delete, or configure file systems and includes operations to list paths under file system, upload, and delete file or directory in the file system. For operations relating to a specific file, the client can also be retrieved using the
get_file_clientfunction. For operations relating to a specific directory, the client can be retrieved using the
- DataLakeDirectoryClient - this client represents interaction with a specific
directory, even if that directory does not exist yet. It provides directory operations create, delete, rename, get properties and set properties operations.
- DataLakeFileClient - this client represents interaction with a specific
file, even if that file does not exist yet. It provides file operations to append data, flush data, delete, create, and read file.
- DataLakeLeaseClient - this client represents lease interactions with a FileSystemClient, DataLakeDirectoryClient
or DataLakeFileClient. It provides operations to acquire, renew, release, change, and break leases on the resources.
The following sections provide several code snippets covering some of the most common Storage DataLake tasks, including:
Client creation with a connection string¶
Create the DataLakeServiceClient using the connection string to your Azure Storage account.
from azure.storage.filedatalake import DataLakeServiceClient service = DataLakeServiceClient.from_connection_string(conn_str="my_connection_string")
Uploading a file¶
Upload a file to your file system.
from azure.storage.filedatalake import DataLakeFileClient data = b"abc" file = DataLakeFileClient.from_connection_string("my_connection_string", file_system_name="myfilesystem", file_path="myfile") file.create_file () file.append_data(data, offset=0, length=len(data)) file.flush_data(len(data))
Downloading a file¶
Download a file from your file system.
from azure.storage.filedatalake import DataLakeFileClient file = DataLakeFileClient.from_connection_string("my_connection_string", file_system_name="myfilesystem", file_path="myfile") with open("./BlockDestination.txt", "wb") as my_file: download = file.download_file() download.readinto(my_file)
List the paths in your file system.
from azure.storage.filedatalake import FileSystemClient file_system = FileSystemClient.from_connection_string("my_connection_string", file_system_name="myfilesystem") paths = file_system.get_paths() for path in paths: print(path.name + '\n')
Optional keyword arguments that can be passed in at the client and per-operation level.
Retry Policy configuration¶
Use the following keyword arguments when instantiating a client to configure the retry policy:
retry_total (int): Total number of retries to allow. Takes precedence over other counts. Pass in
retry_total=0if you do not want to retry on requests. Defaults to 10.
retry_connect (int): How many connection-related errors to retry on. Defaults to 3.
retry_read (int): How many times to retry on read errors. Defaults to 3.
retry_status (int): How many times to retry on bad status codes. Defaults to 3.
retry_to_secondary (bool): Whether the request should be retried to secondary, if able. This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled. Defaults to
Other client / per-operation configuration¶
Other optional configuration keyword arguments that can be specified on the client or per-operation.
Client keyword arguments:
connection_timeout (int): Optionally sets the connect and read timeout value, in seconds.
transport (Any): User-provided transport to send the HTTP request.
Per-operation keyword arguments:
raw_response_hook (callable): The given callback uses the response returned from the service.
raw_request_hook (callable): The given callback uses the request before being sent to service.
client_request_id (str): Optional user specified identification of the request.
user_agent (str): Appends the custom value to the user-agent header to be sent with the request.
logging_enable (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at the client level to enable it for all requests.
logging_body (bool): Enables logging the request and response body. Defaults to False. Can also be passed in at the client level to enable it for all requests.
headers (dict): Pass in custom headers as key, value pairs. E.g.
DataLake Storage clients raise exceptions defined in Azure Core.
This list can be used for reference to catch thrown exceptions. To get the specific error code of the exception, use the
error_code attribute, i.e,
This library uses the standard logging library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level.
Detailed DEBUG level logging, including request/response bodies and unredacted
headers, can be enabled on a client with the
import sys import logging from azure.storage.filedatalake import DataLakeServiceClient # Create a logger for the 'azure.storage.filedatalake' SDK logger = logging.getLogger('azure.storage') logger.setLevel(logging.DEBUG) # Configure a console output handler = logging.StreamHandler(stream=sys.stdout) logger.addHandler(handler) # This client will log detailed information about its HTTP sessions, at DEBUG level service_client = DataLakeServiceClient.from_connection_string("your_connection_string", logging_enable=True)
logging_enable can enable detailed logging for a single operation,
even when it isn’t enabled for the client:
More sample code¶
Get started with our Azure DataLake samples.
Several DataLake Storage Python SDK samples are available to you in the SDK’s GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with DataLake Storage:
``datalake_samples_access_control.py` <https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.6.0b2/sdk/storage/azure-storage-file-datalake/samples/datalake_samples_access_control.py>`_ - Examples for common DataLake Storage tasks:
Set up a file system
Create a directory
Set/Get access control for the directory
Create files under the directory
Set/Get access control for each file
Delete file system
``datalake_samples_upload_download.py` <https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.6.0b2/sdk/storage/azure-storage-file-datalake/samples/datalake_samples_upload_download.py>`_ - Examples for common DataLake Storage tasks:
Set up a file system
Append data to the file
Flush data to the file
Download the uploaded data
Delete file system
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.