.. role:: raw-html-m2r(raw) :format: html Azure Schema Registry Avro Serializer client library for Python =============================================================== Azure Schema Registry is a schema repository service hosted by Azure Event Hubs, providing schema storage, versioning, and management. This package provides an Avro serializer capable of serializing and deserializing payloads containing Schema Registry schema identifiers and Avro-encoded data. `Source code `_ | `Package (PyPi) `_ | `API reference documentation `_ | `Samples `_ | `Changelog `_ *Disclaimer* ---------------- *Azure SDK Python packages support for Python 2.7 is ending 01 January 2022. For more information and questions, please refer to https://github.com/Azure/azure-sdk-for-python/issues/20691* Getting started --------------- Install the package ^^^^^^^^^^^^^^^^^^^ Install the Azure Schema Registry Avro Serializer client library and Azure Identity client library for Python with `pip `_\ : .. code-block:: Bash pip install azure-schemaregistry-avroserializer azure-identity Prerequisites: ^^^^^^^^^^^^^^ To use this package, you must have: * Azure subscription - `Create a free account `_ * `Azure Schema Registry `_ * Python 2.7, 3.6 or later - `Install Python `_ Authenticate the client ^^^^^^^^^^^^^^^^^^^^^^^ Interaction with the Schema Registry Avro Serializer starts with an instance of AvroSerializer class, which takes the schema group name and the `Schema Registry Client `_ class. The client constructor takes the Event Hubs fully qualified namespace and and Azure Active Directory credential: * The fully qualified namespace of the Schema Registry instance should follow the format: ``.servicebus.windows.net``. * An AAD credential that implements the `TokenCredential `_ protocol should be passed to the constructor. There are implementations of the ``TokenCredential`` protocol available in the `azure-identity package `_. To use the credential types provided by ``azure-identity``\ , please install the Azure Identity client library for Python with `pip `_\ : .. code-block:: Bash pip install azure-identity * Additionally, to use the async API supported on Python 3.6+, you must first install an async transport, such as `aiohttp `_\ : .. code-block:: Bash pip install aiohttp **Create AvroSerializer using the azure-schemaregistry library:** .. code-block:: python from azure.schemaregistry import SchemaRegistryClient from azure.schemaregistry.serializer.avroserializer import AvroSerializer from azure.identity import DefaultAzureCredential credential = DefaultAzureCredential() # Namespace should be similar to: '.servicebus.windows.net' fully_qualified_namespace = '<< FULLY QUALIFIED NAMESPACE OF THE SCHEMA REGISTRY >>' group_name = '<< GROUP NAME OF THE SCHEMA >>' schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, credential) serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) Key concepts ------------ AvroSerializer ^^^^^^^^^^^^^^ Provides API to serialize to and deserialize from Avro Binary Encoding plus a header with schema ID. Uses `SchemaRegistryClient `_ to get schema IDs from schema content or vice versa. Message format ^^^^^^^^^^^^^^ The same format is used by schema registry serializers across Azure SDK languages. Messages are encoded as follows: * 4 bytes: Format Indicator * Currently always zero to indicate format below. * 32 bytes: Schema ID * UTF-8 hexadecimal representation of GUID. * 32 hex digits, no hyphens. * Same format and byte order as string from Schema Registry service. * Remaining bytes: Avro payload (in general, format-specific payload) * Avro Binary Encoding * NOT Avro Object Container File, which includes the schema and defeats the purpose of this serialzer to move the schema out of the message payload and into the schema registry. Examples -------- The following sections provide several code snippets covering some of the most common Schema Registry tasks, including: * `Serialization <#serialization>`_ * `Deserialization <#deserialization>`_ * `Event Hubs Sending Integration <#event-hubs-sending-integration>`_ * `Event Hubs Receiving Integration <#event-hubs-receiving-integration>`_ Serialization ^^^^^^^^^^^^^ Use ``AvroSerializer.serialize`` method to serialize dict data with the given avro schema. The method would use a schema previously registered to the Schema Registry service and keep the schema cached for future serialization usage. It is also possible to avoid pre-registering the schema to the service and automatically register with the ``serialize`` method by instantiating the ``AvroSerializer`` with the keyword argument ``auto_register_schemas=True``. .. code-block:: python import os from azure.schemaregistry import SchemaRegistryClient from azure.schemaregistry.serializer.avroserializer import AvroSerializer from azure.identity import DefaultAzureCredential token_credential = DefaultAzureCredential() fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] group_name = "" name = "example.avro.User" format = "Avro" definition = """ {"namespace": "example.avro", "type": "record", "name": "User", "fields": [ {"name": "name", "type": "string"}, {"name": "favorite_number", "type": ["int", "null"]}, {"name": "favorite_color", "type": ["string", "null"]} ] }""" schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) schema_register_client.register(group_name, name, definition, format) serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) with serializer: dict_data = {"name": "Ben", "favorite_number": 7, "favorite_color": "red"} encoded_bytes = serializer.serialize(dict_data, schema=definition) Deserialization ^^^^^^^^^^^^^^^ Use ``AvroSerializer.deserialize`` method to deserialize raw bytes into dict data. The method automatically retrieves the schema from the Schema Registry Service and keeps the schema cached for future deserialization usage. .. code-block:: python import os from azure.schemaregistry import SchemaRegistryClient from azure.schemaregistry.serializer.avroserializer import AvroSerializer from azure.identity import DefaultAzureCredential token_credential = DefaultAzureCredential() fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] group_name = "" schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) with serializer: encoded_bytes = b'' decoded_data = serializer.deserialize(encoded_bytes) Event Hubs Sending Integration ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Integration with `Event Hubs `_ to send serialized avro dict data as the body of EventData. .. code-block:: python import os from azure.eventhub import EventHubProducerClient, EventData from azure.schemaregistry import SchemaRegistryClient from azure.schemaregistry.serializer.avroserializer import AvroSerializer from azure.identity import DefaultAzureCredential token_credential = DefaultAzureCredential() fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] group_name = "" eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR'] eventhub_name = os.environ['EVENT_HUB_NAME'] definition = """ {"namespace": "example.avro", "type": "record", "name": "User", "fields": [ {"name": "name", "type": "string"}, {"name": "favorite_number", "type": ["int", "null"]}, {"name": "favorite_color", "type": ["string", "null"]} ] }""" schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) avro_serializer = AvroSerializer(client=schema_registry_client, group_name=group_name, auto_register_schemas=True) eventhub_producer = EventHubProducerClient.from_connection_string( conn_str=eventhub_connection_str, eventhub_name=eventhub_name ) with eventhub_producer, avro_serializer: event_data_batch = eventhub_producer.create_batch() dict_data = {"name": "Bob", "favorite_number": 7, "favorite_color": "red"} payload_bytes = avro_serializer.serialize(dict_data, schema=definition) event_data_batch.add(EventData(body=payload_bytes)) eventhub_producer.send_batch(event_data_batch) Event Hubs Receiving Integration ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Integration with `Event Hubs `_ to receive ``EventData`` and deserialized raw bytes into avro dict data. .. code-block:: python import os from azure.eventhub import EventHubConsumerClient from azure.schemaregistry import SchemaRegistryClient from azure.schemaregistry.serializer.avroserializer import AvroSerializer from azure.identity import DefaultAzureCredential token_credential = DefaultAzureCredential() fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] group_name = "" eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR'] eventhub_name = os.environ['EVENT_HUB_NAME'] schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) avro_serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) eventhub_consumer = EventHubConsumerClient.from_connection_string( conn_str=eventhub_connection_str, consumer_group='$Default', eventhub_name=eventhub_name, ) def on_event(partition_context, event): bytes_payload = b"".join(b for b in event.body) deserialized_data = avro_serializer.deserialize(bytes_payload) with eventhub_consumer, avro_serializer: eventhub_consumer.receive(on_event=on_event, starting_position="-1") Troubleshooting --------------- General ^^^^^^^ Azure Schema Registry Avro Serializer raise exceptions defined in `Azure Core `_. Logging ^^^^^^^ This library uses the standard `logging `_ library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level. Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the ``logging_enable`` argument: .. code-block:: python import sys import logging from azure.schemaregistry import SchemaRegistryClient from azure.schemaregistry.serializer.avroserializer import AvroSerializer from azure.identity import DefaultAzureCredential # Create a logger for the SDK logger = logging.getLogger('azure.schemaregistry') logger.setLevel(logging.DEBUG) # Configure a console output handler = logging.StreamHandler(stream=sys.stdout) logger.addHandler(handler) credential = DefaultAzureCredential() schema_registry_client = SchemaRegistryClient("", credential, logging_enable=True) # This client will log detailed information about its HTTP sessions, at DEBUG level serializer = AvroSerializer(client=schema_registry_client, group_name="") Similarly, ``logging_enable`` can enable detailed logging for a single operation, even when it isn't enabled for the client: .. code-block:: py serializer.serialize(dict_data, schema=schema_definition, logging_enable=True) Next steps ---------- More sample code ^^^^^^^^^^^^^^^^ Please find further examples in the `samples `_ directory demonstrating common Azure Schema Registry Avro Serializer scenarios. Contributing ------------ This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. This project has adopted the `Microsoft Open Source Code of Conduct `_. For more information see the `Code of Conduct FAQ `_ or contact `opencode@microsoft.com `_ with any additional questions or comments. :raw-html-m2r:`` Indices and tables ------------------ * :ref:`genindex` * :ref:`modindex` * :ref:`search` .. toctree:: :maxdepth: 5 :glob: :caption: Developer Documentation azure.schemaregistry.serializer.rst