The Azure Monitor Ingestion client library is used to send custom logs to Azure Monitor.
This library allows you to send data from virtually any source to supported built-in tables or to custom tables that you create in Log Analytics workspace. You can even extend the schema of built-in tables with custom columns.
Resources:
Install the Azure Monitor Ingestion client library for JS with npm:
npm install @azure/monitor-ingestion
An authenticated client is required to ingest data. To authenticate, create an instance of a TokenCredential class (see @azure/identity for DefaultAzureCredential
and other TokenCredential
implementations). Pass it to the constructor of your client class.
To authenticate, the following example uses DefaultAzureCredential
from the @azure/identity package:
import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient } from "@azure/monitor-ingestion";
import * as dotenv from "dotenv";
dotenv.config();
const logsIngestionEndpoint = process.env.LOGS_INGESTION_ENDPOINT || "logs_ingestion_endpoint";
const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential);
Data Collection Endpoints (DCEs) allow you to uniquely configure ingestion settings for Azure Monitor. This article provides an overview of data collection endpoints including their contents and structure and how you can create and work with them.
Data collection rules (DCR) define data collected by Azure Monitor and specify how and where that data should be sent or stored. The REST API call must specify a DCR to use. A single DCE can support multiple DCRs, so you can specify a different DCR for different sources and target tables.
The DCR must understand the structure of the input data and the structure of the target table. If the two don't match, it can use a transformation to convert the source data to match the target table. You may also use the transform to filter source data and perform any other calculations or conversions.
For more details, refer to Data collection rules in Azure Monitor.
Custom logs can send data to any custom table that you create and to certain built-in tables in your Log Analytics workspace. The target table must exist before you can send data to it. The following built-in tables are currently supported:
You can familiarize yourself with different APIs using Samples.
You can create a client and call the client's Upload
method. Take note of the data ingestion limits.
const { DefaultAzureCredential } = require("@azure/identity");
const { LogsIngestionClient } = require("@azure/monitor-ingestion");
require("dotenv").config();
async function main() {
const logsIngestionEndpoint = process.env.LOGS_INGESTION_ENDPOINT || "logs_ingestion_endpoint";
const ruleId = process.env.DATA_COLLECTION_RULE_ID || "data_collection_rule_id";
const streamName = process.env.STREAM_NAME || "data_stream_name";
const credential = new DefaultAzureCredential();
const client = new LogsIngestionClient(logsIngestionEndpoint, credential);
const logs = [
{
Time: "2021-12-08T23:51:14.1104269Z",
Computer: "Computer1",
AdditionalContext: "context-2",
},
{
Time: "2021-12-08T23:51:14.1104269Z",
Computer: "Computer2",
AdditionalContext: "context",
},
];
const result = await client.upload(ruleId, streamName, logs);
if (result.uploadStatus !== "Success") {
console.log("Some logs have failed to complete ingestion. Upload status=", result.uploadStatus);
for (const errors of result.errors) {
console.log(`Error - ${JSON.stringify(errors.responseError)}`);
console.log(`Log - ${JSON.stringify(errors.failedLogs)}`);
}
}
}
main().catch((err) => {
console.error("The sample encountered an error:", err);
process.exit(1);
});
module.exports = { main };
You can verify that your data has been uploaded correctly by using the @azure/monitor-query library. Run the Upload custom logs sample first before verifying the logs.
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT license.
/**
* @summary Demonstrates how to run query against a Log Analytics workspace to verify if the logs were uploaded
*/
const { DefaultAzureCredential } = require("@azure/identity");
const { LogsQueryClient } = require("@azure/monitor-query");
const monitorWorkspaceId = process.env.MONITOR_WORKSPACE_ID || "workspace_id";
const tableName = process.env.TABLE_NAME || "table_name";
require("dotenv").config();
async function main() {
const credential = new DefaultAzureCredential();
const logsQueryClient = new LogsQueryClient(credential);
const queriesBatch = [
{
workspaceId: monitorWorkspaceId,
query: tableName + " | count;",
timespan: { duration: "P1D" },
},
];
const result = await logsQueryClient.queryBatch(queriesBatch);
if (result[0].status === "Success") {
console.log("Table entry count: ", JSON.stringify(result[0].tables));
} else {
console.log(
`Some error encountered while retrieving the count. Status = ${result[0].status}`,
JSON.stringify(result[0])
);
}
}
main().catch((err) => {
console.error("The sample encountered an error:", err);
process.exit(1);
});
module.exports = { main };
Enabling logging may help uncover useful information about failures. To see a log of HTTP requests and responses, set the AZURE_LOG_LEVEL
environment variable to info
. Alternatively, logging can be enabled at runtime by calling setLogLevel
in the @azure/logger
:
import { setLogLevel } from "@azure/logger";
setLogLevel("info");
For detailed instructions on how to enable logs, see the @azure/logger package docs.
To learn more about Azure Monitor, see the Azure Monitor service documentation.
If you'd like to contribute to this library, please read the contributing guide to learn more about how to build and test the code.
Generated using TypeDoc