Azure Cognitive Language Services Conversations client library for .NET
Azure Conversations - the new version of Language Understanding (LUIS) - is a cloud-based conversational AI service that applies custom machine-learning intelligence to a user's conversational, natural language text to predict overall meaning; and pulls out relevant, detailed information. The service utilizes state-of-the-art technology to create and utilize natively multilingual models, which means that users would be able to train their models in one language but predict in others.
Source code | Package (NuGet) | API reference documentation | Product documentation | Samples
Getting started
Install the package
Install the Azure Cognitive Language Services Conversations client library for .NET with NuGet:
dotnet add package Azure.AI.Language.Conversations --prerelease
Prerequisites
- An Azure subscription
- An existing Text Analytics resource
Note: the new unified Cognitive Language Services are not currently available for deployment.
Authenticate the client
In order to interact with the Conversations service, you'll need to create an instance of the ConversationAnalysisClient
class. You will need an endpoint, and an API key to instantiate a client object. For more information regarding authenticating with Cognitive Services, see Authenticate requests to Azure Cognitive Services.
Get an API key
You can get the endpoint and an API key from the Cognitive Services resource in the Azure Portal.
Alternatively, use the Azure CLI command shown below to get the API key from the Cognitive Service resource.
az cognitiveservices account keys list --resource-group <resource-group-name> --name <resource-name>
Create a ConversationAnalysisClient
Once you've determined your endpoint and API key you can instantiate a ConversationAnalysisClient
:
Uri endpoint = new Uri("https://myaccount.api.cognitive.microsoft.com");
AzureKeyCredential credential = new AzureKeyCredential("{api-key}");
ConversationAnalysisClient client = new ConversationAnalysisClient(endpoint, credential);
Key concepts
ConversationAnalysisClient
The ConversationAnalysisClient
is the primary interface for making predictions using your deployed Conversations models. It provides both synchronous and asynchronous APIs to submit queries.
Thread safety
We guarantee that all client instance methods are thread-safe and independent of each other (guideline). This ensures that the recommendation of reusing client instances is always safe, even across threads.
Additional concepts
Client options | Accessing the response | Long-running operations | Handling failures | Diagnostics | Mocking | Client lifetime
Examples
The Azure.AI.Language.Conversations client library provides both synchronous and asynchronous APIs.
The following examples show common scenarios using the client
[created above](#Create a ConversationAnalysisClient).
Analyze a conversation
To analyze a conversation, we can then call the client.AnalyzeConversation()
method which takes the project name, deployment name, and query as parameters.
Response<AnalyzeConversationResult> response = client.AnalyzeConversation(
"Menu",
"production",
"We'll have 2 plates of seared salmon nigiri.");
Console.WriteLine($"Top intent: {response.Value.Prediction.TopIntent}");
The specified parameters can also be used to initialize a ConversationAnalysisOptions
instance. You can then call AnalyzeConversation()
using the options object as a parameter as shown below.
AnalyzeConversationOptions options = new AnalyzeConversationOptions(
"Menu",
"production",
"We'll have 2 plates of seared salmon nigiri.");
Response<AnalyzeConversationResult> response = client.AnalyzeConversation(options);
Console.WriteLine($"Top intent: {response.Value.Prediction.TopIntent}");
Analyze a conversation in a different language
The language property in the ConversationAnalysisOptions
can be used to specify the language of the conversation.
AnalyzeConversationOptions options = new AnalyzeConversationOptions(
"Menu",
"production",
"Tendremos 2 platos de nigiri de salmón braseado.")
{
Language = "es"
};
Response<AnalyzeConversationResult> response = client.AnalyzeConversation(options);
Console.WriteLine($"Top intent: {response.Value.Prediction.TopIntent}");
Other optional properties can be set such as verbosity and whether service logging is enabled.
Troubleshooting
General
When you interact with the Cognitive Language Services Conversations client library using the .NET SDK, errors returned by the service correspond to the same HTTP status codes returned for REST API requests.
For example, if you submit a query to a non-existant project, a 400
error is returned indicating "Bad Request".
try
{
Response<AnalyzeConversationResult> response = client.AnalyzeConversation(
"invalid-project",
"production",
"We'll have 2 plates of seared salmon nigiri.");
}
catch (RequestFailedException ex)
{
Console.WriteLine(ex.ToString());
}
You will notice that additional information is logged, like the client request ID of the operation.
Azure.RequestFailedException: The input parameter is invalid.
Status: 400 (Bad Request)
ErrorCode: InvalidArgument
Content:
{
"error": {
"code": "InvalidArgument",
"message": "The input parameter is invalid.",
"innerError": {
"code": "InvalidArgument",
"message": "The input parameter \"payload\" cannot be null or empty."
}
}
}
Headers:
Transfer-Encoding: chunked
pragma: no-cache
request-id: 0303b4d0-0954-459f-8a3d-1be6819745b5
apim-request-id: 0303b4d0-0954-459f-8a3d-1be6819745b5
x-envoy-upstream-service-time: 15
Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
x-content-type-options: nosniff
Cache-Control: no-store, proxy-revalidate, no-cache, max-age=0, private
Content-Type: application/json
Setting up console logging
The simplest way to see the logs is to enable console logging. To create an Azure SDK log listener that outputs messages to the console use the AzureEventSourceListener.CreateConsoleLogger
method.
// Setup a listener to monitor logged events.
using AzureEventSourceListener listener = AzureEventSourceListener.CreateConsoleLogger();
To learn more about other logging mechanisms see here.
Next steps
- View our samples.
- Read about the different features of the Conversations service.
- Try our service demos.
Contributing
See the CONTRIBUTING.md for details on building, testing, and contributing to this library.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.