Skip to content

Created AI-powered-insights.md #1679

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
101 changes: 101 additions & 0 deletions doc-output/configure-the-report-engine/aiclient-element.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
---
title: AIClient Element
page_title: AIClient Element Configuration
description: "Learn how to utilize the AIClient Element to configure the AI model used for GenAI-powered insights during report preview"
slug: telerikreporting/aiclient-element
tags: aiclient,element, ai
published: True
position: 13
---

<style>
table th:first-of-type {
width: 10%;
}
table th:nth-of-type(2) {
width: 90%;
}
</style>

# AIClient Element Overview

The `AIClient` element specifies the configuration settings for the GenAI-powered insights functionality of Telerik Reporting. It is used to connect the Reporting engine to a local or remote LLM, as well as configure the behavior of the built-in Reporting AI capabilities.

## Attributes and Elements

__`<AIClient>` element__

| | |
| ------ | ------ |
|Attributes|<ul><li>__friendlyName__ - Required string attribute. Specifies the name that corresponds to the type of AI client to be used. The names of the currently supported AI client types are: `MicrosoftExtensionsAzureAIInference`, `MicrosoftExtensionsAzureOpenAI`, `MicrosoftExtensionsOllama`, and `MicrosoftExtensionsOpenAI`.</li><li>__model__ - Required string attribute. Specifies the AI model to be used for generating responses. For example, setting the model to "gpt-4o-mini" indicates that the GPT-4 model variant is being utilized.</li><li>__endpoint__ - Optional string attribute. If set, specifies the URL of the AI service endpoint.</li><li>__credential__ - Optional string attribute. If set, specifies the authentication credentials used to access the AI service.</li><li>__requireConsent__ - Optional boolean attribute _(true by default)_. Determines whether users must explicitly consent to the use of AI services before the AI report insights features can be utilized within the application.</li><li>__allowCustomPrompts__ - Optional boolean attribute _(true by default)_. Determines whether users are allowed to freely communicate with the AI model. If the switch is set to false, custom queries are forbidden and only the predefined prompts can be used.</li></ul>|
|Child Elements|<ul><li>__predefinedPrompts__ - Optional element. Defines a list of predefined prompts that the AI client can use.</li></ul>|
|Parent Element|__Telerik.Reporting__ - Configures all settings that the Telerik Reporting Engine uses.|

__`<predefinedPrompts>` element__

| | |
| ------ | ------ |
|Attributes|None|
|Child Elements|<ul><li>__add__ - Optional element. Adds a prompt to the list of predefined prompts.</li></ul>|
|Parent Element|__AIClient__|

__`<add>` element__

| | |
| ------ | ------ |
|Attributes|__text__ - The text of a predetermined AI prompt.|
|Child Elements|None|
|Parent Element|__predefinedPrompts__|

## Example

The following code example demonstrates how to configure the Reporting engine with an Azure OpenAI client that uses the `gpt-4o-mini` model. In addition, the AI functionality is restricted to using only a couple of predefined prompts for summarizing and translating the report.

XML-based configuration file:

````XML
<?xml version="1.0"?>
<configuration>
<configSections>
<section name="Telerik.Reporting" type="Telerik.Reporting.Configuration.ReportingConfigurationSection, Telerik.Reporting" allowLocation="true" allowDefinition="Everywhere" />
</configSections>
<Telerik.Reporting>
<AIClient
friendlyName="MicrosoftExtensionsAzureOpenAI"
model="gpt-4o-mini"
endpoint="https://ai-explorations.openai.azure.com/"
credential="..."
requireConsent="true"
allowCustomPrompts="false">
<predefinedPrompts>
<add text ="Generate an executive summary of this report."/>
<add text ="Translate the document in German."/>
</predefinedPrompts>
</AIClient>
</Telerik.Reporting>
...
</configuration>
````

JSON-based configuration file:

````JSON
"telerikReporting": {
"AIClient": {
"friendlyName": "MicrosoftExtensionsAzureOpenAI",
"model": "gpt-4o-mini",
"endpoint": "https://ai-explorations.openai.azure.com/",
"credential": "...",
"requireConsent": true,
"allowCustomPrompts": false,
"predefinedPrompts": [
{ "text": "Generate an executive summary of this report." },
{ "text": "Translate the document in German." }
]
}
}
````

> When adding the `Telerik.Reporting` section manually, do not forget to register it in `configSections` element of the configuration file. Failing to do so will result in a [ConfigurationErrorsException](https://learn.microsoft.com/en-us/dotnet/api/system.configuration.configurationerrorsexception?view=dotnet-plat-ext-7.0) with the following text: *Configuration system failed to initialize*.

## See Also
96 changes: 96 additions & 0 deletions interactivity/AI-powered-insights.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
---
title: AI-powered insights in Report Preview
page_title: AI-powered insights in Report Preview
description: "Learn how to implement a prompt UI as part of the Web report viewer"
slug: telerikreporting/designing-reports/adding-interactivity-to-reports/ai-powered-insights
tags: telerik, reporting, ai,
published: True
position: 1
---

# AI-powered insights Overview

The AI-powered insights in Report Preview provide comprehensive capabilities, including response generation, prompt creation, AI output interaction, and execution of predefined commands.

## OpenAI Implementations

* Ask AI: This functionality enables users to pose questions to the AI, facilitating interactive and dynamic responses based on the provided document context.

* Output: This feature generates outputs from the AI, including summaries, highlights, and other predefined commands, enhancing the overall productivity and efficiency of the report viewer.

## Configure the AI

| Setting | Description |
| ------ | ------ |
|friendlyName|This setting specifies the name corresponding to the type of AI client you wish to use. For example, setting friendlyName to "MicrosoftExtensionsAzureOpenAI" indicates that the Azure OpenAI client is being utilized|
|model|This setting specifies the AI model to be used for generating responses. For example, setting the model to "gpt-4o-mini" indicates that the GPT-4 model variant is being utilized|
|endpoint|This setting specifies the URL of the AI service endpoint|
|credential|This setting specifies the authentication credentials required to access the AI service. It ensures that the AI client can securely connect to the specified endpoint|
|requireConsent|A boolean configuration switch that determines whether users must explicitly consent to the use of AI models before the AI report insights features can be utilized within the application|
|allowCustomPrompts|This setting is set to false by default. If you set it to `True`, you will not be allowed to ask anything except the predefined prompts. For example, if you write "Hi" it will throw an exception|
|predefinedPrompts|This setting specifies a list of predefined prompts that the AI client can use. Each prompt is defined by a text attribute, which contains the prompt's content|

__AI clients__

We have four available options for the `friendlyName` setting

| | |
| ------ | ------ |
|Microsoft.Extensions.AI.AzureAIInference|"MicrosoftExtensionsAzureAIInference"|
|Microsoft.Extensions.AI.OpenAI + Azure.AI.OpenAI|"MicrosoftExtensionsAzureOpenAI"|
|Microsoft.Extensions.AI.Ollama|"MicrosoftExtensionsOllama"|
|Microsoft.Extensions.AI.OpenAI|"MicrosoftExtensionsOpenAI"|

````JSON
{
"telerikReporting": {
"AIClient": {
"friendlyName": "MicrosoftExtensionsAzureOpenAI",
"model": "gpt-4o-mini",
"endpoint": "https://ai-explorations.openai.azure.com/",
"credential": "...",
"requireConsent": false,
"allowCustomPrompts": false,
"predefinedPrompts": [
{ "text": "Prompt 1" },
{ "text": "Prompt 2" }
]
}
}
}
````

````XML
<Telerik.Reporting>
<AIClient
friendlyName="MicrosoftExtensionsAzureOpenAI"
model="gpt-4o-mini"
endpoint="https://ai-explorations.openai.azure.com/"
credential="..."
requireConsent="false"
allowCustomPrompts="false">
<predefinedPrompts>
<add text="Prompt 1" />
<add text="Prompt 2" />
</predefinedPrompts>
</AIClient>
</Telerik.Reporting>
````
## Extensibility

* If necessary, the Reporting engine can use a custom Telerik.Reporting.AI.IClient implementation, which can be registered in the Reporting REST Service configuration:

````C#
builder.Services.TryAddSingleton<IReportServiceConfiguration>(sp => new ReportServiceConfiguration
{
HostAppId = "MyApp",
AIClientFactory = GetCustomAIClient,
...
});
static Telerik.Reporting.AI.IClient GetCustomAIClient()
{
return new MyCustomAIClient(...);
}
````

* The configured predefined prompts can be modified at runtime by overriding the "UpdateAIPrompts" method of the ReportsController class.