Large Language Models (LLM)
Overview
The Xshield platform supports integration with Large Language Models (LLMs) to enhance its capabilities with Generative AI features. By connecting to external LLM providers, Xshield can leverage advanced natural language processing for various security and operational tasks.
SaaS Default Behavior
For SaaS tenants, Generative AI capabilities are enabled by default using a platform-provided LLM. No additional configuration is required to use these standard features.
Bring Your Own LLM (BYO-LLM) & On-Premises
This integration page is intended for customers who wish to:
- Override the default SaaS LLM with their own provider.
- Enable Generative AI for On-Premises deployments where a default LLM is not provided.
- Enforce strict privacy or governance policies by using a private or self-hosted model (e.g., via Azure OpenAI).
Disabling Generative AI
Generative AI capabilities can be disabled at the tenant level. If your organization's compliance or privacy policies prohibit the use of Generative AI, please contact your administrator or support to disable this feature for your tenant.
Currently, Xshield supports integration with the following providers for BYO-LLM scenarios:
- OpenAI
- Azure OpenAI
Prerequisites
Before configuring the integration, ensure you have the necessary accounts and credentials from your LLM provider.
OpenAI
- An active OpenAI account.
- A valid API Key.
- The name of the Model you wish to use (e.g., gpt-4, gpt-3.5-turbo).
Azure OpenAI
- An active Azure subscription with OpenAI Service enabled.
- A valid Azure API Key.
- The Azure Endpoint URL.
- The Azure Deployment name.
- The Azure API Version.
Integration
The following procedure outlines how to configure the LLM integration in Xshield.
- Go to the Integrations page in Xshield.
- Navigate to the GenAI category of integrations.
- Select the provider you wish to configure (OpenAI or Azure OpenAI).
Configuring OpenAI
- Select OpenAI as the provider.
- Enter the API Key generated from your OpenAI account dashboard.
- Specify the Model to be used.
- Click Save to validate the configuration and enable the integration.
Configuring Azure OpenAI
- Select Azure OpenAI as the provider.
- Enter the Azure API Key from your Azure portal.
- Enter the Azure Endpoint URL.
- Specify the Azure Deployment name corresponding to your deployed model.
- Specify the Azure API Version.
- Click Save to validate the configuration and enable the integration.
Once enabled, Xshield will utilize the configured LLM provider for its Generative AI features.