Skip to main content
Added in: 5.14.0 Prowler Lighthouse AI supports multiple Large Language Model (LLM) providers, offering flexibility to choose the provider that best fits infrastructure, compliance requirements, and cost considerations. This guide explains how to configure and use different LLM providers with Lighthouse AI.

Supported Providers

Lighthouse AI supports the following LLM providers:
  • OpenAI: Provides access to GPT models (GPT-4o, GPT-4, etc.)
  • Amazon Bedrock: Offers AWS-hosted access to Claude, Llama, Titan, and other models
  • OpenAI Compatible: Supports custom endpoints like OpenRouter, Ollama, or any OpenAI-compatible service

Model Requirements

For Lighthouse AI to work properly, models must support all of the following capabilities:
  • Text input: Ability to receive text prompts.
  • Text output: Ability to generate text responses.
  • Tool calling: Ability to invoke tools and functions.
If any of these capabilities are missing, the model will not be compatible with Lighthouse AI.

How Default Providers Work

All three providers can be configured for a tenant, but only one can be set as the default provider. The first configured provider automatically becomes the default. When visiting Lighthouse AI chat, the default provider’s default model loads automatically. Users can switch to any available LLM model (including those from non-default providers) using the dropdown in chat. Switch models in Lighthouse AI chat interface

Configuring Providers

Navigate to ConfigurationLighthouse AI to see all three provider options with a Connect button under each. Prowler Lighthouse Configuration

Connecting a Provider

To connect a provider:
  1. Click Connect under the desired provider
  2. Enter the required credentials
  3. Select a default model for that provider
  4. Click Connect to save

Required Information

  • API Key: OpenAI API key (starts with sk- or sk-proj-). API keys can be created from the OpenAI platform.

Before Connecting

  • Ensure the OpenAI account has sufficient credits.
  • Verify that the gpt-5 model (recommended for Lighthouse AI) is not blocked in the OpenAI organization settings.

Changing the Default Provider

To set a different provider as default:
  1. Navigate to ConfigurationLighthouse AI
  2. Click Configure under the desired provider to set as default
  3. Click Set as Default
Set default LLM provider

Updating Provider Credentials

To update credentials for a connected provider:
  1. Navigate to ConfigurationLighthouse AI
  2. Click Configure under the provider
  3. Enter the new credentials
  4. Click Update

Deleting a Provider

To remove a configured provider:
  1. Navigate to ConfigurationLighthouse AI
  2. Click Configure under the provider
  3. Click Delete

Model Recommendations

For best results with Lighthouse AI, the recommended model is gpt-5 from OpenAI. Models from other providers such as Amazon Bedrock and OpenAI Compatible endpoints can be connected and used, but performance is not guaranteed. Ensure that any selected model supports text input, text output, and tool calling capabilities.

Getting Help

For issues or suggestions, reach out through our Slack channel.