Skip to main content
The SDK supports 9 built-in LLM providers and allows custom provider configuration for any OpenAI or Anthropic-compatible endpoint.

Model String Format

All models use the format provider/model-id:
"anthropic/claude-sonnet-4-20250514"
"openai/gpt-4o"
"google/gemini-1.5-pro"

Built-in Providers

Anthropic

PropertyValue
Prefixanthropic/
Environment VariableANTHROPIC_API_KEY

Models

Model IDDescription
claude-sonnet-4-20250514Latest Sonnet (recommended)
claude-3-5-sonnet-20241022Claude 3.5 Sonnet
claude-3-haiku-20240307Fast, efficient
claude-3-opus-20240229Most capable

Example

const agent = new TestAgent({
  tools,
  model: "anthropic/claude-sonnet-4-20250514",
  apiKey: process.env.ANTHROPIC_API_KEY,
});

OpenAI

PropertyValue
Prefixopenai/
Environment VariableOPENAI_API_KEY

Models

Model IDDescription
gpt-4oLatest GPT-4 Omni (recommended)
gpt-4o-miniFast, cost-effective
gpt-4-turboGPT-4 Turbo
gpt-4Original GPT-4
gpt-3.5-turboFast, economical

Example

const agent = new TestAgent({
  tools,
  model: "openai/gpt-4o",
  apiKey: process.env.OPENAI_API_KEY,
});

Azure OpenAI

PropertyValue
Prefixazure/
Environment VariablesAZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT

Environment Variables

VariableRequiredDescription
AZURE_OPENAI_API_KEYYesAPI key
AZURE_OPENAI_ENDPOINTYesEndpoint URL
AZURE_OPENAI_API_VERSIONNoAPI version

Example

// Set environment variables first:
// AZURE_OPENAI_API_KEY=...
// AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com

const agent = new TestAgent({
  tools,
  model: "azure/gpt-4o",
  apiKey: process.env.AZURE_OPENAI_API_KEY,
});

Google (Gemini)

PropertyValue
Prefixgoogle/
Environment VariableGOOGLE_GENERATIVE_AI_API_KEY

Models

Model IDDescription
gemini-1.5-proMost capable (recommended)
gemini-1.5-flashFast, efficient
gemini-proOriginal Gemini Pro

Example

const agent = new TestAgent({
  tools,
  model: "google/gemini-1.5-pro",
  apiKey: process.env.GOOGLE_GENERATIVE_AI_API_KEY,
});

Mistral

PropertyValue
Prefixmistral/
Environment VariableMISTRAL_API_KEY

Models

Model IDDescription
mistral-large-latestMost capable
mistral-medium-latestBalanced
mistral-small-latestFast, economical
open-mistral-7bOpen source
open-mixtral-8x7bOpen source MoE

Example

const agent = new TestAgent({
  tools,
  model: "mistral/mistral-large-latest",
  apiKey: process.env.MISTRAL_API_KEY,
});

DeepSeek

PropertyValue
Prefixdeepseek/
Environment VariableDEEPSEEK_API_KEY

Models

Model IDDescription
deepseek-chatGeneral chat
deepseek-coderCode-focused

Example

const agent = new TestAgent({
  tools,
  model: "deepseek/deepseek-chat",
  apiKey: process.env.DEEPSEEK_API_KEY,
});

Ollama (Local)

PropertyValue
Prefixollama/
Default URLhttp://localhost:11434

Models

Any locally installed model:
  • llama3
  • codellama
  • mistral
  • mixtral
  • etc.
Ensure Ollama is running locally before use. The API key parameter is required but not used—pass any string.

Example

const agent = new TestAgent({
  tools,
  model: "ollama/llama3",
  apiKey: "ollama", // Required but not used
});

OpenRouter

PropertyValue
Prefixopenrouter/
Environment VariableOPENROUTER_API_KEY
Access many models through one API:
Model IDDescription
anthropic/claude-3-opusClaude 3 Opus via OpenRouter
openai/gpt-4-turboGPT-4 Turbo via OpenRouter
google/gemini-proGemini via OpenRouter
meta-llama/llama-3-70b-instructLlama 3 70B

Example

const agent = new TestAgent({
  tools,
  model: "openrouter/anthropic/claude-3-opus",
  apiKey: process.env.OPENROUTER_API_KEY,
});

xAI (Grok)

PropertyValue
Prefixxai/
Environment VariableXAI_API_KEY

Models

Model IDDescription
grok-betaGrok beta

Example

const agent = new TestAgent({
  tools,
  model: "xai/grok-beta",
  apiKey: process.env.XAI_API_KEY,
});

Custom Providers

Add your own OpenAI or Anthropic-compatible endpoints.

CustomProvider Type

interface CustomProvider {
  name: string;
  protocol: "openai-compatible" | "anthropic-compatible";
  baseUrl: string;
  modelIds: string[];
  useChatCompletions?: boolean;
  apiKeyEnvVar?: string;
}

Properties

PropertyTypeRequiredDescription
namestringYesProvider identifier (used in model string)
protocolstringYes"openai-compatible" or "anthropic-compatible"
baseUrlstringYesAPI endpoint URL
modelIdsstring[]YesAvailable model IDs
useChatCompletionsbooleanNoUse /chat/completions endpoint
apiKeyEnvVarstringNoCustom env var name for API key

OpenAI-Compatible Example

const agent = new TestAgent({
  tools,
  model: "my-openai/gpt-4",
  apiKey: process.env.MY_API_KEY,
  customProviders: {
    "my-openai": {
      name: "my-openai",
      protocol: "openai-compatible",
      baseUrl: "https://api.my-provider.com/v1",
      modelIds: ["gpt-4", "gpt-3.5-turbo"],
    },
  },
});

Anthropic-Compatible Example

const agent = new TestAgent({
  tools,
  model: "my-anthropic/claude-3-sonnet",
  apiKey: process.env.MY_API_KEY,
  customProviders: {
    "my-anthropic": {
      name: "my-anthropic",
      protocol: "anthropic-compatible",
      baseUrl: "https://api.my-provider.com",
      modelIds: ["claude-3-sonnet", "claude-3-haiku"],
    },
  },
});

LiteLLM Example

const agent = new TestAgent({
  tools,
  model: "litellm/gpt-4",
  apiKey: process.env.LITELLM_API_KEY,
  customProviders: {
    litellm: {
      name: "litellm",
      protocol: "openai-compatible",
      baseUrl: "http://localhost:8000",
      modelIds: ["gpt-4", "claude-3-sonnet", "gemini-pro"],
      useChatCompletions: true, // Required for LiteLLM
    },
  },
});

Helper Functions

parseLLMString()

Parse a model string into provider and model ID.
import { parseLLMString } from "@mcpjam/sdk";

parseLLMString(modelString: string): { provider: string; modelId: string }

Example

const { provider, modelId } = parseLLMString("anthropic/claude-sonnet-4-20250514");
// provider: "anthropic"
// modelId: "claude-sonnet-4-20250514"

createModelFromString()

Create a Vercel AI SDK model directly.
import { createModelFromString } from "@mcpjam/sdk";

createModelFromString(
  llmString: string,
  options: CreateModelOptions
): ProviderLanguageModel

Example

const model = createModelFromString(
  "openai/gpt-4o",
  { apiKey: process.env.OPENAI_API_KEY }
);

createCustomProvider()

Create a custom provider configuration object.
import { createCustomProvider } from "@mcpjam/sdk";

createCustomProvider(config: CustomProvider): CustomProvider

Example

const provider = createCustomProvider({
  name: "my-litellm",
  protocol: "openai-compatible",
  baseUrl: "http://localhost:8000",
  modelIds: ["gpt-4", "claude-3-sonnet"],
  useChatCompletions: true,
});

Environment Variables Summary

ProviderVariable
AnthropicANTHROPIC_API_KEY
OpenAIOPENAI_API_KEY
AzureAZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT
GoogleGOOGLE_GENERATIVE_AI_API_KEY
MistralMISTRAL_API_KEY
DeepSeekDEEPSEEK_API_KEY
OpenRouterOPENROUTER_API_KEY
xAIXAI_API_KEY