Connect WordPress 7.0+ to Azure AI Foundry for text generation, image generation, embeddings, and more.
Works with WordPress 7 RC2. Tested using WordPress AI (regenerate title, regenerate summary and generate new feature image). Text to speech tested using the Talking Head plugin.
- AI Client integration — registers as a WordPress 7.0 AI provider, usable via
wp_ai_client_prompt()and Settings → Connectors. - OpenAI-compatible — uses the Azure AI Foundry
/chat/completionsendpoint which follows the OpenAI chat format. - Capability detection — auto-detects deployed models and capabilities (text generation, chat history, image generation, embeddings, text-to-speech) by probing the Azure endpoint.
- Multiple endpoint types — supports Azure AI Services (
.services.ai.azure.com), Azure OpenAI (.openai.azure.com), and Cognitive Services (.cognitiveservices.azure.com). - Auto-detection — discovers all deployed models via POST-based probing. No manual model name or API version configuration needed.
- Custom authentication — sends the
api-keyheader required by Azure (instead ofAuthorization: Bearer). - Endpoint validation — validates Azure endpoint URLs and shows inline errors for invalid URLs.
- Environment variable fallback — every setting can be overridden via environment variables or
wp-config.phpconstants. - Connectors page UI — custom React-based connector on the Settings → Connectors page with fields for API key and endpoint URL. Detected deployments and capabilities displayed as read-only chips.
- How to Build an AI Provider Plugin for WordPress 7 — deep-dive into provider registration, settings, authentication, and the Connectors page UI.
- WordPress 7.0 or later
- PHP 8.3+
- An Azure AI Foundry resource with a deployed model
- Download
ai-provider-for-azure-ai-foundry.zip - Upload via
Plugins → Add New → Upload Plugin - Activate via
WordPress Admin → Plugins
From WordPress.org
- Go to Plugins > Add New
- Search for "AI Provider for Azure AI Foundry"
- Click Install Now and Activate
Then:
- Go to Settings → Connectors and configure the Azure AI Foundry connector:
- API Key — your Azure AI Foundry API key.
- Endpoint URL — e.g.
https://my-resource.services.ai.azure.com/api/projects/PROJECT-NAME.
- Click Connect & Detect — the plugin probes your endpoint, discovers deployed models, and saves the configuration automatically.
Settings can also be provided via environment variables or constants in wp-config.php:
| Setting | Environment Variable | wp-config.php Constant |
|---|---|---|
| API Key | AZURE_AI_FOUNDRY_API_KEY |
AZURE_AI_FOUNDRY_API_KEY |
| Endpoint | AZURE_AI_FOUNDRY_ENDPOINT |
AZURE_AI_FOUNDRY_ENDPOINT |
| Model Names | AZURE_AI_FOUNDRY_MODEL |
AZURE_AI_FOUNDRY_MODEL |
| Capabilities | AZURE_AI_FOUNDRY_CAPABILITIES |
AZURE_AI_FOUNDRY_CAPABILITIES |
Model names and capabilities are normally auto-detected. Use these overrides only when you need to pin specific values. Model names accept comma-separated deployment names, e.g. gpt-4.1,gpt-image-1. Capabilities accept a comma-separated string, e.g. text_generation,chat_history,image_generation.
Once configured, the provider is available to any code using the WordPress AI Client:
// Text generation
$text = wp_ai_client_prompt( 'Explain gravity in one sentence.' )->generate_text();
echo $text;
// Image generation
$image = wp_ai_client_prompt( 'A tiny blue cat on a cloud' )->generate_image();
// Text-to-speech
$audio = wp_ai_client_prompt( 'Hello world' )->convert_text_to_speech();npm install
npm run build # Production build
npm run start # Watch modenpm run test # Run Vitest tests
npm run test:watch # Interactive watch modeazure-ai-foundry/
├── ai-provider-for-azure-ai-foundry.php ← Main plugin file
├── src/
│ ├── autoload.php ← PSR-4 autoloader
│ ├── Provider/ ← AI Client provider
│ ├── Models/ ← Text, image, embedding & TTS models
│ ├── Metadata/ ← Model metadata & capabilities
│ ├── Http/ ← api-key authentication
│ ├── Rest/ ← REST API (capability detection)
│ ├── Settings/ ← Connector settings + manager
│ └── js/connectors.js ← Connectors page UI (source)
├── build/connectors.js ← Compiled ESM module
├── tests/js/ ← Vitest tests
├── webpack.config.js ← ESM output config
└── vitest.config.js ← Test config
GPL-2.0-or-later