feat: Add Official Microsoft & Gemini Skills (845+ Total)
🚀 Impact Significantly expands the capabilities of **Antigravity Awesome Skills** by integrating official skill collections from **Microsoft** and **Google Gemini**. This update increases the total skill count to **845+**, making the library even more comprehensive for AI coding assistants. ✨ Key Changes 1. New Official Skills - **Microsoft Skills**: Added a massive collection of official skills from [microsoft/skills](https://github.com/microsoft/skills). - Includes Azure, .NET, Python, TypeScript, and Semantic Kernel skills. - Preserves the original directory structure under `skills/official/microsoft/`. - Includes plugin skills from the `.github/plugins` directory. - **Gemini Skills**: Added official Gemini API development skills under `skills/gemini-api-dev/`. 2. New Scripts & Tooling - **`scripts/sync_microsoft_skills.py`**: A robust synchronization script that: - Clones the official Microsoft repository. - Preserves the original directory heirarchy. - Handles symlinks and plugin locations. - Generates attribution metadata. - **`scripts/tests/inspect_microsoft_repo.py`**: Debug tool to inspect the remote repository structure. - **`scripts/tests/test_comprehensive_coverage.py`**: Verification script to ensure 100% of skills are captured during sync. 3. Core Improvements - **`scripts/generate_index.py`**: Enhanced frontmatter parsing to safely handle unquoted values containing `@` symbols and commas (fixing issues with some Microsoft skill descriptions). - **`package.json`**: Added `sync:microsoft` and `sync:all-official` scripts for easy maintenance. 4. Documentation - Updated `README.md` to reflect the new skill counts (845+) and added Microsoft/Gemini to the provider list. - Updated `CATALOG.md` and `skills_index.json` with the new skills. 🧪 Verification - Ran `scripts/tests/test_comprehensive_coverage.py` to verify all Microsoft skills are detected. - Validated `generate_index.py` fixes by successfully indexing the new skills.
This commit is contained in:
191
CATALOG.md
191
CATALOG.md
@@ -1,10 +1,10 @@
|
||||
# Skill Catalog
|
||||
|
||||
Generated at: 2026-02-08T00:00:00.000Z
|
||||
Generated at: 2026-02-11T14:27:51.213Z
|
||||
|
||||
Total skills: 715
|
||||
Total skills: 844
|
||||
|
||||
## architecture (63)
|
||||
## architecture (64)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -51,6 +51,7 @@ Total skills: 715
|
||||
| `n8n-mcp-tools-expert` | Expert guide for using n8n-mcp MCP tools effectively. Use when searching for nodes, validating configurations, accessing templates, managing workflows, or us... | n8n, mcp | n8n, mcp, effectively, searching, nodes, validating, configurations, accessing, managing, any, provides, selection |
|
||||
| `nestjs-expert` | Nest.js framework expert specializing in module architecture, dependency injection, middleware, guards, interceptors, testing with Jest/Supertest, TypeORM/Mo... | nestjs | nestjs, nest, js, framework, specializing, module, architecture, dependency, injection, middleware, guards, interceptors |
|
||||
| `nx-workspace-patterns` | Configure and optimize Nx monorepo workspaces. Use when setting up Nx, configuring project boundaries, optimizing build caching, or implementing affected com... | nx, workspace | nx, workspace, configure, optimize, monorepo, workspaces, setting, up, configuring, boundaries, optimizing, caching |
|
||||
| `official/microsoft/plugins/wiki-architect` | Analyzes code repositories and generates hierarchical documentation structures with onboarding guides. Use when the user wants to create a wiki, generate doc... | official/microsoft/plugins/wiki | official/microsoft/plugins/wiki, wiki, architect, analyzes, code, repositories, generates, hierarchical, documentation, structures, onboarding, guides |
|
||||
| `on-call-handoff-patterns` | Master on-call shift handoffs with context transfer, escalation procedures, and documentation. Use when transitioning on-call responsibilities, documenting s... | on, call, handoff | on, call, handoff, shift, handoffs, context, transfer, escalation, procedures, documentation, transitioning, responsibilities |
|
||||
| `parallel-agents` | Multi-agent orchestration patterns. Use when multiple independent tasks can run with different domain expertise or when comprehensive analysis requires multi... | parallel, agents | parallel, agents, multi, agent, orchestration, multiple, independent, tasks, run, different, domain, expertise |
|
||||
| `powershell-windows` | PowerShell Windows patterns. Critical pitfalls, operator syntax, error handling. | powershell, windows | powershell, windows, critical, pitfalls, operator, syntax, error, handling |
|
||||
@@ -115,7 +116,7 @@ Total skills: 715
|
||||
| `team-composition-analysis` | This skill should be used when the user asks to "plan team structure", "determine hiring needs", "design org chart", "calculate compensation", "plan equity a... | team, composition | team, composition, analysis, skill, should, used, user, asks, plan, structure, determine, hiring |
|
||||
| `whatsapp-automation` | Automate WhatsApp Business tasks via Rube MCP (Composio): send messages, manage templates, upload media, and handle contacts. Always search tools first for c... | whatsapp | whatsapp, automation, automate, business, tasks, via, rube, mcp, composio, send, messages, upload |
|
||||
|
||||
## data-ai (99)
|
||||
## data-ai (153)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -181,6 +182,78 @@ Total skills: 715
|
||||
| `nextjs-app-router-patterns` | Master Next.js 14+ App Router with Server Components, streaming, parallel routes, and advanced data fetching. Use when building Next.js applications, impleme... | nextjs, app, router | nextjs, app, router, next, js, 14, server, components, streaming, parallel, routes, data |
|
||||
| `nextjs-best-practices` | Next.js App Router principles. Server Components, data fetching, routing patterns. | nextjs, best, practices | nextjs, best, practices, next, js, app, router, principles, server, components, data, fetching |
|
||||
| `nodejs-backend-patterns` | Build production-ready Node.js backend services with Express/Fastify, implementing middleware patterns, error handling, authentication, database integration,... | nodejs, backend | nodejs, backend, node, js, express, fastify, implementing, middleware, error, handling, authentication, database |
|
||||
| `official/microsoft/dotnet/data/cosmosdb` | Azure Resource Manager SDK for Cosmos DB in .NET. Use for MANAGEMENT PLANE operations: creating/managing Cosmos DB accounts, databases, containers, throughpu... | official/microsoft/dotnet/data/cosmosdb | official/microsoft/dotnet/data/cosmosdb, azure, resource, manager, cosmosdb, dotnet, sdk, cosmos, db, net, plane, operations |
|
||||
| `official/microsoft/dotnet/data/mysql` | Azure MySQL Flexible Server SDK for .NET. Database management for MySQL Flexible Server deployments. Use for creating servers, databases, firewall rules, con... | official/microsoft/dotnet/data/mysql | official/microsoft/dotnet/data/mysql, azure, resource, manager, mysql, dotnet, flexible, server, sdk, net, database, deployments |
|
||||
| `official/microsoft/dotnet/data/postgresql` | Azure PostgreSQL Flexible Server SDK for .NET. Database management for PostgreSQL Flexible Server deployments. Use for creating servers, databases, firewall ... | official/microsoft/dotnet/data/postgresql | official/microsoft/dotnet/data/postgresql, azure, resource, manager, postgresql, dotnet, flexible, server, sdk, net, database, deployments |
|
||||
| `official/microsoft/dotnet/data/redis` | Azure Resource Manager SDK for Redis in .NET. Use for MANAGEMENT PLANE operations: creating/managing Azure Cache for Redis instances, firewall rules, access ... | official/microsoft/dotnet/data/redis | official/microsoft/dotnet/data/redis, azure, resource, manager, redis, dotnet, sdk, net, plane, operations, creating, managing |
|
||||
| `official/microsoft/dotnet/data/sql` | Azure Resource Manager SDK for Azure SQL in .NET. Use for MANAGEMENT PLANE operations: creating/managing SQL servers, databases, elastic pools, firewall rule... | official/microsoft/dotnet/data/sql | official/microsoft/dotnet/data/sql, azure, resource, manager, sql, dotnet, sdk, net, plane, operations, creating, managing |
|
||||
| `official/microsoft/dotnet/foundry/document-intelligence` | Azure AI Document Intelligence SDK for .NET. Extract text, tables, and structured data from documents using prebuilt and custom models. Use for invoice proce... | official/microsoft/dotnet/foundry/document, intelligence | official/microsoft/dotnet/foundry/document, intelligence, azure, ai, document, dotnet, sdk, net, extract, text, tables, structured |
|
||||
| `official/microsoft/dotnet/foundry/openai` | Azure OpenAI SDK for .NET. Client library for Azure OpenAI and OpenAI services. Use for chat completions, embeddings, image generation, audio transcription, ... | official/microsoft/dotnet/foundry/openai | official/microsoft/dotnet/foundry/openai, azure, ai, openai, dotnet, sdk, net, client, library, chat, completions, embeddings |
|
||||
| `official/microsoft/dotnet/foundry/projects` | Azure AI Projects SDK for .NET. High-level client for Azure AI Foundry projects including agents, connections, datasets, deployments, evaluations, and indexe... | official/microsoft/dotnet/foundry/projects | official/microsoft/dotnet/foundry/projects, azure, ai, dotnet, sdk, net, high, level, client, foundry, including, agents |
|
||||
| `official/microsoft/dotnet/foundry/search-documents` | Azure AI Search SDK for .NET (Azure.Search.Documents). Use for building search applications with full-text, vector, semantic, and hybrid search. Covers Searc... | official/microsoft/dotnet/foundry/search, documents | official/microsoft/dotnet/foundry/search, documents, azure, search, dotnet, ai, sdk, net, building, applications, full, text |
|
||||
| `official/microsoft/dotnet/foundry/voicelive` | Azure AI Voice Live SDK for .NET. Build real-time voice AI applications with bidirectional WebSocket communication. Use for voice assistants, conversational ... | official/microsoft/dotnet/foundry/voicelive | official/microsoft/dotnet/foundry/voicelive, azure, ai, voicelive, dotnet, voice, live, sdk, net, real, time, applications |
|
||||
| `official/microsoft/dotnet/general/maps` | Azure Maps SDK for .NET. Location-based services including geocoding, routing, rendering, geolocation, and weather. Use for address search, directions, map t... | official/microsoft/dotnet/general/maps | official/microsoft/dotnet/general/maps, azure, maps, search, dotnet, sdk, net, location, including, geocoding, routing, rendering |
|
||||
| `official/microsoft/dotnet/messaging/eventhubs` | Azure Event Hubs SDK for .NET. Use for high-throughput event streaming: sending events (EventHubProducerClient, EventHubBufferedProducerClient), receiving ev... | official/microsoft/dotnet/messaging/eventhubs | official/microsoft/dotnet/messaging/eventhubs, azure, eventhub, dotnet, event, hubs, sdk, net, high, throughput, streaming, sending |
|
||||
| `official/microsoft/java/communication/callautomation` | Build call automation workflows with Azure Communication Services Call Automation Java SDK. Use when implementing IVR systems, call routing, call recording, ... | official/microsoft/java/communication/callautomation | official/microsoft/java/communication/callautomation, azure, communication, callautomation, java, call, automation, sdk, implementing, ivr, routing, recording |
|
||||
| `official/microsoft/java/data/blob` | Build blob storage applications with Azure Storage Blob SDK for Java. Use when uploading, downloading, or managing files in Azure Blob Storage, working with ... | official/microsoft/java/data/blob | official/microsoft/java/data/blob, azure, storage, blob, java, applications, sdk, uploading, downloading, managing, files, working |
|
||||
| `official/microsoft/java/data/cosmos` | Azure Cosmos DB SDK for Java. NoSQL database operations with global distribution, multi-model support, and reactive patterns.
|
||||
Triggers: "CosmosClient java", ... | official/microsoft/java/data/cosmos | official/microsoft/java/data/cosmos, azure, cosmos, java, db, sdk, nosql, database, operations, global, distribution, multi |
|
||||
| `official/microsoft/java/data/tables` | Build table storage applications with Azure Tables SDK for Java. Use when working with Azure Table Storage or Cosmos DB Table API for NoSQL key-value data, s... | official/microsoft/java/data/tables | official/microsoft/java/data/tables, azure, data, tables, java, table, storage, applications, sdk, working, cosmos, db |
|
||||
| `official/microsoft/java/foundry/contentsafety` | Build content moderation applications with Azure AI Content Safety SDK for Java. Use when implementing text/image analysis, blocklist management, or harm det... | official/microsoft/java/foundry/contentsafety | official/microsoft/java/foundry/contentsafety, azure, ai, contentsafety, java, content, moderation, applications, safety, sdk, implementing, text |
|
||||
| `official/microsoft/java/foundry/formrecognizer` | Build document analysis applications with Azure Document Intelligence (Form Recognizer) SDK for Java. Use when extracting text, tables, key-value pairs from ... | official/microsoft/java/foundry/formrecognizer | official/microsoft/java/foundry/formrecognizer, azure, ai, formrecognizer, java, document, analysis, applications, intelligence, form, recognizer, sdk |
|
||||
| `official/microsoft/java/foundry/projects` | Azure AI Projects SDK for Java. High-level SDK for Azure AI Foundry project management including connections, datasets, indexes, and evaluations.
|
||||
Triggers: "... | official/microsoft/java/foundry/projects | official/microsoft/java/foundry/projects, azure, ai, java, sdk, high, level, foundry, including, connections, datasets, indexes |
|
||||
| `official/microsoft/java/foundry/vision-imageanalysis` | Build image analysis applications with Azure AI Vision SDK for Java. Use when implementing image captioning, OCR text extraction, object detection, tagging, ... | official/microsoft/java/foundry/vision, imageanalysis | official/microsoft/java/foundry/vision, imageanalysis, azure, ai, vision, java, image, analysis, applications, sdk, implementing, captioning |
|
||||
| `official/microsoft/java/foundry/voicelive` | Azure AI VoiceLive SDK for Java. Real-time bidirectional voice conversations with AI assistants using WebSocket.
|
||||
Triggers: "VoiceLiveClient java", "voice ass... | official/microsoft/java/foundry/voicelive | official/microsoft/java/foundry/voicelive, azure, ai, voicelive, java, sdk, real, time, bidirectional, voice, conversations, assistants |
|
||||
| `official/microsoft/java/messaging/eventhubs` | Build real-time streaming applications with Azure Event Hubs SDK for Java. Use when implementing event streaming, high-throughput data ingestion, or building... | official/microsoft/java/messaging/eventhubs | official/microsoft/java/messaging/eventhubs, azure, eventhub, java, real, time, streaming, applications, event, hubs, sdk, implementing |
|
||||
| `official/microsoft/java/monitoring/ingestion` | Azure Monitor Ingestion SDK for Java. Send custom logs to Azure Monitor via Data Collection Rules (DCR) and Data Collection Endpoints (DCE).
|
||||
Triggers: "LogsI... | official/microsoft/java/monitoring/ingestion | official/microsoft/java/monitoring/ingestion, azure, monitor, ingestion, java, sdk, send, custom, logs, via, data, collection |
|
||||
| `official/microsoft/java/monitoring/query` | Azure Monitor Query SDK for Java. Execute Kusto queries against Log Analytics workspaces and query metrics from Azure resources.
|
||||
Triggers: "LogsQueryClient j... | official/microsoft/java/monitoring/query | official/microsoft/java/monitoring/query, azure, monitor, query, java, sdk, execute, kusto, queries, against, log, analytics |
|
||||
| `official/microsoft/python/data/cosmos` | Azure Cosmos DB SDK for Python (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
|
||||
Triggers: "cosmos db", "CosmosClient",... | official/microsoft/python/data/cosmos | official/microsoft/python/data/cosmos, azure, cosmos, py, db, sdk, python, nosql, api, document, crud, queries |
|
||||
| `official/microsoft/python/data/datalake` | Azure Data Lake Storage Gen2 SDK for Python. Use for hierarchical file systems, big data analytics, and file/directory operations.
|
||||
Triggers: "data lake", "Da... | official/microsoft/python/data/datalake | official/microsoft/python/data/datalake, azure, storage, file, datalake, py, data, lake, gen2, sdk, python, hierarchical |
|
||||
| `official/microsoft/python/data/tables` | Azure Tables SDK for Python (Storage and Cosmos DB). Use for NoSQL key-value storage, entity CRUD, and batch operations.
|
||||
Triggers: "table storage", "TableSer... | official/microsoft/python/data/tables | official/microsoft/python/data/tables, azure, data, tables, py, sdk, python, storage, cosmos, db, nosql, key |
|
||||
| `official/microsoft/python/foundry/agent-framework` | Build Azure AI Foundry agents using the Microsoft Agent Framework Python SDK (agent-framework-azure-ai). Use when creating persistent agents with AzureAIAgen... | official/microsoft/python/foundry/agent, framework | official/microsoft/python/foundry/agent, framework, agent, azure, ai, py, foundry, agents, microsoft, python, sdk, creating |
|
||||
| `official/microsoft/python/foundry/agents-v2` | Build container-based Foundry Agents using Azure AI Projects SDK with ImageBasedHostedAgentDefinition.
|
||||
Use when creating hosted agents that run custom code i... | official/microsoft/python/foundry/agents, v2 | official/microsoft/python/foundry/agents, v2, agents, py, container, foundry, azure, ai, sdk, imagebasedhostedagentdefinition, creating, hosted |
|
||||
| `official/microsoft/python/foundry/contentsafety` | Azure AI Content Safety SDK for Python. Use for detecting harmful content in text and images with multi-severity classification.
|
||||
Triggers: "azure-ai-contents... | official/microsoft/python/foundry/contentsafety | official/microsoft/python/foundry/contentsafety, azure, ai, contentsafety, py, content, safety, sdk, python, detecting, harmful, text |
|
||||
| `official/microsoft/python/foundry/contentunderstanding` | Azure AI Content Understanding SDK for Python. Use for multimodal content extraction from documents, images, audio, and video.
|
||||
Triggers: "azure-ai-contentund... | official/microsoft/python/foundry/contentunderstanding | official/microsoft/python/foundry/contentunderstanding, azure, ai, contentunderstanding, py, content, understanding, sdk, python, multimodal, extraction, documents |
|
||||
| `official/microsoft/python/foundry/ml` | Azure Machine Learning SDK v2 for Python. Use for ML workspaces, jobs, models, datasets, compute, and pipelines.
|
||||
Triggers: "azure-ai-ml", "MLClient", "worksp... | official/microsoft/python/foundry/ml | official/microsoft/python/foundry/ml, azure, ai, ml, py, machine, learning, sdk, v2, python, workspaces, jobs |
|
||||
| `official/microsoft/python/foundry/projects` | Build AI applications using the Azure AI Projects Python SDK (azure-ai-projects). Use when working with Foundry project clients, creating versioned agents wi... | official/microsoft/python/foundry/projects | official/microsoft/python/foundry/projects, azure, ai, py, applications, python, sdk, working, foundry, clients, creating, versioned |
|
||||
| `official/microsoft/python/foundry/search-documents` | Azure AI Search SDK for Python. Use for vector search, hybrid search, semantic ranking, indexing, and skillsets.
|
||||
Triggers: "azure-search-documents", "SearchC... | official/microsoft/python/foundry/search, documents | official/microsoft/python/foundry/search, documents, azure, search, py, ai, sdk, python, vector, hybrid, semantic, ranking |
|
||||
| `official/microsoft/python/foundry/textanalytics` | Azure AI Text Analytics SDK for sentiment analysis, entity recognition, key phrases, language detection, PII, and healthcare NLP. Use for natural language pr... | official/microsoft/python/foundry/textanalytics | official/microsoft/python/foundry/textanalytics, azure, ai, textanalytics, py, text, analytics, sdk, sentiment, analysis, entity, recognition |
|
||||
| `official/microsoft/python/foundry/transcription` | Azure AI Transcription SDK for Python. Use for real-time and batch speech-to-text transcription with timestamps and diarization.
|
||||
Triggers: "transcription", "... | official/microsoft/python/foundry/transcription | official/microsoft/python/foundry/transcription, azure, ai, transcription, py, sdk, python, real, time, batch, speech, text |
|
||||
| `official/microsoft/python/foundry/translation-document` | Azure AI Document Translation SDK for batch translation of documents with format preservation. Use for translating Word, PDF, Excel, PowerPoint, and other do... | official/microsoft/python/foundry/translation, document | official/microsoft/python/foundry/translation, document, azure, ai, translation, py, sdk, batch, documents, format, preservation, translating |
|
||||
| `official/microsoft/python/foundry/translation-text` | Azure AI Text Translation SDK for real-time text translation, transliteration, language detection, and dictionary lookup. Use for translating text content in... | official/microsoft/python/foundry/translation, text | official/microsoft/python/foundry/translation, text, azure, ai, translation, py, sdk, real, time, transliteration, language, detection |
|
||||
| `official/microsoft/python/foundry/vision-imageanalysis` | Azure AI Vision Image Analysis SDK for captions, tags, objects, OCR, people detection, and smart cropping. Use for computer vision and image understanding ta... | official/microsoft/python/foundry/vision, imageanalysis | official/microsoft/python/foundry/vision, imageanalysis, azure, ai, vision, py, image, analysis, sdk, captions, tags, objects |
|
||||
| `official/microsoft/python/foundry/voicelive` | Build real-time voice AI applications using Azure AI Voice Live SDK (azure-ai-voicelive). Use this skill when creating Python applications that need real-tim... | official/microsoft/python/foundry/voicelive | official/microsoft/python/foundry/voicelive, azure, ai, voicelive, py, real, time, voice, applications, live, sdk, skill |
|
||||
| `official/microsoft/python/monitoring/ingestion` | Azure Monitor Ingestion SDK for Python. Use for sending custom logs to Log Analytics workspace via Logs Ingestion API.
|
||||
Triggers: "azure-monitor-ingestion", "... | official/microsoft/python/monitoring/ingestion | official/microsoft/python/monitoring/ingestion, azure, monitor, ingestion, py, sdk, python, sending, custom, logs, log, analytics |
|
||||
| `official/microsoft/python/monitoring/query` | Azure Monitor Query SDK for Python. Use for querying Log Analytics workspaces and Azure Monitor metrics.
|
||||
Triggers: "azure-monitor-query", "LogsQueryClient", ... | official/microsoft/python/monitoring/query | official/microsoft/python/monitoring/query, azure, monitor, query, py, sdk, python, querying, log, analytics, workspaces, metrics |
|
||||
| `official/microsoft/rust/data/azure-cosmos-rust` | Azure Cosmos DB SDK for Rust (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
|
||||
Triggers: "cosmos db rust", "CosmosClien... | official/microsoft/rust/data/azure, cosmos, rust | official/microsoft/rust/data/azure, cosmos, rust, azure, db, sdk, nosql, api, document, crud, queries, containers |
|
||||
| `official/microsoft/rust/messaging/azure-eventhub-rust` | Azure Event Hubs SDK for Rust. Use for sending and receiving events, streaming data ingestion.
|
||||
Triggers: "event hubs rust", "ProducerClient rust", "ConsumerC... | official/microsoft/rust/messaging/azure, eventhub, rust | official/microsoft/rust/messaging/azure, eventhub, rust, azure, event, hubs, sdk, sending, receiving, events, streaming, data |
|
||||
| `official/microsoft/typescript/data/cosmosdb` | Azure Cosmos DB JavaScript/TypeScript SDK (@azure/cosmos) for data plane operations. Use for CRUD operations on documents, queries, bulk operations, and cont... | official/microsoft/typescript/data/cosmosdb | official/microsoft/typescript/data/cosmosdb, azure, cosmos, ts, db, javascript, typescript, sdk, data, plane, operations, crud |
|
||||
| `official/microsoft/typescript/data/postgres` | Connect to Azure Database for PostgreSQL Flexible Server from Node.js/TypeScript using the pg (node-postgres) package. Use for PostgreSQL queries, connection... | official/microsoft/typescript/data/postgres | official/microsoft/typescript/data/postgres, azure, postgres, ts, connect, database, postgresql, flexible, server, node, js, typescript |
|
||||
| `official/microsoft/typescript/foundry/contentsafety` | Analyze text and images for harmful content using Azure AI Content Safety (@azure-rest/ai-content-safety). Use when moderating user-generated content, detect... | official/microsoft/typescript/foundry/contentsafety | official/microsoft/typescript/foundry/contentsafety, azure, ai, contentsafety, ts, analyze, text, images, harmful, content, safety, rest |
|
||||
| `official/microsoft/typescript/foundry/document-intelligence` | Extract text, tables, and structured data from documents using Azure Document Intelligence (@azure-rest/ai-document-intelligence). Use when processing invoic... | official/microsoft/typescript/foundry/document, intelligence | official/microsoft/typescript/foundry/document, intelligence, azure, ai, document, ts, extract, text, tables, structured, data, documents |
|
||||
| `official/microsoft/typescript/foundry/projects` | Build AI applications using Azure AI Projects SDK for JavaScript (@azure/ai-projects). Use when working with Foundry project clients, agents, connections, de... | official/microsoft/typescript/foundry/projects | official/microsoft/typescript/foundry/projects, azure, ai, ts, applications, sdk, javascript, working, foundry, clients, agents, connections |
|
||||
| `official/microsoft/typescript/foundry/search-documents` | Build search applications using Azure AI Search SDK for JavaScript (@azure/search-documents). Use when creating/managing indexes, implementing vector/hybrid ... | official/microsoft/typescript/foundry/search, documents | official/microsoft/typescript/foundry/search, documents, azure, search, ts, applications, ai, sdk, javascript, creating, managing, indexes |
|
||||
| `official/microsoft/typescript/foundry/translation` | Build translation applications using Azure Translation SDKs for JavaScript (@azure-rest/ai-translation-text, @azure-rest/ai-translation-document). Use when i... | official/microsoft/typescript/foundry/translation | official/microsoft/typescript/foundry/translation, azure, ai, translation, ts, applications, sdks, javascript, rest, text, document, implementing |
|
||||
| `official/microsoft/typescript/foundry/voicelive` | Azure AI Voice Live SDK for JavaScript/TypeScript. Build real-time voice AI applications with bidirectional WebSocket communication. Use for voice assistants... | official/microsoft/typescript/foundry/voicelive | official/microsoft/typescript/foundry/voicelive, azure, ai, voicelive, ts, voice, live, sdk, javascript, typescript, real, time |
|
||||
| `official/microsoft/typescript/frontend/frontend-ui-dark` | Build dark-themed React applications using Tailwind CSS with custom theming, glassmorphism effects, and Framer Motion animations. Use when creating dashboard... | official/microsoft/typescript/frontend/frontend, ui, dark | official/microsoft/typescript/frontend/frontend, ui, dark, frontend, ts, themed, react, applications, tailwind, css, custom, theming |
|
||||
| `official/microsoft/typescript/messaging/eventhubs` | Build event streaming applications using Azure Event Hubs SDK for JavaScript (@azure/event-hubs). Use when implementing high-throughput event ingestion, real... | official/microsoft/typescript/messaging/eventhubs | official/microsoft/typescript/messaging/eventhubs, azure, eventhub, ts, event, streaming, applications, hubs, sdk, javascript, implementing, high |
|
||||
| `php-pro` | Write idiomatic PHP code with generators, iterators, SPL data structures, and modern OOP features. Use PROACTIVELY for high-performance PHP applications. | php | php, pro, write, idiomatic, code, generators, iterators, spl, data, structures, oop, features |
|
||||
| `postgres-best-practices` | Postgres performance optimization and best practices from Supabase. Use this skill when writing, reviewing, or optimizing Postgres queries, schema designs, o... | postgres, best, practices | postgres, best, practices, supabase, performance, optimization, skill, writing, reviewing, optimizing, queries, schema |
|
||||
| `postgresql` | Design a PostgreSQL-specific schema. Covers best-practices, data types, indexing, constraints, performance patterns, and advanced features | postgresql | postgresql, specific, schema, covers, data, types, indexing, constraints, performance, features |
|
||||
@@ -219,7 +292,7 @@ Total skills: 715
|
||||
| `xlsx-official` | Comprehensive spreadsheet creation, editing, and analysis with support for formulas, formatting, data analysis, and visualization. When Claude needs to work ... | xlsx, official | xlsx, official, spreadsheet, creation, editing, analysis, formulas, formatting, data, visualization, claude, work |
|
||||
| `youtube-automation` | Automate YouTube tasks via Rube MCP (Composio): upload videos, manage playlists, search content, get analytics, and handle comments. Always search tools firs... | youtube | youtube, automation, automate, tasks, via, rube, mcp, composio, upload, videos, playlists, search |
|
||||
|
||||
## development (83)
|
||||
## development (123)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -272,6 +345,63 @@ Total skills: 715
|
||||
| `n8n-code-python` | Write Python code in n8n Code nodes. Use when writing Python in n8n, using _input/_json/_node syntax, working with standard library, or need to understand Py... | n8n, code, python | n8n, code, python, write, nodes, writing, input, json, node, syntax, working, standard |
|
||||
| `n8n-node-configuration` | Operation-aware node configuration guidance. Use when configuring nodes, understanding property dependencies, determining required fields, choosing between g... | n8n, node, configuration | n8n, node, configuration, operation, aware, guidance, configuring, nodes, understanding, property, dependencies, determining |
|
||||
| `observe-whatsapp` | Observe and troubleshoot WhatsApp in Kapso: debug message delivery, inspect webhook deliveries/retries, triage API errors, and run health checks. Use when in... | observe, whatsapp | observe, whatsapp, troubleshoot, kapso, debug, message, delivery, inspect, webhook, deliveries, retries, triage |
|
||||
| `official/microsoft/dotnet/compute/durabletask` | Azure Resource Manager SDK for Durable Task Scheduler in .NET. Use for MANAGEMENT PLANE operations: creating/managing Durable Task Schedulers, Task Hubs, and... | official/microsoft/dotnet/compute/durabletask | official/microsoft/dotnet/compute/durabletask, azure, resource, manager, durabletask, dotnet, sdk, durable, task, scheduler, net, plane |
|
||||
| `official/microsoft/dotnet/compute/playwright` | Azure Resource Manager SDK for Microsoft Playwright Testing in .NET. Use for MANAGEMENT PLANE operations: creating/managing Playwright Testing workspaces, ch... | official/microsoft/dotnet/compute/playwright | official/microsoft/dotnet/compute/playwright, azure, resource, manager, playwright, dotnet, sdk, microsoft, testing, net, plane, operations |
|
||||
| `official/microsoft/dotnet/data/fabric` | Azure Resource Manager SDK for Fabric in .NET. Use for MANAGEMENT PLANE operations: provisioning, scaling, suspending/resuming Microsoft Fabric capacities, c... | official/microsoft/dotnet/data/fabric | official/microsoft/dotnet/data/fabric, azure, mgmt, fabric, dotnet, resource, manager, sdk, net, plane, operations, provisioning |
|
||||
| `official/microsoft/dotnet/entra/authentication-events` | Microsoft Entra Authentication Events SDK for .NET. Azure Functions triggers for custom authentication extensions. Use for token enrichment, custom claims, a... | official/microsoft/dotnet/entra/authentication, events | official/microsoft/dotnet/entra/authentication, events, microsoft, azure, webjobs, extensions, authentication, dotnet, entra, sdk, net, functions |
|
||||
| `official/microsoft/dotnet/integration/apicenter` | Azure API Center SDK for .NET. Centralized API inventory management with governance, versioning, and discovery. Use for creating API services, workspaces, AP... | official/microsoft/dotnet/integration/apicenter | official/microsoft/dotnet/integration/apicenter, azure, mgmt, apicenter, dotnet, api, center, sdk, net, centralized, inventory, governance |
|
||||
| `official/microsoft/dotnet/messaging/eventgrid` | Azure Event Grid SDK for .NET. Client library for publishing and consuming events with Azure Event Grid. Use for event-driven architectures, pub/sub messagin... | official/microsoft/dotnet/messaging/eventgrid | official/microsoft/dotnet/messaging/eventgrid, azure, eventgrid, dotnet, event, grid, sdk, net, client, library, publishing, consuming |
|
||||
| `official/microsoft/dotnet/partner/mongodbatlas` | Manage MongoDB Atlas Organizations as Azure ARM resources using Azure.ResourceManager.MongoDBAtlas SDK. Use when creating, updating, listing, or deleting Mon... | official/microsoft/dotnet/partner/mongodbatlas | official/microsoft/dotnet/partner/mongodbatlas, azure, mgmt, mongodbatlas, dotnet, mongodb, atlas, organizations, arm, resources, resourcemanager, sdk |
|
||||
| `official/microsoft/java/communication/callingserver` | Azure Communication Services CallingServer (legacy) Java SDK. Note - This SDK is deprecated. Use azure-communication-callautomation instead for new projects.... | official/microsoft/java/communication/callingserver | official/microsoft/java/communication/callingserver, azure, communication, callingserver, java, legacy, sdk, note, deprecated, callautomation, instead, new |
|
||||
| `official/microsoft/java/communication/chat` | Build real-time chat applications with Azure Communication Services Chat Java SDK. Use when implementing chat threads, messaging, participants, read receipts... | official/microsoft/java/communication/chat | official/microsoft/java/communication/chat, azure, communication, chat, java, real, time, applications, sdk, implementing, threads, messaging |
|
||||
| `official/microsoft/java/communication/common` | Azure Communication Services common utilities for Java. Use when working with CommunicationTokenCredential, user identifiers, token refresh, or shared authen... | official/microsoft/java/communication/common | official/microsoft/java/communication/common, azure, communication, common, java, utilities, working, communicationtokencredential, user, identifiers, token, refresh |
|
||||
| `official/microsoft/java/communication/sms` | Send SMS messages with Azure Communication Services SMS Java SDK. Use when implementing SMS notifications, alerts, OTP delivery, bulk messaging, or delivery ... | official/microsoft/java/communication/sms | official/microsoft/java/communication/sms, azure, communication, sms, java, send, messages, sdk, implementing, notifications, alerts, otp |
|
||||
| `official/microsoft/java/compute/batch` | Azure Batch SDK for Java. Run large-scale parallel and HPC batch jobs with pools, jobs, tasks, and compute nodes.
|
||||
Triggers: "BatchClient java", "azure batch ... | official/microsoft/java/compute/batch | official/microsoft/java/compute/batch, azure, compute, batch, java, sdk, run, large, scale, parallel, hpc, jobs |
|
||||
| `official/microsoft/java/integration/appconfiguration` | Azure App Configuration SDK for Java. Centralized application configuration management with key-value settings, feature flags, and snapshots.
|
||||
Triggers: "Conf... | official/microsoft/java/integration/appconfiguration | official/microsoft/java/integration/appconfiguration, azure, appconfiguration, java, app, configuration, sdk, centralized, application, key, value, settings |
|
||||
| `official/microsoft/java/messaging/eventgrid` | Build event-driven applications with Azure Event Grid SDK for Java. Use when publishing events, implementing pub/sub patterns, or integrating with Azure serv... | official/microsoft/java/messaging/eventgrid | official/microsoft/java/messaging/eventgrid, azure, eventgrid, java, event, driven, applications, grid, sdk, publishing, events, implementing |
|
||||
| `official/microsoft/java/messaging/webpubsub` | Build real-time web applications with Azure Web PubSub SDK for Java. Use when implementing WebSocket-based messaging, live updates, chat applications, or ser... | official/microsoft/java/messaging/webpubsub | official/microsoft/java/messaging/webpubsub, azure, messaging, webpubsub, java, real, time, web, applications, pubsub, sdk, implementing |
|
||||
| `official/microsoft/python/compute/containerregistry` | Azure Container Registry SDK for Python. Use for managing container images, artifacts, and repositories.
|
||||
Triggers: "azure-containerregistry", "ContainerRegis... | official/microsoft/python/compute/containerregistry | official/microsoft/python/compute/containerregistry, azure, containerregistry, py, container, registry, sdk, python, managing, images, artifacts, repositories |
|
||||
| `official/microsoft/python/compute/fabric` | Azure Fabric Management SDK for Python. Use for managing Microsoft Fabric capacities and resources.
|
||||
Triggers: "azure-mgmt-fabric", "FabricMgmtClient", "Fabri... | official/microsoft/python/compute/fabric | official/microsoft/python/compute/fabric, azure, mgmt, fabric, py, sdk, python, managing, microsoft, capacities, resources, triggers |
|
||||
| `official/microsoft/python/data/blob` | Azure Blob Storage SDK for Python. Use for uploading, downloading, listing blobs, managing containers, and blob lifecycle.
|
||||
Triggers: "blob storage", "BlobSer... | official/microsoft/python/data/blob | official/microsoft/python/data/blob, azure, storage, blob, py, sdk, python, uploading, downloading, listing, blobs, managing |
|
||||
| `official/microsoft/python/data/queue` | Azure Queue Storage SDK for Python. Use for reliable message queuing, task distribution, and asynchronous processing.
|
||||
Triggers: "queue storage", "QueueServic... | official/microsoft/python/data/queue | official/microsoft/python/data/queue, azure, storage, queue, py, sdk, python, reliable, message, queuing, task, distribution |
|
||||
| `official/microsoft/python/foundry/speech-to-text-rest` | Azure Speech to Text REST API for short audio (Python). Use for simple speech recognition of audio files up to 60 seconds without the Speech SDK.
|
||||
Triggers: "... | official/microsoft/python/foundry/speech, to, text, rest | official/microsoft/python/foundry/speech, to, text, rest, azure, speech, py, api, short, audio, python, simple |
|
||||
| `official/microsoft/python/integration/apicenter` | Azure API Center Management SDK for Python. Use for managing API inventory, metadata, and governance across your organization.
|
||||
Triggers: "azure-mgmt-apicente... | official/microsoft/python/integration/apicenter | official/microsoft/python/integration/apicenter, azure, mgmt, apicenter, py, api, center, sdk, python, managing, inventory, metadata |
|
||||
| `official/microsoft/python/integration/apimanagement` | Azure API Management SDK for Python. Use for managing APIM services, APIs, products, subscriptions, and policies.
|
||||
Triggers: "azure-mgmt-apimanagement", "ApiM... | official/microsoft/python/integration/apimanagement | official/microsoft/python/integration/apimanagement, azure, mgmt, apimanagement, py, api, sdk, python, managing, apim, apis, products |
|
||||
| `official/microsoft/python/integration/appconfiguration` | Azure App Configuration SDK for Python. Use for centralized configuration management, feature flags, and dynamic settings.
|
||||
Triggers: "azure-appconfiguration"... | official/microsoft/python/integration/appconfiguration | official/microsoft/python/integration/appconfiguration, azure, appconfiguration, py, app, configuration, sdk, python, centralized, feature, flags, dynamic |
|
||||
| `official/microsoft/python/messaging/eventgrid` | Azure Event Grid SDK for Python. Use for publishing events, handling CloudEvents, and event-driven architectures.
|
||||
Triggers: "event grid", "EventGridPublisher... | official/microsoft/python/messaging/eventgrid | official/microsoft/python/messaging/eventgrid, azure, eventgrid, py, event, grid, sdk, python, publishing, events, handling, cloudevents |
|
||||
| `official/microsoft/python/messaging/eventhub` | Azure Event Hubs SDK for Python streaming. Use for high-throughput event ingestion, producers, consumers, and checkpointing.
|
||||
Triggers: "event hubs", "EventHu... | official/microsoft/python/messaging/eventhub | official/microsoft/python/messaging/eventhub, azure, eventhub, py, event, hubs, sdk, python, streaming, high, throughput, ingestion |
|
||||
| `official/microsoft/python/monitoring/opentelemetry` | Azure Monitor OpenTelemetry Distro for Python. Use for one-line Application Insights setup with auto-instrumentation.
|
||||
Triggers: "azure-monitor-opentelemetry"... | official/microsoft/python/monitoring/opentelemetry | official/microsoft/python/monitoring/opentelemetry, azure, monitor, opentelemetry, py, distro, python, one, line, application, insights, setup |
|
||||
| `official/microsoft/python/monitoring/opentelemetry-exporter` | Azure Monitor OpenTelemetry Exporter for Python. Use for low-level OpenTelemetry export to Application Insights.
|
||||
Triggers: "azure-monitor-opentelemetry-expor... | official/microsoft/python/monitoring/opentelemetry, exporter | official/microsoft/python/monitoring/opentelemetry, exporter, azure, monitor, opentelemetry, py, python, low, level, export, application, insights |
|
||||
| `official/microsoft/rust/data/azure-storage-blob-rust` | Azure Blob Storage SDK for Rust. Use for uploading, downloading, and managing blobs and containers.
|
||||
Triggers: "blob storage rust", "BlobClient rust", "upload... | official/microsoft/rust/data/azure, storage, blob, rust | official/microsoft/rust/data/azure, storage, blob, rust, azure, sdk, uploading, downloading, managing, blobs, containers, triggers |
|
||||
| `official/microsoft/rust/entra/azure-identity-rust` | Azure Identity SDK for Rust authentication. Use for DeveloperToolsCredential, ManagedIdentityCredential, ClientSecretCredential, and token-based authenticati... | official/microsoft/rust/entra/azure, identity, rust | official/microsoft/rust/entra/azure, identity, rust, azure, sdk, authentication, developertoolscredential, managedidentitycredential, clientsecretcredential, token, triggers, managed |
|
||||
| `official/microsoft/rust/entra/azure-keyvault-certificates-rust` | Azure Key Vault Certificates SDK for Rust. Use for creating, importing, and managing certificates.
|
||||
Triggers: "keyvault certificates rust", "CertificateClient... | official/microsoft/rust/entra/azure, keyvault, certificates, rust | official/microsoft/rust/entra/azure, keyvault, certificates, rust, azure, key, vault, sdk, creating, importing, managing, triggers |
|
||||
| `official/microsoft/rust/entra/azure-keyvault-keys-rust` | Azure Key Vault Keys SDK for Rust. Use for creating, managing, and using cryptographic keys.
|
||||
Triggers: "keyvault keys rust", "KeyClient rust", "create key ru... | official/microsoft/rust/entra/azure, keyvault, keys, rust | official/microsoft/rust/entra/azure, keyvault, keys, rust, azure, key, vault, sdk, creating, managing, cryptographic, triggers |
|
||||
| `official/microsoft/typescript/data/blob` | Azure Blob Storage JavaScript/TypeScript SDK (@azure/storage-blob) for blob operations. Use for uploading, downloading, listing, and managing blobs and conta... | official/microsoft/typescript/data/blob | official/microsoft/typescript/data/blob, azure, storage, blob, ts, javascript, typescript, sdk, operations, uploading, downloading, listing |
|
||||
| `official/microsoft/typescript/data/fileshare` | Azure File Share JavaScript/TypeScript SDK (@azure/storage-file-share) for SMB file share operations. Use for creating shares, managing directories, uploadin... | official/microsoft/typescript/data/fileshare | official/microsoft/typescript/data/fileshare, azure, storage, file, share, ts, javascript, typescript, sdk, smb, operations, creating |
|
||||
| `official/microsoft/typescript/data/queue` | Azure Queue Storage JavaScript/TypeScript SDK (@azure/storage-queue) for message queue operations. Use for sending, receiving, peeking, and deleting messages... | official/microsoft/typescript/data/queue | official/microsoft/typescript/data/queue, azure, storage, queue, ts, javascript, typescript, sdk, message, operations, sending, receiving |
|
||||
| `official/microsoft/typescript/entra/keyvault-keys` | Manage cryptographic keys using Azure Key Vault Keys SDK for JavaScript (@azure/keyvault-keys). Use when creating, encrypting/decrypting, signing, or rotatin... | official/microsoft/typescript/entra/keyvault, keys | official/microsoft/typescript/entra/keyvault, keys, azure, keyvault, ts, cryptographic, key, vault, sdk, javascript, creating, encrypting |
|
||||
| `official/microsoft/typescript/frontend/react-flow-node` | Create React Flow node components with TypeScript types, handles, and Zustand integration. Use when building custom nodes for React Flow canvas, creating vis... | official/microsoft/typescript/frontend/react, flow, node | official/microsoft/typescript/frontend/react, flow, node, react, ts, components, typescript, types, zustand, integration, building, custom |
|
||||
| `official/microsoft/typescript/frontend/zustand-store` | Create Zustand stores with TypeScript, subscribeWithSelector middleware, and proper state/action separation. Use when building React state management, creati... | official/microsoft/typescript/frontend/zustand, store | official/microsoft/typescript/frontend/zustand, store, zustand, ts, stores, typescript, subscribewithselector, middleware, proper, state, action, separation |
|
||||
| `official/microsoft/typescript/integration/appconfiguration` | Build applications using Azure App Configuration SDK for JavaScript (@azure/app-configuration). Use when working with configuration settings, feature flags, ... | official/microsoft/typescript/integration/appconfiguration | official/microsoft/typescript/integration/appconfiguration, azure, appconfiguration, ts, applications, app, configuration, sdk, javascript, working, settings, feature |
|
||||
| `official/microsoft/typescript/m365/m365-agents` | Microsoft 365 Agents SDK for TypeScript/Node.js. Build multichannel agents for Teams/M365/Copilot Studio with AgentApplication routing, Express hosting, stre... | official/microsoft/typescript/m365/m365, agents | official/microsoft/typescript/m365/m365, agents, m365, ts, microsoft, 365, sdk, typescript, node, js, multichannel, teams |
|
||||
| `official/microsoft/typescript/messaging/webpubsub` | Build real-time messaging applications using Azure Web PubSub SDKs for JavaScript (@azure/web-pubsub, @azure/web-pubsub-client). Use when implementing WebSoc... | official/microsoft/typescript/messaging/webpubsub | official/microsoft/typescript/messaging/webpubsub, azure, web, pubsub, ts, real, time, messaging, applications, sdks, javascript, client |
|
||||
| `product-manager-toolkit` | Comprehensive toolkit for product managers including RICE prioritization, customer interview analysis, PRD templates, discovery frameworks, and go-to-market ... | product, manager | product, manager, toolkit, managers, including, rice, prioritization, customer, interview, analysis, prd, discovery |
|
||||
| `python-development-python-scaffold` | You are a Python project architecture expert specializing in scaffolding production-ready Python applications. Generate complete project structures with mode... | python | python, development, scaffold, architecture, specializing, scaffolding, applications, generate, complete, structures, tooling, uv |
|
||||
| `python-packaging` | Create distributable Python packages with proper project structure, setup.py/pyproject.toml, and publishing to PyPI. Use when packaging Python libraries, cre... | python, packaging | python, packaging, distributable, packages, proper, structure, setup, py, pyproject, toml, publishing, pypi |
|
||||
@@ -308,7 +438,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `viral-generator-builder` | Expert in building shareable generator tools that go viral - name generators, quiz makers, avatar creators, personality tests, and calculator tools. Covers t... | viral, generator, builder | viral, generator, builder, building, shareable, go, name, generators, quiz, makers, avatar, creators |
|
||||
| `webapp-testing` | Toolkit for interacting with and testing local web applications using Playwright. Supports verifying frontend functionality, debugging UI behavior, capturing... | webapp | webapp, testing, toolkit, interacting, local, web, applications, playwright, supports, verifying, frontend, functionality |
|
||||
|
||||
## general (131)
|
||||
## general (134)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -401,6 +531,9 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `nft-standards` | Implement NFT standards (ERC-721, ERC-1155) with proper metadata handling, minting strategies, and marketplace integration. Use when creating NFT contracts, ... | nft, standards | nft, standards, erc, 721, 1155, proper, metadata, handling, minting, marketplace, integration, creating |
|
||||
| `nosql-expert` | Expert guidance for distributed NoSQL databases (Cassandra, DynamoDB). Focuses on mental models, query-first modeling, single-table design, and avoiding hot ... | nosql | nosql, guidance, distributed, databases, cassandra, dynamodb, mental, models, query, first, modeling, single |
|
||||
| `obsidian-clipper-template-creator` | Guide for creating templates for the Obsidian Web Clipper. Use when you want to create a new clipping template, understand available variables, or format cli... | obsidian, clipper, creator | obsidian, clipper, creator, creating, web, want, new, clipping, understand, available, variables, format |
|
||||
| `official/microsoft/plugins/wiki-changelog` | Analyzes git commit history and generates structured changelogs categorized by change type. Use when the user asks about recent changes, wants a changelog, o... | official/microsoft/plugins/wiki, changelog | official/microsoft/plugins/wiki, changelog, wiki, analyzes, git, commit, history, generates, structured, changelogs, categorized, change |
|
||||
| `official/microsoft/plugins/wiki-page-writer` | Generates rich technical documentation pages with dark-mode Mermaid diagrams, source code citations, and first-principles depth. Use when writing documentati... | official/microsoft/plugins/wiki, page, writer | official/microsoft/plugins/wiki, page, writer, wiki, generates, rich, technical, documentation, pages, dark, mode, mermaid |
|
||||
| `official/microsoft/plugins/wiki-vitepress` | Packages generated wiki Markdown into a VitePress static site with dark theme, dark-mode Mermaid diagrams with click-to-zoom, and production build output. Us... | official/microsoft/plugins/wiki, vitepress | official/microsoft/plugins/wiki, vitepress, wiki, packages, generated, markdown, static, site, dark, theme, mode, mermaid |
|
||||
| `onboarding-cro` | When the user wants to optimize post-signup onboarding, user activation, first-run experience, or time-to-value. Also use when the user mentions "onboarding ... | onboarding, cro | onboarding, cro, user, wants, optimize, post, signup, activation, first, run, experience, time |
|
||||
| `oss-hunter` | Automatically hunt for high-impact OSS contribution opportunities in trending repositories. | oss, hunter | oss, hunter, automatically, hunt, high, impact, contribution, opportunities, trending, repositories |
|
||||
| `paid-ads` | When the user wants help with paid advertising campaigns on Google Ads, Meta (Facebook/Instagram), LinkedIn, Twitter/X, or other ad platforms. Also use when ... | paid, ads | paid, ads, user, wants, advertising, campaigns, google, meta, facebook, instagram, linkedin, twitter |
|
||||
@@ -444,7 +577,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `x-article-publisher-skill` | Publish articles to X/Twitter | x, article, publisher, skill | x, article, publisher, skill, publish, articles, twitter |
|
||||
| `youtube-summarizer` | Extract transcripts from YouTube videos and generate comprehensive, detailed summaries using intelligent analysis frameworks | video, summarization, transcription, youtube, content-analysis | video, summarization, transcription, youtube, content-analysis, summarizer, extract, transcripts, videos, generate, detailed, summaries |
|
||||
|
||||
## infrastructure (83)
|
||||
## infrastructure (101)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -511,6 +644,31 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `network-101` | This skill should be used when the user asks to "set up a web server", "configure HTTP or HTTPS", "perform SNMP enumeration", "configure SMB shares", "test n... | network, 101 | network, 101, skill, should, used, user, asks, set, up, web, server, configure |
|
||||
| `observability-monitoring-monitor-setup` | You are a monitoring and observability expert specializing in implementing comprehensive monitoring solutions. Set up metrics collection, distributed tracing... | observability, monitoring, monitor, setup | observability, monitoring, monitor, setup, specializing, implementing, solutions, set, up, metrics, collection, distributed |
|
||||
| `observability-monitoring-slo-implement` | You are an SLO (Service Level Objective) expert specializing in implementing reliability standards and error budget-based practices. Design SLO frameworks, d... | observability, monitoring, slo, implement | observability, monitoring, slo, implement, level, objective, specializing, implementing, reliability, standards, error, budget |
|
||||
| `official/microsoft/dotnet/compute/botservice` | Azure Resource Manager SDK for Bot Service in .NET. Management plane operations for creating and managing Azure Bot resources, channels (Teams, DirectLine, S... | official/microsoft/dotnet/compute/botservice | official/microsoft/dotnet/compute/botservice, azure, mgmt, botservice, dotnet, resource, manager, sdk, bot, net, plane, operations |
|
||||
| `official/microsoft/dotnet/foundry/weightsandbiases` | Azure Weights & Biases SDK for .NET. ML experiment tracking and model management via Azure Marketplace. Use for creating W&B instances, managing SSO, marketp... | official/microsoft/dotnet/foundry/weightsandbiases | official/microsoft/dotnet/foundry/weightsandbiases, azure, mgmt, weightsandbiases, dotnet, weights, biases, sdk, net, ml, experiment, tracking |
|
||||
| `official/microsoft/dotnet/integration/apimanagement` | Azure Resource Manager SDK for API Management in .NET. Use for MANAGEMENT PLANE operations: creating/managing APIM services, APIs, products, subscriptions, p... | official/microsoft/dotnet/integration/apimanagement | official/microsoft/dotnet/integration/apimanagement, azure, mgmt, apimanagement, dotnet, resource, manager, sdk, api, net, plane, operations |
|
||||
| `official/microsoft/dotnet/messaging/servicebus` | Azure Service Bus SDK for .NET. Enterprise messaging with queues, topics, subscriptions, and sessions. Use for reliable message delivery, pub/sub patterns, d... | official/microsoft/dotnet/messaging/servicebus | official/microsoft/dotnet/messaging/servicebus, azure, servicebus, dotnet, bus, sdk, net, enterprise, messaging, queues, topics, subscriptions |
|
||||
| `official/microsoft/dotnet/monitoring/applicationinsights` | Azure Application Insights SDK for .NET. Application performance monitoring and observability resource management. Use for creating Application Insights comp... | official/microsoft/dotnet/monitoring/applicationinsights | official/microsoft/dotnet/monitoring/applicationinsights, azure, mgmt, applicationinsights, dotnet, application, insights, sdk, net, performance, monitoring, observability |
|
||||
| `official/microsoft/dotnet/partner/arize-ai-observability-eval` | Azure Resource Manager SDK for Arize AI Observability and Evaluation (.NET). Use when managing Arize AI organizations
|
||||
on Azure via Azure Marketplace, creati... | official/microsoft/dotnet/partner/arize, ai, observability, eval | official/microsoft/dotnet/partner/arize, ai, observability, eval, azure, mgmt, arizeaiobservabilityeval, dotnet, resource, manager, sdk, arize |
|
||||
| `official/microsoft/java/entra/azure-identity` | Azure Identity Java SDK for authentication with Azure services. Use when implementing DefaultAzureCredential, managed identity, service principal, or any Azu... | official/microsoft/java/entra/azure, identity | official/microsoft/java/entra/azure, identity, azure, java, sdk, authentication, implementing, defaultazurecredential, managed, principal, any, applications |
|
||||
| `official/microsoft/java/foundry/anomalydetector` | Build anomaly detection applications with Azure AI Anomaly Detector SDK for Java. Use when implementing univariate/multivariate anomaly detection, time-serie... | official/microsoft/java/foundry/anomalydetector | official/microsoft/java/foundry/anomalydetector, azure, ai, anomalydetector, java, anomaly, detection, applications, detector, sdk, implementing, univariate |
|
||||
| `official/microsoft/java/monitoring/opentelemetry-exporter` | Azure Monitor OpenTelemetry Exporter for Java. Export OpenTelemetry traces, metrics, and logs to Azure Monitor/Application Insights.
|
||||
Triggers: "AzureMonitorE... | official/microsoft/java/monitoring/opentelemetry, exporter | official/microsoft/java/monitoring/opentelemetry, exporter, azure, monitor, opentelemetry, java, export, traces, metrics, logs, application, insights |
|
||||
| `official/microsoft/python/compute/botservice` | Azure Bot Service Management SDK for Python. Use for creating, managing, and configuring Azure Bot Service resources.
|
||||
Triggers: "azure-mgmt-botservice", "Azu... | official/microsoft/python/compute/botservice | official/microsoft/python/compute/botservice, azure, mgmt, botservice, py, bot, sdk, python, creating, managing, configuring, resources |
|
||||
| `official/microsoft/python/data/fileshare` | Azure Storage File Share SDK for Python. Use for SMB file shares, directories, and file operations in the cloud.
|
||||
Triggers: "azure-storage-file-share", "Share... | official/microsoft/python/data/fileshare | official/microsoft/python/data/fileshare, azure, storage, file, share, py, sdk, python, smb, shares, directories, operations |
|
||||
| `official/microsoft/python/entra/azure-identity` | Azure Identity SDK for Python authentication. Use for DefaultAzureCredential, managed identity, service principals, and token caching.
|
||||
Triggers: "azure-ident... | official/microsoft/python/entra/azure, identity | official/microsoft/python/entra/azure, identity, azure, py, sdk, python, authentication, defaultazurecredential, managed, principals, token, caching |
|
||||
| `official/microsoft/python/messaging/servicebus` | Azure Service Bus SDK for Python messaging. Use for queues, topics, subscriptions, and enterprise messaging patterns.
|
||||
Triggers: "service bus", "ServiceBusCli... | official/microsoft/python/messaging/servicebus | official/microsoft/python/messaging/servicebus, azure, servicebus, py, bus, sdk, python, messaging, queues, topics, subscriptions, enterprise |
|
||||
| `official/microsoft/python/messaging/webpubsub-service` | Azure Web PubSub Service SDK for Python. Use for real-time messaging, WebSocket connections, and pub/sub patterns.
|
||||
Triggers: "azure-messaging-webpubsubservic... | official/microsoft/python/messaging/webpubsub, service | official/microsoft/python/messaging/webpubsub, service, azure, messaging, webpubsubservice, py, web, pubsub, sdk, python, real, time |
|
||||
| `official/microsoft/typescript/compute/playwright` | Run Playwright tests at scale using Azure Playwright Workspaces (formerly Microsoft Playwright Testing). Use when scaling browser tests across cloud-hosted b... | official/microsoft/typescript/compute/playwright | official/microsoft/typescript/compute/playwright, azure, microsoft, playwright, testing, ts, run, tests, scale, workspaces, formerly, scaling |
|
||||
| `official/microsoft/typescript/entra/azure-identity` | Authenticate to Azure services using Azure Identity SDK for JavaScript (@azure/identity). Use when configuring authentication with DefaultAzureCredential, ma... | official/microsoft/typescript/entra/azure, identity | official/microsoft/typescript/entra/azure, identity, azure, ts, authenticate, sdk, javascript, configuring, authentication, defaultazurecredential, managed, principals |
|
||||
| `official/microsoft/typescript/messaging/servicebus` | Build messaging applications using Azure Service Bus SDK for JavaScript (@azure/service-bus). Use when implementing queues, topics/subscriptions, message ses... | official/microsoft/typescript/messaging/servicebus | official/microsoft/typescript/messaging/servicebus, azure, servicebus, ts, messaging, applications, bus, sdk, javascript, implementing, queues, topics |
|
||||
| `official/microsoft/typescript/monitoring/opentelemetry` | Instrument applications with Azure Monitor and OpenTelemetry for JavaScript (@azure/monitor-opentelemetry). Use when adding distributed tracing, metrics, and... | official/microsoft/typescript/monitoring/opentelemetry | official/microsoft/typescript/monitoring/opentelemetry, azure, monitor, opentelemetry, ts, instrument, applications, javascript, adding, distributed, tracing, metrics |
|
||||
| `performance-engineer` | Expert performance engineer specializing in modern observability, application optimization, and scalable system performance. Masters OpenTelemetry, distribut... | performance | performance, engineer, specializing, observability, application, optimization, scalable, masters, opentelemetry, distributed, tracing, load |
|
||||
| `performance-testing-review-ai-review` | You are an expert AI-powered code review specialist combining automated static analysis, intelligent pattern recognition, and modern DevOps practices. Levera... | performance, ai | performance, ai, testing, review, powered, code, combining, automated, static, analysis, intelligent, recognition |
|
||||
| `pipedrive-automation` | Automate Pipedrive CRM operations including deals, contacts, organizations, activities, notes, and pipeline management via Rube MCP (Composio). Always search... | pipedrive | pipedrive, automation, automate, crm, operations, including, deals, contacts, organizations, activities, notes, pipeline |
|
||||
@@ -532,7 +690,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `wireshark-analysis` | This skill should be used when the user asks to "analyze network traffic with Wireshark", "capture packets for troubleshooting", "filter PCAP files", "follow... | wireshark | wireshark, network, traffic, analysis, skill, should, used, user, asks, analyze, capture, packets |
|
||||
| `workflow-automation` | Workflow automation is the infrastructure that makes AI agents reliable. Without durable execution, a network hiccup during a 10-step payment flow means lost... | | automation, infrastructure, makes, ai, agents, reliable, without, durable, execution, network, hiccup, during |
|
||||
|
||||
## security (114)
|
||||
## security (126)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -606,6 +764,20 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `nodejs-best-practices` | Node.js development principles and decision-making. Framework selection, async patterns, security, and architecture. Teaches thinking, not copying. | nodejs, best, practices | nodejs, best, practices, node, js, development, principles, decision, making, framework, selection, async |
|
||||
| `notebooklm` | Use this skill to query your Google NotebookLM notebooks directly from Claude Code for source-grounded, citation-backed answers from Gemini. Browser automati... | notebooklm | notebooklm, skill, query, google, notebooks, directly, claude, code, source, grounded, citation, backed |
|
||||
| `observability-engineer` | Build production-ready monitoring, logging, and tracing systems. Implements comprehensive observability strategies, SLI/SLO management, and incident response... | observability | observability, engineer, monitoring, logging, tracing, implements, sli, slo, incident, response, proactively, infrastructure |
|
||||
| `official/microsoft/dotnet/entra/azure-identity` | Azure Identity SDK for .NET. Authentication library for Azure SDK clients using Microsoft Entra ID. Use for DefaultAzureCredential, managed identity, service... | official/microsoft/dotnet/entra/azure, identity | official/microsoft/dotnet/entra/azure, identity, azure, dotnet, sdk, net, authentication, library, clients, microsoft, entra, id |
|
||||
| `official/microsoft/dotnet/entra/keyvault` | Azure Key Vault Keys SDK for .NET. Client library for managing cryptographic keys in Azure Key Vault and Managed HSM. Use for key creation, rotation, encrypt... | official/microsoft/dotnet/entra/keyvault | official/microsoft/dotnet/entra/keyvault, azure, security, keyvault, keys, dotnet, key, vault, sdk, net, client, library |
|
||||
| `official/microsoft/dotnet/m365/m365-agents` | Microsoft 365 Agents SDK for .NET. Build multichannel agents for Teams/M365/Copilot Studio with ASP.NET Core hosting, AgentApplication routing, and MSAL-base... | official/microsoft/dotnet/m365/m365, agents | official/microsoft/dotnet/m365/m365, agents, m365, dotnet, microsoft, 365, sdk, net, multichannel, teams, copilot, studio |
|
||||
| `official/microsoft/java/entra/keyvault-keys` | Azure Key Vault Keys Java SDK for cryptographic key management. Use when creating, managing, or using RSA/EC keys, performing encrypt/decrypt/sign/verify ope... | official/microsoft/java/entra/keyvault, keys | official/microsoft/java/entra/keyvault, keys, azure, security, keyvault, java, key, vault, sdk, cryptographic, creating, managing |
|
||||
| `official/microsoft/java/entra/keyvault-secrets` | Azure Key Vault Secrets Java SDK for secret management. Use when storing, retrieving, or managing passwords, API keys, connection strings, or other sensitive... | official/microsoft/java/entra/keyvault, secrets | official/microsoft/java/entra/keyvault, secrets, azure, security, keyvault, java, key, vault, sdk, secret, storing, retrieving |
|
||||
| `official/microsoft/plugins/wiki-onboarding` | Generates two complementary onboarding guides — a Principal-Level architectural deep-dive and a Zero-to-Hero contributor walkthrough. Use when the user wants... | official/microsoft/plugins/wiki, onboarding | official/microsoft/plugins/wiki, onboarding, wiki, generates, two, complementary, guides, principal, level, architectural, deep, dive |
|
||||
| `official/microsoft/plugins/wiki-researcher` | Conducts multi-turn iterative deep research on specific topics within a codebase with zero tolerance for shallow analysis. Use when the user wants an in-dept... | official/microsoft/plugins/wiki, researcher | official/microsoft/plugins/wiki, researcher, wiki, conducts, multi, turn, iterative, deep, research, specific, topics, within |
|
||||
| `official/microsoft/python/data/cosmos-db` | Build Azure Cosmos DB NoSQL services with Python/FastAPI following production-grade patterns. Use when implementing database client setup with dual auth (Def... | official/microsoft/python/data/cosmos, db | official/microsoft/python/data/cosmos, db, azure, cosmos, py, nosql, python, fastapi, following, grade, implementing, database |
|
||||
| `official/microsoft/python/entra/keyvault` | Azure Key Vault SDK for Python. Use for secrets, keys, and certificates management with secure storage.
|
||||
Triggers: "key vault", "SecretClient", "KeyClient", "... | official/microsoft/python/entra/keyvault | official/microsoft/python/entra/keyvault, azure, keyvault, py, key, vault, sdk, python, secrets, keys, certificates, secure |
|
||||
| `official/microsoft/python/m365/m365-agents` | Microsoft 365 Agents SDK for Python. Build multichannel agents for Teams/M365/Copilot Studio with aiohttp hosting, AgentApplication routing, streaming respon... | official/microsoft/python/m365/m365, agents | official/microsoft/python/m365/m365, agents, m365, py, microsoft, 365, sdk, python, multichannel, teams, copilot, studio |
|
||||
| `official/microsoft/rust/entra/azure-keyvault-secrets-rust` | Azure Key Vault Secrets SDK for Rust. Use for storing and retrieving secrets, passwords, and API keys.
|
||||
Triggers: "keyvault secrets rust", "SecretClient rust"... | official/microsoft/rust/entra/azure, keyvault, secrets, rust | official/microsoft/rust/entra/azure, keyvault, secrets, rust, azure, key, vault, sdk, storing, retrieving, passwords, api |
|
||||
| `official/microsoft/typescript/entra/keyvault-secrets` | Manage secrets using Azure Key Vault Secrets SDK for JavaScript (@azure/keyvault-secrets). Use when storing and retrieving application secrets or configurati... | official/microsoft/typescript/entra/keyvault, secrets | official/microsoft/typescript/entra/keyvault, secrets, azure, keyvault, ts, key, vault, sdk, javascript, storing, retrieving, application |
|
||||
| `openapi-spec-generation` | Generate and maintain OpenAPI 3.1 specifications from code, design-first specs, and validation patterns. Use when creating API documentation, generating SDKs... | openapi, spec, generation | openapi, spec, generation, generate, maintain, specifications, code, first, specs, validation, creating, api |
|
||||
| `payment-integration` | Integrate Stripe, PayPal, and payment processors. Handles checkout flows, subscriptions, webhooks, and PCI compliance. Use PROACTIVELY when implementing paym... | payment, integration | payment, integration, integrate, stripe, paypal, processors, checkout, flows, subscriptions, webhooks, pci, compliance |
|
||||
| `pci-compliance` | Implement PCI DSS compliance requirements for secure handling of payment card data and payment systems. Use when securing payment processing, achieving PCI c... | pci, compliance | pci, compliance, dss, requirements, secure, handling, payment, card, data, securing, processing, achieving |
|
||||
@@ -651,7 +823,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `wordpress-penetration-testing` | This skill should be used when the user asks to "pentest WordPress sites", "scan WordPress for vulnerabilities", "enumerate WordPress users, themes, or plugi... | wordpress, penetration | wordpress, penetration, testing, skill, should, used, user, asks, pentest, sites, scan, vulnerabilities |
|
||||
| `xss-html-injection` | This skill should be used when the user asks to "test for XSS vulnerabilities", "perform cross-site scripting attacks", "identify HTML injection flaws", "exp... | xss, html, injection | xss, html, injection, cross, site, scripting, testing, skill, should, used, user, asks |
|
||||
|
||||
## testing (23)
|
||||
## testing (24)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -661,6 +833,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `conductor-revert` | Git-aware undo by logical work unit (track, phase, or task) | conductor, revert | conductor, revert, git, aware, undo, logical, work, unit, track, phase, task |
|
||||
| `debugger` | Debugging specialist for errors, test failures, and unexpected behavior. Use proactively when encountering any issues. | debugger | debugger, debugging, errors, test, failures, unexpected, behavior, proactively, encountering, any, issues |
|
||||
| `dependency-upgrade` | Manage major dependency version upgrades with compatibility analysis, staged rollout, and comprehensive testing. Use when upgrading framework versions, updat... | dependency, upgrade | dependency, upgrade, major, version, upgrades, compatibility, analysis, staged, rollout, testing, upgrading, framework |
|
||||
| `official/microsoft/plugins/wiki-qa` | Answers questions about a code repository using source file analysis. Use when the user asks a question about how something works, wants to understand a comp... | official/microsoft/plugins/wiki, qa | official/microsoft/plugins/wiki, qa, wiki, answers, questions, about, code, repository, source, file, analysis, user |
|
||||
| `pentest-commands` | This skill should be used when the user asks to "run pentest commands", "scan with nmap", "use metasploit exploits", "crack passwords with hydra or john", "s... | pentest, commands | pentest, commands, skill, should, used, user, asks, run, scan, nmap, metasploit, exploits |
|
||||
| `performance-testing-review-multi-agent-review` | Use when working with performance testing review multi agent review | performance, multi, agent | performance, multi, agent, testing, review, working |
|
||||
| `playwright-skill` | Complete browser automation with Playwright. Auto-detects dev servers, writes clean test scripts to /tmp. Test pages, fill forms, take screenshots, check res... | playwright, skill | playwright, skill, complete, browser, automation, auto, detects, dev, servers, writes, clean, test |
|
||||
|
||||
23
README.md
23
README.md
@@ -1,6 +1,6 @@
|
||||
# 🌌 Antigravity Awesome Skills: 715+ Agentic Skills for Claude Code, Gemini CLI, Cursor, Copilot & More
|
||||
# 🌌 Antigravity Awesome Skills: 845+ Agentic Skills for Claude Code, Gemini CLI, Cursor, Copilot & More
|
||||
|
||||
> **The Ultimate Collection of 715+ Universal Agentic Skills for AI Coding Assistants — Claude Code, Gemini CLI, Codex CLI, Antigravity IDE, GitHub Copilot, Cursor, OpenCode, AdaL**
|
||||
> **The Ultimate Collection of 845+ Universal Agentic Skills for AI Coding Assistants — Claude Code, Gemini CLI, Codex CLI, Antigravity IDE, GitHub Copilot, Cursor, OpenCode, AdaL**
|
||||
|
||||
[](https://opensource.org/licenses/MIT)
|
||||
[](https://claude.ai)
|
||||
@@ -16,7 +16,7 @@
|
||||
|
||||
If this project helps you, you can [support it here](https://buymeacoffee.com/sickn33) or simply ⭐ the repo.
|
||||
|
||||
**Antigravity Awesome Skills** is a curated, battle-tested library of **715 high-performance agentic skills** designed to work seamlessly across all major AI coding assistants:
|
||||
**Antigravity Awesome Skills** is a curated, battle-tested library of **845 high-performance agentic skills** designed to work seamlessly across all major AI coding assistants:
|
||||
|
||||
- 🟣 **Claude Code** (Anthropic CLI)
|
||||
- 🔵 **Gemini CLI** (Google DeepMind)
|
||||
@@ -27,7 +27,7 @@ If this project helps you, you can [support it here](https://buymeacoffee.com/si
|
||||
- ⚪ **OpenCode** (Open-source CLI)
|
||||
- 🌸 **AdaL CLI** (Self-evolving Coding Agent)
|
||||
|
||||
This repository provides essential skills to transform your AI assistant into a **full-stack digital agency**, including official capabilities from **Anthropic**, **OpenAI**, **Google**, **Supabase**, and **Vercel Labs**.
|
||||
This repository provides essential skills to transform your AI assistant into a **full-stack digital agency**, including official capabilities from **Anthropic**, **OpenAI**, **Google**, **Microsoft**, **Supabase**, and **Vercel Labs**.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
@@ -38,7 +38,7 @@ This repository provides essential skills to transform your AI assistant into a
|
||||
- [🎁 Curated Collections (Bundles)](#curated-collections)
|
||||
- [🧭 Antigravity Workflows](#antigravity-workflows)
|
||||
- [📦 Features & Categories](#features--categories)
|
||||
- [📚 Browse 715+ Skills](#browse-715-skills)
|
||||
- [📚 Browse 845+ Skills](#browse-845-skills)
|
||||
- [🤝 How to Contribute](#how-to-contribute)
|
||||
- [🤝 Community](#community)
|
||||
- [☕ Support the Project](#support-the-project)
|
||||
@@ -221,20 +221,24 @@ npx antigravity-awesome-skills
|
||||
They help you avoid picking from 700+ skills one by one.
|
||||
|
||||
What bundles are:
|
||||
|
||||
- Recommended starting sets for common workflows.
|
||||
- A shortcut for onboarding and faster execution.
|
||||
|
||||
What bundles are not:
|
||||
|
||||
- Not a separate install.
|
||||
- Not a locked preset.
|
||||
|
||||
How to use bundles:
|
||||
|
||||
1. Install the repository once.
|
||||
2. Pick one bundle in [docs/BUNDLES.md](docs/BUNDLES.md).
|
||||
3. Start with 3-5 skills from that bundle in your prompt.
|
||||
4. Add more only when needed.
|
||||
|
||||
Examples:
|
||||
|
||||
- Building a SaaS MVP: `Essentials` + `Full-Stack Developer` + `QA & Testing`.
|
||||
- Hardening production: `Security Developer` + `DevOps & Cloud` + `Observability & Monitoring`.
|
||||
- Shipping OSS changes: `Essentials` + `OSS Maintainer`.
|
||||
@@ -247,10 +251,12 @@ Bundles help you choose skills. Workflows help you execute them in order.
|
||||
- Use workflows when you need step-by-step execution for a concrete goal.
|
||||
|
||||
Start here:
|
||||
|
||||
- [docs/WORKFLOWS.md](docs/WORKFLOWS.md): human-readable playbooks.
|
||||
- [data/workflows.json](data/workflows.json): machine-readable workflow metadata.
|
||||
|
||||
Initial workflows include:
|
||||
|
||||
- Ship a SaaS MVP
|
||||
- Security Audit for a Web App
|
||||
- Build an AI Agent System
|
||||
@@ -274,7 +280,7 @@ The repository is organized into specialized domains to transform your AI into a
|
||||
|
||||
Counts change as new skills are added. For the current full registry, see [CATALOG.md](CATALOG.md).
|
||||
|
||||
## Browse 715+ Skills
|
||||
## Browse 845+ Skills
|
||||
|
||||
We have moved the full skill registry to a dedicated catalog to keep this README clean.
|
||||
|
||||
@@ -308,14 +314,17 @@ Please ensure your skill follows the Antigravity/Claude Code best practices.
|
||||
Support is optional. This project stays free and open-source for everyone.
|
||||
|
||||
If this repository saves you time or helps you ship faster, you can support ongoing maintenance:
|
||||
|
||||
- [☕ Buy me a book on Buy Me a Coffee](https://buymeacoffee.com/sickn33)
|
||||
|
||||
Where support goes:
|
||||
|
||||
- Skill curation, testing, and quality validation.
|
||||
- Documentation updates, examples, and onboarding improvements.
|
||||
- Faster triage and review of community issues and PRs.
|
||||
|
||||
Prefer non-financial support:
|
||||
|
||||
- Star the repository.
|
||||
- Open clear, reproducible issues.
|
||||
- Submit PRs (skills, docs, fixes).
|
||||
@@ -346,6 +355,8 @@ This collection would not be possible without the incredible work of the Claude
|
||||
- **[vercel-labs/agent-skills](https://github.com/vercel-labs/agent-skills)**: Vercel Labs official skills - React Best Practices, Web Design Guidelines.
|
||||
- **[openai/skills](https://github.com/openai/skills)**: OpenAI Codex skills catalog - Agent skills, Skill Creator, Concise Planning.
|
||||
- **[supabase/agent-skills](https://github.com/supabase/agent-skills)**: Supabase official skills - Postgres Best Practices.
|
||||
- **[microsoft/skills](https://github.com/microsoft/skills)**: Official Microsoft skills - Azure cloud services, Bot Framework, Cognitive Services, and enterprise development patterns across .NET, Python, TypeScript, Go, Rust, and Java.
|
||||
- **[google-gemini/gemini-skills](https://github.com/google-gemini/gemini-skills)**: Official Gemini skills - Gemini API, SDK and model interactions.
|
||||
|
||||
### Community Contributors
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"generatedAt": "2026-02-08T00:00:00.000Z",
|
||||
"generatedAt": "2026-02-11T14:27:51.213Z",
|
||||
"aliases": {
|
||||
"accessibility-compliance-audit": "accessibility-compliance-accessibility-audit",
|
||||
"active directory attacks": "active-directory-attacks",
|
||||
@@ -91,6 +91,139 @@
|
||||
"observability-monitoring-setup": "observability-monitoring-monitor-setup",
|
||||
"observability-monitoring-implement": "observability-monitoring-slo-implement",
|
||||
"obsidian-clipper-creator": "obsidian-clipper-template-creator",
|
||||
"azure-mgmt-botservice-dotnet": "official/microsoft/dotnet/compute/botservice",
|
||||
"azure-resource-manager-durabletask-dotnet": "official/microsoft/dotnet/compute/durabletask",
|
||||
"azure-resource-manager-playwright-dotnet": "official/microsoft/dotnet/compute/playwright",
|
||||
"azure-resource-manager-cosmosdb-dotnet": "official/microsoft/dotnet/data/cosmosdb",
|
||||
"azure-mgmt-fabric-dotnet": "official/microsoft/dotnet/data/fabric",
|
||||
"azure-resource-manager-mysql-dotnet": "official/microsoft/dotnet/data/mysql",
|
||||
"azure-resource-manager-postgresql-dotnet": "official/microsoft/dotnet/data/postgresql",
|
||||
"azure-resource-manager-redis-dotnet": "official/microsoft/dotnet/data/redis",
|
||||
"azure-resource-manager-sql-dotnet": "official/microsoft/dotnet/data/sql",
|
||||
"microsoft-azure-webjobs-extensions-authentication-events-dotnet": "official/microsoft/dotnet/entra/authentication-events",
|
||||
"azure-identity-dotnet": "official/microsoft/dotnet/entra/azure-identity",
|
||||
"azure-security-keyvault-keys-dotnet": "official/microsoft/dotnet/entra/keyvault",
|
||||
"azure-ai-document-intelligence-dotnet": "official/microsoft/dotnet/foundry/document-intelligence",
|
||||
"azure-ai-openai-dotnet": "official/microsoft/dotnet/foundry/openai",
|
||||
"azure-ai-projects-dotnet": "official/microsoft/dotnet/foundry/projects",
|
||||
"azure-search-documents-dotnet": "official/microsoft/dotnet/foundry/search-documents",
|
||||
"azure-ai-voicelive-dotnet": "official/microsoft/dotnet/foundry/voicelive",
|
||||
"azure-mgmt-weightsandbiases-dotnet": "official/microsoft/dotnet/foundry/weightsandbiases",
|
||||
"azure-maps-search-dotnet": "official/microsoft/dotnet/general/maps",
|
||||
"azure-mgmt-apicenter-dotnet": "official/microsoft/dotnet/integration/apicenter",
|
||||
"azure-mgmt-apimanagement-dotnet": "official/microsoft/dotnet/integration/apimanagement",
|
||||
"m365-agents-dotnet": "official/microsoft/dotnet/m365/m365-agents",
|
||||
"azure-eventgrid-dotnet": "official/microsoft/dotnet/messaging/eventgrid",
|
||||
"azure-eventhub-dotnet": "official/microsoft/dotnet/messaging/eventhubs",
|
||||
"azure-servicebus-dotnet": "official/microsoft/dotnet/messaging/servicebus",
|
||||
"azure-mgmt-applicationinsights-dotnet": "official/microsoft/dotnet/monitoring/applicationinsights",
|
||||
"azure-mgmt-arizeaiobservabilityeval-dotnet": "official/microsoft/dotnet/partner/arize-ai-observability-eval",
|
||||
"official/microsoft/dotnet/partner/arize-ai-eval": "official/microsoft/dotnet/partner/arize-ai-observability-eval",
|
||||
"azure-mgmt-mongodbatlas-dotnet": "official/microsoft/dotnet/partner/mongodbatlas",
|
||||
"azure-communication-callautomation-java": "official/microsoft/java/communication/callautomation",
|
||||
"azure-communication-callingserver-java": "official/microsoft/java/communication/callingserver",
|
||||
"azure-communication-chat-java": "official/microsoft/java/communication/chat",
|
||||
"azure-communication-common-java": "official/microsoft/java/communication/common",
|
||||
"azure-communication-sms-java": "official/microsoft/java/communication/sms",
|
||||
"azure-compute-batch-java": "official/microsoft/java/compute/batch",
|
||||
"azure-storage-blob-java": "official/microsoft/java/data/blob",
|
||||
"azure-cosmos-java": "official/microsoft/java/data/cosmos",
|
||||
"azure-data-tables-java": "official/microsoft/java/data/tables",
|
||||
"azure-identity-java": "official/microsoft/java/entra/azure-identity",
|
||||
"azure-security-keyvault-keys-java": "official/microsoft/java/entra/keyvault-keys",
|
||||
"azure-security-keyvault-secrets-java": "official/microsoft/java/entra/keyvault-secrets",
|
||||
"azure-ai-anomalydetector-java": "official/microsoft/java/foundry/anomalydetector",
|
||||
"azure-ai-contentsafety-java": "official/microsoft/java/foundry/contentsafety",
|
||||
"azure-ai-formrecognizer-java": "official/microsoft/java/foundry/formrecognizer",
|
||||
"azure-ai-projects-java": "official/microsoft/java/foundry/projects",
|
||||
"azure-ai-vision-imageanalysis-java": "official/microsoft/java/foundry/vision-imageanalysis",
|
||||
"azure-ai-voicelive-java": "official/microsoft/java/foundry/voicelive",
|
||||
"azure-appconfiguration-java": "official/microsoft/java/integration/appconfiguration",
|
||||
"azure-eventgrid-java": "official/microsoft/java/messaging/eventgrid",
|
||||
"azure-eventhub-java": "official/microsoft/java/messaging/eventhubs",
|
||||
"azure-messaging-webpubsub-java": "official/microsoft/java/messaging/webpubsub",
|
||||
"azure-monitor-ingestion-java": "official/microsoft/java/monitoring/ingestion",
|
||||
"azure-monitor-opentelemetry-exporter-java": "official/microsoft/java/monitoring/opentelemetry-exporter",
|
||||
"azure-monitor-query-java": "official/microsoft/java/monitoring/query",
|
||||
"wiki-architect": "official/microsoft/plugins/wiki-architect",
|
||||
"wiki-changelog": "official/microsoft/plugins/wiki-changelog",
|
||||
"wiki-onboarding": "official/microsoft/plugins/wiki-onboarding",
|
||||
"wiki-page-writer": "official/microsoft/plugins/wiki-page-writer",
|
||||
"wiki-qa": "official/microsoft/plugins/wiki-qa",
|
||||
"wiki-researcher": "official/microsoft/plugins/wiki-researcher",
|
||||
"wiki-vitepress": "official/microsoft/plugins/wiki-vitepress",
|
||||
"azure-mgmt-botservice-py": "official/microsoft/python/compute/botservice",
|
||||
"azure-containerregistry-py": "official/microsoft/python/compute/containerregistry",
|
||||
"azure-mgmt-fabric-py": "official/microsoft/python/compute/fabric",
|
||||
"azure-storage-blob-py": "official/microsoft/python/data/blob",
|
||||
"azure-cosmos-py": "official/microsoft/python/data/cosmos",
|
||||
"azure-cosmos-db-py": "official/microsoft/python/data/cosmos-db",
|
||||
"azure-storage-file-datalake-py": "official/microsoft/python/data/datalake",
|
||||
"azure-storage-file-share-py": "official/microsoft/python/data/fileshare",
|
||||
"azure-storage-queue-py": "official/microsoft/python/data/queue",
|
||||
"azure-data-tables-py": "official/microsoft/python/data/tables",
|
||||
"azure-identity-py": "official/microsoft/python/entra/azure-identity",
|
||||
"azure-keyvault-py": "official/microsoft/python/entra/keyvault",
|
||||
"agent-framework-azure-ai-py": "official/microsoft/python/foundry/agent-framework",
|
||||
"agents-v2-py": "official/microsoft/python/foundry/agents-v2",
|
||||
"azure-ai-contentsafety-py": "official/microsoft/python/foundry/contentsafety",
|
||||
"azure-ai-contentunderstanding-py": "official/microsoft/python/foundry/contentunderstanding",
|
||||
"azure-ai-ml-py": "official/microsoft/python/foundry/ml",
|
||||
"azure-ai-projects-py": "official/microsoft/python/foundry/projects",
|
||||
"azure-search-documents-py": "official/microsoft/python/foundry/search-documents",
|
||||
"azure-speech-to-text-rest-py": "official/microsoft/python/foundry/speech-to-text-rest",
|
||||
"official/microsoft/python/foundry/speech-to-rest": "official/microsoft/python/foundry/speech-to-text-rest",
|
||||
"azure-ai-textanalytics-py": "official/microsoft/python/foundry/textanalytics",
|
||||
"azure-ai-transcription-py": "official/microsoft/python/foundry/transcription",
|
||||
"azure-ai-translation-document-py": "official/microsoft/python/foundry/translation-document",
|
||||
"azure-ai-translation-text-py": "official/microsoft/python/foundry/translation-text",
|
||||
"azure-ai-vision-imageanalysis-py": "official/microsoft/python/foundry/vision-imageanalysis",
|
||||
"azure-ai-voicelive-py": "official/microsoft/python/foundry/voicelive",
|
||||
"azure-mgmt-apicenter-py": "official/microsoft/python/integration/apicenter",
|
||||
"azure-mgmt-apimanagement-py": "official/microsoft/python/integration/apimanagement",
|
||||
"azure-appconfiguration-py": "official/microsoft/python/integration/appconfiguration",
|
||||
"m365-agents-py": "official/microsoft/python/m365/m365-agents",
|
||||
"azure-eventgrid-py": "official/microsoft/python/messaging/eventgrid",
|
||||
"azure-eventhub-py": "official/microsoft/python/messaging/eventhub",
|
||||
"azure-servicebus-py": "official/microsoft/python/messaging/servicebus",
|
||||
"azure-messaging-webpubsubservice-py": "official/microsoft/python/messaging/webpubsub-service",
|
||||
"azure-monitor-ingestion-py": "official/microsoft/python/monitoring/ingestion",
|
||||
"azure-monitor-opentelemetry-py": "official/microsoft/python/monitoring/opentelemetry",
|
||||
"azure-monitor-opentelemetry-exporter-py": "official/microsoft/python/monitoring/opentelemetry-exporter",
|
||||
"azure-monitor-query-py": "official/microsoft/python/monitoring/query",
|
||||
"azure-cosmos-rust": "official/microsoft/rust/data/azure-cosmos-rust",
|
||||
"azure-storage-blob-rust": "official/microsoft/rust/data/azure-storage-blob-rust",
|
||||
"official/microsoft/rust/data/azure-storage-rust": "official/microsoft/rust/data/azure-storage-blob-rust",
|
||||
"azure-identity-rust": "official/microsoft/rust/entra/azure-identity-rust",
|
||||
"azure-keyvault-certificates-rust": "official/microsoft/rust/entra/azure-keyvault-certificates-rust",
|
||||
"official/microsoft/rust/entra/azure-keyvault-rust": "official/microsoft/rust/entra/azure-keyvault-certificates-rust",
|
||||
"azure-keyvault-keys-rust": "official/microsoft/rust/entra/azure-keyvault-keys-rust",
|
||||
"azure-keyvault-secrets-rust": "official/microsoft/rust/entra/azure-keyvault-secrets-rust",
|
||||
"azure-eventhub-rust": "official/microsoft/rust/messaging/azure-eventhub-rust",
|
||||
"azure-microsoft-playwright-testing-ts": "official/microsoft/typescript/compute/playwright",
|
||||
"azure-storage-blob-ts": "official/microsoft/typescript/data/blob",
|
||||
"azure-cosmos-ts": "official/microsoft/typescript/data/cosmosdb",
|
||||
"azure-storage-file-share-ts": "official/microsoft/typescript/data/fileshare",
|
||||
"azure-postgres-ts": "official/microsoft/typescript/data/postgres",
|
||||
"azure-storage-queue-ts": "official/microsoft/typescript/data/queue",
|
||||
"azure-identity-ts": "official/microsoft/typescript/entra/azure-identity",
|
||||
"azure-keyvault-keys-ts": "official/microsoft/typescript/entra/keyvault-keys",
|
||||
"azure-keyvault-secrets-ts": "official/microsoft/typescript/entra/keyvault-secrets",
|
||||
"azure-ai-contentsafety-ts": "official/microsoft/typescript/foundry/contentsafety",
|
||||
"azure-ai-document-intelligence-ts": "official/microsoft/typescript/foundry/document-intelligence",
|
||||
"azure-ai-projects-ts": "official/microsoft/typescript/foundry/projects",
|
||||
"azure-search-documents-ts": "official/microsoft/typescript/foundry/search-documents",
|
||||
"azure-ai-translation-ts": "official/microsoft/typescript/foundry/translation",
|
||||
"azure-ai-voicelive-ts": "official/microsoft/typescript/foundry/voicelive",
|
||||
"frontend-ui-dark-ts": "official/microsoft/typescript/frontend/frontend-ui-dark",
|
||||
"react-flow-node-ts": "official/microsoft/typescript/frontend/react-flow-node",
|
||||
"zustand-store-ts": "official/microsoft/typescript/frontend/zustand-store",
|
||||
"azure-appconfiguration-ts": "official/microsoft/typescript/integration/appconfiguration",
|
||||
"m365-agents-ts": "official/microsoft/typescript/m365/m365-agents",
|
||||
"azure-eventhub-ts": "official/microsoft/typescript/messaging/eventhubs",
|
||||
"azure-servicebus-ts": "official/microsoft/typescript/messaging/servicebus",
|
||||
"azure-web-pubsub-ts": "official/microsoft/typescript/messaging/webpubsub",
|
||||
"azure-monitor-opentelemetry-ts": "official/microsoft/typescript/monitoring/opentelemetry",
|
||||
"pdf": "pdf-official",
|
||||
"pentest checklist": "pentest-checklist",
|
||||
"pentest commands": "pentest-commands",
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"generatedAt": "2026-02-08T00:00:00.000Z",
|
||||
"generatedAt": "2026-02-11T14:27:51.213Z",
|
||||
"bundles": {
|
||||
"core-dev": {
|
||||
"description": "Core development skills across languages, frameworks, and backend/frontend fundamentals.",
|
||||
@@ -88,6 +88,95 @@
|
||||
"nodejs-backend-patterns",
|
||||
"nodejs-best-practices",
|
||||
"observe-whatsapp",
|
||||
"official/microsoft/dotnet/integration/apicenter",
|
||||
"official/microsoft/dotnet/integration/apimanagement",
|
||||
"official/microsoft/dotnet/monitoring/applicationinsights",
|
||||
"official/microsoft/java/communication/callautomation",
|
||||
"official/microsoft/java/communication/callingserver",
|
||||
"official/microsoft/java/communication/chat",
|
||||
"official/microsoft/java/communication/common",
|
||||
"official/microsoft/java/communication/sms",
|
||||
"official/microsoft/java/compute/batch",
|
||||
"official/microsoft/java/data/blob",
|
||||
"official/microsoft/java/data/cosmos",
|
||||
"official/microsoft/java/data/tables",
|
||||
"official/microsoft/java/entra/azure-identity",
|
||||
"official/microsoft/java/entra/keyvault-keys",
|
||||
"official/microsoft/java/entra/keyvault-secrets",
|
||||
"official/microsoft/java/foundry/anomalydetector",
|
||||
"official/microsoft/java/foundry/contentsafety",
|
||||
"official/microsoft/java/foundry/formrecognizer",
|
||||
"official/microsoft/java/foundry/projects",
|
||||
"official/microsoft/java/foundry/vision-imageanalysis",
|
||||
"official/microsoft/java/foundry/voicelive",
|
||||
"official/microsoft/java/integration/appconfiguration",
|
||||
"official/microsoft/java/messaging/eventgrid",
|
||||
"official/microsoft/java/messaging/eventhubs",
|
||||
"official/microsoft/java/messaging/webpubsub",
|
||||
"official/microsoft/java/monitoring/ingestion",
|
||||
"official/microsoft/java/monitoring/opentelemetry-exporter",
|
||||
"official/microsoft/java/monitoring/query",
|
||||
"official/microsoft/python/compute/botservice",
|
||||
"official/microsoft/python/compute/containerregistry",
|
||||
"official/microsoft/python/compute/fabric",
|
||||
"official/microsoft/python/data/blob",
|
||||
"official/microsoft/python/data/cosmos",
|
||||
"official/microsoft/python/data/cosmos-db",
|
||||
"official/microsoft/python/data/datalake",
|
||||
"official/microsoft/python/data/fileshare",
|
||||
"official/microsoft/python/data/queue",
|
||||
"official/microsoft/python/data/tables",
|
||||
"official/microsoft/python/entra/azure-identity",
|
||||
"official/microsoft/python/entra/keyvault",
|
||||
"official/microsoft/python/foundry/agent-framework",
|
||||
"official/microsoft/python/foundry/contentsafety",
|
||||
"official/microsoft/python/foundry/contentunderstanding",
|
||||
"official/microsoft/python/foundry/ml",
|
||||
"official/microsoft/python/foundry/projects",
|
||||
"official/microsoft/python/foundry/search-documents",
|
||||
"official/microsoft/python/foundry/speech-to-text-rest",
|
||||
"official/microsoft/python/foundry/transcription",
|
||||
"official/microsoft/python/foundry/voicelive",
|
||||
"official/microsoft/python/integration/apicenter",
|
||||
"official/microsoft/python/integration/apimanagement",
|
||||
"official/microsoft/python/integration/appconfiguration",
|
||||
"official/microsoft/python/m365/m365-agents",
|
||||
"official/microsoft/python/messaging/eventgrid",
|
||||
"official/microsoft/python/messaging/eventhub",
|
||||
"official/microsoft/python/messaging/servicebus",
|
||||
"official/microsoft/python/messaging/webpubsub-service",
|
||||
"official/microsoft/python/monitoring/ingestion",
|
||||
"official/microsoft/python/monitoring/opentelemetry",
|
||||
"official/microsoft/python/monitoring/opentelemetry-exporter",
|
||||
"official/microsoft/python/monitoring/query",
|
||||
"official/microsoft/rust/data/azure-cosmos-rust",
|
||||
"official/microsoft/rust/data/azure-storage-blob-rust",
|
||||
"official/microsoft/rust/entra/azure-identity-rust",
|
||||
"official/microsoft/rust/entra/azure-keyvault-certificates-rust",
|
||||
"official/microsoft/rust/entra/azure-keyvault-keys-rust",
|
||||
"official/microsoft/rust/entra/azure-keyvault-secrets-rust",
|
||||
"official/microsoft/rust/messaging/azure-eventhub-rust",
|
||||
"official/microsoft/typescript/data/blob",
|
||||
"official/microsoft/typescript/data/cosmosdb",
|
||||
"official/microsoft/typescript/data/fileshare",
|
||||
"official/microsoft/typescript/data/postgres",
|
||||
"official/microsoft/typescript/data/queue",
|
||||
"official/microsoft/typescript/entra/azure-identity",
|
||||
"official/microsoft/typescript/entra/keyvault-keys",
|
||||
"official/microsoft/typescript/entra/keyvault-secrets",
|
||||
"official/microsoft/typescript/foundry/projects",
|
||||
"official/microsoft/typescript/foundry/search-documents",
|
||||
"official/microsoft/typescript/foundry/translation",
|
||||
"official/microsoft/typescript/foundry/voicelive",
|
||||
"official/microsoft/typescript/frontend/frontend-ui-dark",
|
||||
"official/microsoft/typescript/frontend/react-flow-node",
|
||||
"official/microsoft/typescript/frontend/zustand-store",
|
||||
"official/microsoft/typescript/integration/appconfiguration",
|
||||
"official/microsoft/typescript/m365/m365-agents",
|
||||
"official/microsoft/typescript/messaging/eventhubs",
|
||||
"official/microsoft/typescript/messaging/servicebus",
|
||||
"official/microsoft/typescript/messaging/webpubsub",
|
||||
"official/microsoft/typescript/monitoring/opentelemetry",
|
||||
"openapi-spec-generation",
|
||||
"php-pro",
|
||||
"plaid-fintech",
|
||||
@@ -196,6 +285,16 @@
|
||||
"nextjs-supabase-auth",
|
||||
"nodejs-best-practices",
|
||||
"notebooklm",
|
||||
"official/microsoft/dotnet/entra/azure-identity",
|
||||
"official/microsoft/dotnet/entra/keyvault",
|
||||
"official/microsoft/dotnet/m365/m365-agents",
|
||||
"official/microsoft/java/entra/keyvault-keys",
|
||||
"official/microsoft/java/entra/keyvault-secrets",
|
||||
"official/microsoft/python/data/cosmos-db",
|
||||
"official/microsoft/python/entra/keyvault",
|
||||
"official/microsoft/python/m365/m365-agents",
|
||||
"official/microsoft/rust/entra/azure-keyvault-secrets-rust",
|
||||
"official/microsoft/typescript/entra/keyvault-secrets",
|
||||
"openapi-spec-generation",
|
||||
"payment-integration",
|
||||
"pci-compliance",
|
||||
@@ -255,6 +354,18 @@
|
||||
"mtls-configuration",
|
||||
"network-engineer",
|
||||
"observability-monitoring-slo-implement",
|
||||
"official/microsoft/dotnet/compute/botservice",
|
||||
"official/microsoft/dotnet/entra/azure-identity",
|
||||
"official/microsoft/dotnet/integration/apimanagement",
|
||||
"official/microsoft/dotnet/messaging/servicebus",
|
||||
"official/microsoft/java/entra/azure-identity",
|
||||
"official/microsoft/python/compute/botservice",
|
||||
"official/microsoft/python/data/cosmos-db",
|
||||
"official/microsoft/python/entra/azure-identity",
|
||||
"official/microsoft/python/messaging/servicebus",
|
||||
"official/microsoft/python/messaging/webpubsub-service",
|
||||
"official/microsoft/typescript/entra/azure-identity",
|
||||
"official/microsoft/typescript/messaging/servicebus",
|
||||
"service-mesh-expert",
|
||||
"service-mesh-observability",
|
||||
"slo-implementation"
|
||||
@@ -308,6 +419,36 @@
|
||||
"nextjs-app-router-patterns",
|
||||
"nextjs-best-practices",
|
||||
"nodejs-backend-patterns",
|
||||
"official/microsoft/dotnet/data/cosmosdb",
|
||||
"official/microsoft/dotnet/data/mysql",
|
||||
"official/microsoft/dotnet/data/postgresql",
|
||||
"official/microsoft/dotnet/data/redis",
|
||||
"official/microsoft/dotnet/data/sql",
|
||||
"official/microsoft/dotnet/foundry/document-intelligence",
|
||||
"official/microsoft/dotnet/general/maps",
|
||||
"official/microsoft/dotnet/messaging/eventhubs",
|
||||
"official/microsoft/dotnet/monitoring/applicationinsights",
|
||||
"official/microsoft/java/data/blob",
|
||||
"official/microsoft/java/data/cosmos",
|
||||
"official/microsoft/java/data/tables",
|
||||
"official/microsoft/java/entra/keyvault-secrets",
|
||||
"official/microsoft/java/messaging/eventhubs",
|
||||
"official/microsoft/java/monitoring/ingestion",
|
||||
"official/microsoft/java/monitoring/query",
|
||||
"official/microsoft/python/data/cosmos",
|
||||
"official/microsoft/python/data/cosmos-db",
|
||||
"official/microsoft/python/data/datalake",
|
||||
"official/microsoft/python/data/tables",
|
||||
"official/microsoft/python/foundry/textanalytics",
|
||||
"official/microsoft/python/monitoring/ingestion",
|
||||
"official/microsoft/python/monitoring/query",
|
||||
"official/microsoft/rust/data/azure-cosmos-rust",
|
||||
"official/microsoft/rust/messaging/azure-eventhub-rust",
|
||||
"official/microsoft/typescript/data/cosmosdb",
|
||||
"official/microsoft/typescript/data/postgres",
|
||||
"official/microsoft/typescript/foundry/document-intelligence",
|
||||
"official/microsoft/typescript/frontend/frontend-ui-dark",
|
||||
"official/microsoft/typescript/messaging/eventhubs",
|
||||
"pci-compliance",
|
||||
"php-pro",
|
||||
"postgres-best-practices",
|
||||
@@ -394,6 +535,12 @@
|
||||
"observability-engineer",
|
||||
"observability-monitoring-monitor-setup",
|
||||
"observability-monitoring-slo-implement",
|
||||
"official/microsoft/dotnet/foundry/weightsandbiases",
|
||||
"official/microsoft/dotnet/monitoring/applicationinsights",
|
||||
"official/microsoft/dotnet/partner/arize-ai-observability-eval",
|
||||
"official/microsoft/java/foundry/anomalydetector",
|
||||
"official/microsoft/java/monitoring/opentelemetry-exporter",
|
||||
"official/microsoft/typescript/monitoring/opentelemetry",
|
||||
"performance-engineer",
|
||||
"performance-testing-review-ai-review",
|
||||
"pipedrive-automation",
|
||||
|
||||
3163
data/catalog.json
3163
data/catalog.json
File diff suppressed because it is too large
Load Diff
@@ -3,16 +3,16 @@
|
||||
We believe in giving credit where credit is due.
|
||||
If you recognize your work here and it is not properly attributed, please open an Issue.
|
||||
|
||||
| Skill / Category | Original Source | License | Notes |
|
||||
| :-------------------------- | :----------------------------------------------------- | :------------- | :---------------------------- |
|
||||
| `cloud-penetration-testing` | [HackTricks](https://book.hacktricks.xyz/) | MIT / CC-BY-SA | Adapted for agentic use. |
|
||||
| `active-directory-attacks` | [HackTricks](https://book.hacktricks.xyz/) | MIT / CC-BY-SA | Adapted for agentic use. |
|
||||
| `owasp-top-10` | [OWASP](https://owasp.org/) | CC-BY-SA | Methodology adapted. |
|
||||
| `burp-suite-testing` | [PortSwigger](https://portswigger.net/burp) | N/A | Usage guide only (no binary). |
|
||||
| `crewai` | [CrewAI](https://github.com/joaomdmoura/crewAI) | MIT | Framework guides. |
|
||||
| `langgraph` | [LangGraph](https://github.com/langchain-ai/langgraph) | MIT | Framework guides. |
|
||||
| `react-patterns` | [React Docs](https://react.dev/) | CC-BY | Official patterns. |
|
||||
| **All Official Skills** | [Anthropic / Google / OpenAI] | Proprietary | Usage encouraged by vendors. |
|
||||
| Skill / Category | Original Source | License | Notes |
|
||||
| :-------------------------- | :----------------------------------------------------------------- | :------------- | :---------------------------- |
|
||||
| `cloud-penetration-testing` | [HackTricks](https://book.hacktricks.xyz/) | MIT / CC-BY-SA | Adapted for agentic use. |
|
||||
| `active-directory-attacks` | [HackTricks](https://book.hacktricks.xyz/) | MIT / CC-BY-SA | Adapted for agentic use. |
|
||||
| `owasp-top-10` | [OWASP](https://owasp.org/) | CC-BY-SA | Methodology adapted. |
|
||||
| `burp-suite-testing` | [PortSwigger](https://portswigger.net/burp) | N/A | Usage guide only (no binary). |
|
||||
| `crewai` | [CrewAI](https://github.com/joaomdmoura/crewAI) | MIT | Framework guides. |
|
||||
| `langgraph` | [LangGraph](https://github.com/langchain-ai/langgraph) | MIT | Framework guides. |
|
||||
| `react-patterns` | [React Docs](https://react.dev/) | CC-BY | Official patterns. |
|
||||
| **All Official Skills** | [Anthropic / Google / OpenAI / Microsoft / Supabase / Vercel Labs] | Proprietary | Usage encouraged by vendors. |
|
||||
|
||||
## Skills from VoltAgent/awesome-agent-skills
|
||||
|
||||
@@ -20,44 +20,44 @@ The following skills were added from the curated collection at [VoltAgent/awesom
|
||||
|
||||
### Official Team Skills
|
||||
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :---- | :-------------- | :------ | :---- |
|
||||
| `vercel-deploy-claimable` | [Vercel Labs](https://github.com/vercel-labs/agent-skills) | MIT | Official Vercel skill |
|
||||
| `design-md` | [Google Labs (Stitch)](https://github.com/google-labs-code/stitch-skills) | Compatible | Google Labs Stitch skills |
|
||||
| `hugging-face-cli`, `hugging-face-jobs` | [Hugging Face](https://github.com/huggingface/skills) | Compatible | Official Hugging Face skills |
|
||||
| `culture-index`, `fix-review`, `sharp-edges` | [Trail of Bits](https://github.com/trailofbits/skills) | Compatible | Security skills from Trail of Bits |
|
||||
| `expo-deployment`, `upgrading-expo` | [Expo](https://github.com/expo/skills) | Compatible | Official Expo skills |
|
||||
| `commit`, `create-pr`, `find-bugs`, `iterate-pr` | [Sentry](https://github.com/getsentry/skills) | Compatible | Sentry dev team skills |
|
||||
| `using-neon` | [Neon](https://github.com/neondatabase/agent-skills) | Compatible | Neon Postgres best practices |
|
||||
| `fal-audio`, `fal-generate`, `fal-image-edit`, `fal-platform`, `fal-upscale`, `fal-workflow` | [fal.ai Community](https://github.com/fal-ai-community/skills) | Compatible | fal.ai AI model skills |
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------ | :--------- | :--------------------------------- |
|
||||
| `vercel-deploy-claimable` | [Vercel Labs](https://github.com/vercel-labs/agent-skills) | MIT | Official Vercel skill |
|
||||
| `design-md` | [Google Labs (Stitch)](https://github.com/google-labs-code/stitch-skills) | Compatible | Google Labs Stitch skills |
|
||||
| `hugging-face-cli`, `hugging-face-jobs` | [Hugging Face](https://github.com/huggingface/skills) | Compatible | Official Hugging Face skills |
|
||||
| `culture-index`, `fix-review`, `sharp-edges` | [Trail of Bits](https://github.com/trailofbits/skills) | Compatible | Security skills from Trail of Bits |
|
||||
| `expo-deployment`, `upgrading-expo` | [Expo](https://github.com/expo/skills) | Compatible | Official Expo skills |
|
||||
| `commit`, `create-pr`, `find-bugs`, `iterate-pr` | [Sentry](https://github.com/getsentry/skills) | Compatible | Sentry dev team skills |
|
||||
| `using-neon` | [Neon](https://github.com/neondatabase/agent-skills) | Compatible | Neon Postgres best practices |
|
||||
| `fal-audio`, `fal-generate`, `fal-image-edit`, `fal-platform`, `fal-upscale`, `fal-workflow` | [fal.ai Community](https://github.com/fal-ai-community/skills) | Compatible | fal.ai AI model skills |
|
||||
|
||||
### Community Skills
|
||||
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :---- | :-------------- | :------ | :---- |
|
||||
| `automate-whatsapp`, `observe-whatsapp` | [gokapso](https://github.com/gokapso/agent-skills) | Compatible | WhatsApp automation skills |
|
||||
| `readme` | [Shpigford](https://github.com/Shpigford/skills) | Compatible | README generation |
|
||||
| `screenshots` | [Shpigford](https://github.com/Shpigford/skills) | Compatible | Marketing screenshots |
|
||||
| `aws-skills` | [zxkane](https://github.com/zxkane/aws-skills) | Compatible | AWS development patterns |
|
||||
| `deep-research` | [sanjay3290](https://github.com/sanjay3290/ai-skills) | Compatible | Gemini Deep Research Agent |
|
||||
| `ffuf-claude-skill` | [jthack](https://github.com/jthack/ffuf_claude_skill) | Compatible | Web fuzzing with ffuf |
|
||||
| `ui-skills` | [ibelick](https://github.com/ibelick/ui-skills) | Compatible | UI development constraints |
|
||||
| `vexor` | [scarletkc](https://github.com/scarletkc/vexor) | Compatible | Vector-powered CLI |
|
||||
| `pypict-skill` | [omkamal](https://github.com/omkamal/pypict-claude-skill) | Compatible | Pairwise test generation |
|
||||
| `makepad-skills` | [ZhangHanDong](https://github.com/ZhangHanDong/makepad-skills) | Compatible | Makepad UI development |
|
||||
| `swiftui-expert-skill` | [AvdLee](https://github.com/AvdLee/SwiftUI-Agent-Skill) | Compatible | SwiftUI best practices |
|
||||
| `threejs-skills` | [CloudAI-X](https://github.com/CloudAI-X/threejs-skills) | Compatible | Three.js 3D experiences |
|
||||
| `claude-scientific-skills` | [K-Dense-AI](https://github.com/K-Dense-AI/claude-scientific-skills) | Compatible | Scientific research skills |
|
||||
| `claude-win11-speckit-update-skill` | [NotMyself](https://github.com/NotMyself/claude-win11-speckit-update-skill) | Compatible | Windows 11 management |
|
||||
| `imagen` | [sanjay3290](https://github.com/sanjay3290/ai-skills) | Compatible | Google Gemini image generation |
|
||||
| `security-bluebook-builder` | [SHADOWPR0](https://github.com/SHADOWPR0/security-bluebook-builder) | Compatible | Security documentation |
|
||||
| `claude-ally-health` | [huifer](https://github.com/huifer/Claude-Ally-Health) | Compatible | Health assistant |
|
||||
| `clarity-gate` | [frmoretto](https://github.com/frmoretto/clarity-gate) | Compatible | RAG quality verification |
|
||||
| `n8n-code-python`, `n8n-mcp-tools-expert`, `n8n-node-configuration` | [czlonkowski](https://github.com/czlonkowski/n8n-skills) | Compatible | n8n automation skills |
|
||||
| `varlock-claude-skill` | [wrsmith108](https://github.com/wrsmith108/varlock-claude-skill) | Compatible | Secure environment variables |
|
||||
| `beautiful-prose` | [SHADOWPR0](https://github.com/SHADOWPR0/beautiful_prose) | Compatible | Writing style guide |
|
||||
| `claude-speed-reader` | [SeanZoR](https://github.com/SeanZoR/claude-speed-reader) | Compatible | Speed reading tool |
|
||||
| `skill-seekers` | [yusufkaraaslan](https://github.com/yusufkaraaslan/Skill_Seekers) | Compatible | Skill conversion tool |
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :------------------------------------------------------------------ | :-------------------------------------------------------------------------- | :--------- | :----------------------------- |
|
||||
| `automate-whatsapp`, `observe-whatsapp` | [gokapso](https://github.com/gokapso/agent-skills) | Compatible | WhatsApp automation skills |
|
||||
| `readme` | [Shpigford](https://github.com/Shpigford/skills) | Compatible | README generation |
|
||||
| `screenshots` | [Shpigford](https://github.com/Shpigford/skills) | Compatible | Marketing screenshots |
|
||||
| `aws-skills` | [zxkane](https://github.com/zxkane/aws-skills) | Compatible | AWS development patterns |
|
||||
| `deep-research` | [sanjay3290](https://github.com/sanjay3290/ai-skills) | Compatible | Gemini Deep Research Agent |
|
||||
| `ffuf-claude-skill` | [jthack](https://github.com/jthack/ffuf_claude_skill) | Compatible | Web fuzzing with ffuf |
|
||||
| `ui-skills` | [ibelick](https://github.com/ibelick/ui-skills) | Compatible | UI development constraints |
|
||||
| `vexor` | [scarletkc](https://github.com/scarletkc/vexor) | Compatible | Vector-powered CLI |
|
||||
| `pypict-skill` | [omkamal](https://github.com/omkamal/pypict-claude-skill) | Compatible | Pairwise test generation |
|
||||
| `makepad-skills` | [ZhangHanDong](https://github.com/ZhangHanDong/makepad-skills) | Compatible | Makepad UI development |
|
||||
| `swiftui-expert-skill` | [AvdLee](https://github.com/AvdLee/SwiftUI-Agent-Skill) | Compatible | SwiftUI best practices |
|
||||
| `threejs-skills` | [CloudAI-X](https://github.com/CloudAI-X/threejs-skills) | Compatible | Three.js 3D experiences |
|
||||
| `claude-scientific-skills` | [K-Dense-AI](https://github.com/K-Dense-AI/claude-scientific-skills) | Compatible | Scientific research skills |
|
||||
| `claude-win11-speckit-update-skill` | [NotMyself](https://github.com/NotMyself/claude-win11-speckit-update-skill) | Compatible | Windows 11 management |
|
||||
| `imagen` | [sanjay3290](https://github.com/sanjay3290/ai-skills) | Compatible | Google Gemini image generation |
|
||||
| `security-bluebook-builder` | [SHADOWPR0](https://github.com/SHADOWPR0/security-bluebook-builder) | Compatible | Security documentation |
|
||||
| `claude-ally-health` | [huifer](https://github.com/huifer/Claude-Ally-Health) | Compatible | Health assistant |
|
||||
| `clarity-gate` | [frmoretto](https://github.com/frmoretto/clarity-gate) | Compatible | RAG quality verification |
|
||||
| `n8n-code-python`, `n8n-mcp-tools-expert`, `n8n-node-configuration` | [czlonkowski](https://github.com/czlonkowski/n8n-skills) | Compatible | n8n automation skills |
|
||||
| `varlock-claude-skill` | [wrsmith108](https://github.com/wrsmith108/varlock-claude-skill) | Compatible | Secure environment variables |
|
||||
| `beautiful-prose` | [SHADOWPR0](https://github.com/SHADOWPR0/beautiful_prose) | Compatible | Writing style guide |
|
||||
| `claude-speed-reader` | [SeanZoR](https://github.com/SeanZoR/claude-speed-reader) | Compatible | Speed reading tool |
|
||||
| `skill-seekers` | [yusufkaraaslan](https://github.com/yusufkaraaslan/Skill_Seekers) | Compatible | Skill conversion tool |
|
||||
|
||||
- **frontend-slides** - [zarazhangrui](https://github.com/zarazhangrui/frontend-slides)
|
||||
- **linear-claude-skill** - [wrsmith108](https://github.com/wrsmith108/linear-claude-skill)
|
||||
@@ -74,11 +74,11 @@ The following skills were added from the curated collection at [VoltAgent/awesom
|
||||
|
||||
## Skills from whatiskadudoing/fp-ts-skills (v4.4.0)
|
||||
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :---- | :-------------- | :------ | :---- |
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :---------------- | :------------------------------------------------------------------------------ | :--------- | :------------------------------------------------------- |
|
||||
| `fp-ts-pragmatic` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | Pragmatic fp-ts guide – pipe, Option, Either, TaskEither |
|
||||
| `fp-ts-react` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | fp-ts with React 18/19 and Next.js |
|
||||
| `fp-ts-errors` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | Type-safe error handling with Either and TaskEither |
|
||||
| `fp-ts-react` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | fp-ts with React 18/19 and Next.js |
|
||||
| `fp-ts-errors` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | Type-safe error handling with Either and TaskEither |
|
||||
|
||||
## License Policy
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "antigravity-awesome-skills",
|
||||
"version": "5.0.0",
|
||||
"description": "626+ agentic skills for Claude Code, Gemini CLI, Cursor, Antigravity & more. Installer CLI.",
|
||||
"description": "800+ agentic skills for Claude Code, Gemini CLI, Cursor, Antigravity & more. Installer CLI.",
|
||||
"license": "MIT",
|
||||
"scripts": {
|
||||
"validate": "python3 scripts/validate_skills.py",
|
||||
@@ -11,7 +11,9 @@
|
||||
"chain": "npm run validate && npm run index && npm run readme",
|
||||
"catalog": "node scripts/build-catalog.js",
|
||||
"build": "npm run chain && npm run catalog",
|
||||
"test": "node scripts/tests/validate_skills_headings.test.js && python3 scripts/tests/test_validate_skills_headings.py"
|
||||
"test": "node scripts/tests/validate_skills_headings.test.js && python3 scripts/tests/test_validate_skills_headings.py && python3 scripts/tests/inspect_microsoft_repo.py && python3 scripts/tests/test_comprehensive_coverage.py",
|
||||
"sync:microsoft": "python3 scripts/sync_microsoft_skills.py",
|
||||
"sync:all-official": "npm run sync:microsoft && npm run chain"
|
||||
},
|
||||
"devDependencies": {
|
||||
"yaml": "^2.8.2"
|
||||
|
||||
@@ -6,14 +6,34 @@ import yaml
|
||||
|
||||
def parse_frontmatter(content):
|
||||
"""
|
||||
Parses YAML frontmatter using PyYAML for standard compliance.
|
||||
Parses YAML frontmatter, sanitizing unquoted values containing @.
|
||||
Handles single values and comma-separated lists by quoting the entire line.
|
||||
"""
|
||||
fm_match = re.search(r'^---\s*\n(.*?)\n---', content, re.DOTALL)
|
||||
if not fm_match:
|
||||
return {}
|
||||
|
||||
yaml_text = fm_match.group(1)
|
||||
|
||||
# Process line by line to handle values containing @ and commas
|
||||
sanitized_lines = []
|
||||
for line in yaml_text.splitlines():
|
||||
# Match "key: value" (handles keys with dashes like 'package-name')
|
||||
match = re.match(r'^(\s*[\w-]+):\s*(.*)$', line)
|
||||
if match:
|
||||
key, val = match.groups()
|
||||
val_s = val.strip()
|
||||
# If value contains @ and isn't already quoted, wrap the whole string in double quotes
|
||||
if '@' in val_s and not (val_s.startswith('"') or val_s.startswith("'")):
|
||||
# Escape any existing double quotes within the value string
|
||||
safe_val = val_s.replace('"', '\\"')
|
||||
line = f'{key}: "{safe_val}"'
|
||||
sanitized_lines.append(line)
|
||||
|
||||
sanitized_yaml = '\n'.join(sanitized_lines)
|
||||
|
||||
try:
|
||||
return yaml.safe_load(fm_match.group(1)) or {}
|
||||
return yaml.safe_load(sanitized_yaml) or {}
|
||||
except yaml.YAMLError as e:
|
||||
print(f"⚠️ YAML parsing error: {e}")
|
||||
return {}
|
||||
|
||||
285
scripts/sync_microsoft_skills.py
Normal file
285
scripts/sync_microsoft_skills.py
Normal file
@@ -0,0 +1,285 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Sync Microsoft Skills Repository - v3
|
||||
Preserves original structure from skills/ directory and handles all locations
|
||||
"""
|
||||
|
||||
import shutil
|
||||
import subprocess
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
import json
|
||||
|
||||
MS_REPO = "https://github.com/microsoft/skills.git"
|
||||
TARGET_DIR = Path(__file__).parent.parent / "skills"
|
||||
|
||||
def clone_repo(temp_dir: Path):
|
||||
"""Clone Microsoft skills repository"""
|
||||
print("🔄 Cloning Microsoft Skills repository...")
|
||||
subprocess.run(
|
||||
["git", "clone", "--depth", "1", MS_REPO, str(temp_dir)],
|
||||
check=True
|
||||
)
|
||||
|
||||
def find_all_skills(source_dir: Path):
|
||||
"""Find all SKILL.md files in the repository"""
|
||||
all_skills = {}
|
||||
|
||||
# Search in .github/skills/
|
||||
github_skills = source_dir / ".github" / "skills"
|
||||
if github_skills.exists():
|
||||
for skill_dir in github_skills.iterdir():
|
||||
if skill_dir.is_dir() and (skill_dir / "SKILL.md").exists():
|
||||
all_skills[skill_dir.name] = skill_dir
|
||||
|
||||
# Search in .github/plugins/
|
||||
github_plugins = source_dir / ".github" / "plugins"
|
||||
if github_plugins.exists():
|
||||
for skill_file in github_plugins.rglob("SKILL.md"):
|
||||
skill_dir = skill_file.parent
|
||||
skill_name = skill_dir.name
|
||||
if skill_name not in all_skills:
|
||||
all_skills[skill_name] = skill_dir
|
||||
|
||||
return all_skills
|
||||
|
||||
def sync_skills_preserve_structure(source_dir: Path, target_dir: Path):
|
||||
"""
|
||||
Sync skills preserving the original skills/ directory structure.
|
||||
This is better than auto-categorization since MS already organized them.
|
||||
"""
|
||||
skills_source = source_dir / "skills"
|
||||
|
||||
if not skills_source.exists():
|
||||
print(" ⚠️ skills/ directory not found, will use flat structure")
|
||||
return sync_skills_flat(source_dir, target_dir)
|
||||
|
||||
# First, find all actual skill content
|
||||
all_skills = find_all_skills(source_dir)
|
||||
print(f" 📂 Found {len(all_skills)} total skills in repository")
|
||||
|
||||
synced_count = 0
|
||||
skill_metadata = []
|
||||
|
||||
# Walk through the skills/ directory structure
|
||||
for item in skills_source.rglob("*"):
|
||||
# Skip non-directories
|
||||
if not item.is_dir():
|
||||
continue
|
||||
|
||||
# Check if this directory (or its symlink target) contains a SKILL.md
|
||||
skill_md = None
|
||||
skill_source_dir = None
|
||||
|
||||
# If it's a symlink, resolve it
|
||||
if item.is_symlink():
|
||||
try:
|
||||
resolved = item.resolve()
|
||||
if (resolved / "SKILL.md").exists():
|
||||
skill_md = resolved / "SKILL.md"
|
||||
skill_source_dir = resolved
|
||||
except:
|
||||
continue
|
||||
elif (item / "SKILL.md").exists():
|
||||
skill_md = item / "SKILL.md"
|
||||
skill_source_dir = item
|
||||
|
||||
if skill_md is None:
|
||||
continue
|
||||
|
||||
# Get relative path from skills/ directory - this preserves MS's organization
|
||||
try:
|
||||
relative_path = item.relative_to(skills_source)
|
||||
except ValueError:
|
||||
# Shouldn't happen, but handle it
|
||||
continue
|
||||
|
||||
# Create target directory preserving structure
|
||||
target_skill_dir = target_dir / "official" / "microsoft" / relative_path
|
||||
target_skill_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Copy SKILL.md
|
||||
shutil.copy2(skill_md, target_skill_dir / "SKILL.md")
|
||||
|
||||
# Copy other files from the actual skill directory
|
||||
for file_item in skill_source_dir.iterdir():
|
||||
if file_item.name != "SKILL.md" and file_item.is_file():
|
||||
shutil.copy2(file_item, target_skill_dir / file_item.name)
|
||||
|
||||
# Collect metadata
|
||||
skill_metadata.append({
|
||||
"path": str(relative_path),
|
||||
"name": item.name,
|
||||
"category": str(relative_path.parent),
|
||||
"source": str(skill_source_dir.relative_to(source_dir))
|
||||
})
|
||||
|
||||
synced_count += 1
|
||||
print(f" ✅ Synced: {relative_path}")
|
||||
|
||||
# Also sync any skills from .github/plugins that aren't symlinked in skills/
|
||||
plugin_skills = find_plugin_skills(source_dir, skill_metadata)
|
||||
if plugin_skills:
|
||||
print(f"\n 📦 Found {len(plugin_skills)} additional plugin skills")
|
||||
for plugin_skill in plugin_skills:
|
||||
target_skill_dir = target_dir / "official" / "microsoft" / "plugins" / plugin_skill['name']
|
||||
target_skill_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Copy SKILL.md
|
||||
shutil.copy2(plugin_skill['source'] / "SKILL.md", target_skill_dir / "SKILL.md")
|
||||
|
||||
# Copy other files
|
||||
for file_item in plugin_skill['source'].iterdir():
|
||||
if file_item.name != "SKILL.md" and file_item.is_file():
|
||||
shutil.copy2(file_item, target_skill_dir / file_item.name)
|
||||
|
||||
skill_metadata.append({
|
||||
"path": f"plugins/{plugin_skill['name']}",
|
||||
"name": plugin_skill['name'],
|
||||
"category": "plugins",
|
||||
"source": str(plugin_skill['source'].relative_to(source_dir))
|
||||
})
|
||||
|
||||
synced_count += 1
|
||||
print(f" ✅ Synced: plugins/{plugin_skill['name']}")
|
||||
|
||||
return synced_count, skill_metadata
|
||||
|
||||
def find_plugin_skills(source_dir: Path, already_synced: list):
|
||||
"""Find plugin skills that haven't been synced yet"""
|
||||
synced_names = {s['name'] for s in already_synced}
|
||||
plugin_skills = []
|
||||
|
||||
github_plugins = source_dir / ".github" / "plugins"
|
||||
if github_plugins.exists():
|
||||
for skill_file in github_plugins.rglob("SKILL.md"):
|
||||
skill_dir = skill_file.parent
|
||||
skill_name = skill_dir.name
|
||||
|
||||
if skill_name not in synced_names:
|
||||
plugin_skills.append({
|
||||
'name': skill_name,
|
||||
'source': skill_dir
|
||||
})
|
||||
|
||||
return plugin_skills
|
||||
|
||||
def sync_skills_flat(source_dir: Path, target_dir: Path):
|
||||
"""Fallback: sync all skills in a flat structure"""
|
||||
all_skills = find_all_skills(source_dir)
|
||||
|
||||
synced_count = 0
|
||||
skill_metadata = []
|
||||
|
||||
for skill_name, skill_dir in all_skills.items():
|
||||
target_skill_dir = target_dir / "official" / "microsoft" / skill_name
|
||||
target_skill_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Copy SKILL.md
|
||||
shutil.copy2(skill_dir / "SKILL.md", target_skill_dir / "SKILL.md")
|
||||
|
||||
# Copy other files
|
||||
for item in skill_dir.iterdir():
|
||||
if item.name != "SKILL.md" and item.is_file():
|
||||
shutil.copy2(item, target_skill_dir / item.name)
|
||||
|
||||
skill_metadata.append({
|
||||
"path": skill_name,
|
||||
"name": skill_name,
|
||||
"category": "root"
|
||||
})
|
||||
|
||||
synced_count += 1
|
||||
print(f" ✅ Synced: {skill_name}")
|
||||
|
||||
return synced_count, skill_metadata
|
||||
|
||||
def create_attribution_file(target_dir: Path, metadata: list):
|
||||
"""Create attribution and metadata file"""
|
||||
attribution = {
|
||||
"source": "microsoft/skills",
|
||||
"repository": "https://github.com/microsoft/skills",
|
||||
"license": "MIT",
|
||||
"synced_skills": len(metadata),
|
||||
"skills": metadata,
|
||||
"note": "Symlinks resolved and content copied for compatibility. Original directory structure preserved."
|
||||
}
|
||||
|
||||
ms_dir = target_dir / "official" / "microsoft"
|
||||
ms_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
with open(ms_dir / "ATTRIBUTION.json", "w") as f:
|
||||
json.dump(attribution, f, indent=2)
|
||||
|
||||
def copy_documentation(source_dir: Path, target_dir: Path):
|
||||
"""Copy LICENSE and README files"""
|
||||
ms_dir = target_dir / "official" / "microsoft"
|
||||
ms_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
if (source_dir / "LICENSE").exists():
|
||||
shutil.copy2(source_dir / "LICENSE", ms_dir / "LICENSE")
|
||||
|
||||
if (source_dir / "README.md").exists():
|
||||
shutil.copy2(source_dir / "README.md", ms_dir / "README-MICROSOFT.md")
|
||||
|
||||
def main():
|
||||
"""Main sync function"""
|
||||
print("🚀 Microsoft Skills Sync Script v3")
|
||||
print("=" * 50)
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
temp_path = Path(temp_dir)
|
||||
|
||||
try:
|
||||
# Clone repository
|
||||
clone_repo(temp_path)
|
||||
|
||||
# Create target directory
|
||||
TARGET_DIR.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Sync skills (preserving structure)
|
||||
print("\n🔗 Resolving symlinks and preserving directory structure...")
|
||||
count, metadata = sync_skills_preserve_structure(temp_path, TARGET_DIR)
|
||||
|
||||
# Copy documentation
|
||||
print("\n📄 Copying documentation...")
|
||||
copy_documentation(temp_path, TARGET_DIR)
|
||||
|
||||
# Create attribution file
|
||||
print("📝 Creating attribution metadata...")
|
||||
create_attribution_file(TARGET_DIR, metadata)
|
||||
|
||||
print(f"\n✨ Success! Synced {count} Microsoft skills")
|
||||
print(f"📁 Location: {TARGET_DIR / 'official' / 'microsoft'}")
|
||||
|
||||
# Show structure summary
|
||||
ms_dir = TARGET_DIR / "official" / "microsoft"
|
||||
categories = set()
|
||||
for skill in metadata:
|
||||
cat = skill.get('category', 'root')
|
||||
if cat != 'root':
|
||||
categories.add(cat.split('/')[0] if '/' in cat else cat)
|
||||
|
||||
print(f"\n📊 Organization:")
|
||||
print(f" Total skills: {count}")
|
||||
print(f" Categories: {', '.join(sorted(categories)[:10])}")
|
||||
if len(categories) > 10:
|
||||
print(f" ... and {len(categories) - 10} more")
|
||||
|
||||
print("\n📋 Next steps:")
|
||||
print("1. Review synced skills")
|
||||
print("2. Run: npm run validate")
|
||||
print("3. Update CATALOG.md")
|
||||
print("4. Update docs/SOURCES.md")
|
||||
print("5. Commit changes and create PR")
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
if __name__ == "__main__":
|
||||
exit(main())
|
||||
149
scripts/tests/inspect_microsoft_repo.py
Normal file
149
scripts/tests/inspect_microsoft_repo.py
Normal file
@@ -0,0 +1,149 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Debug script to inspect Microsoft Skills repository structure - v2
|
||||
Handles all skill locations including plugins
|
||||
"""
|
||||
|
||||
import subprocess
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
MS_REPO = "https://github.com/microsoft/skills.git"
|
||||
|
||||
def inspect_repo():
|
||||
"""Inspect the Microsoft skills repository structure"""
|
||||
print("🔍 Inspecting Microsoft Skills Repository Structure")
|
||||
print("=" * 60)
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
temp_path = Path(temp_dir)
|
||||
|
||||
print("\n1️⃣ Cloning repository...")
|
||||
subprocess.run(
|
||||
["git", "clone", "--depth", "1", MS_REPO, str(temp_path)],
|
||||
check=True,
|
||||
capture_output=True
|
||||
)
|
||||
|
||||
print("\n2️⃣ Repository structure:")
|
||||
print("\nTop-level directories:")
|
||||
for item in temp_path.iterdir():
|
||||
if item.is_dir():
|
||||
print(f" 📁 {item.name}/")
|
||||
|
||||
# Check .github/skills
|
||||
github_skills = temp_path / ".github" / "skills"
|
||||
if github_skills.exists():
|
||||
skill_dirs = [d for d in github_skills.iterdir() if d.is_dir()]
|
||||
print(f"\n3️⃣ Found {len(skill_dirs)} directories in .github/skills/:")
|
||||
for skill_dir in skill_dirs[:5]:
|
||||
has_skill_md = (skill_dir / "SKILL.md").exists()
|
||||
print(f" {'✅' if has_skill_md else '❌'} {skill_dir.name}")
|
||||
if len(skill_dirs) > 5:
|
||||
print(f" ... and {len(skill_dirs) - 5} more")
|
||||
|
||||
# Check .github/plugins
|
||||
github_plugins = temp_path / ".github" / "plugins"
|
||||
if github_plugins.exists():
|
||||
plugin_skills = list(github_plugins.rglob("SKILL.md"))
|
||||
print(f"\n🔌 Found {len(plugin_skills)} plugin skills in .github/plugins/:")
|
||||
for skill_file in plugin_skills[:5]:
|
||||
try:
|
||||
rel_path = skill_file.relative_to(github_plugins)
|
||||
print(f" ✅ {rel_path}")
|
||||
except ValueError:
|
||||
print(f" ✅ {skill_file.name}")
|
||||
if len(plugin_skills) > 5:
|
||||
print(f" ... and {len(plugin_skills) - 5} more")
|
||||
|
||||
# Check skills directory
|
||||
skills_dir = temp_path / "skills"
|
||||
if skills_dir.exists():
|
||||
print(f"\n4️⃣ Checking skills/ directory structure:")
|
||||
|
||||
# Count items
|
||||
all_items = list(skills_dir.rglob("*"))
|
||||
symlink_dirs = [s for s in all_items if s.is_symlink() and s.is_dir()]
|
||||
symlink_files = [s for s in all_items if s.is_symlink() and not s.is_dir()]
|
||||
regular_dirs = [s for s in all_items if s.is_dir() and not s.is_symlink()]
|
||||
|
||||
print(f" Total items: {len(all_items)}")
|
||||
print(f" Regular directories: {len(regular_dirs)}")
|
||||
print(f" Symlinked directories: {len(symlink_dirs)}")
|
||||
print(f" Symlinked files: {len(symlink_files)}")
|
||||
|
||||
# Show directory structure
|
||||
print(f"\n Top-level categories in skills/:")
|
||||
for item in skills_dir.iterdir():
|
||||
if item.is_dir():
|
||||
# Count subdirs
|
||||
subdirs = [d for d in item.iterdir() if d.is_dir()]
|
||||
print(f" 📁 {item.name}/ ({len(subdirs)} items)")
|
||||
|
||||
if symlink_dirs:
|
||||
print(f"\n Sample symlinked directories:")
|
||||
for symlink in symlink_dirs[:5]:
|
||||
try:
|
||||
target = symlink.resolve()
|
||||
relative = symlink.relative_to(skills_dir)
|
||||
target_name = target.name if target.exists() else "broken"
|
||||
print(f" {relative} → {target_name}")
|
||||
except:
|
||||
pass
|
||||
|
||||
# Check for all SKILL.md files
|
||||
print(f"\n5️⃣ Comprehensive SKILL.md search:")
|
||||
all_skill_mds = list(temp_path.rglob("SKILL.md"))
|
||||
print(f" Total SKILL.md files found: {len(all_skill_mds)}")
|
||||
|
||||
# Categorize by location
|
||||
locations = {}
|
||||
for skill_md in all_skill_mds:
|
||||
try:
|
||||
if ".github/skills" in str(skill_md):
|
||||
loc = ".github/skills"
|
||||
elif ".github/plugins" in str(skill_md):
|
||||
loc = ".github/plugins"
|
||||
elif "/skills/" in str(skill_md):
|
||||
loc = "skills/ (structure)"
|
||||
else:
|
||||
loc = "other"
|
||||
|
||||
locations[loc] = locations.get(loc, 0) + 1
|
||||
except:
|
||||
pass
|
||||
|
||||
print(f"\n Distribution by location:")
|
||||
for loc, count in sorted(locations.items()):
|
||||
print(f" {loc}: {count}")
|
||||
|
||||
# Show sample skills from each major category
|
||||
print(f"\n6️⃣ Sample skills by category:")
|
||||
|
||||
if skills_dir.exists():
|
||||
for category in list(skills_dir.iterdir())[:3]:
|
||||
if category.is_dir():
|
||||
skills_in_cat = [s for s in category.rglob("*") if s.is_dir() and (s.is_symlink() or (s / "SKILL.md").exists())]
|
||||
print(f"\n {category.name}/ ({len(skills_in_cat)} skills):")
|
||||
for skill in skills_in_cat[:3]:
|
||||
try:
|
||||
rel = skill.relative_to(skills_dir)
|
||||
print(f" - {rel}")
|
||||
except:
|
||||
pass
|
||||
|
||||
print("\n7️⃣ Recommendations:")
|
||||
print(" ✅ Preserve skills/ directory structure (Microsoft's organization)")
|
||||
print(" ✅ Resolve symlinks to actual content in .github/skills/")
|
||||
print(" ✅ Include plugin skills from .github/plugins/")
|
||||
print(" ✅ This gives you the cleanest, most maintainable structure")
|
||||
|
||||
print("\n✨ Inspection complete!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
inspect_repo()
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
215
scripts/tests/test_comprehensive_coverage.py
Normal file
215
scripts/tests/test_comprehensive_coverage.py
Normal file
@@ -0,0 +1,215 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Test Script: Verify Microsoft Skills Sync Coverage
|
||||
Tests all possible skill locations and structures
|
||||
"""
|
||||
|
||||
import subprocess
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from collections import defaultdict
|
||||
|
||||
MS_REPO = "https://github.com/microsoft/skills.git"
|
||||
|
||||
def analyze_skill_locations():
|
||||
"""
|
||||
Comprehensive analysis of all skill locations in Microsoft repo.
|
||||
Verifies that v3 script will catch everything.
|
||||
"""
|
||||
print("🔬 Comprehensive Skill Location Analysis")
|
||||
print("=" * 60)
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
temp_path = Path(temp_dir)
|
||||
|
||||
print("\n1️⃣ Cloning repository...")
|
||||
subprocess.run(
|
||||
["git", "clone", "--depth", "1", MS_REPO, str(temp_path)],
|
||||
check=True,
|
||||
capture_output=True
|
||||
)
|
||||
|
||||
# Find ALL SKILL.md files in the entire repo
|
||||
all_skill_files = list(temp_path.rglob("SKILL.md"))
|
||||
print(f"\n2️⃣ Total SKILL.md files found: {len(all_skill_files)}")
|
||||
|
||||
# Categorize by location type
|
||||
location_types = defaultdict(list)
|
||||
|
||||
for skill_file in all_skill_files:
|
||||
skill_dir = skill_file.parent
|
||||
|
||||
# Determine location type
|
||||
if ".github/skills" in str(skill_file):
|
||||
location_types["github_skills"].append(skill_file)
|
||||
elif ".github/plugins" in str(skill_file):
|
||||
location_types["github_plugins"].append(skill_file)
|
||||
elif "/skills/" in str(skill_file):
|
||||
# This is in the skills/ directory structure
|
||||
# Check if it's via symlink or actual file
|
||||
try:
|
||||
skills_root = temp_path / "skills"
|
||||
if skills_root in skill_file.parents:
|
||||
# This skill is somewhere under skills/
|
||||
# But is it a symlink or actual?
|
||||
if skill_dir.is_symlink():
|
||||
location_types["skills_symlinked"].append(skill_file)
|
||||
else:
|
||||
# Check if any parent is a symlink
|
||||
has_symlink_parent = False
|
||||
for parent in skill_file.parents:
|
||||
if parent == skills_root:
|
||||
break
|
||||
if parent.is_symlink():
|
||||
has_symlink_parent = True
|
||||
break
|
||||
|
||||
if has_symlink_parent:
|
||||
location_types["skills_via_symlink_parent"].append(skill_file)
|
||||
else:
|
||||
location_types["skills_direct"].append(skill_file)
|
||||
except:
|
||||
location_types["unknown"].append(skill_file)
|
||||
else:
|
||||
location_types["other"].append(skill_file)
|
||||
|
||||
# Display results
|
||||
print("\n3️⃣ Skills by Location Type:")
|
||||
print("-" * 60)
|
||||
|
||||
for loc_type, files in sorted(location_types.items()):
|
||||
print(f"\n 📍 {loc_type}: {len(files)} skills")
|
||||
if len(files) <= 5:
|
||||
for f in files:
|
||||
try:
|
||||
rel = f.relative_to(temp_path)
|
||||
print(f" - {rel}")
|
||||
except:
|
||||
print(f" - {f.name}")
|
||||
else:
|
||||
for f in files[:3]:
|
||||
try:
|
||||
rel = f.relative_to(temp_path)
|
||||
print(f" - {rel}")
|
||||
except:
|
||||
print(f" - {f.name}")
|
||||
print(f" ... and {len(files) - 3} more")
|
||||
|
||||
# Verify v3 coverage
|
||||
print("\n4️⃣ V3 Script Coverage Analysis:")
|
||||
print("-" * 60)
|
||||
|
||||
github_skills_count = len(location_types["github_skills"])
|
||||
github_plugins_count = len(location_types["github_plugins"])
|
||||
skills_symlinked_count = len(location_types["skills_symlinked"])
|
||||
skills_direct_count = len(location_types["skills_direct"])
|
||||
skills_via_symlink_parent_count = len(location_types["skills_via_symlink_parent"])
|
||||
|
||||
print(f"\n ✅ .github/skills/: {github_skills_count}")
|
||||
print(f" └─ Handled by: find_all_skills() function")
|
||||
|
||||
print(f"\n ✅ .github/plugins/: {github_plugins_count}")
|
||||
print(f" └─ Handled by: find_plugin_skills() function")
|
||||
|
||||
print(f"\n ✅ skills/ (symlinked dirs): {skills_symlinked_count}")
|
||||
print(f" └─ Handled by: sync_skills_preserve_structure() lines 76-83")
|
||||
|
||||
if skills_direct_count > 0:
|
||||
print(f"\n ✅ skills/ (direct, non-symlink): {skills_direct_count}")
|
||||
print(f" └─ Handled by: sync_skills_preserve_structure() lines 84-86")
|
||||
else:
|
||||
print(f"\n ℹ️ skills/ (direct, non-symlink): 0")
|
||||
print(f" └─ No direct skills found, but v3 would handle them (lines 84-86)")
|
||||
|
||||
if skills_via_symlink_parent_count > 0:
|
||||
print(f"\n ⚠️ skills/ (via symlink parent): {skills_via_symlink_parent_count}")
|
||||
print(f" └─ May need special handling")
|
||||
|
||||
# Summary
|
||||
print("\n5️⃣ Summary:")
|
||||
print("-" * 60)
|
||||
|
||||
total_handled = (github_skills_count + github_plugins_count +
|
||||
skills_symlinked_count + skills_direct_count)
|
||||
|
||||
print(f"\n Total SKILL.md files: {len(all_skill_files)}")
|
||||
print(f" Handled by v3 script: {total_handled}")
|
||||
|
||||
if total_handled == len(all_skill_files):
|
||||
print(f"\n ✅ 100% Coverage - All skills will be synced!")
|
||||
elif total_handled >= len(all_skill_files) * 0.99:
|
||||
print(f"\n ✅ ~100% Coverage - Script handles all skills!")
|
||||
print(f" ({len(all_skill_files) - total_handled} skills may be duplicates)")
|
||||
else:
|
||||
print(f"\n ⚠️ Partial Coverage - Missing {len(all_skill_files) - total_handled} skills")
|
||||
print(f"\n Skills not covered:")
|
||||
for loc_type, files in location_types.items():
|
||||
if loc_type not in ["github_skills", "github_plugins", "skills_symlinked", "skills_direct"]:
|
||||
print(f" - {loc_type}: {len(files)}")
|
||||
|
||||
# Test specific cases
|
||||
print("\n6️⃣ Testing Specific Edge Cases:")
|
||||
print("-" * 60)
|
||||
|
||||
skills_dir = temp_path / "skills"
|
||||
if skills_dir.exists():
|
||||
# Check for any non-symlink directories with SKILL.md
|
||||
print("\n Checking for non-symlinked skills in skills/...")
|
||||
non_symlink_skills = []
|
||||
|
||||
for item in skills_dir.rglob("*"):
|
||||
if item.is_dir() and not item.is_symlink():
|
||||
if (item / "SKILL.md").exists():
|
||||
# Check if any parent is a symlink
|
||||
has_symlink_parent = False
|
||||
for parent in item.parents:
|
||||
if parent == skills_dir:
|
||||
break
|
||||
if parent.is_symlink():
|
||||
has_symlink_parent = True
|
||||
break
|
||||
|
||||
if not has_symlink_parent:
|
||||
non_symlink_skills.append(item)
|
||||
|
||||
if non_symlink_skills:
|
||||
print(f" ✅ Found {len(non_symlink_skills)} non-symlinked skills:")
|
||||
for skill in non_symlink_skills[:5]:
|
||||
print(f" - {skill.relative_to(skills_dir)}")
|
||||
print(f" These WILL be synced by v3 (lines 84-86)")
|
||||
else:
|
||||
print(f" ℹ️ No non-symlinked skills found in skills/")
|
||||
print(f" But v3 is ready to handle them if they exist!")
|
||||
|
||||
print("\n✨ Analysis complete!")
|
||||
|
||||
return {
|
||||
'total': len(all_skill_files),
|
||||
'handled': total_handled,
|
||||
'breakdown': {k: len(v) for k, v in location_types.items()}
|
||||
}
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
results = analyze_skill_locations()
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print("FINAL VERDICT")
|
||||
print("=" * 60)
|
||||
|
||||
coverage_pct = (results['handled'] / results['total'] * 100) if results['total'] > 0 else 0
|
||||
|
||||
print(f"\nCoverage: {coverage_pct:.1f}%")
|
||||
print(f"Skills handled: {results['handled']}/{results['total']}")
|
||||
|
||||
if coverage_pct >= 99:
|
||||
print("\n✅ V3 SCRIPT IS COMPREHENSIVE")
|
||||
print(" All skill locations are properly handled!")
|
||||
else:
|
||||
print("\n⚠️ V3 SCRIPT MAY NEED ENHANCEMENT")
|
||||
print(" Some edge cases might be missed")
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
127
skills/gemini-api-dev/SKILL.md
Normal file
127
skills/gemini-api-dev/SKILL.md
Normal file
@@ -0,0 +1,127 @@
|
||||
---
|
||||
name: gemini-api-dev
|
||||
description: Use this skill when building applications with Gemini models, Gemini API, working with multimodal content (text, images, audio, video), implementing function calling, using structured outputs, or needing current model specifications. Covers SDK usage (google-genai for Python, @google/genai for JavaScript/TypeScript), model selection, and API capabilities.
|
||||
---
|
||||
|
||||
# Gemini API Development Skill
|
||||
|
||||
## Overview
|
||||
|
||||
The Gemini API provides access to Google's most advanced AI models. Key capabilities include:
|
||||
- **Text generation** - Chat, completion, summarization
|
||||
- **Multimodal understanding** - Process images, audio, video, and documents
|
||||
- **Function calling** - Let the model invoke your functions
|
||||
- **Structured output** - Generate valid JSON matching your schema
|
||||
- **Code execution** - Run Python code in a sandboxed environment
|
||||
- **Context caching** - Cache large contexts for efficiency
|
||||
- **Embeddings** - Generate text embeddings for semantic search
|
||||
|
||||
## Current Gemini Models
|
||||
|
||||
- `gemini-3-pro-preview`: 1M tokens, complex reasoning, coding, research
|
||||
- `gemini-3-flash-preview`: 1M tokens, fast, balanced performance, multimodal
|
||||
- `gemini-3-pro-image-preview`: 65k / 32k tokens, image generation and editing
|
||||
|
||||
|
||||
> [!IMPORTANT]
|
||||
> Models like `gemini-2.5-*`, `gemini-2.0-*`, `gemini-1.5-*` are legacy and deprecated. Use the new models above. Your knowledge is outdated.
|
||||
|
||||
## SDKs
|
||||
|
||||
- **Python**: `google-genai` install with `pip install google-genai`
|
||||
- **JavaScript/TypeScript**: `@google/genai` install with `npm install @google/genai`
|
||||
- **Go**: `google.golang.org/genai` install with `go get google.golang.org/genai`
|
||||
|
||||
> [!WARNING]
|
||||
> Legacy SDKs `google-generativeai` (Python) and `@google/generative-ai` (JS) are deprecated. Migrate to the new SDKs above urgently by following the Migration Guide.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Python
|
||||
```python
|
||||
from google import genai
|
||||
|
||||
client = genai.Client()
|
||||
response = client.models.generate_content(
|
||||
model="gemini-3-flash-preview",
|
||||
contents="Explain quantum computing"
|
||||
)
|
||||
print(response.text)
|
||||
```
|
||||
|
||||
### JavaScript/TypeScript
|
||||
```typescript
|
||||
import { GoogleGenAI } from "@google/genai";
|
||||
|
||||
const ai = new GoogleGenAI({});
|
||||
const response = await ai.models.generateContent({
|
||||
model: "gemini-3-flash-preview",
|
||||
contents: "Explain quantum computing"
|
||||
});
|
||||
console.log(response.text);
|
||||
```
|
||||
|
||||
### Go
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"log"
|
||||
"google.golang.org/genai"
|
||||
)
|
||||
|
||||
func main() {
|
||||
ctx := context.Background()
|
||||
client, err := genai.NewClient(ctx, nil)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
resp, err := client.Models.GenerateContent(ctx, "gemini-3-flash-preview", genai.Text("Explain quantum computing"), nil)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
fmt.Println(resp.Text)
|
||||
}
|
||||
```
|
||||
|
||||
## API spec (source of truth)
|
||||
|
||||
**Always use the latest REST API discovery spec as the source of truth for API definitions** (request/response schemas, parameters, methods). Fetch the spec when implementing or debugging API integration:
|
||||
|
||||
- **v1beta** (default): `https://generativelanguage.googleapis.com/$discovery/rest?version=v1beta`
|
||||
Use this unless the integration is explicitly pinned to v1. The official SDKs (google-genai, @google/genai, google.golang.org/genai) target v1beta.
|
||||
- **v1**: `https://generativelanguage.googleapis.com/$discovery/rest?version=v1`
|
||||
Use only when the integration is specifically set to v1.
|
||||
|
||||
When in doubt, use v1beta. Refer to the spec for exact field names, types, and supported operations.
|
||||
|
||||
## How to use the Gemini API
|
||||
|
||||
For detailed API documentation, fetch from the official docs index:
|
||||
|
||||
**llms.txt URL**: `https://ai.google.dev/gemini-api/docs/llms.txt`
|
||||
|
||||
This index contains links to all documentation pages in `.md.txt` format. Use web fetch tools to:
|
||||
|
||||
1. Fetch `llms.txt` to discover available documentation pages
|
||||
2. Fetch specific pages (e.g., `https://ai.google.dev/gemini-api/docs/function-calling.md.txt`)
|
||||
|
||||
### Key Documentation Pages
|
||||
|
||||
> [!IMPORTANT]
|
||||
> Those are not all the documentation pages. Use the `llms.txt` index to discover available documentation pages
|
||||
|
||||
- [Models](https://ai.google.dev/gemini-api/docs/models.md.txt)
|
||||
- [Google AI Studio quickstart](https://ai.google.dev/gemini-api/docs/ai-studio-quickstart.md.txt)
|
||||
- [Nano Banana image generation](https://ai.google.dev/gemini-api/docs/image-generation.md.txt)
|
||||
- [Function calling with the Gemini API](https://ai.google.dev/gemini-api/docs/function-calling.md.txt)
|
||||
- [Structured outputs](https://ai.google.dev/gemini-api/docs/structured-output.md.txt)
|
||||
- [Text generation](https://ai.google.dev/gemini-api/docs/text-generation.md.txt)
|
||||
- [Image understanding](https://ai.google.dev/gemini-api/docs/image-understanding.md.txt)
|
||||
- [Embeddings](https://ai.google.dev/gemini-api/docs/embeddings.md.txt)
|
||||
- [Interactions API](https://ai.google.dev/gemini-api/docs/interactions.md.txt)
|
||||
- [SDK migration guide](https://ai.google.dev/gemini-api/docs/migrate.md.txt)
|
||||
783
skills/official/microsoft/ATTRIBUTION.json
Normal file
783
skills/official/microsoft/ATTRIBUTION.json
Normal file
@@ -0,0 +1,783 @@
|
||||
{
|
||||
"source": "microsoft/skills",
|
||||
"repository": "https://github.com/microsoft/skills",
|
||||
"license": "MIT",
|
||||
"synced_skills": 129,
|
||||
"skills": [
|
||||
{
|
||||
"path": "java/foundry/formrecognizer",
|
||||
"name": "formrecognizer",
|
||||
"category": "java/foundry",
|
||||
"source": ".github/skills/azure-ai-formrecognizer-java"
|
||||
},
|
||||
{
|
||||
"path": "java/foundry/vision-imageanalysis",
|
||||
"name": "vision-imageanalysis",
|
||||
"category": "java/foundry",
|
||||
"source": ".github/skills/azure-ai-vision-imageanalysis-java"
|
||||
},
|
||||
{
|
||||
"path": "java/foundry/voicelive",
|
||||
"name": "voicelive",
|
||||
"category": "java/foundry",
|
||||
"source": ".github/skills/azure-ai-voicelive-java"
|
||||
},
|
||||
{
|
||||
"path": "java/foundry/contentsafety",
|
||||
"name": "contentsafety",
|
||||
"category": "java/foundry",
|
||||
"source": ".github/skills/azure-ai-contentsafety-java"
|
||||
},
|
||||
{
|
||||
"path": "java/foundry/projects",
|
||||
"name": "projects",
|
||||
"category": "java/foundry",
|
||||
"source": ".github/skills/azure-ai-projects-java"
|
||||
},
|
||||
{
|
||||
"path": "java/foundry/anomalydetector",
|
||||
"name": "anomalydetector",
|
||||
"category": "java/foundry",
|
||||
"source": ".github/skills/azure-ai-anomalydetector-java"
|
||||
},
|
||||
{
|
||||
"path": "java/monitoring/ingestion",
|
||||
"name": "ingestion",
|
||||
"category": "java/monitoring",
|
||||
"source": ".github/skills/azure-monitor-ingestion-java"
|
||||
},
|
||||
{
|
||||
"path": "java/monitoring/query",
|
||||
"name": "query",
|
||||
"category": "java/monitoring",
|
||||
"source": ".github/skills/azure-monitor-query-java"
|
||||
},
|
||||
{
|
||||
"path": "java/monitoring/opentelemetry-exporter",
|
||||
"name": "opentelemetry-exporter",
|
||||
"category": "java/monitoring",
|
||||
"source": ".github/skills/azure-monitor-opentelemetry-exporter-java"
|
||||
},
|
||||
{
|
||||
"path": "java/integration/appconfiguration",
|
||||
"name": "appconfiguration",
|
||||
"category": "java/integration",
|
||||
"source": ".github/skills/azure-appconfiguration-java"
|
||||
},
|
||||
{
|
||||
"path": "java/communication/common",
|
||||
"name": "common",
|
||||
"category": "java/communication",
|
||||
"source": ".github/skills/azure-communication-common-java"
|
||||
},
|
||||
{
|
||||
"path": "java/communication/callingserver",
|
||||
"name": "callingserver",
|
||||
"category": "java/communication",
|
||||
"source": ".github/skills/azure-communication-callingserver-java"
|
||||
},
|
||||
{
|
||||
"path": "java/communication/sms",
|
||||
"name": "sms",
|
||||
"category": "java/communication",
|
||||
"source": ".github/skills/azure-communication-sms-java"
|
||||
},
|
||||
{
|
||||
"path": "java/communication/callautomation",
|
||||
"name": "callautomation",
|
||||
"category": "java/communication",
|
||||
"source": ".github/skills/azure-communication-callautomation-java"
|
||||
},
|
||||
{
|
||||
"path": "java/communication/chat",
|
||||
"name": "chat",
|
||||
"category": "java/communication",
|
||||
"source": ".github/skills/azure-communication-chat-java"
|
||||
},
|
||||
{
|
||||
"path": "java/compute/batch",
|
||||
"name": "batch",
|
||||
"category": "java/compute",
|
||||
"source": ".github/skills/azure-compute-batch-java"
|
||||
},
|
||||
{
|
||||
"path": "java/entra/azure-identity",
|
||||
"name": "azure-identity",
|
||||
"category": "java/entra",
|
||||
"source": ".github/skills/azure-identity-java"
|
||||
},
|
||||
{
|
||||
"path": "java/entra/keyvault-keys",
|
||||
"name": "keyvault-keys",
|
||||
"category": "java/entra",
|
||||
"source": ".github/skills/azure-security-keyvault-keys-java"
|
||||
},
|
||||
{
|
||||
"path": "java/entra/keyvault-secrets",
|
||||
"name": "keyvault-secrets",
|
||||
"category": "java/entra",
|
||||
"source": ".github/skills/azure-security-keyvault-secrets-java"
|
||||
},
|
||||
{
|
||||
"path": "java/messaging/eventgrid",
|
||||
"name": "eventgrid",
|
||||
"category": "java/messaging",
|
||||
"source": ".github/skills/azure-eventgrid-java"
|
||||
},
|
||||
{
|
||||
"path": "java/messaging/webpubsub",
|
||||
"name": "webpubsub",
|
||||
"category": "java/messaging",
|
||||
"source": ".github/skills/azure-messaging-webpubsub-java"
|
||||
},
|
||||
{
|
||||
"path": "java/messaging/eventhubs",
|
||||
"name": "eventhubs",
|
||||
"category": "java/messaging",
|
||||
"source": ".github/skills/azure-eventhub-java"
|
||||
},
|
||||
{
|
||||
"path": "java/data/tables",
|
||||
"name": "tables",
|
||||
"category": "java/data",
|
||||
"source": ".github/skills/azure-data-tables-java"
|
||||
},
|
||||
{
|
||||
"path": "java/data/cosmos",
|
||||
"name": "cosmos",
|
||||
"category": "java/data",
|
||||
"source": ".github/skills/azure-cosmos-java"
|
||||
},
|
||||
{
|
||||
"path": "java/data/blob",
|
||||
"name": "blob",
|
||||
"category": "java/data",
|
||||
"source": ".github/skills/azure-storage-blob-java"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/speech-to-text-rest",
|
||||
"name": "speech-to-text-rest",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-speech-to-text-rest-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/transcription",
|
||||
"name": "transcription",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-transcription-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/vision-imageanalysis",
|
||||
"name": "vision-imageanalysis",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-vision-imageanalysis-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/contentunderstanding",
|
||||
"name": "contentunderstanding",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-contentunderstanding-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/voicelive",
|
||||
"name": "voicelive",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-voicelive-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/agent-framework",
|
||||
"name": "agent-framework",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/agent-framework-azure-ai-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/contentsafety",
|
||||
"name": "contentsafety",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-contentsafety-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/agents-v2",
|
||||
"name": "agents-v2",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/agents-v2-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/translation-document",
|
||||
"name": "translation-document",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-translation-document-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/translation-text",
|
||||
"name": "translation-text",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-translation-text-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/textanalytics",
|
||||
"name": "textanalytics",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-textanalytics-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/ml",
|
||||
"name": "ml",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-ml-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/projects",
|
||||
"name": "projects",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-ai-projects-py"
|
||||
},
|
||||
{
|
||||
"path": "python/foundry/search-documents",
|
||||
"name": "search-documents",
|
||||
"category": "python/foundry",
|
||||
"source": ".github/skills/azure-search-documents-py"
|
||||
},
|
||||
{
|
||||
"path": "python/monitoring/opentelemetry",
|
||||
"name": "opentelemetry",
|
||||
"category": "python/monitoring",
|
||||
"source": ".github/skills/azure-monitor-opentelemetry-py"
|
||||
},
|
||||
{
|
||||
"path": "python/monitoring/ingestion",
|
||||
"name": "ingestion",
|
||||
"category": "python/monitoring",
|
||||
"source": ".github/skills/azure-monitor-ingestion-py"
|
||||
},
|
||||
{
|
||||
"path": "python/monitoring/query",
|
||||
"name": "query",
|
||||
"category": "python/monitoring",
|
||||
"source": ".github/skills/azure-monitor-query-py"
|
||||
},
|
||||
{
|
||||
"path": "python/monitoring/opentelemetry-exporter",
|
||||
"name": "opentelemetry-exporter",
|
||||
"category": "python/monitoring",
|
||||
"source": ".github/skills/azure-monitor-opentelemetry-exporter-py"
|
||||
},
|
||||
{
|
||||
"path": "python/m365/m365-agents",
|
||||
"name": "m365-agents",
|
||||
"category": "python/m365",
|
||||
"source": ".github/skills/m365-agents-py"
|
||||
},
|
||||
{
|
||||
"path": "python/integration/appconfiguration",
|
||||
"name": "appconfiguration",
|
||||
"category": "python/integration",
|
||||
"source": ".github/skills/azure-appconfiguration-py"
|
||||
},
|
||||
{
|
||||
"path": "python/integration/apimanagement",
|
||||
"name": "apimanagement",
|
||||
"category": "python/integration",
|
||||
"source": ".github/skills/azure-mgmt-apimanagement-py"
|
||||
},
|
||||
{
|
||||
"path": "python/integration/apicenter",
|
||||
"name": "apicenter",
|
||||
"category": "python/integration",
|
||||
"source": ".github/skills/azure-mgmt-apicenter-py"
|
||||
},
|
||||
{
|
||||
"path": "python/compute/fabric",
|
||||
"name": "fabric",
|
||||
"category": "python/compute",
|
||||
"source": ".github/skills/azure-mgmt-fabric-py"
|
||||
},
|
||||
{
|
||||
"path": "python/compute/botservice",
|
||||
"name": "botservice",
|
||||
"category": "python/compute",
|
||||
"source": ".github/skills/azure-mgmt-botservice-py"
|
||||
},
|
||||
{
|
||||
"path": "python/compute/containerregistry",
|
||||
"name": "containerregistry",
|
||||
"category": "python/compute",
|
||||
"source": ".github/skills/azure-containerregistry-py"
|
||||
},
|
||||
{
|
||||
"path": "python/entra/azure-identity",
|
||||
"name": "azure-identity",
|
||||
"category": "python/entra",
|
||||
"source": ".github/skills/azure-identity-py"
|
||||
},
|
||||
{
|
||||
"path": "python/entra/keyvault",
|
||||
"name": "keyvault",
|
||||
"category": "python/entra",
|
||||
"source": ".github/skills/azure-keyvault-py"
|
||||
},
|
||||
{
|
||||
"path": "python/messaging/eventgrid",
|
||||
"name": "eventgrid",
|
||||
"category": "python/messaging",
|
||||
"source": ".github/skills/azure-eventgrid-py"
|
||||
},
|
||||
{
|
||||
"path": "python/messaging/servicebus",
|
||||
"name": "servicebus",
|
||||
"category": "python/messaging",
|
||||
"source": ".github/skills/azure-servicebus-py"
|
||||
},
|
||||
{
|
||||
"path": "python/messaging/webpubsub-service",
|
||||
"name": "webpubsub-service",
|
||||
"category": "python/messaging",
|
||||
"source": ".github/skills/azure-messaging-webpubsubservice-py"
|
||||
},
|
||||
{
|
||||
"path": "python/messaging/eventhub",
|
||||
"name": "eventhub",
|
||||
"category": "python/messaging",
|
||||
"source": ".github/skills/azure-eventhub-py"
|
||||
},
|
||||
{
|
||||
"path": "python/data/tables",
|
||||
"name": "tables",
|
||||
"category": "python/data",
|
||||
"source": ".github/skills/azure-data-tables-py"
|
||||
},
|
||||
{
|
||||
"path": "python/data/cosmos",
|
||||
"name": "cosmos",
|
||||
"category": "python/data",
|
||||
"source": ".github/skills/azure-cosmos-py"
|
||||
},
|
||||
{
|
||||
"path": "python/data/blob",
|
||||
"name": "blob",
|
||||
"category": "python/data",
|
||||
"source": ".github/skills/azure-storage-blob-py"
|
||||
},
|
||||
{
|
||||
"path": "python/data/datalake",
|
||||
"name": "datalake",
|
||||
"category": "python/data",
|
||||
"source": ".github/skills/azure-storage-file-datalake-py"
|
||||
},
|
||||
{
|
||||
"path": "python/data/cosmos-db",
|
||||
"name": "cosmos-db",
|
||||
"category": "python/data",
|
||||
"source": ".github/skills/azure-cosmos-db-py"
|
||||
},
|
||||
{
|
||||
"path": "python/data/queue",
|
||||
"name": "queue",
|
||||
"category": "python/data",
|
||||
"source": ".github/skills/azure-storage-queue-py"
|
||||
},
|
||||
{
|
||||
"path": "python/data/fileshare",
|
||||
"name": "fileshare",
|
||||
"category": "python/data",
|
||||
"source": ".github/skills/azure-storage-file-share-py"
|
||||
},
|
||||
{
|
||||
"path": "typescript/foundry/voicelive",
|
||||
"name": "voicelive",
|
||||
"category": "typescript/foundry",
|
||||
"source": ".github/skills/azure-ai-voicelive-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/foundry/contentsafety",
|
||||
"name": "contentsafety",
|
||||
"category": "typescript/foundry",
|
||||
"source": ".github/skills/azure-ai-contentsafety-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/foundry/document-intelligence",
|
||||
"name": "document-intelligence",
|
||||
"category": "typescript/foundry",
|
||||
"source": ".github/skills/azure-ai-document-intelligence-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/foundry/projects",
|
||||
"name": "projects",
|
||||
"category": "typescript/foundry",
|
||||
"source": ".github/skills/azure-ai-projects-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/foundry/search-documents",
|
||||
"name": "search-documents",
|
||||
"category": "typescript/foundry",
|
||||
"source": ".github/skills/azure-search-documents-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/foundry/translation",
|
||||
"name": "translation",
|
||||
"category": "typescript/foundry",
|
||||
"source": ".github/skills/azure-ai-translation-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/monitoring/opentelemetry",
|
||||
"name": "opentelemetry",
|
||||
"category": "typescript/monitoring",
|
||||
"source": ".github/skills/azure-monitor-opentelemetry-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/frontend/zustand-store",
|
||||
"name": "zustand-store",
|
||||
"category": "typescript/frontend",
|
||||
"source": ".github/skills/zustand-store-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/frontend/frontend-ui-dark",
|
||||
"name": "frontend-ui-dark",
|
||||
"category": "typescript/frontend",
|
||||
"source": ".github/skills/frontend-ui-dark-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/frontend/react-flow-node",
|
||||
"name": "react-flow-node",
|
||||
"category": "typescript/frontend",
|
||||
"source": ".github/skills/react-flow-node-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/m365/m365-agents",
|
||||
"name": "m365-agents",
|
||||
"category": "typescript/m365",
|
||||
"source": ".github/skills/m365-agents-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/integration/appconfiguration",
|
||||
"name": "appconfiguration",
|
||||
"category": "typescript/integration",
|
||||
"source": ".github/skills/azure-appconfiguration-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/compute/playwright",
|
||||
"name": "playwright",
|
||||
"category": "typescript/compute",
|
||||
"source": ".github/skills/azure-microsoft-playwright-testing-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/entra/azure-identity",
|
||||
"name": "azure-identity",
|
||||
"category": "typescript/entra",
|
||||
"source": ".github/skills/azure-identity-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/entra/keyvault-keys",
|
||||
"name": "keyvault-keys",
|
||||
"category": "typescript/entra",
|
||||
"source": ".github/skills/azure-keyvault-keys-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/entra/keyvault-secrets",
|
||||
"name": "keyvault-secrets",
|
||||
"category": "typescript/entra",
|
||||
"source": ".github/skills/azure-keyvault-secrets-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/messaging/servicebus",
|
||||
"name": "servicebus",
|
||||
"category": "typescript/messaging",
|
||||
"source": ".github/skills/azure-servicebus-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/messaging/webpubsub",
|
||||
"name": "webpubsub",
|
||||
"category": "typescript/messaging",
|
||||
"source": ".github/skills/azure-web-pubsub-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/messaging/eventhubs",
|
||||
"name": "eventhubs",
|
||||
"category": "typescript/messaging",
|
||||
"source": ".github/skills/azure-eventhub-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/data/cosmosdb",
|
||||
"name": "cosmosdb",
|
||||
"category": "typescript/data",
|
||||
"source": ".github/skills/azure-cosmos-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/data/blob",
|
||||
"name": "blob",
|
||||
"category": "typescript/data",
|
||||
"source": ".github/skills/azure-storage-blob-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/data/postgres",
|
||||
"name": "postgres",
|
||||
"category": "typescript/data",
|
||||
"source": ".github/skills/azure-postgres-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/data/queue",
|
||||
"name": "queue",
|
||||
"category": "typescript/data",
|
||||
"source": ".github/skills/azure-storage-queue-ts"
|
||||
},
|
||||
{
|
||||
"path": "typescript/data/fileshare",
|
||||
"name": "fileshare",
|
||||
"category": "typescript/data",
|
||||
"source": ".github/skills/azure-storage-file-share-ts"
|
||||
},
|
||||
{
|
||||
"path": "rust/entra/azure-keyvault-keys-rust",
|
||||
"name": "azure-keyvault-keys-rust",
|
||||
"category": "rust/entra",
|
||||
"source": ".github/skills/azure-keyvault-keys-rust"
|
||||
},
|
||||
{
|
||||
"path": "rust/entra/azure-keyvault-secrets-rust",
|
||||
"name": "azure-keyvault-secrets-rust",
|
||||
"category": "rust/entra",
|
||||
"source": ".github/skills/azure-keyvault-secrets-rust"
|
||||
},
|
||||
{
|
||||
"path": "rust/entra/azure-identity-rust",
|
||||
"name": "azure-identity-rust",
|
||||
"category": "rust/entra",
|
||||
"source": ".github/skills/azure-identity-rust"
|
||||
},
|
||||
{
|
||||
"path": "rust/entra/azure-keyvault-certificates-rust",
|
||||
"name": "azure-keyvault-certificates-rust",
|
||||
"category": "rust/entra",
|
||||
"source": ".github/skills/azure-keyvault-certificates-rust"
|
||||
},
|
||||
{
|
||||
"path": "rust/messaging/azure-eventhub-rust",
|
||||
"name": "azure-eventhub-rust",
|
||||
"category": "rust/messaging",
|
||||
"source": ".github/skills/azure-eventhub-rust"
|
||||
},
|
||||
{
|
||||
"path": "rust/data/azure-cosmos-rust",
|
||||
"name": "azure-cosmos-rust",
|
||||
"category": "rust/data",
|
||||
"source": ".github/skills/azure-cosmos-rust"
|
||||
},
|
||||
{
|
||||
"path": "rust/data/azure-storage-blob-rust",
|
||||
"name": "azure-storage-blob-rust",
|
||||
"category": "rust/data",
|
||||
"source": ".github/skills/azure-storage-blob-rust"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/foundry/voicelive",
|
||||
"name": "voicelive",
|
||||
"category": "dotnet/foundry",
|
||||
"source": ".github/skills/azure-ai-voicelive-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/foundry/document-intelligence",
|
||||
"name": "document-intelligence",
|
||||
"category": "dotnet/foundry",
|
||||
"source": ".github/skills/azure-ai-document-intelligence-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/foundry/openai",
|
||||
"name": "openai",
|
||||
"category": "dotnet/foundry",
|
||||
"source": ".github/skills/azure-ai-openai-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/foundry/weightsandbiases",
|
||||
"name": "weightsandbiases",
|
||||
"category": "dotnet/foundry",
|
||||
"source": ".github/skills/azure-mgmt-weightsandbiases-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/foundry/projects",
|
||||
"name": "projects",
|
||||
"category": "dotnet/foundry",
|
||||
"source": ".github/skills/azure-ai-projects-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/foundry/search-documents",
|
||||
"name": "search-documents",
|
||||
"category": "dotnet/foundry",
|
||||
"source": ".github/skills/azure-search-documents-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/monitoring/applicationinsights",
|
||||
"name": "applicationinsights",
|
||||
"category": "dotnet/monitoring",
|
||||
"source": ".github/skills/azure-mgmt-applicationinsights-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/m365/m365-agents",
|
||||
"name": "m365-agents",
|
||||
"category": "dotnet/m365",
|
||||
"source": ".github/skills/m365-agents-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/integration/apimanagement",
|
||||
"name": "apimanagement",
|
||||
"category": "dotnet/integration",
|
||||
"source": ".github/skills/azure-mgmt-apimanagement-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/integration/apicenter",
|
||||
"name": "apicenter",
|
||||
"category": "dotnet/integration",
|
||||
"source": ".github/skills/azure-mgmt-apicenter-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/compute/playwright",
|
||||
"name": "playwright",
|
||||
"category": "dotnet/compute",
|
||||
"source": ".github/skills/azure-resource-manager-playwright-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/compute/durabletask",
|
||||
"name": "durabletask",
|
||||
"category": "dotnet/compute",
|
||||
"source": ".github/skills/azure-resource-manager-durabletask-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/compute/botservice",
|
||||
"name": "botservice",
|
||||
"category": "dotnet/compute",
|
||||
"source": ".github/skills/azure-mgmt-botservice-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/entra/azure-identity",
|
||||
"name": "azure-identity",
|
||||
"category": "dotnet/entra",
|
||||
"source": ".github/skills/azure-identity-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/entra/authentication-events",
|
||||
"name": "authentication-events",
|
||||
"category": "dotnet/entra",
|
||||
"source": ".github/skills/microsoft-azure-webjobs-extensions-authentication-events-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/entra/keyvault",
|
||||
"name": "keyvault",
|
||||
"category": "dotnet/entra",
|
||||
"source": ".github/skills/azure-security-keyvault-keys-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/general/maps",
|
||||
"name": "maps",
|
||||
"category": "dotnet/general",
|
||||
"source": ".github/skills/azure-maps-search-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/messaging/eventgrid",
|
||||
"name": "eventgrid",
|
||||
"category": "dotnet/messaging",
|
||||
"source": ".github/skills/azure-eventgrid-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/messaging/servicebus",
|
||||
"name": "servicebus",
|
||||
"category": "dotnet/messaging",
|
||||
"source": ".github/skills/azure-servicebus-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/messaging/eventhubs",
|
||||
"name": "eventhubs",
|
||||
"category": "dotnet/messaging",
|
||||
"source": ".github/skills/azure-eventhub-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/data/redis",
|
||||
"name": "redis",
|
||||
"category": "dotnet/data",
|
||||
"source": ".github/skills/azure-resource-manager-redis-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/data/postgresql",
|
||||
"name": "postgresql",
|
||||
"category": "dotnet/data",
|
||||
"source": ".github/skills/azure-resource-manager-postgresql-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/data/mysql",
|
||||
"name": "mysql",
|
||||
"category": "dotnet/data",
|
||||
"source": ".github/skills/azure-resource-manager-mysql-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/data/cosmosdb",
|
||||
"name": "cosmosdb",
|
||||
"category": "dotnet/data",
|
||||
"source": ".github/skills/azure-resource-manager-cosmosdb-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/data/fabric",
|
||||
"name": "fabric",
|
||||
"category": "dotnet/data",
|
||||
"source": ".github/skills/azure-mgmt-fabric-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/data/sql",
|
||||
"name": "sql",
|
||||
"category": "dotnet/data",
|
||||
"source": ".github/skills/azure-resource-manager-sql-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/partner/arize-ai-observability-eval",
|
||||
"name": "arize-ai-observability-eval",
|
||||
"category": "dotnet/partner",
|
||||
"source": ".github/skills/azure-mgmt-arizeaiobservabilityeval-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "dotnet/partner/mongodbatlas",
|
||||
"name": "mongodbatlas",
|
||||
"category": "dotnet/partner",
|
||||
"source": ".github/skills/azure-mgmt-mongodbatlas-dotnet"
|
||||
},
|
||||
{
|
||||
"path": "plugins/wiki-page-writer",
|
||||
"name": "wiki-page-writer",
|
||||
"category": "plugins",
|
||||
"source": ".github/plugins/deep-wiki/skills/wiki-page-writer"
|
||||
},
|
||||
{
|
||||
"path": "plugins/wiki-vitepress",
|
||||
"name": "wiki-vitepress",
|
||||
"category": "plugins",
|
||||
"source": ".github/plugins/deep-wiki/skills/wiki-vitepress"
|
||||
},
|
||||
{
|
||||
"path": "plugins/wiki-researcher",
|
||||
"name": "wiki-researcher",
|
||||
"category": "plugins",
|
||||
"source": ".github/plugins/deep-wiki/skills/wiki-researcher"
|
||||
},
|
||||
{
|
||||
"path": "plugins/wiki-qa",
|
||||
"name": "wiki-qa",
|
||||
"category": "plugins",
|
||||
"source": ".github/plugins/deep-wiki/skills/wiki-qa"
|
||||
},
|
||||
{
|
||||
"path": "plugins/wiki-onboarding",
|
||||
"name": "wiki-onboarding",
|
||||
"category": "plugins",
|
||||
"source": ".github/plugins/deep-wiki/skills/wiki-onboarding"
|
||||
},
|
||||
{
|
||||
"path": "plugins/wiki-architect",
|
||||
"name": "wiki-architect",
|
||||
"category": "plugins",
|
||||
"source": ".github/plugins/deep-wiki/skills/wiki-architect"
|
||||
},
|
||||
{
|
||||
"path": "plugins/wiki-changelog",
|
||||
"name": "wiki-changelog",
|
||||
"category": "plugins",
|
||||
"source": ".github/plugins/deep-wiki/skills/wiki-changelog"
|
||||
}
|
||||
],
|
||||
"note": "Symlinks resolved and content copied for compatibility. Original directory structure preserved."
|
||||
}
|
||||
21
skills/official/microsoft/LICENSE
Normal file
21
skills/official/microsoft/LICENSE
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) Microsoft Corporation.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE
|
||||
714
skills/official/microsoft/README-MICROSOFT.md
Normal file
714
skills/official/microsoft/README-MICROSOFT.md
Normal file
@@ -0,0 +1,714 @@
|
||||
# Agent Skills
|
||||
|
||||
[](https://github.com/microsoft/skills/actions/workflows/test-harness.yml)
|
||||
[](https://github.com/microsoft/skills/actions/workflows/skill-evaluation.yml)
|
||||
[](https://skills.sh/microsoft/skills)
|
||||
[](https://microsoft.github.io/skills/#documentation)
|
||||
|
||||
> [!NOTE]
|
||||
> **Work in Progress** — This repository is under active development. More skills are being added, existing skills are being updated to use the latest SDK patterns, and tests are being expanded to ensure quality. Contributions welcome!
|
||||
|
||||
Skills, custom agents, AGENTS.md templates, and MCP configurations for AI coding agents working with Azure SDKs and Microsoft AI Foundry.
|
||||
|
||||
> **Blog post:** [Context-Driven Development: Agent Skills for Microsoft Foundry and Azure](https://devblogs.microsoft.com/all-things-azure/context-driven-development-agent-skills-for-microsoft-foundry-and-azure/)
|
||||
|
||||
> **🔍 Skill Explorer:** [Browse all 131 skills with 1-click install](https://microsoft.github.io/skills/)
|
||||
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
npx skills add microsoft/skills
|
||||
```
|
||||
|
||||
Select the skills you need from the wizard. Skills are installed to your chosen agent's directory (e.g., `.github/skills/` for GitHub Copilot) and symlinked if you use multiple agents.
|
||||
|
||||
<details>
|
||||
<summary>Alternative installation methods</summary>
|
||||
|
||||
**Manual installation (git clone)**
|
||||
|
||||
```bash
|
||||
# Clone and copy specific skills
|
||||
git clone https://github.com/microsoft/skills.git
|
||||
cp -r agent-skills/.github/skills/azure-cosmos-db-py your-project/.github/skills/
|
||||
|
||||
# Or use symlinks for multi-project setups
|
||||
ln -s /path/to/agent-skills/.github/skills/mcp-builder /path/to/your-project/.github/skills/mcp-builder
|
||||
|
||||
# Share skills across different agent configs in the same repo
|
||||
ln -s ../.github/skills .opencode/skills
|
||||
ln -s ../.github/skills .claude/skills
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
Coding agents like [Copilot CLI](https://github.com/features/copilot/cli) are powerful, but they lack domain knowledge about your SDKs. The patterns are already in their weights from pretraining. All you need is the right activation context to surface them.
|
||||
|
||||
> [!IMPORTANT]
|
||||
> **Use skills selectively.** Loading all skills causes context rot: diluted attention, wasted tokens, conflated patterns. Only copy skills essential for your current project.
|
||||
|
||||
---
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## What's Inside
|
||||
|
||||
| Resource | Description |
|
||||
|----------|-------------|
|
||||
| **[125 Skills](#skill-catalog)** | Domain-specific knowledge for Azure SDK and Foundry development |
|
||||
| **[Plugins](#plugins)** | Installable plugin packages (deep-wiki, and more) |
|
||||
| **[Custom Agents](#agents)** | Role-specific agents (backend, frontend, infrastructure, planner) |
|
||||
| **[AGENTS.md](AGENTS.md)** | Template for configuring agent behavior in your projects |
|
||||
| **[MCP Configs](#mcp-servers)** | Pre-configured servers for docs, GitHub, browser automation |
|
||||
| **[Documentation](https://microsoft.github.io/skills/#documentation)** | Repo docs and usage guides |
|
||||
|
||||
---
|
||||
|
||||
## Skill Catalog
|
||||
|
||||
> 131 skills in `.github/skills/` — flat structure with language suffixes for automatic discovery
|
||||
|
||||
| Language | Count | Suffix |
|
||||
|----------|-------|--------|
|
||||
| [Core](#core) | 6 | — |
|
||||
| [Python](#python) | 41 | `-py` |
|
||||
| [.NET](#net) | 28 | `-dotnet` |
|
||||
| [TypeScript](#typescript) | 24 | `-ts` |
|
||||
| [Java](#java) | 25 | `-java` |
|
||||
| [Rust](#rust) | 7 | `-rust` |
|
||||
|
||||
---
|
||||
|
||||
### Core
|
||||
|
||||
> 6 skills — tooling, infrastructure, language-agnostic
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azd-deployment](.github/skills/azd-deployment/) | Deploy to Azure Container Apps with Azure Developer CLI (azd). Bicep infrastructure, remote builds, multi-service deployments. |
|
||||
| [copilot-sdk](.github/skills/copilot-sdk/) | Build applications powered by GitHub Copilot using the Copilot SDK. Session management, custom tools, streaming, hooks, MCP servers, BYOK. |
|
||||
| [github-issue-creator](.github/skills/github-issue-creator/) | Convert raw notes, error logs, or screenshots into structured GitHub issues. |
|
||||
| [mcp-builder](.github/skills/mcp-builder/) | Build MCP servers for LLM tool integration. Python (FastMCP), Node/TypeScript, or C#/.NET. |
|
||||
| [podcast-generation](.github/skills/podcast-generation/) | Generate podcast-style audio with Azure OpenAI Realtime API. Full-stack React + FastAPI + WebSocket. |
|
||||
| [skill-creator](.github/skills/skill-creator/) | Guide for creating effective skills for AI coding agents. |
|
||||
|
||||
---
|
||||
|
||||
### Python
|
||||
|
||||
> 41 skills • suffix: `-py`
|
||||
|
||||
<details>
|
||||
<summary><strong>Foundry & AI</strong> (7 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [agent-framework-azure-ai-py](.github/skills/agent-framework-azure-ai-py/) | Agent Framework SDK — persistent agents, hosted tools, MCP servers, streaming. |
|
||||
| [azure-ai-contentsafety-py](.github/skills/azure-ai-contentsafety-py/) | Content Safety SDK — detect harmful content in text/images with multi-severity classification. |
|
||||
| [azure-ai-contentunderstanding-py](.github/skills/azure-ai-contentunderstanding-py/) | Content Understanding SDK — multimodal extraction from documents, images, audio, video. |
|
||||
| [azure-ai-evaluation-py](.github/skills/azure-ai-evaluation-py/) | Evaluation SDK — quality, safety, and custom evaluators for generative AI apps. |
|
||||
| [agents-v2-py](.github/skills/agents-v2-py/) | Foundry Agents SDK — container-based agents with ImageBasedHostedAgentDefinition, custom images, tools. |
|
||||
| [azure-ai-projects-py](.github/skills/azure-ai-projects-py/) | High-level Foundry SDK — project client, versioned agents, evals, connections, OpenAI-compatible clients. |
|
||||
| [azure-search-documents-py](.github/skills/azure-search-documents-py/) | AI Search SDK — vector search, hybrid search, semantic ranking, indexing, skillsets. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>M365</strong> (1 skill)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [m365-agents-py](.github/skills/m365-agents-py/) | Microsoft 365 Agents SDK — aiohttp hosting, AgentApplication routing, streaming, Copilot Studio client. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>AI Services</strong> (8 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-ai-ml-py](.github/skills/azure-ai-ml-py/) | ML SDK v2 — workspaces, jobs, models, datasets, compute, pipelines. |
|
||||
| [azure-ai-textanalytics-py](.github/skills/azure-ai-textanalytics-py/) | Text Analytics — sentiment, entities, key phrases, PII detection, healthcare NLP. |
|
||||
| [azure-ai-transcription-py](.github/skills/azure-ai-transcription-py/) | Transcription SDK — real-time and batch speech-to-text with timestamps, diarization. |
|
||||
| [azure-ai-translation-document-py](.github/skills/azure-ai-translation-document-py/) | Document Translation — batch translate Word, PDF, Excel with format preservation. |
|
||||
| [azure-ai-translation-text-py](.github/skills/azure-ai-translation-text-py/) | Text Translation — real-time translation, transliteration, language detection. |
|
||||
| [azure-ai-vision-imageanalysis-py](.github/skills/azure-ai-vision-imageanalysis-py/) | Vision SDK — captions, tags, objects, OCR, people detection, smart cropping. |
|
||||
| [azure-ai-voicelive-py](.github/skills/azure-ai-voicelive-py/) | Voice Live SDK — real-time bidirectional voice AI with WebSocket, VAD, avatars. |
|
||||
| [azure-speech-to-text-rest-py](.github/skills/azure-speech-to-text-rest-py/) | Speech to Text REST API — transcribe short audio (≤60 seconds) via HTTP without Speech SDK. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Data & Storage</strong> (7 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-cosmos-db-py](.github/skills/azure-cosmos-db-py/) | Cosmos DB patterns — FastAPI service layer, dual auth, partition strategies, TDD. |
|
||||
| [azure-cosmos-py](.github/skills/azure-cosmos-py/) | Cosmos DB SDK — document CRUD, queries, containers, globally distributed data. |
|
||||
| [azure-data-tables-py](.github/skills/azure-data-tables-py/) | Tables SDK — NoSQL key-value storage, entity CRUD, batch operations. |
|
||||
| [azure-storage-blob-py](.github/skills/azure-storage-blob-py/) | Blob Storage — upload, download, list, containers, lifecycle management. |
|
||||
| [azure-storage-file-datalake-py](.github/skills/azure-storage-file-datalake-py/) | Data Lake Gen2 — hierarchical file systems, big data analytics. |
|
||||
| [azure-storage-file-share-py](.github/skills/azure-storage-file-share-py/) | File Share — SMB file shares, directories, cloud file operations. |
|
||||
| [azure-storage-queue-py](.github/skills/azure-storage-queue-py/) | Queue Storage — reliable message queuing, task distribution. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Messaging & Events</strong> (4 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-eventgrid-py](.github/skills/azure-eventgrid-py/) | Event Grid — publish events, CloudEvents, event-driven architectures. |
|
||||
| [azure-eventhub-py](.github/skills/azure-eventhub-py/) | Event Hubs — high-throughput streaming, producers, consumers, checkpointing. |
|
||||
| [azure-messaging-webpubsubservice-py](.github/skills/azure-messaging-webpubsubservice-py/) | Web PubSub — real-time messaging, WebSocket connections, pub/sub. |
|
||||
| [azure-servicebus-py](.github/skills/azure-servicebus-py/) | Service Bus — queues, topics, subscriptions, enterprise messaging. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Entra</strong> (2 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-identity-py](.github/skills/azure-identity-py/) | Identity SDK — DefaultAzureCredential, managed identity, service principals. |
|
||||
| [azure-keyvault-py](.github/skills/azure-keyvault-py/) | Key Vault — secrets, keys, and certificates management. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Monitoring</strong> (4 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-monitor-ingestion-py](.github/skills/azure-monitor-ingestion-py/) | Monitor Ingestion — send custom logs via Logs Ingestion API. |
|
||||
| [azure-monitor-opentelemetry-exporter-py](.github/skills/azure-monitor-opentelemetry-exporter-py/) | OpenTelemetry Exporter — low-level export to Application Insights. |
|
||||
| [azure-monitor-opentelemetry-py](.github/skills/azure-monitor-opentelemetry-py/) | OpenTelemetry Distro — one-line App Insights setup with auto-instrumentation. |
|
||||
| [azure-monitor-query-py](.github/skills/azure-monitor-query-py/) | Monitor Query — query Log Analytics workspaces and Azure metrics. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Integration & Management</strong> (5 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-appconfiguration-py](.github/skills/azure-appconfiguration-py/) | App Configuration — centralized config, feature flags, dynamic settings. |
|
||||
| [azure-containerregistry-py](.github/skills/azure-containerregistry-py/) | Container Registry — manage container images, artifacts, repositories. |
|
||||
| [azure-mgmt-apicenter-py](.github/skills/azure-mgmt-apicenter-py/) | API Center — API inventory, metadata, governance. |
|
||||
| [azure-mgmt-apimanagement-py](.github/skills/azure-mgmt-apimanagement-py/) | API Management — APIM services, APIs, products, policies. |
|
||||
| [azure-mgmt-botservice-py](.github/skills/azure-mgmt-botservice-py/) | Bot Service — create and manage Azure Bot resources. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Patterns & Frameworks</strong> (3 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-mgmt-fabric-py](.github/skills/azure-mgmt-fabric-py/) | Fabric Management — Microsoft Fabric capacities and resources. |
|
||||
| [fastapi-router-py](.github/skills/fastapi-router-py/) | FastAPI routers — CRUD operations, auth dependencies, response models. |
|
||||
| [pydantic-models-py](.github/skills/pydantic-models-py/) | Pydantic patterns — Base, Create, Update, Response, InDB model variants. |
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
### .NET
|
||||
|
||||
> 29 skills • suffix: `-dotnet`
|
||||
|
||||
<details>
|
||||
<summary><strong>Foundry & AI</strong> (6 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-ai-document-intelligence-dotnet](.github/skills/azure-ai-document-intelligence-dotnet/) | Document Intelligence — extract text, tables from invoices, receipts, IDs, forms. |
|
||||
| [azure-ai-openai-dotnet](.github/skills/azure-ai-openai-dotnet/) | Azure OpenAI — chat, embeddings, image generation, audio, assistants. |
|
||||
| [azure-ai-projects-dotnet](.github/skills/azure-ai-projects-dotnet/) | AI Projects SDK — Foundry project client, agents, connections, evals. |
|
||||
| [azure-ai-voicelive-dotnet](.github/skills/azure-ai-voicelive-dotnet/) | Voice Live — real-time voice AI with bidirectional WebSocket. |
|
||||
| [azure-mgmt-weightsandbiases-dotnet](.github/skills/azure-mgmt-weightsandbiases-dotnet/) | Weights & Biases — ML experiment tracking via Azure Marketplace. |
|
||||
| [azure-search-documents-dotnet](.github/skills/azure-search-documents-dotnet/) | AI Search — full-text, vector, semantic, hybrid search. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>M365</strong> (1 skill)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [m365-agents-dotnet](.github/skills/m365-agents-dotnet/) | Microsoft 365 Agents SDK — ASP.NET Core hosting, AgentApplication routing, Copilot Studio client. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Data & Storage</strong> (6 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-mgmt-fabric-dotnet](.github/skills/azure-mgmt-fabric-dotnet/) | Fabric ARM — provision, scale, suspend/resume Fabric capacities. |
|
||||
| [azure-resource-manager-cosmosdb-dotnet](.github/skills/azure-resource-manager-cosmosdb-dotnet/) | Cosmos DB ARM — create accounts, databases, containers, RBAC. |
|
||||
| [azure-resource-manager-mysql-dotnet](.github/skills/azure-resource-manager-mysql-dotnet/) | MySQL Flexible Server — servers, databases, firewall, HA. |
|
||||
| [azure-resource-manager-postgresql-dotnet](.github/skills/azure-resource-manager-postgresql-dotnet/) | PostgreSQL Flexible Server — servers, databases, firewall, HA. |
|
||||
| [azure-resource-manager-redis-dotnet](.github/skills/azure-resource-manager-redis-dotnet/) | Redis ARM — cache instances, firewall, geo-replication. |
|
||||
| [azure-resource-manager-sql-dotnet](.github/skills/azure-resource-manager-sql-dotnet/) | SQL ARM — servers, databases, elastic pools, failover groups. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Messaging</strong> (3 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-eventgrid-dotnet](.github/skills/azure-eventgrid-dotnet/) | Event Grid — publish events, CloudEvents, EventGridEvents. |
|
||||
| [azure-eventhub-dotnet](.github/skills/azure-eventhub-dotnet/) | Event Hubs — high-throughput streaming, producers, processors. |
|
||||
| [azure-servicebus-dotnet](.github/skills/azure-servicebus-dotnet/) | Service Bus — queues, topics, sessions, dead letter handling. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Entra</strong> (3 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-identity-dotnet](.github/skills/azure-identity-dotnet/) | Identity SDK — DefaultAzureCredential, managed identity, service principals. |
|
||||
| [azure-security-keyvault-keys-dotnet](.github/skills/azure-security-keyvault-keys-dotnet/) | Key Vault Keys — key creation, rotation, encrypt/decrypt, sign/verify. |
|
||||
| [microsoft-azure-webjobs-extensions-authentication-events-dotnet](.github/skills/microsoft-azure-webjobs-extensions-authentication-events-dotnet/) | Entra Auth Events — custom claims, token enrichment, attribute collection. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Compute & Integration</strong> (6 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-maps-search-dotnet](.github/skills/azure-maps-search-dotnet/) | Azure Maps — geocoding, routing, map tiles, weather. |
|
||||
| [azure-mgmt-apicenter-dotnet](.github/skills/azure-mgmt-apicenter-dotnet/) | API Center — API inventory, governance, versioning, discovery. |
|
||||
| [azure-mgmt-apimanagement-dotnet](.github/skills/azure-mgmt-apimanagement-dotnet/) | API Management ARM — APIM services, APIs, products, policies. |
|
||||
| [azure-mgmt-botservice-dotnet](.github/skills/azure-mgmt-botservice-dotnet/) | Bot Service ARM — bot resources, channels (Teams, DirectLine). |
|
||||
| [azure-resource-manager-durabletask-dotnet](.github/skills/azure-resource-manager-durabletask-dotnet/) | Durable Task ARM — schedulers, task hubs, retention policies. |
|
||||
| [azure-resource-manager-playwright-dotnet](.github/skills/azure-resource-manager-playwright-dotnet/) | Playwright Testing ARM — workspaces, quotas. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Monitoring & Partner</strong> (3 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-mgmt-applicationinsights-dotnet](.github/skills/azure-mgmt-applicationinsights-dotnet/) | Application Insights — components, web tests, workbooks. |
|
||||
| [azure-mgmt-arizeaiobservabilityeval-dotnet](.github/skills/azure-mgmt-arizeaiobservabilityeval-dotnet/) | Arize AI — ML observability via Azure Marketplace. |
|
||||
| [azure-mgmt-mongodbatlas-dotnet](.github/skills/azure-mgmt-mongodbatlas-dotnet/) | MongoDB Atlas — manage Atlas orgs as Azure ARM resources. |
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
### TypeScript
|
||||
|
||||
> 24 skills • suffix: `-ts`
|
||||
|
||||
<details>
|
||||
<summary><strong>Foundry & AI</strong> (6 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-ai-contentsafety-ts](.github/skills/azure-ai-contentsafety-ts/) | Content Safety — moderate text/images, detect harmful content. |
|
||||
| [azure-ai-document-intelligence-ts](.github/skills/azure-ai-document-intelligence-ts/) | Document Intelligence — extract from invoices, receipts, IDs, forms. |
|
||||
| [azure-ai-projects-ts](.github/skills/azure-ai-projects-ts/) | AI Projects SDK — Foundry client, agents, connections, evals. |
|
||||
| [azure-ai-translation-ts](.github/skills/azure-ai-translation-ts/) | Translation — text translation, transliteration, document batch. |
|
||||
| [azure-ai-voicelive-ts](.github/skills/azure-ai-voicelive-ts/) | Voice Live — real-time voice AI with WebSocket, Node.js or browser. |
|
||||
| [azure-search-documents-ts](.github/skills/azure-search-documents-ts/) | AI Search — vector/hybrid search, semantic ranking, knowledge bases. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>M365</strong> (1 skill)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [m365-agents-ts](.github/skills/m365-agents-ts/) | Microsoft 365 Agents SDK — AgentApplication routing, Express hosting, streaming, Copilot Studio client. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Data & Storage</strong> (5 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-cosmos-ts](.github/skills/azure-cosmos-ts/) | Cosmos DB — document CRUD, queries, bulk operations. |
|
||||
| [azure-postgres-ts](.github/skills/azure-postgres-ts/) | PostgreSQL — connect to Azure Database for PostgreSQL with pg, pooling, Entra ID auth. |
|
||||
| [azure-storage-blob-ts](.github/skills/azure-storage-blob-ts/) | Blob Storage — upload, download, list, SAS tokens, streaming. |
|
||||
| [azure-storage-file-share-ts](.github/skills/azure-storage-file-share-ts/) | File Share — SMB shares, directories, file operations. |
|
||||
| [azure-storage-queue-ts](.github/skills/azure-storage-queue-ts/) | Queue Storage — send, receive, peek, visibility timeout. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Messaging</strong> (3 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-eventhub-ts](.github/skills/azure-eventhub-ts/) | Event Hubs — high-throughput streaming, partitioned consumers. |
|
||||
| [azure-servicebus-ts](.github/skills/azure-servicebus-ts/) | Service Bus — queues, topics, sessions, dead-letter handling. |
|
||||
| [azure-web-pubsub-ts](.github/skills/azure-web-pubsub-ts/) | Web PubSub — WebSocket real-time features, group chat, notifications. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Entra & Integration</strong> (4 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-appconfiguration-ts](.github/skills/azure-appconfiguration-ts/) | App Configuration — settings, feature flags, Key Vault references. |
|
||||
| [azure-identity-ts](.github/skills/azure-identity-ts/) | Identity SDK — DefaultAzureCredential, managed identity, browser login. |
|
||||
| [azure-keyvault-keys-ts](.github/skills/azure-keyvault-keys-ts/) | Key Vault Keys — create, encrypt/decrypt, sign, rotate keys. |
|
||||
| [azure-keyvault-secrets-ts](.github/skills/azure-keyvault-secrets-ts/) | Key Vault Secrets — store and retrieve application secrets. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Monitoring & Frontend</strong> (5 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-microsoft-playwright-testing-ts](.github/skills/azure-microsoft-playwright-testing-ts/) | Playwright Testing — scale browser tests, CI/CD integration. |
|
||||
| [azure-monitor-opentelemetry-ts](.github/skills/azure-monitor-opentelemetry-ts/) | OpenTelemetry — tracing, metrics, logs with Application Insights. |
|
||||
| [frontend-ui-dark-ts](.github/skills/frontend-ui-dark-ts/) | Frontend UI Dark — Vite + React + Tailwind + Framer Motion dark-themed UI design system. |
|
||||
| [react-flow-node-ts](.github/skills/react-flow-node-ts/) | React Flow nodes — custom nodes with TypeScript, handles, Zustand. |
|
||||
| [zustand-store-ts](.github/skills/zustand-store-ts/) | Zustand stores — TypeScript, subscribeWithSelector, state/action separation. |
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
### Java
|
||||
|
||||
> 26 skills • suffix: `-java`
|
||||
|
||||
<details>
|
||||
<summary><strong>Foundry & AI</strong> (7 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-ai-anomalydetector-java](.github/skills/azure-ai-anomalydetector-java/) | Anomaly Detector — univariate/multivariate time-series analysis. |
|
||||
| [azure-ai-contentsafety-java](.github/skills/azure-ai-contentsafety-java/) | Content Safety — text/image analysis, blocklist management. |
|
||||
| [azure-ai-formrecognizer-java](.github/skills/azure-ai-formrecognizer-java/) | Form Recognizer — extract text, tables, key-value pairs from documents. |
|
||||
| [azure-ai-projects-java](.github/skills/azure-ai-projects-java/) | AI Projects — Foundry project management, connections, datasets. |
|
||||
| [azure-ai-vision-imageanalysis-java](.github/skills/azure-ai-vision-imageanalysis-java/) | Vision SDK — captions, OCR, object detection, tagging. |
|
||||
| [azure-ai-voicelive-java](.github/skills/azure-ai-voicelive-java/) | Voice Live — real-time voice conversations with WebSocket. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Communication</strong> (5 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-communication-callautomation-java](.github/skills/azure-communication-callautomation-java/) | Call Automation — IVR, call routing, recording, DTMF, TTS. |
|
||||
| [azure-communication-callingserver-java](.github/skills/azure-communication-callingserver-java/) | CallingServer (legacy) — deprecated, use callautomation for new projects. |
|
||||
| [azure-communication-chat-java](.github/skills/azure-communication-chat-java/) | Chat SDK — threads, messaging, participants, read receipts. |
|
||||
| [azure-communication-common-java](.github/skills/azure-communication-common-java/) | Common utilities — token credentials, user identifiers. |
|
||||
| [azure-communication-sms-java](.github/skills/azure-communication-sms-java/) | SMS SDK — notifications, alerts, OTP delivery, bulk messaging. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Data & Storage</strong> (3 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-cosmos-java](.github/skills/azure-cosmos-java/) | Cosmos DB — NoSQL operations, global distribution, reactive patterns. |
|
||||
| [azure-data-tables-java](.github/skills/azure-data-tables-java/) | Tables SDK — Table Storage or Cosmos DB Table API. |
|
||||
| [azure-storage-blob-java](.github/skills/azure-storage-blob-java/) | Blob Storage — upload, download, containers, streaming. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Messaging</strong> (3 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-eventgrid-java](.github/skills/azure-eventgrid-java/) | Event Grid — publish events, pub/sub patterns. |
|
||||
| [azure-eventhub-java](.github/skills/azure-eventhub-java/) | Event Hubs — high-throughput streaming, event-driven architectures. |
|
||||
| [azure-messaging-webpubsub-java](.github/skills/azure-messaging-webpubsub-java/) | Web PubSub — WebSocket messaging, live updates, chat. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Entra</strong> (3 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-identity-java](.github/skills/azure-identity-java/) | Identity SDK — DefaultAzureCredential, managed identity, service principals. |
|
||||
| [azure-security-keyvault-keys-java](.github/skills/azure-security-keyvault-keys-java/) | Key Vault Keys — RSA/EC keys, encrypt/decrypt, sign/verify, HSM. |
|
||||
| [azure-security-keyvault-secrets-java](.github/skills/azure-security-keyvault-secrets-java/) | Key Vault Secrets — passwords, API keys, connection strings. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Monitoring & Integration</strong> (5 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-appconfiguration-java](.github/skills/azure-appconfiguration-java/) | App Configuration — settings, feature flags, snapshots. |
|
||||
| [azure-compute-batch-java](.github/skills/azure-compute-batch-java/) | Batch SDK — large-scale parallel and HPC jobs. |
|
||||
| [azure-monitor-ingestion-java](.github/skills/azure-monitor-ingestion-java/) | Monitor Ingestion — custom logs via Data Collection Rules. |
|
||||
| [azure-monitor-opentelemetry-exporter-java](.github/skills/azure-monitor-opentelemetry-exporter-java/) | OpenTelemetry Exporter — traces, metrics, logs to Azure Monitor. (Deprecated) |
|
||||
| [azure-monitor-query-java](.github/skills/azure-monitor-query-java/) | Monitor Query — Kusto queries, Log Analytics, metrics. (Deprecated) |
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
### Rust
|
||||
|
||||
> 7 skills • suffix: `-rust`
|
||||
|
||||
<details>
|
||||
<summary><strong>Entra</strong> (4 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-identity-rust](.github/skills/azure-identity-rust/) | Identity SDK — DeveloperToolsCredential, ManagedIdentityCredential, ClientSecretCredential. |
|
||||
| [azure-keyvault-certificates-rust](.github/skills/azure-keyvault-certificates-rust/) | Key Vault Certificates — create, import, manage certificates. |
|
||||
| [azure-keyvault-keys-rust](.github/skills/azure-keyvault-keys-rust/) | Key Vault Keys — RSA/EC keys, encrypt/decrypt, sign/verify. |
|
||||
| [azure-keyvault-secrets-rust](.github/skills/azure-keyvault-secrets-rust/) | Key Vault Secrets — passwords, API keys, connection strings. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Data & Storage</strong> (2 skills)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-cosmos-rust](.github/skills/azure-cosmos-rust/) | Cosmos DB SDK — document CRUD, queries, containers, partitions. |
|
||||
| [azure-storage-blob-rust](.github/skills/azure-storage-blob-rust/) | Blob Storage — upload, download, containers, streaming. |
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Messaging</strong> (1 skill)</summary>
|
||||
|
||||
| Skill | Description |
|
||||
|-------|-------------|
|
||||
| [azure-eventhub-rust](.github/skills/azure-eventhub-rust/) | Event Hubs — high-throughput streaming, producers, consumers, batching. |
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## Repository Structure
|
||||
|
||||
```
|
||||
AGENTS.md # Agent configuration template
|
||||
|
||||
.github/
|
||||
├── skills/ # All 132 skills (flat structure)
|
||||
├── plugins/ # Installable plugin packages
|
||||
│ └── deep-wiki/ # AI-powered wiki generator
|
||||
├── prompts/ # Reusable prompt templates
|
||||
├── agents/ # Agent persona definitions
|
||||
├── scripts/ # Automation scripts (doc scraping)
|
||||
├── workflows/ # GitHub Actions (daily doc updates)
|
||||
└── copilot-instructions.md
|
||||
|
||||
docs/ # Generated llms.txt files (daily workflow) - GitHub Pages hosted
|
||||
├── llms.txt # Links + summaries
|
||||
└── llms-full.txt # Full content
|
||||
|
||||
skills/ # Symlinks for backward compatibility
|
||||
├── python/ # -> ../.github/skills/*-py
|
||||
├── dotnet/ # -> ../.github/skills/*-dotnet
|
||||
├── typescript/ # -> ../.github/skills/*-ts
|
||||
├── java/ # -> ../.github/skills/*-java
|
||||
└── rust/ # -> ../.github/skills/*-rust
|
||||
|
||||
.vscode/
|
||||
└── mcp.json # MCP server configurations
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Plugins
|
||||
|
||||
Plugins are installable packages containing curated sets of agents, commands, and skills. Install via the Copilot CLI:
|
||||
|
||||
```bash
|
||||
# Inside Copilot CLI, run these slash commands:
|
||||
/plugin marketplace add microsoft/skills
|
||||
/plugin install deep-wiki@skills
|
||||
```
|
||||
|
||||
| Plugin | Description | Commands |
|
||||
|--------|-------------|----------|
|
||||
| [deep-wiki](.github/plugins/deep-wiki/) | AI-powered wiki generator with Mermaid diagrams, architecture analysis, and source citations | `/deep-wiki:generate`, `/deep-wiki:catalogue`, `/deep-wiki:page`, `/deep-wiki:changelog`, `/deep-wiki:research`, `/deep-wiki:ask` |
|
||||
|
||||
---
|
||||
|
||||
## MCP Servers
|
||||
|
||||
Reference configurations in [`.vscode/mcp.json`](.vscode/mcp.json):
|
||||
|
||||
| Category | Servers |
|
||||
|----------|---------|
|
||||
| **Documentation** | `microsoft-docs`, `context7`, `deepwiki` |
|
||||
| **Development** | `github`, `playwright`, `terraform`, `eslint` |
|
||||
| **Utilities** | `sequentialthinking`, `memory`, `markitdown` |
|
||||
|
||||
For full MCP server implementations for Azure services, see **[microsoft/mcp](https://github.com/microsoft/mcp)**.
|
||||
|
||||
---
|
||||
|
||||
## Additional Resources
|
||||
|
||||
### Agents
|
||||
|
||||
Role-specific agent personas in [`.github/agents/`](.github/agents/):
|
||||
|
||||
| Agent | Expertise |
|
||||
|-------|-----------|
|
||||
| `backend.agent.md` | FastAPI, Pydantic, Cosmos DB, Azure services |
|
||||
| `frontend.agent.md` | React, TypeScript, React Flow, Zustand, Tailwind |
|
||||
| `infrastructure.agent.md` | Bicep, Azure CLI, Container Apps, networking |
|
||||
| `planner.agent.md` | Task decomposition, architecture decisions |
|
||||
| `presenter.agent.md` | Documentation, demos, technical writing |
|
||||
|
||||
Use [`AGENTS.md`](AGENTS.md) as a template for configuring agent behavior in your own projects.
|
||||
|
||||
### Prompts
|
||||
|
||||
Reusable prompt templates in [`.github/prompts/`](.github/prompts/):
|
||||
|
||||
| Prompt | Purpose |
|
||||
|--------|---------|
|
||||
| [`code-review.prompt.md`](.github/prompts/code-review.prompt.md) | Structured code review with security, performance, and maintainability checks |
|
||||
| [`create-store.prompt.md`](.github/prompts/create-store.prompt.md) | Zustand store creation with TypeScript and subscribeWithSelector |
|
||||
| [`create-node.prompt.md`](.github/prompts/create-node.prompt.md) | React Flow custom node creation with handles and Zustand integration |
|
||||
| [`add-endpoint.prompt.md`](.github/prompts/add-endpoint.prompt.md) | FastAPI endpoint creation with Pydantic models and proper typing |
|
||||
|
||||
### Documentation
|
||||
|
||||
See the docs at https://microsoft.github.io/skills/#documentation.
|
||||
|
||||
---
|
||||
|
||||
## Testing Skills
|
||||
|
||||
The test harness validates that skills produce correct code patterns using the [GitHub Copilot SDK](https://github.com/github/copilot-sdk). It evaluates generated code against acceptance criteria defined for each skill.
|
||||
|
||||
```bash
|
||||
# Install test dependencies (from tests directory)
|
||||
cd tests
|
||||
pnpm install
|
||||
|
||||
# List skills with test coverage
|
||||
pnpm harness --list
|
||||
|
||||
# Run tests for a specific skill (mock mode for CI)
|
||||
pnpm harness azure-ai-projects-py --mock --verbose
|
||||
|
||||
# Run with Ralph Loop (iterative improvement)
|
||||
pnpm harness azure-ai-projects-py --ralph --mock --max-iterations 5 --threshold 85
|
||||
|
||||
# Run unit tests
|
||||
pnpm test
|
||||
```
|
||||
|
||||
### Test Coverage Summary
|
||||
|
||||
**126 skills with 1135 test scenarios** — all skills have acceptance criteria and test scenarios.
|
||||
|
||||
| Language | Skills | Scenarios | Top Skills by Scenarios |
|
||||
|----------|--------|-----------|-------------------------|
|
||||
| Core | 6 | 62 | `copilot-sdk` (11), `podcast-generation` (8), `skill-creator` (8) |
|
||||
| Python | 41 | 331 | `azure-ai-projects-py` (12), `pydantic-models-py` (12), `azure-ai-translation-text-py` (11) |
|
||||
| .NET | 29 | 290 | `azure-resource-manager-sql-dotnet` (14), `azure-resource-manager-redis-dotnet` (14), `azure-servicebus-dotnet` (13) |
|
||||
| TypeScript | 24 | 257 | `azure-storage-blob-ts` (17), `azure-servicebus-ts` (14), `azure-microsoft-playwright-testing-ts` (13) |
|
||||
| Java | 26 | 195 | `azure-storage-blob-java` (12), `azure-identity-java` (12), `azure-data-tables-java` (11) |
|
||||
|
||||
### Adding Test Coverage
|
||||
|
||||
See [`tests/README.md`](tests/README.md) for instructions on adding acceptance criteria and scenarios for new skills.
|
||||
|
||||
### Ralph Loop & Sensei Patterns
|
||||
|
||||
The test harness implements iterative quality improvement patterns inspired by [Sensei](https://github.com/microsoft/GitHub-Copilot-for-Azure/tree/main/.github/skills/sensei):
|
||||
|
||||
**Ralph Loop** — An iterative code generation and improvement system that:
|
||||
1. **Generate** code for a given skill/scenario
|
||||
2. **Evaluate** against acceptance criteria (score 0-100)
|
||||
3. **Analyze** failures and build LLM-actionable feedback
|
||||
4. **Re-generate** with feedback until quality threshold is met
|
||||
5. **Report** on quality improvements across iterations
|
||||
|
||||
**Sensei-style Scoring** — Skills are evaluated on frontmatter compliance:
|
||||
|
||||
| Score | Requirements |
|
||||
|-------|--------------|
|
||||
| **Low** | Basic description only |
|
||||
| **Medium** | Description > 150 chars, has trigger keywords |
|
||||
| **Medium-High** | Has "USE FOR:" triggers AND "DO NOT USE FOR:" anti-triggers |
|
||||
| **High** | Triggers + anti-triggers + compatibility field |
|
||||
|
||||
---
|
||||
|
||||
## Contributing
|
||||
|
||||
### Adding New Skills
|
||||
|
||||
New skills must follow the full workflow to ensure quality and discoverability:
|
||||
|
||||
**Prerequisites:**
|
||||
- SDK package name (e.g., `azure-ai-agents`, `Azure.AI.OpenAI`)
|
||||
- Microsoft Learn documentation URL or GitHub repository
|
||||
- Target language (py/dotnet/ts/java)
|
||||
|
||||
**Workflow:**
|
||||
|
||||
1. **Create skill** in `.github/skills/<skill-name>/SKILL.md`
|
||||
- Naming: `azure-<service>-<language>` (e.g., `azure-ai-projects-py`)
|
||||
- Include YAML frontmatter with `name` and `description`
|
||||
- Reference official docs via `microsoft-docs` MCP
|
||||
|
||||
2. **Categorize with symlink** in `skills/<language>/<category>/`
|
||||
```bash
|
||||
# Example: Python AI agent skill in foundry category
|
||||
cd skills/python/foundry
|
||||
ln -s ../../../.github/skills/azure-ai-projects-py projects
|
||||
```
|
||||
|
||||
Categories: `foundry`, `data`, `messaging`, `monitoring`, `entra`, `integration`, `compute`, `m365`, `general`
|
||||
|
||||
3. **Create acceptance criteria** in `.github/skills/<skill>/references/acceptance-criteria.md`
|
||||
- Document correct/incorrect import patterns
|
||||
- Document authentication patterns
|
||||
- Document async variants
|
||||
|
||||
4. **Create test scenarios** in `tests/scenarios/<skill>/scenarios.yaml`
|
||||
- Test basic usage, error handling, advanced features
|
||||
- Include mock responses for CI
|
||||
|
||||
5. **Verify tests pass**
|
||||
```bash
|
||||
cd tests && pnpm harness <skill-name> --mock --verbose
|
||||
```
|
||||
|
||||
6. **Update README.md** — Add to the appropriate language section in the Skill Catalog
|
||||
|
||||
> **Full guide:** See [`.github/skills/skill-creator/SKILL.md`](.github/skills/skill-creator/SKILL.md)
|
||||
|
||||
### Other Contributions
|
||||
|
||||
- Improve existing prompts and agents
|
||||
- Share MCP server configurations
|
||||
- Fix bugs in test harness
|
||||
|
||||
---
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
334
skills/official/microsoft/dotnet/compute/botservice/SKILL.md
Normal file
334
skills/official/microsoft/dotnet/compute/botservice/SKILL.md
Normal file
@@ -0,0 +1,334 @@
|
||||
---
|
||||
name: azure-mgmt-botservice-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for Bot Service in .NET. Management plane operations for creating and managing Azure Bot resources, channels (Teams, DirectLine, Slack), and connection settings. Triggers: "Bot Service", "BotResource", "Azure Bot", "DirectLine channel", "Teams channel", "bot management .NET", "create bot".
|
||||
package: Azure.ResourceManager.BotService
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.BotService (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Azure Bot Service resources via Azure Resource Manager.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.BotService
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.1.1, Preview v1.1.0-beta.1
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.BotService;
|
||||
|
||||
// Authenticate using DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
ArmClient armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription and resource group
|
||||
SubscriptionResource subscription = await armClient.GetDefaultSubscriptionAsync();
|
||||
ResourceGroupResource resourceGroup = await subscription.GetResourceGroups().GetAsync("myResourceGroup");
|
||||
|
||||
// Access bot collection
|
||||
BotCollection botCollection = resourceGroup.GetBots();
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
└── ResourceGroupResource
|
||||
└── BotResource
|
||||
├── BotChannelResource (DirectLine, Teams, Slack, etc.)
|
||||
├── BotConnectionSettingResource (OAuth connections)
|
||||
└── BotServicePrivateEndpointConnectionResource
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create Bot Resource
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.BotService;
|
||||
using Azure.ResourceManager.BotService.Models;
|
||||
|
||||
// Create bot data
|
||||
var botData = new BotData(AzureLocation.WestUS2)
|
||||
{
|
||||
Kind = BotServiceKind.Azurebot,
|
||||
Sku = new BotServiceSku(BotServiceSkuName.F0),
|
||||
Properties = new BotProperties(
|
||||
displayName: "MyBot",
|
||||
endpoint: new Uri("https://mybot.azurewebsites.net/api/messages"),
|
||||
msaAppId: "<your-msa-app-id>")
|
||||
{
|
||||
Description = "My Azure Bot",
|
||||
MsaAppType = BotMsaAppType.MultiTenant
|
||||
}
|
||||
};
|
||||
|
||||
// Create or update the bot
|
||||
ArmOperation<BotResource> operation = await botCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"myBotName",
|
||||
botData);
|
||||
|
||||
BotResource bot = operation.Value;
|
||||
Console.WriteLine($"Bot created: {bot.Data.Name}");
|
||||
```
|
||||
|
||||
### 2. Configure DirectLine Channel
|
||||
|
||||
```csharp
|
||||
// Get the bot
|
||||
BotResource bot = await resourceGroup.GetBots().GetAsync("myBotName");
|
||||
|
||||
// Get channel collection
|
||||
BotChannelCollection channels = bot.GetBotChannels();
|
||||
|
||||
// Create DirectLine channel configuration
|
||||
var channelData = new BotChannelData(AzureLocation.WestUS2)
|
||||
{
|
||||
Properties = new DirectLineChannel()
|
||||
{
|
||||
Properties = new DirectLineChannelProperties()
|
||||
{
|
||||
Sites =
|
||||
{
|
||||
new DirectLineSite("Default Site")
|
||||
{
|
||||
IsEnabled = true,
|
||||
IsV1Enabled = false,
|
||||
IsV3Enabled = true,
|
||||
IsSecureSiteEnabled = true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Create or update the channel
|
||||
ArmOperation<BotChannelResource> channelOp = await channels.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
BotChannelName.DirectLineChannel,
|
||||
channelData);
|
||||
|
||||
Console.WriteLine("DirectLine channel configured");
|
||||
```
|
||||
|
||||
### 3. Configure Microsoft Teams Channel
|
||||
|
||||
```csharp
|
||||
var teamsChannelData = new BotChannelData(AzureLocation.WestUS2)
|
||||
{
|
||||
Properties = new MsTeamsChannel()
|
||||
{
|
||||
Properties = new MsTeamsChannelProperties()
|
||||
{
|
||||
IsEnabled = true,
|
||||
EnableCalling = false
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
await channels.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
BotChannelName.MsTeamsChannel,
|
||||
teamsChannelData);
|
||||
```
|
||||
|
||||
### 4. Configure Web Chat Channel
|
||||
|
||||
```csharp
|
||||
var webChatChannelData = new BotChannelData(AzureLocation.WestUS2)
|
||||
{
|
||||
Properties = new WebChatChannel()
|
||||
{
|
||||
Properties = new WebChatChannelProperties()
|
||||
{
|
||||
Sites =
|
||||
{
|
||||
new WebChatSite("Default Site")
|
||||
{
|
||||
IsEnabled = true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
await channels.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
BotChannelName.WebChatChannel,
|
||||
webChatChannelData);
|
||||
```
|
||||
|
||||
### 5. Get Bot and List Channels
|
||||
|
||||
```csharp
|
||||
// Get bot
|
||||
BotResource bot = await botCollection.GetAsync("myBotName");
|
||||
Console.WriteLine($"Bot: {bot.Data.Properties.DisplayName}");
|
||||
Console.WriteLine($"Endpoint: {bot.Data.Properties.Endpoint}");
|
||||
|
||||
// List channels
|
||||
await foreach (BotChannelResource channel in bot.GetBotChannels().GetAllAsync())
|
||||
{
|
||||
Console.WriteLine($"Channel: {channel.Data.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Regenerate DirectLine Keys
|
||||
|
||||
```csharp
|
||||
var regenerateRequest = new BotChannelRegenerateKeysContent(BotChannelName.DirectLineChannel)
|
||||
{
|
||||
SiteName = "Default Site"
|
||||
};
|
||||
|
||||
BotChannelResource channelWithKeys = await bot.GetBotChannelWithRegenerateKeysAsync(regenerateRequest);
|
||||
```
|
||||
|
||||
### 7. Update Bot
|
||||
|
||||
```csharp
|
||||
BotResource bot = await botCollection.GetAsync("myBotName");
|
||||
|
||||
// Update using patch
|
||||
var updateData = new BotData(bot.Data.Location)
|
||||
{
|
||||
Properties = new BotProperties(
|
||||
displayName: "Updated Bot Name",
|
||||
endpoint: bot.Data.Properties.Endpoint,
|
||||
msaAppId: bot.Data.Properties.MsaAppId)
|
||||
{
|
||||
Description = "Updated description"
|
||||
}
|
||||
};
|
||||
|
||||
await bot.UpdateAsync(updateData);
|
||||
```
|
||||
|
||||
### 8. Delete Bot
|
||||
|
||||
```csharp
|
||||
BotResource bot = await botCollection.GetAsync("myBotName");
|
||||
await bot.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Supported Channel Types
|
||||
|
||||
| Channel | Constant | Class |
|
||||
|---------|----------|-------|
|
||||
| Direct Line | `BotChannelName.DirectLineChannel` | `DirectLineChannel` |
|
||||
| Direct Line Speech | `BotChannelName.DirectLineSpeechChannel` | `DirectLineSpeechChannel` |
|
||||
| Microsoft Teams | `BotChannelName.MsTeamsChannel` | `MsTeamsChannel` |
|
||||
| Web Chat | `BotChannelName.WebChatChannel` | `WebChatChannel` |
|
||||
| Slack | `BotChannelName.SlackChannel` | `SlackChannel` |
|
||||
| Facebook | `BotChannelName.FacebookChannel` | `FacebookChannel` |
|
||||
| Email | `BotChannelName.EmailChannel` | `EmailChannel` |
|
||||
| Telegram | `BotChannelName.TelegramChannel` | `TelegramChannel` |
|
||||
| Telephony | `BotChannelName.TelephonyChannel` | `TelephonyChannel` |
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `BotResource` | Represents an Azure Bot resource |
|
||||
| `BotCollection` | Collection for bot CRUD |
|
||||
| `BotData` | Bot resource definition |
|
||||
| `BotProperties` | Bot configuration properties |
|
||||
| `BotChannelResource` | Channel configuration |
|
||||
| `BotChannelCollection` | Collection of channels |
|
||||
| `BotChannelData` | Channel configuration data |
|
||||
| `BotConnectionSettingResource` | OAuth connection settings |
|
||||
|
||||
## BotServiceKind Values
|
||||
|
||||
| Value | Description |
|
||||
|-------|-------------|
|
||||
| `BotServiceKind.Azurebot` | Azure Bot (recommended) |
|
||||
| `BotServiceKind.Bot` | Legacy Bot Framework bot |
|
||||
| `BotServiceKind.Designer` | Composer bot |
|
||||
| `BotServiceKind.Function` | Function bot |
|
||||
| `BotServiceKind.Sdk` | SDK bot |
|
||||
|
||||
## BotServiceSkuName Values
|
||||
|
||||
| Value | Description |
|
||||
|-------|-------------|
|
||||
| `BotServiceSkuName.F0` | Free tier |
|
||||
| `BotServiceSkuName.S1` | Standard tier |
|
||||
|
||||
## BotMsaAppType Values
|
||||
|
||||
| Value | Description |
|
||||
|-------|-------------|
|
||||
| `BotMsaAppType.MultiTenant` | Multi-tenant app |
|
||||
| `BotMsaAppType.SingleTenant` | Single-tenant app |
|
||||
| `BotMsaAppType.UserAssignedMSI` | User-assigned managed identity |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always use `DefaultAzureCredential`** — supports multiple auth methods
|
||||
2. **Use `WaitUntil.Completed`** for synchronous operations
|
||||
3. **Handle `RequestFailedException`** for API errors
|
||||
4. **Use async methods** (`*Async`) for all operations
|
||||
5. **Store MSA App credentials securely** — use Key Vault for secrets
|
||||
6. **Use managed identity** (`BotMsaAppType.UserAssignedMSI`) for production bots
|
||||
7. **Enable secure sites** for DirectLine channels in production
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await botCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
botName,
|
||||
botData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Bot already exists");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.BotService` | Bot management (this SDK) | `dotnet add package Azure.ResourceManager.BotService` |
|
||||
| `Microsoft.Bot.Builder` | Bot Framework SDK | `dotnet add package Microsoft.Bot.Builder` |
|
||||
| `Microsoft.Bot.Builder.Integration.AspNet.Core` | ASP.NET Core integration | `dotnet add package Microsoft.Bot.Builder.Integration.AspNet.Core` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.ResourceManager.BotService |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.resourcemanager.botservice |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/botservice/Azure.ResourceManager.BotService |
|
||||
| Azure Bot Service Docs | https://learn.microsoft.com/azure/bot-service/ |
|
||||
377
skills/official/microsoft/dotnet/compute/durabletask/SKILL.md
Normal file
377
skills/official/microsoft/dotnet/compute/durabletask/SKILL.md
Normal file
@@ -0,0 +1,377 @@
|
||||
---
|
||||
name: azure-resource-manager-durabletask-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for Durable Task Scheduler in .NET. Use for MANAGEMENT PLANE operations: creating/managing Durable Task Schedulers, Task Hubs, and retention policies via Azure Resource Manager. Triggers: "Durable Task Scheduler", "create scheduler", "task hub", "DurableTaskSchedulerResource", "provision Durable Task", "orchestration scheduler".
|
||||
package: Azure.ResourceManager.DurableTask
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.DurableTask (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Azure Durable Task Scheduler resources via Azure Resource Manager.
|
||||
|
||||
> **⚠️ Management vs Data Plane**
|
||||
> - **This SDK (Azure.ResourceManager.DurableTask)**: Create schedulers, task hubs, configure retention policies
|
||||
> - **Data Plane SDK (Microsoft.DurableTask.Client.AzureManaged)**: Start orchestrations, query instances, send events
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.DurableTask
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.0.0 (2025-11-03), Preview v1.0.0-beta.1 (2025-04-24)
|
||||
**API Version**: 2025-11-01
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.DurableTask;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
|
||||
var subscription = armClient.GetSubscriptionResource(
|
||||
new ResourceIdentifier($"/subscriptions/{subscriptionId}"));
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
└── ResourceGroupResource
|
||||
└── DurableTaskSchedulerResource
|
||||
├── DurableTaskHubResource
|
||||
└── DurableTaskRetentionPolicyResource
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Create Durable Task Scheduler
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.DurableTask;
|
||||
using Azure.ResourceManager.DurableTask.Models;
|
||||
|
||||
// Get resource group
|
||||
var resourceGroup = await subscription
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Define scheduler with Dedicated SKU
|
||||
var schedulerData = new DurableTaskSchedulerData(AzureLocation.EastUS)
|
||||
{
|
||||
Properties = new DurableTaskSchedulerProperties
|
||||
{
|
||||
Sku = new DurableTaskSchedulerSku(DurableTaskSchedulerSkuName.Dedicated)
|
||||
{
|
||||
Capacity = 1 // Number of instances
|
||||
},
|
||||
// Optional: IP allowlist for network security
|
||||
IPAllowlist = { "10.0.0.0/24", "192.168.1.0/24" }
|
||||
}
|
||||
};
|
||||
|
||||
// Create scheduler (long-running operation)
|
||||
var schedulerCollection = resourceGroup.Value.GetDurableTaskSchedulers();
|
||||
var operation = await schedulerCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-scheduler",
|
||||
schedulerData);
|
||||
|
||||
DurableTaskSchedulerResource scheduler = operation.Value;
|
||||
Console.WriteLine($"Scheduler created: {scheduler.Data.Name}");
|
||||
Console.WriteLine($"Endpoint: {scheduler.Data.Properties.Endpoint}");
|
||||
```
|
||||
|
||||
### 2. Create Scheduler with Consumption SKU
|
||||
|
||||
```csharp
|
||||
// Consumption SKU (serverless)
|
||||
var consumptionSchedulerData = new DurableTaskSchedulerData(AzureLocation.EastUS)
|
||||
{
|
||||
Properties = new DurableTaskSchedulerProperties
|
||||
{
|
||||
Sku = new DurableTaskSchedulerSku(DurableTaskSchedulerSkuName.Consumption)
|
||||
// No capacity needed for consumption
|
||||
}
|
||||
};
|
||||
|
||||
var operation = await schedulerCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-serverless-scheduler",
|
||||
consumptionSchedulerData);
|
||||
```
|
||||
|
||||
### 3. Create Task Hub
|
||||
|
||||
```csharp
|
||||
// Task hubs are created under a scheduler
|
||||
var taskHubData = new DurableTaskHubData
|
||||
{
|
||||
// Properties are optional for basic task hub
|
||||
};
|
||||
|
||||
var taskHubCollection = scheduler.GetDurableTaskHubs();
|
||||
var hubOperation = await taskHubCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-taskhub",
|
||||
taskHubData);
|
||||
|
||||
DurableTaskHubResource taskHub = hubOperation.Value;
|
||||
Console.WriteLine($"Task Hub created: {taskHub.Data.Name}");
|
||||
```
|
||||
|
||||
### 4. List Schedulers
|
||||
|
||||
```csharp
|
||||
// List all schedulers in subscription
|
||||
await foreach (var sched in subscription.GetDurableTaskSchedulersAsync())
|
||||
{
|
||||
Console.WriteLine($"Scheduler: {sched.Data.Name}");
|
||||
Console.WriteLine($" Location: {sched.Data.Location}");
|
||||
Console.WriteLine($" SKU: {sched.Data.Properties.Sku?.Name}");
|
||||
Console.WriteLine($" Endpoint: {sched.Data.Properties.Endpoint}");
|
||||
}
|
||||
|
||||
// List schedulers in resource group
|
||||
var schedulers = resourceGroup.Value.GetDurableTaskSchedulers();
|
||||
await foreach (var sched in schedulers.GetAllAsync())
|
||||
{
|
||||
Console.WriteLine($"Scheduler: {sched.Data.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Get Scheduler by Name
|
||||
|
||||
```csharp
|
||||
// Get existing scheduler
|
||||
var existingScheduler = await schedulerCollection.GetAsync("my-scheduler");
|
||||
Console.WriteLine($"Found: {existingScheduler.Value.Data.Name}");
|
||||
|
||||
// Or use extension method
|
||||
var schedulerResource = armClient.GetDurableTaskSchedulerResource(
|
||||
DurableTaskSchedulerResource.CreateResourceIdentifier(
|
||||
subscriptionId,
|
||||
"my-resource-group",
|
||||
"my-scheduler"));
|
||||
var scheduler = await schedulerResource.GetAsync();
|
||||
```
|
||||
|
||||
### 6. Update Scheduler
|
||||
|
||||
```csharp
|
||||
// Get current scheduler
|
||||
var scheduler = await schedulerCollection.GetAsync("my-scheduler");
|
||||
|
||||
// Update with new configuration
|
||||
var updateData = new DurableTaskSchedulerData(scheduler.Value.Data.Location)
|
||||
{
|
||||
Properties = new DurableTaskSchedulerProperties
|
||||
{
|
||||
Sku = new DurableTaskSchedulerSku(DurableTaskSchedulerSkuName.Dedicated)
|
||||
{
|
||||
Capacity = 2 // Scale up
|
||||
},
|
||||
IPAllowlist = { "10.0.0.0/16" } // Update IP allowlist
|
||||
}
|
||||
};
|
||||
|
||||
var updateOperation = await schedulerCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-scheduler",
|
||||
updateData);
|
||||
```
|
||||
|
||||
### 7. Delete Resources
|
||||
|
||||
```csharp
|
||||
// Delete task hub first
|
||||
var taskHub = await scheduler.GetDurableTaskHubs().GetAsync("my-taskhub");
|
||||
await taskHub.Value.DeleteAsync(WaitUntil.Completed);
|
||||
|
||||
// Then delete scheduler
|
||||
await scheduler.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
### 8. Manage Retention Policies
|
||||
|
||||
```csharp
|
||||
// Get retention policy collection
|
||||
var retentionPolicies = scheduler.GetDurableTaskRetentionPolicies();
|
||||
|
||||
// Create or update retention policy
|
||||
var retentionData = new DurableTaskRetentionPolicyData
|
||||
{
|
||||
Properties = new DurableTaskRetentionPolicyProperties
|
||||
{
|
||||
// Configure retention settings
|
||||
}
|
||||
};
|
||||
|
||||
var retentionOperation = await retentionPolicies.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"default", // Policy name
|
||||
retentionData);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `DurableTaskSchedulerResource` | Represents a Durable Task Scheduler |
|
||||
| `DurableTaskSchedulerCollection` | Collection for scheduler CRUD |
|
||||
| `DurableTaskSchedulerData` | Scheduler creation/update payload |
|
||||
| `DurableTaskSchedulerProperties` | Scheduler configuration (SKU, IPAllowlist) |
|
||||
| `DurableTaskSchedulerSku` | SKU configuration (Name, Capacity, RedundancyState) |
|
||||
| `DurableTaskSchedulerSkuName` | SKU options: `Dedicated`, `Consumption` |
|
||||
| `DurableTaskHubResource` | Represents a Task Hub |
|
||||
| `DurableTaskHubCollection` | Collection for task hub CRUD |
|
||||
| `DurableTaskHubData` | Task hub creation payload |
|
||||
| `DurableTaskRetentionPolicyResource` | Retention policy management |
|
||||
| `DurableTaskRetentionPolicyData` | Retention policy configuration |
|
||||
| `DurableTaskExtensions` | Extension methods for ARM client |
|
||||
|
||||
## SKU Options
|
||||
|
||||
| SKU | Description | Use Case |
|
||||
|-----|-------------|----------|
|
||||
| `Dedicated` | Fixed capacity with configurable instances | Production workloads, predictable performance |
|
||||
| `Consumption` | Serverless, auto-scaling | Development, variable workloads |
|
||||
|
||||
## Extension Methods
|
||||
|
||||
The SDK provides extension methods on `SubscriptionResource` and `ResourceGroupResource`:
|
||||
|
||||
```csharp
|
||||
// On SubscriptionResource
|
||||
subscription.GetDurableTaskSchedulers(); // List all in subscription
|
||||
subscription.GetDurableTaskSchedulersAsync(); // Async enumerable
|
||||
|
||||
// On ResourceGroupResource
|
||||
resourceGroup.GetDurableTaskSchedulers(); // Get collection
|
||||
resourceGroup.GetDurableTaskSchedulerAsync(name); // Get by name
|
||||
|
||||
// On ArmClient
|
||||
armClient.GetDurableTaskSchedulerResource(id); // Get by resource ID
|
||||
armClient.GetDurableTaskHubResource(id); // Get task hub by ID
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `WaitUntil.Completed`** for operations that must finish before proceeding
|
||||
2. **Use `WaitUntil.Started`** when you want to poll manually or run operations in parallel
|
||||
3. **Always use `DefaultAzureCredential`** — never hardcode keys
|
||||
4. **Handle `RequestFailedException`** for ARM API errors
|
||||
5. **Use `CreateOrUpdateAsync`** for idempotent operations
|
||||
6. **Delete task hubs before schedulers** — schedulers with task hubs cannot be deleted
|
||||
7. **Use IP allowlists** for network security in production
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await schedulerCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, schedulerName, schedulerData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Scheduler already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 404)
|
||||
{
|
||||
Console.WriteLine("Resource group not found");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Complete Example
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.DurableTask;
|
||||
using Azure.ResourceManager.DurableTask.Models;
|
||||
using Azure.ResourceManager.Resources;
|
||||
|
||||
// Setup
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID")!;
|
||||
var resourceGroupName = Environment.GetEnvironmentVariable("AZURE_RESOURCE_GROUP")!;
|
||||
|
||||
var subscription = armClient.GetSubscriptionResource(
|
||||
new ResourceIdentifier($"/subscriptions/{subscriptionId}"));
|
||||
var resourceGroup = await subscription.GetResourceGroupAsync(resourceGroupName);
|
||||
|
||||
// Create scheduler
|
||||
var schedulerData = new DurableTaskSchedulerData(AzureLocation.EastUS)
|
||||
{
|
||||
Properties = new DurableTaskSchedulerProperties
|
||||
{
|
||||
Sku = new DurableTaskSchedulerSku(DurableTaskSchedulerSkuName.Dedicated)
|
||||
{
|
||||
Capacity = 1
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
var schedulerCollection = resourceGroup.Value.GetDurableTaskSchedulers();
|
||||
var schedulerOp = await schedulerCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, "my-scheduler", schedulerData);
|
||||
var scheduler = schedulerOp.Value;
|
||||
|
||||
Console.WriteLine($"Scheduler endpoint: {scheduler.Data.Properties.Endpoint}");
|
||||
|
||||
// Create task hub
|
||||
var taskHubData = new DurableTaskHubData();
|
||||
var taskHubOp = await scheduler.GetDurableTaskHubs().CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, "my-taskhub", taskHubData);
|
||||
var taskHub = taskHubOp.Value;
|
||||
|
||||
Console.WriteLine($"Task Hub: {taskHub.Data.Name}");
|
||||
|
||||
// Cleanup
|
||||
await taskHub.DeleteAsync(WaitUntil.Completed);
|
||||
await scheduler.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.DurableTask` | Management plane (this SDK) | `dotnet add package Azure.ResourceManager.DurableTask` |
|
||||
| `Microsoft.DurableTask.Client.AzureManaged` | Data plane (orchestrations, activities) | `dotnet add package Microsoft.DurableTask.Client.AzureManaged` |
|
||||
| `Microsoft.DurableTask.Worker.AzureManaged` | Worker for running orchestrations | `dotnet add package Microsoft.DurableTask.Worker.AzureManaged` |
|
||||
| `Azure.Identity` | Authentication | `dotnet add package Azure.Identity` |
|
||||
| `Azure.ResourceManager` | Base ARM SDK | `dotnet add package Azure.ResourceManager` |
|
||||
|
||||
## Source Reference
|
||||
|
||||
- [GitHub: Azure.ResourceManager.DurableTask](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/durabletask/Azure.ResourceManager.DurableTask)
|
||||
- [NuGet: Azure.ResourceManager.DurableTask](https://www.nuget.org/packages/Azure.ResourceManager.DurableTask)
|
||||
297
skills/official/microsoft/dotnet/compute/playwright/SKILL.md
Normal file
297
skills/official/microsoft/dotnet/compute/playwright/SKILL.md
Normal file
@@ -0,0 +1,297 @@
|
||||
---
|
||||
name: azure-resource-manager-playwright-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for Microsoft Playwright Testing in .NET. Use for MANAGEMENT PLANE operations: creating/managing Playwright Testing workspaces, checking name availability, and managing workspace quotas via Azure Resource Manager. NOT for running Playwright tests - use Azure.Developer.MicrosoftPlaywrightTesting.NUnit for that. Triggers: "Playwright workspace", "create Playwright Testing workspace", "manage Playwright resources", "ARM Playwright", "PlaywrightWorkspaceResource", "provision Playwright Testing".
|
||||
package: Azure.ResourceManager.Playwright
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.Playwright (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Microsoft Playwright Testing workspaces via Azure Resource Manager.
|
||||
|
||||
> **⚠️ Management vs Test Execution**
|
||||
> - **This SDK (Azure.ResourceManager.Playwright)**: Create workspaces, manage quotas, check name availability
|
||||
> - **Test Execution SDK (Azure.Developer.MicrosoftPlaywrightTesting.NUnit)**: Run Playwright tests at scale on cloud browsers
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.Playwright
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.0.0, Preview v1.0.0-beta.1
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.Playwright;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
|
||||
var subscription = armClient.GetSubscriptionResource(
|
||||
new ResourceIdentifier($"/subscriptions/{subscriptionId}"));
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
├── PlaywrightQuotaResource (subscription-level quotas)
|
||||
└── ResourceGroupResource
|
||||
└── PlaywrightWorkspaceResource
|
||||
└── PlaywrightWorkspaceQuotaResource (workspace-level quotas)
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Create Playwright Workspace
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.Playwright;
|
||||
using Azure.ResourceManager.Playwright.Models;
|
||||
|
||||
// Get resource group
|
||||
var resourceGroup = await subscription
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Define workspace
|
||||
var workspaceData = new PlaywrightWorkspaceData(AzureLocation.WestUS3)
|
||||
{
|
||||
// Optional: Configure regional affinity and local auth
|
||||
RegionalAffinity = PlaywrightRegionalAffinity.Enabled,
|
||||
LocalAuth = PlaywrightLocalAuth.Enabled,
|
||||
Tags =
|
||||
{
|
||||
["Team"] = "Dev Exp",
|
||||
["Environment"] = "Production"
|
||||
}
|
||||
};
|
||||
|
||||
// Create workspace (long-running operation)
|
||||
var workspaceCollection = resourceGroup.Value.GetPlaywrightWorkspaces();
|
||||
var operation = await workspaceCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-playwright-workspace",
|
||||
workspaceData);
|
||||
|
||||
PlaywrightWorkspaceResource workspace = operation.Value;
|
||||
|
||||
// Get the data plane URI for running tests
|
||||
Console.WriteLine($"Data Plane URI: {workspace.Data.DataplaneUri}");
|
||||
Console.WriteLine($"Workspace ID: {workspace.Data.WorkspaceId}");
|
||||
```
|
||||
|
||||
### 2. Get Existing Workspace
|
||||
|
||||
```csharp
|
||||
// Get by name
|
||||
var workspace = await workspaceCollection.GetAsync("my-playwright-workspace");
|
||||
|
||||
// Or check if exists first
|
||||
bool exists = await workspaceCollection.ExistsAsync("my-playwright-workspace");
|
||||
if (exists)
|
||||
{
|
||||
var existingWorkspace = await workspaceCollection.GetAsync("my-playwright-workspace");
|
||||
Console.WriteLine($"Workspace found: {existingWorkspace.Value.Data.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
### 3. List Workspaces
|
||||
|
||||
```csharp
|
||||
// List in resource group
|
||||
await foreach (var workspace in workspaceCollection.GetAllAsync())
|
||||
{
|
||||
Console.WriteLine($"Workspace: {workspace.Data.Name}");
|
||||
Console.WriteLine($" Location: {workspace.Data.Location}");
|
||||
Console.WriteLine($" State: {workspace.Data.ProvisioningState}");
|
||||
Console.WriteLine($" Data Plane URI: {workspace.Data.DataplaneUri}");
|
||||
}
|
||||
|
||||
// List across subscription
|
||||
await foreach (var workspace in subscription.GetPlaywrightWorkspacesAsync())
|
||||
{
|
||||
Console.WriteLine($"Workspace: {workspace.Data.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Update Workspace
|
||||
|
||||
```csharp
|
||||
var patch = new PlaywrightWorkspacePatch
|
||||
{
|
||||
Tags =
|
||||
{
|
||||
["Team"] = "Dev Exp",
|
||||
["Environment"] = "Staging",
|
||||
["UpdatedAt"] = DateTime.UtcNow.ToString("o")
|
||||
}
|
||||
};
|
||||
|
||||
var updatedWorkspace = await workspace.Value.UpdateAsync(patch);
|
||||
```
|
||||
|
||||
### 5. Check Name Availability
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.Playwright.Models;
|
||||
|
||||
var checkRequest = new PlaywrightCheckNameAvailabilityContent
|
||||
{
|
||||
Name = "my-new-workspace",
|
||||
ResourceType = "Microsoft.LoadTestService/playwrightWorkspaces"
|
||||
};
|
||||
|
||||
var result = await subscription.CheckPlaywrightNameAvailabilityAsync(checkRequest);
|
||||
|
||||
if (result.Value.IsNameAvailable == true)
|
||||
{
|
||||
Console.WriteLine("Name is available!");
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"Name unavailable: {result.Value.Message}");
|
||||
Console.WriteLine($"Reason: {result.Value.Reason}");
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Get Quota Information
|
||||
|
||||
```csharp
|
||||
// Subscription-level quotas
|
||||
await foreach (var quota in subscription.GetPlaywrightQuotasAsync(AzureLocation.WestUS3))
|
||||
{
|
||||
Console.WriteLine($"Quota: {quota.Data.Name}");
|
||||
Console.WriteLine($" Limit: {quota.Data.Limit}");
|
||||
Console.WriteLine($" Used: {quota.Data.Used}");
|
||||
}
|
||||
|
||||
// Workspace-level quotas
|
||||
var workspaceQuotas = workspace.Value.GetAllPlaywrightWorkspaceQuota();
|
||||
await foreach (var quota in workspaceQuotas.GetAllAsync())
|
||||
{
|
||||
Console.WriteLine($"Workspace Quota: {quota.Data.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Delete Workspace
|
||||
|
||||
```csharp
|
||||
// Delete (long-running operation)
|
||||
await workspace.Value.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `PlaywrightWorkspaceResource` | Represents a Playwright Testing workspace |
|
||||
| `PlaywrightWorkspaceCollection` | Collection for workspace CRUD |
|
||||
| `PlaywrightWorkspaceData` | Workspace creation/response payload |
|
||||
| `PlaywrightWorkspacePatch` | Workspace update payload |
|
||||
| `PlaywrightQuotaResource` | Subscription-level quota information |
|
||||
| `PlaywrightWorkspaceQuotaResource` | Workspace-level quota information |
|
||||
| `PlaywrightExtensions` | Extension methods for ARM resources |
|
||||
| `PlaywrightCheckNameAvailabilityContent` | Name availability check request |
|
||||
|
||||
## Workspace Properties
|
||||
|
||||
| Property | Description |
|
||||
|----------|-------------|
|
||||
| `DataplaneUri` | URI for running tests (e.g., `https://api.dataplane.{guid}.domain.com`) |
|
||||
| `WorkspaceId` | Unique workspace identifier (GUID) |
|
||||
| `RegionalAffinity` | Enable/disable regional affinity for test execution |
|
||||
| `LocalAuth` | Enable/disable local authentication (access tokens) |
|
||||
| `ProvisioningState` | Current provisioning state (Succeeded, Failed, etc.) |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `WaitUntil.Completed`** for operations that must finish before proceeding
|
||||
2. **Use `WaitUntil.Started`** when you want to poll manually or run operations in parallel
|
||||
3. **Always use `DefaultAzureCredential`** — never hardcode keys
|
||||
4. **Handle `RequestFailedException`** for ARM API errors
|
||||
5. **Use `CreateOrUpdateAsync`** for idempotent operations
|
||||
6. **Navigate hierarchy** via `Get*` methods (e.g., `resourceGroup.GetPlaywrightWorkspaces()`)
|
||||
7. **Store the DataplaneUri** after workspace creation for test execution configuration
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await workspaceCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, workspaceName, workspaceData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Workspace already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Bad request: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Integration with Test Execution
|
||||
|
||||
After creating a workspace, use the `DataplaneUri` to configure your Playwright tests:
|
||||
|
||||
```csharp
|
||||
// 1. Create workspace (this SDK)
|
||||
var workspace = await workspaceCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, "my-workspace", workspaceData);
|
||||
|
||||
// 2. Get the service URL
|
||||
var serviceUrl = workspace.Value.Data.DataplaneUri;
|
||||
|
||||
// 3. Set environment variable for test execution
|
||||
Environment.SetEnvironmentVariable("PLAYWRIGHT_SERVICE_URL", serviceUrl.ToString());
|
||||
|
||||
// 4. Run tests using Azure.Developer.MicrosoftPlaywrightTesting.NUnit
|
||||
// (separate package for test execution)
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.Playwright` | Management plane (this SDK) | `dotnet add package Azure.ResourceManager.Playwright` |
|
||||
| `Azure.Developer.MicrosoftPlaywrightTesting.NUnit` | Run NUnit Playwright tests at scale | `dotnet add package Azure.Developer.MicrosoftPlaywrightTesting.NUnit --prerelease` |
|
||||
| `Azure.Developer.Playwright` | Playwright client library | `dotnet add package Azure.Developer.Playwright` |
|
||||
|
||||
## API Information
|
||||
|
||||
- **Resource Provider**: `Microsoft.LoadTestService`
|
||||
- **Default API Version**: `2025-09-01`
|
||||
- **Resource Type**: `Microsoft.LoadTestService/playwrightWorkspaces`
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- [Azure.ResourceManager.Playwright API Reference](https://learn.microsoft.com/en-us/dotnet/api/azure.resourcemanager.playwright)
|
||||
- [Microsoft Playwright Testing Overview](https://learn.microsoft.com/en-us/azure/playwright-testing/overview-what-is-microsoft-playwright-testing)
|
||||
- [Quickstart: Run Playwright Tests at Scale](https://learn.microsoft.com/en-us/azure/playwright-testing/quickstart-run-end-to-end-tests)
|
||||
250
skills/official/microsoft/dotnet/data/cosmosdb/SKILL.md
Normal file
250
skills/official/microsoft/dotnet/data/cosmosdb/SKILL.md
Normal file
@@ -0,0 +1,250 @@
|
||||
---
|
||||
name: azure-resource-manager-cosmosdb-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for Cosmos DB in .NET. Use for MANAGEMENT PLANE operations: creating/managing Cosmos DB accounts, databases, containers, throughput settings, and RBAC via Azure Resource Manager. NOT for data plane operations (CRUD on documents) - use Microsoft.Azure.Cosmos for that. Triggers: "Cosmos DB account", "create Cosmos account", "manage Cosmos resources", "ARM Cosmos", "CosmosDBAccountResource", "provision Cosmos DB".
|
||||
package: Azure.ResourceManager.CosmosDB
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.CosmosDB (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Azure Cosmos DB resources via Azure Resource Manager.
|
||||
|
||||
> **⚠️ Management vs Data Plane**
|
||||
> - **This SDK (Azure.ResourceManager.CosmosDB)**: Create accounts, databases, containers, configure throughput, manage RBAC
|
||||
> - **Data Plane SDK (Microsoft.Azure.Cosmos)**: CRUD operations on documents, queries, stored procedures execution
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.CosmosDB
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.4.0, Preview v1.4.0-beta.13
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.CosmosDB;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
|
||||
var subscription = armClient.GetSubscriptionResource(
|
||||
new ResourceIdentifier($"/subscriptions/{subscriptionId}"));
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
└── ResourceGroupResource
|
||||
└── CosmosDBAccountResource
|
||||
├── CosmosDBSqlDatabaseResource
|
||||
│ └── CosmosDBSqlContainerResource
|
||||
│ ├── CosmosDBSqlStoredProcedureResource
|
||||
│ ├── CosmosDBSqlTriggerResource
|
||||
│ └── CosmosDBSqlUserDefinedFunctionResource
|
||||
├── CassandraKeyspaceResource
|
||||
├── GremlinDatabaseResource
|
||||
├── MongoDBDatabaseResource
|
||||
└── CosmosDBTableResource
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Create Cosmos DB Account
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.CosmosDB;
|
||||
using Azure.ResourceManager.CosmosDB.Models;
|
||||
|
||||
// Get resource group
|
||||
var resourceGroup = await subscription
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Define account
|
||||
var accountData = new CosmosDBAccountCreateOrUpdateContent(
|
||||
location: AzureLocation.EastUS,
|
||||
locations: new[]
|
||||
{
|
||||
new CosmosDBAccountLocation
|
||||
{
|
||||
LocationName = AzureLocation.EastUS,
|
||||
FailoverPriority = 0,
|
||||
IsZoneRedundant = false
|
||||
}
|
||||
})
|
||||
{
|
||||
Kind = CosmosDBAccountKind.GlobalDocumentDB,
|
||||
ConsistencyPolicy = new ConsistencyPolicy(DefaultConsistencyLevel.Session),
|
||||
EnableAutomaticFailover = true
|
||||
};
|
||||
|
||||
// Create account (long-running operation)
|
||||
var accountCollection = resourceGroup.Value.GetCosmosDBAccounts();
|
||||
var operation = await accountCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-cosmos-account",
|
||||
accountData);
|
||||
|
||||
CosmosDBAccountResource account = operation.Value;
|
||||
```
|
||||
|
||||
### 2. Create SQL Database
|
||||
|
||||
```csharp
|
||||
var databaseData = new CosmosDBSqlDatabaseCreateOrUpdateContent(
|
||||
new CosmosDBSqlDatabaseResourceInfo("my-database"));
|
||||
|
||||
var databaseCollection = account.GetCosmosDBSqlDatabases();
|
||||
var dbOperation = await databaseCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-database",
|
||||
databaseData);
|
||||
|
||||
CosmosDBSqlDatabaseResource database = dbOperation.Value;
|
||||
```
|
||||
|
||||
### 3. Create SQL Container
|
||||
|
||||
```csharp
|
||||
var containerData = new CosmosDBSqlContainerCreateOrUpdateContent(
|
||||
new CosmosDBSqlContainerResourceInfo("my-container")
|
||||
{
|
||||
PartitionKey = new CosmosDBContainerPartitionKey
|
||||
{
|
||||
Paths = { "/partitionKey" },
|
||||
Kind = CosmosDBPartitionKind.Hash
|
||||
},
|
||||
IndexingPolicy = new CosmosDBIndexingPolicy
|
||||
{
|
||||
Automatic = true,
|
||||
IndexingMode = CosmosDBIndexingMode.Consistent
|
||||
},
|
||||
DefaultTtl = 86400 // 24 hours
|
||||
});
|
||||
|
||||
var containerCollection = database.GetCosmosDBSqlContainers();
|
||||
var containerOperation = await containerCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-container",
|
||||
containerData);
|
||||
|
||||
CosmosDBSqlContainerResource container = containerOperation.Value;
|
||||
```
|
||||
|
||||
### 4. Configure Throughput
|
||||
|
||||
```csharp
|
||||
// Manual throughput
|
||||
var throughputData = new ThroughputSettingsUpdateData(
|
||||
new ThroughputSettingsResourceInfo
|
||||
{
|
||||
Throughput = 400
|
||||
});
|
||||
|
||||
// Autoscale throughput
|
||||
var autoscaleData = new ThroughputSettingsUpdateData(
|
||||
new ThroughputSettingsResourceInfo
|
||||
{
|
||||
AutoscaleSettings = new AutoscaleSettingsResourceInfo
|
||||
{
|
||||
MaxThroughput = 4000
|
||||
}
|
||||
});
|
||||
|
||||
// Apply to database
|
||||
await database.CreateOrUpdateCosmosDBSqlDatabaseThroughputAsync(
|
||||
WaitUntil.Completed,
|
||||
throughputData);
|
||||
```
|
||||
|
||||
### 5. Get Connection Information
|
||||
|
||||
```csharp
|
||||
// Get keys
|
||||
var keys = await account.GetKeysAsync();
|
||||
Console.WriteLine($"Primary Key: {keys.Value.PrimaryMasterKey}");
|
||||
|
||||
// Get connection strings
|
||||
var connectionStrings = await account.GetConnectionStringsAsync();
|
||||
foreach (var cs in connectionStrings.Value.ConnectionStrings)
|
||||
{
|
||||
Console.WriteLine($"{cs.Description}: {cs.ConnectionString}");
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `CosmosDBAccountResource` | Represents a Cosmos DB account |
|
||||
| `CosmosDBAccountCollection` | Collection for account CRUD |
|
||||
| `CosmosDBSqlDatabaseResource` | SQL API database |
|
||||
| `CosmosDBSqlContainerResource` | SQL API container |
|
||||
| `CosmosDBAccountCreateOrUpdateContent` | Account creation payload |
|
||||
| `CosmosDBSqlDatabaseCreateOrUpdateContent` | Database creation payload |
|
||||
| `CosmosDBSqlContainerCreateOrUpdateContent` | Container creation payload |
|
||||
| `ThroughputSettingsUpdateData` | Throughput configuration |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `WaitUntil.Completed`** for operations that must finish before proceeding
|
||||
2. **Use `WaitUntil.Started`** when you want to poll manually or run operations in parallel
|
||||
3. **Always use `DefaultAzureCredential`** — never hardcode keys
|
||||
4. **Handle `RequestFailedException`** for ARM API errors
|
||||
5. **Use `CreateOrUpdateAsync`** for idempotent operations
|
||||
6. **Navigate hierarchy** via `Get*` methods (e.g., `account.GetCosmosDBSqlDatabases()`)
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await accountCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, accountName, accountData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Account already exists");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | When to Read |
|
||||
|------|--------------|
|
||||
| [references/account-management.md](references/account-management.md) | Account CRUD, failover, keys, connection strings, networking |
|
||||
| [references/sql-resources.md](references/sql-resources.md) | SQL databases, containers, stored procedures, triggers, UDFs |
|
||||
| [references/throughput.md](references/throughput.md) | Manual/autoscale throughput, migration between modes |
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Microsoft.Azure.Cosmos` | Data plane (document CRUD, queries) | `dotnet add package Microsoft.Azure.Cosmos` |
|
||||
| `Azure.ResourceManager.CosmosDB` | Management plane (this SDK) | `dotnet add package Azure.ResourceManager.CosmosDB` |
|
||||
338
skills/official/microsoft/dotnet/data/fabric/SKILL.md
Normal file
338
skills/official/microsoft/dotnet/data/fabric/SKILL.md
Normal file
@@ -0,0 +1,338 @@
|
||||
---
|
||||
name: azure-mgmt-fabric-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for Fabric in .NET. Use for MANAGEMENT PLANE operations: provisioning, scaling, suspending/resuming Microsoft Fabric capacities, checking name availability, and listing SKUs via Azure Resource Manager. Triggers: "Fabric capacity", "create capacity", "suspend capacity", "resume capacity", "Fabric SKU", "provision Fabric", "ARM Fabric", "FabricCapacityResource".
|
||||
package: Azure.ResourceManager.Fabric
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.Fabric (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Microsoft Fabric capacity resources via Azure Resource Manager.
|
||||
|
||||
> **Management Plane Only**
|
||||
> This SDK manages Fabric *capacities* (compute resources). For working with Fabric workspaces, lakehouses, warehouses, and data items, use the Microsoft Fabric REST API or data plane SDKs.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.Fabric
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: 1.0.0 (GA - September 2025)
|
||||
**API Version**: 2023-11-01
|
||||
**Target Frameworks**: .NET 8.0, .NET Standard 2.0
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.Fabric;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription
|
||||
var subscription = await armClient.GetDefaultSubscriptionAsync();
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
└── ResourceGroupResource
|
||||
└── FabricCapacityResource
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create Fabric Capacity
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.Fabric;
|
||||
using Azure.ResourceManager.Fabric.Models;
|
||||
using Azure.Core;
|
||||
|
||||
// Get resource group
|
||||
var resourceGroup = await subscription.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Define capacity configuration
|
||||
var administration = new FabricCapacityAdministration(
|
||||
new[] { "admin@contoso.com" } // Capacity administrators (UPNs or object IDs)
|
||||
);
|
||||
|
||||
var properties = new FabricCapacityProperties(administration);
|
||||
|
||||
var sku = new FabricSku("F64", FabricSkuTier.Fabric);
|
||||
|
||||
var capacityData = new FabricCapacityData(
|
||||
AzureLocation.WestUS2,
|
||||
properties,
|
||||
sku)
|
||||
{
|
||||
Tags = { ["Environment"] = "Production" }
|
||||
};
|
||||
|
||||
// Create capacity (long-running operation)
|
||||
var capacityCollection = resourceGroup.Value.GetFabricCapacities();
|
||||
var operation = await capacityCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-fabric-capacity",
|
||||
capacityData);
|
||||
|
||||
FabricCapacityResource capacity = operation.Value;
|
||||
Console.WriteLine($"Created capacity: {capacity.Data.Name}");
|
||||
Console.WriteLine($"State: {capacity.Data.Properties.State}");
|
||||
```
|
||||
|
||||
### 2. Get Fabric Capacity
|
||||
|
||||
```csharp
|
||||
// Get existing capacity
|
||||
var capacity = await resourceGroup.Value
|
||||
.GetFabricCapacityAsync("my-fabric-capacity");
|
||||
|
||||
Console.WriteLine($"Name: {capacity.Value.Data.Name}");
|
||||
Console.WriteLine($"Location: {capacity.Value.Data.Location}");
|
||||
Console.WriteLine($"SKU: {capacity.Value.Data.Sku.Name}");
|
||||
Console.WriteLine($"State: {capacity.Value.Data.Properties.State}");
|
||||
Console.WriteLine($"Provisioning State: {capacity.Value.Data.Properties.ProvisioningState}");
|
||||
```
|
||||
|
||||
### 3. Update Capacity (Scale SKU or Change Admins)
|
||||
|
||||
```csharp
|
||||
var capacity = await resourceGroup.Value
|
||||
.GetFabricCapacityAsync("my-fabric-capacity");
|
||||
|
||||
var patch = new FabricCapacityPatch
|
||||
{
|
||||
Sku = new FabricSku("F128", FabricSkuTier.Fabric), // Scale up
|
||||
Properties = new FabricCapacityUpdateProperties
|
||||
{
|
||||
Administration = new FabricCapacityAdministration(
|
||||
new[] { "admin@contoso.com", "newadmin@contoso.com" }
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
var updateOperation = await capacity.Value.UpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
patch);
|
||||
|
||||
Console.WriteLine($"Updated SKU: {updateOperation.Value.Data.Sku.Name}");
|
||||
```
|
||||
|
||||
### 4. Suspend and Resume Capacity
|
||||
|
||||
```csharp
|
||||
// Suspend capacity (stop billing for compute)
|
||||
await capacity.Value.SuspendAsync(WaitUntil.Completed);
|
||||
Console.WriteLine("Capacity suspended");
|
||||
|
||||
// Resume capacity
|
||||
var resumeOperation = await capacity.Value.ResumeAsync(WaitUntil.Completed);
|
||||
Console.WriteLine($"Capacity resumed. State: {resumeOperation.Value.Data.Properties.State}");
|
||||
```
|
||||
|
||||
### 5. Delete Capacity
|
||||
|
||||
```csharp
|
||||
await capacity.Value.DeleteAsync(WaitUntil.Completed);
|
||||
Console.WriteLine("Capacity deleted");
|
||||
```
|
||||
|
||||
### 6. List All Capacities
|
||||
|
||||
```csharp
|
||||
// In a resource group
|
||||
await foreach (var cap in resourceGroup.Value.GetFabricCapacities())
|
||||
{
|
||||
Console.WriteLine($"- {cap.Data.Name} ({cap.Data.Sku.Name})");
|
||||
}
|
||||
|
||||
// In a subscription
|
||||
await foreach (var cap in subscription.GetFabricCapacitiesAsync())
|
||||
{
|
||||
Console.WriteLine($"- {cap.Data.Name} in {cap.Data.Location}");
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Check Name Availability
|
||||
|
||||
```csharp
|
||||
var checkContent = new FabricNameAvailabilityContent
|
||||
{
|
||||
Name = "my-new-capacity",
|
||||
ResourceType = "Microsoft.Fabric/capacities"
|
||||
};
|
||||
|
||||
var result = await subscription.CheckFabricCapacityNameAvailabilityAsync(
|
||||
AzureLocation.WestUS2,
|
||||
checkContent);
|
||||
|
||||
if (result.Value.IsNameAvailable == true)
|
||||
{
|
||||
Console.WriteLine("Name is available!");
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"Name unavailable: {result.Value.Reason} - {result.Value.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
### 8. List Available SKUs
|
||||
|
||||
```csharp
|
||||
// List all SKUs available in subscription
|
||||
await foreach (var skuDetails in subscription.GetSkusFabricCapacitiesAsync())
|
||||
{
|
||||
Console.WriteLine($"SKU: {skuDetails.Name}");
|
||||
Console.WriteLine($" Resource Type: {skuDetails.ResourceType}");
|
||||
foreach (var location in skuDetails.Locations)
|
||||
{
|
||||
Console.WriteLine($" Location: {location}");
|
||||
}
|
||||
}
|
||||
|
||||
// List SKUs available for an existing capacity (for scaling)
|
||||
await foreach (var skuDetails in capacity.Value.GetSkusForCapacityAsync())
|
||||
{
|
||||
Console.WriteLine($"Can scale to: {skuDetails.Sku.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
## SKU Reference
|
||||
|
||||
| SKU Name | Capacity Units (CU) | Power BI Equivalent |
|
||||
|----------|---------------------|---------------------|
|
||||
| F2 | 2 | - |
|
||||
| F4 | 4 | - |
|
||||
| F8 | 8 | EM1/A1 |
|
||||
| F16 | 16 | EM2/A2 |
|
||||
| F32 | 32 | EM3/A3 |
|
||||
| F64 | 64 | P1/A4 |
|
||||
| F128 | 128 | P2/A5 |
|
||||
| F256 | 256 | P3/A6 |
|
||||
| F512 | 512 | P4/A7 |
|
||||
| F1024 | 1024 | P5/A8 |
|
||||
| F2048 | 2048 | - |
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `FabricCapacityResource` | Represents a Fabric capacity instance |
|
||||
| `FabricCapacityCollection` | Collection for capacity CRUD operations |
|
||||
| `FabricCapacityData` | Capacity creation/read data model |
|
||||
| `FabricCapacityPatch` | Capacity update payload |
|
||||
| `FabricCapacityProperties` | Capacity properties (administration, state) |
|
||||
| `FabricCapacityAdministration` | Admin members configuration |
|
||||
| `FabricSku` | SKU configuration (name and tier) |
|
||||
| `FabricSkuTier` | Pricing tier (currently only "Fabric") |
|
||||
| `FabricProvisioningState` | Provisioning states (Succeeded, Failed, etc.) |
|
||||
| `FabricResourceState` | Resource states (Active, Suspended, etc.) |
|
||||
| `FabricNameAvailabilityContent` | Name availability check request |
|
||||
| `FabricNameAvailabilityResult` | Name availability check response |
|
||||
|
||||
## Provisioning and Resource States
|
||||
|
||||
### Provisioning States (`FabricProvisioningState`)
|
||||
- `Succeeded` - Operation completed successfully
|
||||
- `Failed` - Operation failed
|
||||
- `Canceled` - Operation was canceled
|
||||
- `Deleting` - Capacity is being deleted
|
||||
- `Provisioning` - Initial provisioning in progress
|
||||
- `Updating` - Update operation in progress
|
||||
|
||||
### Resource States (`FabricResourceState`)
|
||||
- `Active` - Capacity is running and available
|
||||
- `Provisioning` - Being provisioned
|
||||
- `Failed` - In failed state
|
||||
- `Updating` - Being updated
|
||||
- `Deleting` - Being deleted
|
||||
- `Suspending` - Transitioning to suspended
|
||||
- `Suspended` - Suspended (not billing for compute)
|
||||
- `Pausing` - Transitioning to paused
|
||||
- `Paused` - Paused
|
||||
- `Resuming` - Resuming from suspended/paused
|
||||
- `Scaling` - Scaling to different SKU
|
||||
- `Preparing` - Preparing resources
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `WaitUntil.Completed`** for operations that must finish before proceeding
|
||||
2. **Use `WaitUntil.Started`** when you want to poll manually or run operations in parallel
|
||||
3. **Always use `DefaultAzureCredential`** — never hardcode credentials
|
||||
4. **Handle `RequestFailedException`** for ARM API errors
|
||||
5. **Use `CreateOrUpdateAsync`** for idempotent operations
|
||||
6. **Suspend when not in use** — Fabric capacities bill for compute even when idle
|
||||
7. **Check provisioning state** before performing operations on a capacity
|
||||
8. **Use appropriate SKU** — Start small (F2/F4) for dev/test, scale up for production
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await capacityCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, capacityName, capacityData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Capacity already exists or conflict");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid configuration: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 403)
|
||||
{
|
||||
Console.WriteLine("Insufficient permissions or quota exceeded");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Common Pitfalls
|
||||
|
||||
1. **Capacity names must be globally unique** — Fabric capacity names must be unique across all Azure subscriptions
|
||||
2. **Suspend doesn't delete** — Suspended capacities still exist but don't bill for compute
|
||||
3. **SKU changes may require downtime** — Scaling operations can take several minutes
|
||||
4. **Admin UPNs must be valid** — Capacity administrators must be valid Azure AD users
|
||||
5. **Location constraints** — Not all SKUs are available in all regions; use `GetSkusFabricCapacitiesAsync` to check
|
||||
6. **Long provisioning times** — Capacity creation can take 5-15 minutes
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.Fabric` | Management plane (this SDK) | `dotnet add package Azure.ResourceManager.Fabric` |
|
||||
| `Microsoft.Fabric.Api` | Data plane operations (beta) | `dotnet add package Microsoft.Fabric.Api --prerelease` |
|
||||
| `Azure.ResourceManager` | Core ARM SDK | `dotnet add package Azure.ResourceManager` |
|
||||
| `Azure.Identity` | Authentication | `dotnet add package Azure.Identity` |
|
||||
|
||||
## References
|
||||
|
||||
- [Azure.ResourceManager.Fabric NuGet](https://www.nuget.org/packages/Azure.ResourceManager.Fabric)
|
||||
- [GitHub Source](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/fabric/Azure.ResourceManager.Fabric)
|
||||
- [Microsoft Fabric Documentation](https://learn.microsoft.com/fabric/)
|
||||
- [Fabric Capacity Management](https://learn.microsoft.com/fabric/admin/service-admin-portal-capacity-settings)
|
||||
392
skills/official/microsoft/dotnet/data/mysql/SKILL.md
Normal file
392
skills/official/microsoft/dotnet/data/mysql/SKILL.md
Normal file
@@ -0,0 +1,392 @@
|
||||
---
|
||||
name: azure-resource-manager-mysql-dotnet
|
||||
description: |
|
||||
Azure MySQL Flexible Server SDK for .NET. Database management for MySQL Flexible Server deployments. Use for creating servers, databases, firewall rules, configurations, backups, and high availability. Triggers: "MySQL", "MySqlFlexibleServer", "MySQL Flexible Server", "Azure Database for MySQL", "MySQL database management", "MySQL firewall", "MySQL backup".
|
||||
package: Azure.ResourceManager.MySql
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.MySql (.NET)
|
||||
|
||||
Azure Resource Manager SDK for managing MySQL Flexible Server deployments.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.MySql
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.2.0 (GA)
|
||||
**API Version**: 2023-12-30
|
||||
|
||||
> **Note**: This skill focuses on MySQL Flexible Server. Single Server is deprecated and scheduled for retirement.
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_MYSQL_SERVER_NAME=<your-mysql-server>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.MySql;
|
||||
using Azure.ResourceManager.MySql.FlexibleServers;
|
||||
|
||||
ArmClient client = new ArmClient(new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
Subscription
|
||||
└── ResourceGroup
|
||||
└── MySqlFlexibleServer # MySQL Flexible Server instance
|
||||
├── MySqlFlexibleServerDatabase # Database within the server
|
||||
├── MySqlFlexibleServerFirewallRule # IP firewall rules
|
||||
├── MySqlFlexibleServerConfiguration # Server parameters
|
||||
├── MySqlFlexibleServerBackup # Backup information
|
||||
├── MySqlFlexibleServerMaintenanceWindow # Maintenance schedule
|
||||
└── MySqlFlexibleServerAadAdministrator # Entra ID admin
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create MySQL Flexible Server
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.MySql.FlexibleServers;
|
||||
using Azure.ResourceManager.MySql.FlexibleServers.Models;
|
||||
|
||||
ResourceGroupResource resourceGroup = await client
|
||||
.GetDefaultSubscriptionAsync()
|
||||
.Result
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
MySqlFlexibleServerCollection servers = resourceGroup.GetMySqlFlexibleServers();
|
||||
|
||||
MySqlFlexibleServerData data = new MySqlFlexibleServerData(AzureLocation.EastUS)
|
||||
{
|
||||
Sku = new MySqlFlexibleServerSku("Standard_D2ds_v4", MySqlFlexibleServerSkuTier.GeneralPurpose),
|
||||
AdministratorLogin = "mysqladmin",
|
||||
AdministratorLoginPassword = "YourSecurePassword123!",
|
||||
Version = MySqlFlexibleServerVersion.Ver8_0_21,
|
||||
Storage = new MySqlFlexibleServerStorage
|
||||
{
|
||||
StorageSizeInGB = 128,
|
||||
AutoGrow = MySqlFlexibleServerEnableStatusEnum.Enabled,
|
||||
Iops = 3000
|
||||
},
|
||||
Backup = new MySqlFlexibleServerBackupProperties
|
||||
{
|
||||
BackupRetentionDays = 7,
|
||||
GeoRedundantBackup = MySqlFlexibleServerEnableStatusEnum.Disabled
|
||||
},
|
||||
HighAvailability = new MySqlFlexibleServerHighAvailability
|
||||
{
|
||||
Mode = MySqlFlexibleServerHighAvailabilityMode.ZoneRedundant,
|
||||
StandbyAvailabilityZone = "2"
|
||||
},
|
||||
AvailabilityZone = "1"
|
||||
};
|
||||
|
||||
ArmOperation<MySqlFlexibleServerResource> operation = await servers
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-mysql-server", data);
|
||||
|
||||
MySqlFlexibleServerResource server = operation.Value;
|
||||
Console.WriteLine($"Server created: {server.Data.FullyQualifiedDomainName}");
|
||||
```
|
||||
|
||||
### 2. Create Database
|
||||
|
||||
```csharp
|
||||
MySqlFlexibleServerResource server = await resourceGroup
|
||||
.GetMySqlFlexibleServerAsync("my-mysql-server");
|
||||
|
||||
MySqlFlexibleServerDatabaseCollection databases = server.GetMySqlFlexibleServerDatabases();
|
||||
|
||||
MySqlFlexibleServerDatabaseData dbData = new MySqlFlexibleServerDatabaseData
|
||||
{
|
||||
Charset = "utf8mb4",
|
||||
Collation = "utf8mb4_unicode_ci"
|
||||
};
|
||||
|
||||
ArmOperation<MySqlFlexibleServerDatabaseResource> operation = await databases
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "myappdb", dbData);
|
||||
|
||||
MySqlFlexibleServerDatabaseResource database = operation.Value;
|
||||
Console.WriteLine($"Database created: {database.Data.Name}");
|
||||
```
|
||||
|
||||
### 3. Configure Firewall Rules
|
||||
|
||||
```csharp
|
||||
MySqlFlexibleServerFirewallRuleCollection firewallRules = server.GetMySqlFlexibleServerFirewallRules();
|
||||
|
||||
// Allow specific IP range
|
||||
MySqlFlexibleServerFirewallRuleData ruleData = new MySqlFlexibleServerFirewallRuleData
|
||||
{
|
||||
StartIPAddress = System.Net.IPAddress.Parse("10.0.0.1"),
|
||||
EndIPAddress = System.Net.IPAddress.Parse("10.0.0.255")
|
||||
};
|
||||
|
||||
ArmOperation<MySqlFlexibleServerFirewallRuleResource> operation = await firewallRules
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "allow-internal", ruleData);
|
||||
|
||||
// Allow Azure services
|
||||
MySqlFlexibleServerFirewallRuleData azureServicesRule = new MySqlFlexibleServerFirewallRuleData
|
||||
{
|
||||
StartIPAddress = System.Net.IPAddress.Parse("0.0.0.0"),
|
||||
EndIPAddress = System.Net.IPAddress.Parse("0.0.0.0")
|
||||
};
|
||||
|
||||
await firewallRules.CreateOrUpdateAsync(WaitUntil.Completed, "AllowAllAzureServicesAndResourcesWithinAzureIps", azureServicesRule);
|
||||
```
|
||||
|
||||
### 4. Update Server Configuration
|
||||
|
||||
```csharp
|
||||
MySqlFlexibleServerConfigurationCollection configurations = server.GetMySqlFlexibleServerConfigurations();
|
||||
|
||||
// Get current configuration
|
||||
MySqlFlexibleServerConfigurationResource config = await configurations
|
||||
.GetAsync("max_connections");
|
||||
|
||||
// Update configuration
|
||||
MySqlFlexibleServerConfigurationData configData = new MySqlFlexibleServerConfigurationData
|
||||
{
|
||||
Value = "500",
|
||||
Source = MySqlFlexibleServerConfigurationSource.UserOverride
|
||||
};
|
||||
|
||||
ArmOperation<MySqlFlexibleServerConfigurationResource> operation = await configurations
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "max_connections", configData);
|
||||
|
||||
// Common configurations to tune
|
||||
string[] commonParams = { "max_connections", "innodb_buffer_pool_size", "slow_query_log", "long_query_time" };
|
||||
```
|
||||
|
||||
### 5. Configure Entra ID Administrator
|
||||
|
||||
```csharp
|
||||
MySqlFlexibleServerAadAdministratorCollection admins = server.GetMySqlFlexibleServerAadAdministrators();
|
||||
|
||||
MySqlFlexibleServerAadAdministratorData adminData = new MySqlFlexibleServerAadAdministratorData
|
||||
{
|
||||
AdministratorType = MySqlFlexibleServerAdministratorType.ActiveDirectory,
|
||||
Login = "aad-admin@contoso.com",
|
||||
Sid = Guid.Parse("<entra-object-id>"),
|
||||
TenantId = Guid.Parse("<tenant-id>"),
|
||||
IdentityResourceId = new ResourceIdentifier("/subscriptions/.../userAssignedIdentities/mysql-identity")
|
||||
};
|
||||
|
||||
ArmOperation<MySqlFlexibleServerAadAdministratorResource> operation = await admins
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "ActiveDirectory", adminData);
|
||||
```
|
||||
|
||||
### 6. List and Manage Servers
|
||||
|
||||
```csharp
|
||||
// List servers in resource group
|
||||
await foreach (MySqlFlexibleServerResource server in resourceGroup.GetMySqlFlexibleServers())
|
||||
{
|
||||
Console.WriteLine($"Server: {server.Data.Name}");
|
||||
Console.WriteLine($" FQDN: {server.Data.FullyQualifiedDomainName}");
|
||||
Console.WriteLine($" Version: {server.Data.Version}");
|
||||
Console.WriteLine($" State: {server.Data.State}");
|
||||
Console.WriteLine($" SKU: {server.Data.Sku.Name} ({server.Data.Sku.Tier})");
|
||||
}
|
||||
|
||||
// List databases in server
|
||||
await foreach (MySqlFlexibleServerDatabaseResource db in server.GetMySqlFlexibleServerDatabases())
|
||||
{
|
||||
Console.WriteLine($"Database: {db.Data.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Backup and Restore
|
||||
|
||||
```csharp
|
||||
// List available backups
|
||||
await foreach (MySqlFlexibleServerBackupResource backup in server.GetMySqlFlexibleServerBackups())
|
||||
{
|
||||
Console.WriteLine($"Backup: {backup.Data.Name}");
|
||||
Console.WriteLine($" Type: {backup.Data.BackupType}");
|
||||
Console.WriteLine($" Completed: {backup.Data.CompletedOn}");
|
||||
}
|
||||
|
||||
// Point-in-time restore
|
||||
MySqlFlexibleServerData restoreData = new MySqlFlexibleServerData(AzureLocation.EastUS)
|
||||
{
|
||||
CreateMode = MySqlFlexibleServerCreateMode.PointInTimeRestore,
|
||||
SourceServerResourceId = server.Id,
|
||||
RestorePointInTime = DateTimeOffset.UtcNow.AddHours(-2)
|
||||
};
|
||||
|
||||
ArmOperation<MySqlFlexibleServerResource> operation = await servers
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-mysql-restored", restoreData);
|
||||
```
|
||||
|
||||
### 8. Stop and Start Server
|
||||
|
||||
```csharp
|
||||
MySqlFlexibleServerResource server = await resourceGroup
|
||||
.GetMySqlFlexibleServerAsync("my-mysql-server");
|
||||
|
||||
// Stop server (saves costs when not in use)
|
||||
await server.StopAsync(WaitUntil.Completed);
|
||||
|
||||
// Start server
|
||||
await server.StartAsync(WaitUntil.Completed);
|
||||
|
||||
// Restart server
|
||||
await server.RestartAsync(WaitUntil.Completed, new MySqlFlexibleServerRestartParameter
|
||||
{
|
||||
RestartWithFailover = MySqlFlexibleServerEnableStatusEnum.Enabled,
|
||||
MaxFailoverSeconds = 60
|
||||
});
|
||||
```
|
||||
|
||||
### 9. Update Server (Scale)
|
||||
|
||||
```csharp
|
||||
MySqlFlexibleServerResource server = await resourceGroup
|
||||
.GetMySqlFlexibleServerAsync("my-mysql-server");
|
||||
|
||||
MySqlFlexibleServerPatch patch = new MySqlFlexibleServerPatch
|
||||
{
|
||||
Sku = new MySqlFlexibleServerSku("Standard_D4ds_v4", MySqlFlexibleServerSkuTier.GeneralPurpose),
|
||||
Storage = new MySqlFlexibleServerStorage
|
||||
{
|
||||
StorageSizeInGB = 256,
|
||||
Iops = 6000
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<MySqlFlexibleServerResource> operation = await server
|
||||
.UpdateAsync(WaitUntil.Completed, patch);
|
||||
```
|
||||
|
||||
### 10. Delete Server
|
||||
|
||||
```csharp
|
||||
MySqlFlexibleServerResource server = await resourceGroup
|
||||
.GetMySqlFlexibleServerAsync("my-mysql-server");
|
||||
|
||||
await server.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `MySqlFlexibleServerResource` | Flexible Server instance |
|
||||
| `MySqlFlexibleServerData` | Server configuration data |
|
||||
| `MySqlFlexibleServerCollection` | Collection of servers |
|
||||
| `MySqlFlexibleServerDatabaseResource` | Database within server |
|
||||
| `MySqlFlexibleServerFirewallRuleResource` | IP firewall rule |
|
||||
| `MySqlFlexibleServerConfigurationResource` | Server parameter |
|
||||
| `MySqlFlexibleServerBackupResource` | Backup metadata |
|
||||
| `MySqlFlexibleServerAadAdministratorResource` | Entra ID admin |
|
||||
| `MySqlFlexibleServerSku` | SKU (compute tier + size) |
|
||||
| `MySqlFlexibleServerStorage` | Storage configuration |
|
||||
| `MySqlFlexibleServerHighAvailability` | HA configuration |
|
||||
| `MySqlFlexibleServerBackupProperties` | Backup settings |
|
||||
|
||||
## SKU Tiers
|
||||
|
||||
| Tier | Use Case | SKU Examples |
|
||||
|------|----------|--------------|
|
||||
| `Burstable` | Dev/test, light workloads | Standard_B1ms, Standard_B2s |
|
||||
| `GeneralPurpose` | Production workloads | Standard_D2ds_v4, Standard_D4ds_v4 |
|
||||
| `MemoryOptimized` | High memory requirements | Standard_E2ds_v4, Standard_E4ds_v4 |
|
||||
|
||||
## High Availability Modes
|
||||
|
||||
| Mode | Description |
|
||||
|------|-------------|
|
||||
| `Disabled` | No HA (single server) |
|
||||
| `SameZone` | HA within same availability zone |
|
||||
| `ZoneRedundant` | HA across availability zones |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Flexible Server** — Single Server is deprecated
|
||||
2. **Enable zone-redundant HA** — For production workloads
|
||||
3. **Use DefaultAzureCredential** — Prefer over connection strings
|
||||
4. **Configure Entra ID authentication** — More secure than SQL auth
|
||||
5. **Enable auto-grow storage** — Prevents out-of-space issues
|
||||
6. **Set appropriate backup retention** — 7-35 days based on compliance
|
||||
7. **Use private endpoints** — For secure network access
|
||||
8. **Tune server parameters** — Based on workload characteristics
|
||||
9. **Monitor with Azure Monitor** — Enable metrics and logs
|
||||
10. **Stop dev/test servers** — Save costs when not in use
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ArmOperation<MySqlFlexibleServerResource> operation = await servers
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-mysql", data);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Server already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid configuration: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Connection String
|
||||
|
||||
After creating the server, connect using:
|
||||
|
||||
```csharp
|
||||
// ADO.NET connection string
|
||||
string connectionString = $"Server={server.Data.FullyQualifiedDomainName};" +
|
||||
"Database=myappdb;" +
|
||||
"User Id=mysqladmin;" +
|
||||
"Password=YourSecurePassword123!;" +
|
||||
"SslMode=Required;";
|
||||
|
||||
// With Entra ID token (recommended)
|
||||
var credential = new DefaultAzureCredential();
|
||||
var token = await credential.GetTokenAsync(
|
||||
new TokenRequestContext(new[] { "https://ossrdbms-aad.database.windows.net/.default" }));
|
||||
|
||||
string connectionString = $"Server={server.Data.FullyQualifiedDomainName};" +
|
||||
"Database=myappdb;" +
|
||||
$"User Id=aad-admin@contoso.com;" +
|
||||
$"Password={token.Token};" +
|
||||
"SslMode=Required;";
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.MySql` | MySQL management (this SDK) | `dotnet add package Azure.ResourceManager.MySql` |
|
||||
| `Azure.ResourceManager.PostgreSql` | PostgreSQL management | `dotnet add package Azure.ResourceManager.PostgreSql` |
|
||||
| `MySqlConnector` | MySQL data access | `dotnet add package MySqlConnector` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.ResourceManager.MySql |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.resourcemanager.mysql |
|
||||
| Product Documentation | https://learn.microsoft.com/azure/mysql/flexible-server/ |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/mysql/Azure.ResourceManager.MySql |
|
||||
432
skills/official/microsoft/dotnet/data/postgresql/SKILL.md
Normal file
432
skills/official/microsoft/dotnet/data/postgresql/SKILL.md
Normal file
@@ -0,0 +1,432 @@
|
||||
---
|
||||
name: azure-resource-manager-postgresql-dotnet
|
||||
description: |
|
||||
Azure PostgreSQL Flexible Server SDK for .NET. Database management for PostgreSQL Flexible Server deployments. Use for creating servers, databases, firewall rules, configurations, backups, and high availability. Triggers: "PostgreSQL", "PostgreSqlFlexibleServer", "PostgreSQL Flexible Server", "Azure Database for PostgreSQL", "PostgreSQL database management", "PostgreSQL firewall", "PostgreSQL backup", "Postgres".
|
||||
package: Azure.ResourceManager.PostgreSql
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.PostgreSql (.NET)
|
||||
|
||||
Azure Resource Manager SDK for managing PostgreSQL Flexible Server deployments.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.PostgreSql
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.2.0 (GA)
|
||||
**API Version**: 2023-12-01-preview
|
||||
|
||||
> **Note**: This skill focuses on PostgreSQL Flexible Server. Single Server is deprecated and scheduled for retirement.
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_POSTGRESQL_SERVER_NAME=<your-postgresql-server>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.PostgreSql;
|
||||
using Azure.ResourceManager.PostgreSql.FlexibleServers;
|
||||
|
||||
ArmClient client = new ArmClient(new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
Subscription
|
||||
└── ResourceGroup
|
||||
└── PostgreSqlFlexibleServer # PostgreSQL Flexible Server instance
|
||||
├── PostgreSqlFlexibleServerDatabase # Database within the server
|
||||
├── PostgreSqlFlexibleServerFirewallRule # IP firewall rules
|
||||
├── PostgreSqlFlexibleServerConfiguration # Server parameters
|
||||
├── PostgreSqlFlexibleServerBackup # Backup information
|
||||
├── PostgreSqlFlexibleServerActiveDirectoryAdministrator # Entra ID admin
|
||||
└── PostgreSqlFlexibleServerVirtualEndpoint # Read replica endpoints
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create PostgreSQL Flexible Server
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.PostgreSql.FlexibleServers;
|
||||
using Azure.ResourceManager.PostgreSql.FlexibleServers.Models;
|
||||
|
||||
ResourceGroupResource resourceGroup = await client
|
||||
.GetDefaultSubscriptionAsync()
|
||||
.Result
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
PostgreSqlFlexibleServerCollection servers = resourceGroup.GetPostgreSqlFlexibleServers();
|
||||
|
||||
PostgreSqlFlexibleServerData data = new PostgreSqlFlexibleServerData(AzureLocation.EastUS)
|
||||
{
|
||||
Sku = new PostgreSqlFlexibleServerSku("Standard_D2ds_v4", PostgreSqlFlexibleServerSkuTier.GeneralPurpose),
|
||||
AdministratorLogin = "pgadmin",
|
||||
AdministratorLoginPassword = "YourSecurePassword123!",
|
||||
Version = PostgreSqlFlexibleServerVersion.Ver16,
|
||||
Storage = new PostgreSqlFlexibleServerStorage
|
||||
{
|
||||
StorageSizeInGB = 128,
|
||||
AutoGrow = StorageAutoGrow.Enabled,
|
||||
Tier = PostgreSqlStorageTierName.P30
|
||||
},
|
||||
Backup = new PostgreSqlFlexibleServerBackupProperties
|
||||
{
|
||||
BackupRetentionDays = 7,
|
||||
GeoRedundantBackup = PostgreSqlFlexibleServerGeoRedundantBackupEnum.Disabled
|
||||
},
|
||||
HighAvailability = new PostgreSqlFlexibleServerHighAvailability
|
||||
{
|
||||
Mode = PostgreSqlFlexibleServerHighAvailabilityMode.ZoneRedundant,
|
||||
StandbyAvailabilityZone = "2"
|
||||
},
|
||||
AvailabilityZone = "1",
|
||||
AuthConfig = new PostgreSqlFlexibleServerAuthConfig
|
||||
{
|
||||
ActiveDirectoryAuth = PostgreSqlFlexibleServerActiveDirectoryAuthEnum.Enabled,
|
||||
PasswordAuth = PostgreSqlFlexibleServerPasswordAuthEnum.Enabled
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<PostgreSqlFlexibleServerResource> operation = await servers
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-postgresql-server", data);
|
||||
|
||||
PostgreSqlFlexibleServerResource server = operation.Value;
|
||||
Console.WriteLine($"Server created: {server.Data.FullyQualifiedDomainName}");
|
||||
```
|
||||
|
||||
### 2. Create Database
|
||||
|
||||
```csharp
|
||||
PostgreSqlFlexibleServerResource server = await resourceGroup
|
||||
.GetPostgreSqlFlexibleServerAsync("my-postgresql-server");
|
||||
|
||||
PostgreSqlFlexibleServerDatabaseCollection databases = server.GetPostgreSqlFlexibleServerDatabases();
|
||||
|
||||
PostgreSqlFlexibleServerDatabaseData dbData = new PostgreSqlFlexibleServerDatabaseData
|
||||
{
|
||||
Charset = "UTF8",
|
||||
Collation = "en_US.utf8"
|
||||
};
|
||||
|
||||
ArmOperation<PostgreSqlFlexibleServerDatabaseResource> operation = await databases
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "myappdb", dbData);
|
||||
|
||||
PostgreSqlFlexibleServerDatabaseResource database = operation.Value;
|
||||
Console.WriteLine($"Database created: {database.Data.Name}");
|
||||
```
|
||||
|
||||
### 3. Configure Firewall Rules
|
||||
|
||||
```csharp
|
||||
PostgreSqlFlexibleServerFirewallRuleCollection firewallRules = server.GetPostgreSqlFlexibleServerFirewallRules();
|
||||
|
||||
// Allow specific IP range
|
||||
PostgreSqlFlexibleServerFirewallRuleData ruleData = new PostgreSqlFlexibleServerFirewallRuleData
|
||||
{
|
||||
StartIPAddress = System.Net.IPAddress.Parse("10.0.0.1"),
|
||||
EndIPAddress = System.Net.IPAddress.Parse("10.0.0.255")
|
||||
};
|
||||
|
||||
ArmOperation<PostgreSqlFlexibleServerFirewallRuleResource> operation = await firewallRules
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "allow-internal", ruleData);
|
||||
|
||||
// Allow Azure services
|
||||
PostgreSqlFlexibleServerFirewallRuleData azureServicesRule = new PostgreSqlFlexibleServerFirewallRuleData
|
||||
{
|
||||
StartIPAddress = System.Net.IPAddress.Parse("0.0.0.0"),
|
||||
EndIPAddress = System.Net.IPAddress.Parse("0.0.0.0")
|
||||
};
|
||||
|
||||
await firewallRules.CreateOrUpdateAsync(WaitUntil.Completed, "AllowAllAzureServicesAndResourcesWithinAzureIps", azureServicesRule);
|
||||
```
|
||||
|
||||
### 4. Update Server Configuration
|
||||
|
||||
```csharp
|
||||
PostgreSqlFlexibleServerConfigurationCollection configurations = server.GetPostgreSqlFlexibleServerConfigurations();
|
||||
|
||||
// Get current configuration
|
||||
PostgreSqlFlexibleServerConfigurationResource config = await configurations
|
||||
.GetAsync("max_connections");
|
||||
|
||||
// Update configuration
|
||||
PostgreSqlFlexibleServerConfigurationData configData = new PostgreSqlFlexibleServerConfigurationData
|
||||
{
|
||||
Value = "500",
|
||||
Source = "user-override"
|
||||
};
|
||||
|
||||
ArmOperation<PostgreSqlFlexibleServerConfigurationResource> operation = await configurations
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "max_connections", configData);
|
||||
|
||||
// Common PostgreSQL configurations to tune
|
||||
string[] commonParams = {
|
||||
"max_connections",
|
||||
"shared_buffers",
|
||||
"work_mem",
|
||||
"maintenance_work_mem",
|
||||
"effective_cache_size",
|
||||
"log_min_duration_statement"
|
||||
};
|
||||
```
|
||||
|
||||
### 5. Configure Entra ID Administrator
|
||||
|
||||
```csharp
|
||||
PostgreSqlFlexibleServerActiveDirectoryAdministratorCollection admins =
|
||||
server.GetPostgreSqlFlexibleServerActiveDirectoryAdministrators();
|
||||
|
||||
PostgreSqlFlexibleServerActiveDirectoryAdministratorData adminData =
|
||||
new PostgreSqlFlexibleServerActiveDirectoryAdministratorData
|
||||
{
|
||||
PrincipalType = PostgreSqlFlexibleServerPrincipalType.User,
|
||||
PrincipalName = "aad-admin@contoso.com",
|
||||
TenantId = Guid.Parse("<tenant-id>")
|
||||
};
|
||||
|
||||
ArmOperation<PostgreSqlFlexibleServerActiveDirectoryAdministratorResource> operation = await admins
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "<entra-object-id>", adminData);
|
||||
```
|
||||
|
||||
### 6. List and Manage Servers
|
||||
|
||||
```csharp
|
||||
// List servers in resource group
|
||||
await foreach (PostgreSqlFlexibleServerResource server in resourceGroup.GetPostgreSqlFlexibleServers())
|
||||
{
|
||||
Console.WriteLine($"Server: {server.Data.Name}");
|
||||
Console.WriteLine($" FQDN: {server.Data.FullyQualifiedDomainName}");
|
||||
Console.WriteLine($" Version: {server.Data.Version}");
|
||||
Console.WriteLine($" State: {server.Data.State}");
|
||||
Console.WriteLine($" SKU: {server.Data.Sku.Name} ({server.Data.Sku.Tier})");
|
||||
Console.WriteLine($" HA: {server.Data.HighAvailability?.Mode}");
|
||||
}
|
||||
|
||||
// List databases in server
|
||||
await foreach (PostgreSqlFlexibleServerDatabaseResource db in server.GetPostgreSqlFlexibleServerDatabases())
|
||||
{
|
||||
Console.WriteLine($"Database: {db.Data.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Backup and Point-in-Time Restore
|
||||
|
||||
```csharp
|
||||
// List available backups
|
||||
await foreach (PostgreSqlFlexibleServerBackupResource backup in server.GetPostgreSqlFlexibleServerBackups())
|
||||
{
|
||||
Console.WriteLine($"Backup: {backup.Data.Name}");
|
||||
Console.WriteLine($" Type: {backup.Data.BackupType}");
|
||||
Console.WriteLine($" Completed: {backup.Data.CompletedOn}");
|
||||
}
|
||||
|
||||
// Point-in-time restore
|
||||
PostgreSqlFlexibleServerData restoreData = new PostgreSqlFlexibleServerData(AzureLocation.EastUS)
|
||||
{
|
||||
CreateMode = PostgreSqlFlexibleServerCreateMode.PointInTimeRestore,
|
||||
SourceServerResourceId = server.Id,
|
||||
PointInTimeUtc = DateTimeOffset.UtcNow.AddHours(-2)
|
||||
};
|
||||
|
||||
ArmOperation<PostgreSqlFlexibleServerResource> operation = await servers
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-postgresql-restored", restoreData);
|
||||
```
|
||||
|
||||
### 8. Create Read Replica
|
||||
|
||||
```csharp
|
||||
PostgreSqlFlexibleServerData replicaData = new PostgreSqlFlexibleServerData(AzureLocation.WestUS)
|
||||
{
|
||||
CreateMode = PostgreSqlFlexibleServerCreateMode.Replica,
|
||||
SourceServerResourceId = server.Id,
|
||||
Sku = new PostgreSqlFlexibleServerSku("Standard_D2ds_v4", PostgreSqlFlexibleServerSkuTier.GeneralPurpose)
|
||||
};
|
||||
|
||||
ArmOperation<PostgreSqlFlexibleServerResource> operation = await servers
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-postgresql-replica", replicaData);
|
||||
```
|
||||
|
||||
### 9. Stop and Start Server
|
||||
|
||||
```csharp
|
||||
PostgreSqlFlexibleServerResource server = await resourceGroup
|
||||
.GetPostgreSqlFlexibleServerAsync("my-postgresql-server");
|
||||
|
||||
// Stop server (saves costs when not in use)
|
||||
await server.StopAsync(WaitUntil.Completed);
|
||||
|
||||
// Start server
|
||||
await server.StartAsync(WaitUntil.Completed);
|
||||
|
||||
// Restart server
|
||||
await server.RestartAsync(WaitUntil.Completed, new PostgreSqlFlexibleServerRestartParameter
|
||||
{
|
||||
RestartWithFailover = true,
|
||||
FailoverMode = PostgreSqlFlexibleServerFailoverMode.PlannedFailover
|
||||
});
|
||||
```
|
||||
|
||||
### 10. Update Server (Scale)
|
||||
|
||||
```csharp
|
||||
PostgreSqlFlexibleServerResource server = await resourceGroup
|
||||
.GetPostgreSqlFlexibleServerAsync("my-postgresql-server");
|
||||
|
||||
PostgreSqlFlexibleServerPatch patch = new PostgreSqlFlexibleServerPatch
|
||||
{
|
||||
Sku = new PostgreSqlFlexibleServerSku("Standard_D4ds_v4", PostgreSqlFlexibleServerSkuTier.GeneralPurpose),
|
||||
Storage = new PostgreSqlFlexibleServerStorage
|
||||
{
|
||||
StorageSizeInGB = 256,
|
||||
Tier = PostgreSqlStorageTierName.P40
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<PostgreSqlFlexibleServerResource> operation = await server
|
||||
.UpdateAsync(WaitUntil.Completed, patch);
|
||||
```
|
||||
|
||||
### 11. Delete Server
|
||||
|
||||
```csharp
|
||||
PostgreSqlFlexibleServerResource server = await resourceGroup
|
||||
.GetPostgreSqlFlexibleServerAsync("my-postgresql-server");
|
||||
|
||||
await server.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `PostgreSqlFlexibleServerResource` | Flexible Server instance |
|
||||
| `PostgreSqlFlexibleServerData` | Server configuration data |
|
||||
| `PostgreSqlFlexibleServerCollection` | Collection of servers |
|
||||
| `PostgreSqlFlexibleServerDatabaseResource` | Database within server |
|
||||
| `PostgreSqlFlexibleServerFirewallRuleResource` | IP firewall rule |
|
||||
| `PostgreSqlFlexibleServerConfigurationResource` | Server parameter |
|
||||
| `PostgreSqlFlexibleServerBackupResource` | Backup metadata |
|
||||
| `PostgreSqlFlexibleServerActiveDirectoryAdministratorResource` | Entra ID admin |
|
||||
| `PostgreSqlFlexibleServerSku` | SKU (compute tier + size) |
|
||||
| `PostgreSqlFlexibleServerStorage` | Storage configuration |
|
||||
| `PostgreSqlFlexibleServerHighAvailability` | HA configuration |
|
||||
| `PostgreSqlFlexibleServerBackupProperties` | Backup settings |
|
||||
| `PostgreSqlFlexibleServerAuthConfig` | Authentication settings |
|
||||
|
||||
## SKU Tiers
|
||||
|
||||
| Tier | Use Case | SKU Examples |
|
||||
|------|----------|--------------|
|
||||
| `Burstable` | Dev/test, light workloads | Standard_B1ms, Standard_B2s |
|
||||
| `GeneralPurpose` | Production workloads | Standard_D2ds_v4, Standard_D4ds_v4 |
|
||||
| `MemoryOptimized` | High memory requirements | Standard_E2ds_v4, Standard_E4ds_v4 |
|
||||
|
||||
## PostgreSQL Versions
|
||||
|
||||
| Version | Enum Value |
|
||||
|---------|------------|
|
||||
| PostgreSQL 11 | `Ver11` |
|
||||
| PostgreSQL 12 | `Ver12` |
|
||||
| PostgreSQL 13 | `Ver13` |
|
||||
| PostgreSQL 14 | `Ver14` |
|
||||
| PostgreSQL 15 | `Ver15` |
|
||||
| PostgreSQL 16 | `Ver16` |
|
||||
|
||||
## High Availability Modes
|
||||
|
||||
| Mode | Description |
|
||||
|------|-------------|
|
||||
| `Disabled` | No HA (single server) |
|
||||
| `SameZone` | HA within same availability zone |
|
||||
| `ZoneRedundant` | HA across availability zones |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Flexible Server** — Single Server is deprecated
|
||||
2. **Enable zone-redundant HA** — For production workloads
|
||||
3. **Use DefaultAzureCredential** — Prefer over connection strings
|
||||
4. **Configure Entra ID authentication** — More secure than SQL auth alone
|
||||
5. **Enable both auth methods** — Entra ID + password for flexibility
|
||||
6. **Set appropriate backup retention** — 7-35 days based on compliance
|
||||
7. **Use private endpoints** — For secure network access
|
||||
8. **Tune server parameters** — Based on workload characteristics
|
||||
9. **Use read replicas** — For read-heavy workloads
|
||||
10. **Stop dev/test servers** — Save costs when not in use
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ArmOperation<PostgreSqlFlexibleServerResource> operation = await servers
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-postgresql", data);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Server already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid configuration: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Connection String
|
||||
|
||||
After creating the server, connect using:
|
||||
|
||||
```csharp
|
||||
// Npgsql connection string
|
||||
string connectionString = $"Host={server.Data.FullyQualifiedDomainName};" +
|
||||
"Database=myappdb;" +
|
||||
"Username=pgadmin;" +
|
||||
"Password=YourSecurePassword123!;" +
|
||||
"SSL Mode=Require;Trust Server Certificate=true;";
|
||||
|
||||
// With Entra ID token (recommended)
|
||||
var credential = new DefaultAzureCredential();
|
||||
var token = await credential.GetTokenAsync(
|
||||
new TokenRequestContext(new[] { "https://ossrdbms-aad.database.windows.net/.default" }));
|
||||
|
||||
string connectionString = $"Host={server.Data.FullyQualifiedDomainName};" +
|
||||
"Database=myappdb;" +
|
||||
$"Username=aad-admin@contoso.com;" +
|
||||
$"Password={token.Token};" +
|
||||
"SSL Mode=Require;";
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.PostgreSql` | PostgreSQL management (this SDK) | `dotnet add package Azure.ResourceManager.PostgreSql` |
|
||||
| `Azure.ResourceManager.MySql` | MySQL management | `dotnet add package Azure.ResourceManager.MySql` |
|
||||
| `Npgsql` | PostgreSQL data access | `dotnet add package Npgsql` |
|
||||
| `Npgsql.EntityFrameworkCore.PostgreSQL` | EF Core provider | `dotnet add package Npgsql.EntityFrameworkCore.PostgreSQL` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.ResourceManager.PostgreSql |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.resourcemanager.postgresql |
|
||||
| Product Documentation | https://learn.microsoft.com/azure/postgresql/flexible-server/ |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/postgresql/Azure.ResourceManager.PostgreSql |
|
||||
356
skills/official/microsoft/dotnet/data/redis/SKILL.md
Normal file
356
skills/official/microsoft/dotnet/data/redis/SKILL.md
Normal file
@@ -0,0 +1,356 @@
|
||||
---
|
||||
name: azure-resource-manager-redis-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for Redis in .NET. Use for MANAGEMENT PLANE operations: creating/managing Azure Cache for Redis instances, firewall rules, access keys, patch schedules, linked servers (geo-replication), and private endpoints via Azure Resource Manager. NOT for data plane operations (get/set keys, pub/sub) - use StackExchange.Redis for that. Triggers: "Redis cache", "create Redis", "manage Redis", "ARM Redis", "RedisResource", "provision Redis", "Azure Cache for Redis".
|
||||
package: Azure.ResourceManager.Redis
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.Redis (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Azure Cache for Redis resources via Azure Resource Manager.
|
||||
|
||||
> **⚠️ Management vs Data Plane**
|
||||
> - **This SDK (Azure.ResourceManager.Redis)**: Create caches, configure firewall rules, manage access keys, set up geo-replication
|
||||
> - **Data Plane SDK (StackExchange.Redis)**: Get/set keys, pub/sub, streams, Lua scripts
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.Redis
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: 1.5.1 (Stable)
|
||||
**API Version**: 2024-11-01
|
||||
**Target Frameworks**: .NET 8.0, .NET Standard 2.0
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.Redis;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
|
||||
var subscription = armClient.GetSubscriptionResource(
|
||||
new ResourceIdentifier($"/subscriptions/{subscriptionId}"));
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
└── ResourceGroupResource
|
||||
└── RedisResource
|
||||
├── RedisFirewallRuleResource
|
||||
├── RedisPatchScheduleResource
|
||||
├── RedisLinkedServerWithPropertyResource
|
||||
├── RedisPrivateEndpointConnectionResource
|
||||
└── RedisCacheAccessPolicyResource
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create Redis Cache
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.Redis;
|
||||
using Azure.ResourceManager.Redis.Models;
|
||||
|
||||
// Get resource group
|
||||
var resourceGroup = await subscription
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Define cache configuration
|
||||
var cacheData = new RedisCreateOrUpdateContent(
|
||||
location: AzureLocation.EastUS,
|
||||
sku: new RedisSku(RedisSkuName.Standard, RedisSkuFamily.BasicOrStandard, 1))
|
||||
{
|
||||
EnableNonSslPort = false,
|
||||
MinimumTlsVersion = RedisTlsVersion.Tls1_2,
|
||||
RedisConfiguration = new RedisCommonConfiguration
|
||||
{
|
||||
MaxMemoryPolicy = "volatile-lru"
|
||||
},
|
||||
Tags =
|
||||
{
|
||||
["environment"] = "production"
|
||||
}
|
||||
};
|
||||
|
||||
// Create cache (long-running operation)
|
||||
var cacheCollection = resourceGroup.Value.GetAllRedis();
|
||||
var operation = await cacheCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-redis-cache",
|
||||
cacheData);
|
||||
|
||||
RedisResource cache = operation.Value;
|
||||
Console.WriteLine($"Cache created: {cache.Data.HostName}");
|
||||
```
|
||||
|
||||
### 2. Get Redis Cache
|
||||
|
||||
```csharp
|
||||
// Get existing cache
|
||||
var cache = await resourceGroup.Value
|
||||
.GetRedisAsync("my-redis-cache");
|
||||
|
||||
Console.WriteLine($"Host: {cache.Value.Data.HostName}");
|
||||
Console.WriteLine($"Port: {cache.Value.Data.Port}");
|
||||
Console.WriteLine($"SSL Port: {cache.Value.Data.SslPort}");
|
||||
Console.WriteLine($"Provisioning State: {cache.Value.Data.ProvisioningState}");
|
||||
```
|
||||
|
||||
### 3. Update Redis Cache
|
||||
|
||||
```csharp
|
||||
var patchData = new RedisPatch
|
||||
{
|
||||
Sku = new RedisSku(RedisSkuName.Standard, RedisSkuFamily.BasicOrStandard, 2),
|
||||
RedisConfiguration = new RedisCommonConfiguration
|
||||
{
|
||||
MaxMemoryPolicy = "allkeys-lru"
|
||||
}
|
||||
};
|
||||
|
||||
var updateOperation = await cache.Value.UpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
patchData);
|
||||
```
|
||||
|
||||
### 4. Delete Redis Cache
|
||||
|
||||
```csharp
|
||||
await cache.Value.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
### 5. Get Access Keys
|
||||
|
||||
```csharp
|
||||
var keys = await cache.Value.GetKeysAsync();
|
||||
Console.WriteLine($"Primary Key: {keys.Value.PrimaryKey}");
|
||||
Console.WriteLine($"Secondary Key: {keys.Value.SecondaryKey}");
|
||||
```
|
||||
|
||||
### 6. Regenerate Access Keys
|
||||
|
||||
```csharp
|
||||
var regenerateContent = new RedisRegenerateKeyContent(RedisRegenerateKeyType.Primary);
|
||||
var newKeys = await cache.Value.RegenerateKeyAsync(regenerateContent);
|
||||
Console.WriteLine($"New Primary Key: {newKeys.Value.PrimaryKey}");
|
||||
```
|
||||
|
||||
### 7. Manage Firewall Rules
|
||||
|
||||
```csharp
|
||||
// Create firewall rule
|
||||
var firewallData = new RedisFirewallRuleData(
|
||||
startIP: System.Net.IPAddress.Parse("10.0.0.1"),
|
||||
endIP: System.Net.IPAddress.Parse("10.0.0.255"));
|
||||
|
||||
var firewallCollection = cache.Value.GetRedisFirewallRules();
|
||||
var firewallOperation = await firewallCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"allow-internal-network",
|
||||
firewallData);
|
||||
|
||||
// List all firewall rules
|
||||
await foreach (var rule in firewallCollection.GetAllAsync())
|
||||
{
|
||||
Console.WriteLine($"Rule: {rule.Data.Name} ({rule.Data.StartIP} - {rule.Data.EndIP})");
|
||||
}
|
||||
|
||||
// Delete firewall rule
|
||||
var ruleToDelete = await firewallCollection.GetAsync("allow-internal-network");
|
||||
await ruleToDelete.Value.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
### 8. Configure Patch Schedule (Premium SKU)
|
||||
|
||||
```csharp
|
||||
// Patch schedules require Premium SKU
|
||||
var scheduleData = new RedisPatchScheduleData(
|
||||
new[]
|
||||
{
|
||||
new RedisPatchScheduleSetting(RedisDayOfWeek.Saturday, 2) // 2 AM Saturday
|
||||
{
|
||||
MaintenanceWindow = TimeSpan.FromHours(5)
|
||||
},
|
||||
new RedisPatchScheduleSetting(RedisDayOfWeek.Sunday, 2) // 2 AM Sunday
|
||||
{
|
||||
MaintenanceWindow = TimeSpan.FromHours(5)
|
||||
}
|
||||
});
|
||||
|
||||
var scheduleCollection = cache.Value.GetRedisPatchSchedules();
|
||||
await scheduleCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
RedisPatchScheduleDefaultName.Default,
|
||||
scheduleData);
|
||||
```
|
||||
|
||||
### 9. Import/Export Data (Premium SKU)
|
||||
|
||||
```csharp
|
||||
// Import data from blob storage
|
||||
var importContent = new ImportRdbContent(
|
||||
files: new[] { "https://mystorageaccount.blob.core.windows.net/container/dump.rdb" },
|
||||
format: "RDB");
|
||||
|
||||
await cache.Value.ImportDataAsync(WaitUntil.Completed, importContent);
|
||||
|
||||
// Export data to blob storage
|
||||
var exportContent = new ExportRdbContent(
|
||||
prefix: "backup",
|
||||
container: "https://mystorageaccount.blob.core.windows.net/container?sastoken",
|
||||
format: "RDB");
|
||||
|
||||
await cache.Value.ExportDataAsync(WaitUntil.Completed, exportContent);
|
||||
```
|
||||
|
||||
### 10. Force Reboot
|
||||
|
||||
```csharp
|
||||
var rebootContent = new RedisRebootContent
|
||||
{
|
||||
RebootType = RedisRebootType.AllNodes,
|
||||
ShardId = 0 // For clustered caches
|
||||
};
|
||||
|
||||
await cache.Value.ForceRebootAsync(rebootContent);
|
||||
```
|
||||
|
||||
## SKU Reference
|
||||
|
||||
| SKU | Family | Capacity | Features |
|
||||
|-----|--------|----------|----------|
|
||||
| Basic | C | 0-6 | Single node, no SLA, dev/test only |
|
||||
| Standard | C | 0-6 | Two nodes (primary/replica), SLA |
|
||||
| Premium | P | 1-5 | Clustering, geo-replication, VNet, persistence |
|
||||
|
||||
**Capacity Sizes (Family C - Basic/Standard)**:
|
||||
- C0: 250 MB
|
||||
- C1: 1 GB
|
||||
- C2: 2.5 GB
|
||||
- C3: 6 GB
|
||||
- C4: 13 GB
|
||||
- C5: 26 GB
|
||||
- C6: 53 GB
|
||||
|
||||
**Capacity Sizes (Family P - Premium)**:
|
||||
- P1: 6 GB per shard
|
||||
- P2: 13 GB per shard
|
||||
- P3: 26 GB per shard
|
||||
- P4: 53 GB per shard
|
||||
- P5: 120 GB per shard
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `RedisResource` | Represents a Redis cache instance |
|
||||
| `RedisCollection` | Collection for cache CRUD operations |
|
||||
| `RedisFirewallRuleResource` | Firewall rule for IP filtering |
|
||||
| `RedisPatchScheduleResource` | Maintenance window configuration |
|
||||
| `RedisLinkedServerWithPropertyResource` | Geo-replication linked server |
|
||||
| `RedisPrivateEndpointConnectionResource` | Private endpoint connection |
|
||||
| `RedisCacheAccessPolicyResource` | RBAC access policy |
|
||||
| `RedisCreateOrUpdateContent` | Cache creation payload |
|
||||
| `RedisPatch` | Cache update payload |
|
||||
| `RedisSku` | SKU configuration (name, family, capacity) |
|
||||
| `RedisAccessKeys` | Primary and secondary access keys |
|
||||
| `RedisRegenerateKeyContent` | Key regeneration request |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `WaitUntil.Completed`** for operations that must finish before proceeding
|
||||
2. **Use `WaitUntil.Started`** when you want to poll manually or run operations in parallel
|
||||
3. **Always use `DefaultAzureCredential`** — never hardcode keys
|
||||
4. **Handle `RequestFailedException`** for ARM API errors
|
||||
5. **Use `CreateOrUpdateAsync`** for idempotent operations
|
||||
6. **Navigate hierarchy** via `Get*` methods (e.g., `cache.GetRedisFirewallRules()`)
|
||||
7. **Use Premium SKU** for production workloads requiring geo-replication, clustering, or persistence
|
||||
8. **Enable TLS 1.2 minimum** — set `MinimumTlsVersion = RedisTlsVersion.Tls1_2`
|
||||
9. **Disable non-SSL port** — set `EnableNonSslPort = false` for security
|
||||
10. **Rotate keys regularly** — use `RegenerateKeyAsync` and update connection strings
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await cacheCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, cacheName, cacheData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Cache already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid configuration: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Common Pitfalls
|
||||
|
||||
1. **SKU downgrades not allowed** — You cannot downgrade from Premium to Standard/Basic
|
||||
2. **Clustering requires Premium** — Shard configuration only available on Premium SKU
|
||||
3. **Geo-replication requires Premium** — Linked servers only work with Premium caches
|
||||
4. **VNet injection requires Premium** — Virtual network support is Premium-only
|
||||
5. **Patch schedules require Premium** — Maintenance windows only configurable on Premium
|
||||
6. **Cache name globally unique** — Redis cache names must be unique across all Azure subscriptions
|
||||
7. **Long provisioning times** — Cache creation can take 15-20 minutes; use `WaitUntil.Started` for async patterns
|
||||
|
||||
## Connecting with StackExchange.Redis (Data Plane)
|
||||
|
||||
After creating the cache with this management SDK, use StackExchange.Redis for data operations:
|
||||
|
||||
```csharp
|
||||
using StackExchange.Redis;
|
||||
|
||||
// Get connection info from management SDK
|
||||
var cache = await resourceGroup.Value.GetRedisAsync("my-redis-cache");
|
||||
var keys = await cache.Value.GetKeysAsync();
|
||||
|
||||
// Connect with StackExchange.Redis
|
||||
var connectionString = $"{cache.Value.Data.HostName}:{cache.Value.Data.SslPort},password={keys.Value.PrimaryKey},ssl=True,abortConnect=False";
|
||||
var connection = ConnectionMultiplexer.Connect(connectionString);
|
||||
var db = connection.GetDatabase();
|
||||
|
||||
// Data operations
|
||||
await db.StringSetAsync("key", "value");
|
||||
var value = await db.StringGetAsync("key");
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `StackExchange.Redis` | Data plane (get/set, pub/sub, streams) | `dotnet add package StackExchange.Redis` |
|
||||
| `Azure.ResourceManager.Redis` | Management plane (this SDK) | `dotnet add package Azure.ResourceManager.Redis` |
|
||||
| `Microsoft.Azure.StackExchangeRedis` | Azure-specific Redis extensions | `dotnet add package Microsoft.Azure.StackExchangeRedis` |
|
||||
319
skills/official/microsoft/dotnet/data/sql/SKILL.md
Normal file
319
skills/official/microsoft/dotnet/data/sql/SKILL.md
Normal file
@@ -0,0 +1,319 @@
|
||||
---
|
||||
name: azure-resource-manager-sql-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for Azure SQL in .NET. Use for MANAGEMENT PLANE operations: creating/managing SQL servers, databases, elastic pools, firewall rules, and failover groups via Azure Resource Manager. NOT for data plane operations (executing queries) - use Microsoft.Data.SqlClient for that. Triggers: "SQL server", "create SQL database", "manage SQL resources", "ARM SQL", "SqlServerResource", "provision Azure SQL", "elastic pool", "firewall rule".
|
||||
package: Azure.ResourceManager.Sql
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.Sql (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Azure SQL resources via Azure Resource Manager.
|
||||
|
||||
> **⚠️ Management vs Data Plane**
|
||||
> - **This SDK (Azure.ResourceManager.Sql)**: Create servers, databases, elastic pools, configure firewall rules, manage failover groups
|
||||
> - **Data Plane SDK (Microsoft.Data.SqlClient)**: Execute queries, stored procedures, manage connections
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.Sql
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.3.0, Preview v1.4.0-beta.3
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.Sql;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
|
||||
var subscription = armClient.GetSubscriptionResource(
|
||||
new ResourceIdentifier($"/subscriptions/{subscriptionId}"));
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
└── ResourceGroupResource
|
||||
└── SqlServerResource
|
||||
├── SqlDatabaseResource
|
||||
├── ElasticPoolResource
|
||||
│ └── ElasticPoolDatabaseResource
|
||||
├── SqlFirewallRuleResource
|
||||
├── FailoverGroupResource
|
||||
├── ServerBlobAuditingPolicyResource
|
||||
├── EncryptionProtectorResource
|
||||
└── VirtualNetworkRuleResource
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Create SQL Server
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.Sql;
|
||||
using Azure.ResourceManager.Sql.Models;
|
||||
|
||||
// Get resource group
|
||||
var resourceGroup = await subscription
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Define server
|
||||
var serverData = new SqlServerData(AzureLocation.EastUS)
|
||||
{
|
||||
AdministratorLogin = "sqladmin",
|
||||
AdministratorLoginPassword = "YourSecurePassword123!",
|
||||
Version = "12.0",
|
||||
MinimalTlsVersion = SqlMinimalTlsVersion.Tls1_2,
|
||||
PublicNetworkAccess = ServerNetworkAccessFlag.Enabled
|
||||
};
|
||||
|
||||
// Create server (long-running operation)
|
||||
var serverCollection = resourceGroup.Value.GetSqlServers();
|
||||
var operation = await serverCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-sql-server",
|
||||
serverData);
|
||||
|
||||
SqlServerResource server = operation.Value;
|
||||
```
|
||||
|
||||
### 2. Create SQL Database
|
||||
|
||||
```csharp
|
||||
var databaseData = new SqlDatabaseData(AzureLocation.EastUS)
|
||||
{
|
||||
Sku = new SqlSku("S0") { Tier = "Standard" },
|
||||
MaxSizeBytes = 2L * 1024 * 1024 * 1024, // 2 GB
|
||||
Collation = "SQL_Latin1_General_CP1_CI_AS",
|
||||
RequestedBackupStorageRedundancy = SqlBackupStorageRedundancy.Local
|
||||
};
|
||||
|
||||
var databaseCollection = server.GetSqlDatabases();
|
||||
var dbOperation = await databaseCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-database",
|
||||
databaseData);
|
||||
|
||||
SqlDatabaseResource database = dbOperation.Value;
|
||||
```
|
||||
|
||||
### 3. Create Elastic Pool
|
||||
|
||||
```csharp
|
||||
var poolData = new ElasticPoolData(AzureLocation.EastUS)
|
||||
{
|
||||
Sku = new SqlSku("StandardPool")
|
||||
{
|
||||
Tier = "Standard",
|
||||
Capacity = 100 // 100 eDTUs
|
||||
},
|
||||
PerDatabaseSettings = new ElasticPoolPerDatabaseSettings
|
||||
{
|
||||
MinCapacity = 0,
|
||||
MaxCapacity = 100
|
||||
}
|
||||
};
|
||||
|
||||
var poolCollection = server.GetElasticPools();
|
||||
var poolOperation = await poolCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-elastic-pool",
|
||||
poolData);
|
||||
|
||||
ElasticPoolResource pool = poolOperation.Value;
|
||||
```
|
||||
|
||||
### 4. Add Database to Elastic Pool
|
||||
|
||||
```csharp
|
||||
var databaseData = new SqlDatabaseData(AzureLocation.EastUS)
|
||||
{
|
||||
ElasticPoolId = pool.Id
|
||||
};
|
||||
|
||||
await databaseCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"pooled-database",
|
||||
databaseData);
|
||||
```
|
||||
|
||||
### 5. Configure Firewall Rules
|
||||
|
||||
```csharp
|
||||
// Allow Azure services
|
||||
var azureServicesRule = new SqlFirewallRuleData
|
||||
{
|
||||
StartIPAddress = "0.0.0.0",
|
||||
EndIPAddress = "0.0.0.0"
|
||||
};
|
||||
|
||||
var firewallCollection = server.GetSqlFirewallRules();
|
||||
await firewallCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"AllowAzureServices",
|
||||
azureServicesRule);
|
||||
|
||||
// Allow specific IP range
|
||||
var clientRule = new SqlFirewallRuleData
|
||||
{
|
||||
StartIPAddress = "203.0.113.0",
|
||||
EndIPAddress = "203.0.113.255"
|
||||
};
|
||||
|
||||
await firewallCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"AllowClientIPs",
|
||||
clientRule);
|
||||
```
|
||||
|
||||
### 6. List Resources
|
||||
|
||||
```csharp
|
||||
// List all servers in subscription
|
||||
await foreach (var srv in subscription.GetSqlServersAsync())
|
||||
{
|
||||
Console.WriteLine($"Server: {srv.Data.Name} in {srv.Data.Location}");
|
||||
}
|
||||
|
||||
// List databases in a server
|
||||
await foreach (var db in server.GetSqlDatabases())
|
||||
{
|
||||
Console.WriteLine($"Database: {db.Data.Name}, SKU: {db.Data.Sku?.Name}");
|
||||
}
|
||||
|
||||
// List elastic pools
|
||||
await foreach (var ep in server.GetElasticPools())
|
||||
{
|
||||
Console.WriteLine($"Pool: {ep.Data.Name}, DTU: {ep.Data.Sku?.Capacity}");
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Get Connection String
|
||||
|
||||
```csharp
|
||||
// Build connection string (server FQDN is predictable)
|
||||
var serverFqdn = $"{server.Data.Name}.database.windows.net";
|
||||
var connectionString = $"Server=tcp:{serverFqdn},1433;" +
|
||||
$"Initial Catalog={database.Data.Name};" +
|
||||
"Persist Security Info=False;" +
|
||||
$"User ID={server.Data.AdministratorLogin};" +
|
||||
"Password=<your-password>;" +
|
||||
"MultipleActiveResultSets=False;" +
|
||||
"Encrypt=True;" +
|
||||
"TrustServerCertificate=False;" +
|
||||
"Connection Timeout=30;";
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `SqlServerResource` | Represents an Azure SQL server |
|
||||
| `SqlServerCollection` | Collection for server CRUD |
|
||||
| `SqlDatabaseResource` | Represents a SQL database |
|
||||
| `SqlDatabaseCollection` | Collection for database CRUD |
|
||||
| `ElasticPoolResource` | Represents an elastic pool |
|
||||
| `ElasticPoolCollection` | Collection for elastic pool CRUD |
|
||||
| `SqlFirewallRuleResource` | Represents a firewall rule |
|
||||
| `SqlFirewallRuleCollection` | Collection for firewall rule CRUD |
|
||||
| `SqlServerData` | Server creation/update payload |
|
||||
| `SqlDatabaseData` | Database creation/update payload |
|
||||
| `ElasticPoolData` | Elastic pool creation/update payload |
|
||||
| `SqlFirewallRuleData` | Firewall rule creation/update payload |
|
||||
| `SqlSku` | SKU configuration (tier, capacity) |
|
||||
|
||||
## Common SKUs
|
||||
|
||||
### Database SKUs
|
||||
|
||||
| SKU Name | Tier | Description |
|
||||
|----------|------|-------------|
|
||||
| `Basic` | Basic | 5 DTUs, 2 GB max |
|
||||
| `S0`-`S12` | Standard | 10-3000 DTUs |
|
||||
| `P1`-`P15` | Premium | 125-4000 DTUs |
|
||||
| `GP_Gen5_2` | GeneralPurpose | vCore-based, 2 vCores |
|
||||
| `BC_Gen5_2` | BusinessCritical | vCore-based, 2 vCores |
|
||||
| `HS_Gen5_2` | Hyperscale | vCore-based, 2 vCores |
|
||||
|
||||
### Elastic Pool SKUs
|
||||
|
||||
| SKU Name | Tier | Description |
|
||||
|----------|------|-------------|
|
||||
| `BasicPool` | Basic | 50-1600 eDTUs |
|
||||
| `StandardPool` | Standard | 50-3000 eDTUs |
|
||||
| `PremiumPool` | Premium | 125-4000 eDTUs |
|
||||
| `GP_Gen5_2` | GeneralPurpose | vCore-based |
|
||||
| `BC_Gen5_2` | BusinessCritical | vCore-based |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `WaitUntil.Completed`** for operations that must finish before proceeding
|
||||
2. **Use `WaitUntil.Started`** when you want to poll manually or run operations in parallel
|
||||
3. **Always use `DefaultAzureCredential`** — never hardcode passwords in production
|
||||
4. **Handle `RequestFailedException`** for ARM API errors
|
||||
5. **Use `CreateOrUpdateAsync`** for idempotent operations
|
||||
6. **Navigate hierarchy** via `Get*` methods (e.g., `server.GetSqlDatabases()`)
|
||||
7. **Use elastic pools** for cost optimization when managing multiple databases
|
||||
8. **Configure firewall rules** before attempting connections
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await serverCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, serverName, serverData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Server already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid request: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | When to Read |
|
||||
|------|--------------|
|
||||
| [references/server-management.md](references/server-management.md) | Server CRUD, admin credentials, Azure AD auth, networking |
|
||||
| [references/database-operations.md](references/database-operations.md) | Database CRUD, scaling, backup, restore, copy |
|
||||
| [references/elastic-pools.md](references/elastic-pools.md) | Pool management, adding/removing databases, scaling |
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Microsoft.Data.SqlClient` | Data plane (execute queries, stored procedures) | `dotnet add package Microsoft.Data.SqlClient` |
|
||||
| `Azure.ResourceManager.Sql` | Management plane (this SDK) | `dotnet add package Azure.ResourceManager.Sql` |
|
||||
| `Microsoft.EntityFrameworkCore.SqlServer` | ORM for SQL Server | `dotnet add package Microsoft.EntityFrameworkCore.SqlServer` |
|
||||
@@ -0,0 +1,440 @@
|
||||
---
|
||||
name: microsoft-azure-webjobs-extensions-authentication-events-dotnet
|
||||
description: |
|
||||
Microsoft Entra Authentication Events SDK for .NET. Azure Functions triggers for custom authentication extensions. Use for token enrichment, custom claims, attribute collection, and OTP customization in Entra ID. Triggers: "Authentication Events", "WebJobsAuthenticationEventsTrigger", "OnTokenIssuanceStart", "OnAttributeCollectionStart", "custom claims", "token enrichment", "Entra custom extension", "authentication extension".
|
||||
---
|
||||
|
||||
# Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents (.NET)
|
||||
|
||||
Azure Functions extension for handling Microsoft Entra ID custom authentication events.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents
|
||||
```
|
||||
|
||||
**Current Version**: v1.1.0 (stable)
|
||||
|
||||
## Supported Events
|
||||
|
||||
| Event | Purpose |
|
||||
|-------|---------|
|
||||
| `OnTokenIssuanceStart` | Add custom claims to tokens during issuance |
|
||||
| `OnAttributeCollectionStart` | Customize attribute collection UI before display |
|
||||
| `OnAttributeCollectionSubmit` | Validate/modify attributes after user submission |
|
||||
| `OnOtpSend` | Custom OTP delivery (SMS, email, etc.) |
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Token Enrichment (Add Custom Claims)
|
||||
|
||||
Add custom claims to access or ID tokens during sign-in.
|
||||
|
||||
```csharp
|
||||
using Microsoft.Azure.WebJobs;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents.TokenIssuanceStart;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
public static class TokenEnrichmentFunction
|
||||
{
|
||||
[FunctionName("OnTokenIssuanceStart")]
|
||||
public static WebJobsAuthenticationEventResponse Run(
|
||||
[WebJobsAuthenticationEventsTrigger] WebJobsTokenIssuanceStartRequest request,
|
||||
ILogger log)
|
||||
{
|
||||
log.LogInformation("Token issuance event for user: {UserId}",
|
||||
request.Data?.AuthenticationContext?.User?.Id);
|
||||
|
||||
// Create response with custom claims
|
||||
var response = new WebJobsTokenIssuanceStartResponse();
|
||||
|
||||
// Add claims to the token
|
||||
response.Actions.Add(new WebJobsProvideClaimsForToken
|
||||
{
|
||||
Claims = new Dictionary<string, string>
|
||||
{
|
||||
{ "customClaim1", "customValue1" },
|
||||
{ "department", "Engineering" },
|
||||
{ "costCenter", "CC-12345" },
|
||||
{ "apiVersion", "v2" }
|
||||
}
|
||||
});
|
||||
|
||||
return response;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Token Enrichment with External Data
|
||||
|
||||
Fetch claims from external systems (databases, APIs).
|
||||
|
||||
```csharp
|
||||
using Microsoft.Azure.WebJobs;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents.TokenIssuanceStart;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using System.Net.Http;
|
||||
using System.Text.Json;
|
||||
|
||||
public static class TokenEnrichmentWithExternalData
|
||||
{
|
||||
private static readonly HttpClient _httpClient = new();
|
||||
|
||||
[FunctionName("OnTokenIssuanceStartExternal")]
|
||||
public static async Task<WebJobsAuthenticationEventResponse> Run(
|
||||
[WebJobsAuthenticationEventsTrigger] WebJobsTokenIssuanceStartRequest request,
|
||||
ILogger log)
|
||||
{
|
||||
string? userId = request.Data?.AuthenticationContext?.User?.Id;
|
||||
|
||||
if (string.IsNullOrEmpty(userId))
|
||||
{
|
||||
log.LogWarning("No user ID in request");
|
||||
return new WebJobsTokenIssuanceStartResponse();
|
||||
}
|
||||
|
||||
// Fetch user data from external API
|
||||
var userProfile = await GetUserProfileAsync(userId);
|
||||
|
||||
var response = new WebJobsTokenIssuanceStartResponse();
|
||||
response.Actions.Add(new WebJobsProvideClaimsForToken
|
||||
{
|
||||
Claims = new Dictionary<string, string>
|
||||
{
|
||||
{ "employeeId", userProfile.EmployeeId },
|
||||
{ "department", userProfile.Department },
|
||||
{ "roles", string.Join(",", userProfile.Roles) }
|
||||
}
|
||||
});
|
||||
|
||||
return response;
|
||||
}
|
||||
|
||||
private static async Task<UserProfile> GetUserProfileAsync(string userId)
|
||||
{
|
||||
var response = await _httpClient.GetAsync($"https://api.example.com/users/{userId}");
|
||||
response.EnsureSuccessStatusCode();
|
||||
var json = await response.Content.ReadAsStringAsync();
|
||||
return JsonSerializer.Deserialize<UserProfile>(json)!;
|
||||
}
|
||||
}
|
||||
|
||||
public record UserProfile(string EmployeeId, string Department, string[] Roles);
|
||||
```
|
||||
|
||||
### 3. Attribute Collection - Customize UI (Start Event)
|
||||
|
||||
Customize the attribute collection page before it's displayed.
|
||||
|
||||
```csharp
|
||||
using Microsoft.Azure.WebJobs;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents.Framework;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
public static class AttributeCollectionStartFunction
|
||||
{
|
||||
[FunctionName("OnAttributeCollectionStart")]
|
||||
public static WebJobsAuthenticationEventResponse Run(
|
||||
[WebJobsAuthenticationEventsTrigger] WebJobsAttributeCollectionStartRequest request,
|
||||
ILogger log)
|
||||
{
|
||||
log.LogInformation("Attribute collection start for correlation: {CorrelationId}",
|
||||
request.Data?.AuthenticationContext?.CorrelationId);
|
||||
|
||||
var response = new WebJobsAttributeCollectionStartResponse();
|
||||
|
||||
// Option 1: Continue with default behavior
|
||||
response.Actions.Add(new WebJobsContinueWithDefaultBehavior());
|
||||
|
||||
// Option 2: Prefill attributes
|
||||
// response.Actions.Add(new WebJobsSetPrefillValues
|
||||
// {
|
||||
// Attributes = new Dictionary<string, string>
|
||||
// {
|
||||
// { "city", "Seattle" },
|
||||
// { "country", "USA" }
|
||||
// }
|
||||
// });
|
||||
|
||||
// Option 3: Show blocking page (prevent sign-up)
|
||||
// response.Actions.Add(new WebJobsShowBlockPage
|
||||
// {
|
||||
// Message = "Sign-up is currently disabled."
|
||||
// });
|
||||
|
||||
return response;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Attribute Collection - Validate Submission (Submit Event)
|
||||
|
||||
Validate and modify attributes after user submission.
|
||||
|
||||
```csharp
|
||||
using Microsoft.Azure.WebJobs;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents.Framework;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
public static class AttributeCollectionSubmitFunction
|
||||
{
|
||||
[FunctionName("OnAttributeCollectionSubmit")]
|
||||
public static WebJobsAuthenticationEventResponse Run(
|
||||
[WebJobsAuthenticationEventsTrigger] WebJobsAttributeCollectionSubmitRequest request,
|
||||
ILogger log)
|
||||
{
|
||||
var response = new WebJobsAttributeCollectionSubmitResponse();
|
||||
|
||||
// Access submitted attributes
|
||||
var attributes = request.Data?.UserSignUpInfo?.Attributes;
|
||||
|
||||
string? email = attributes?["email"]?.ToString();
|
||||
string? displayName = attributes?["displayName"]?.ToString();
|
||||
|
||||
// Validation example: block certain email domains
|
||||
if (email?.EndsWith("@blocked.com") == true)
|
||||
{
|
||||
response.Actions.Add(new WebJobsShowBlockPage
|
||||
{
|
||||
Message = "Sign-up from this email domain is not allowed."
|
||||
});
|
||||
return response;
|
||||
}
|
||||
|
||||
// Validation example: show validation error
|
||||
if (string.IsNullOrEmpty(displayName) || displayName.Length < 3)
|
||||
{
|
||||
response.Actions.Add(new WebJobsShowValidationError
|
||||
{
|
||||
Message = "Display name must be at least 3 characters.",
|
||||
AttributeErrors = new Dictionary<string, string>
|
||||
{
|
||||
{ "displayName", "Name is too short" }
|
||||
}
|
||||
});
|
||||
return response;
|
||||
}
|
||||
|
||||
// Modify attributes before saving
|
||||
response.Actions.Add(new WebJobsModifyAttributeValues
|
||||
{
|
||||
Attributes = new Dictionary<string, string>
|
||||
{
|
||||
{ "displayName", displayName.Trim() },
|
||||
{ "city", attributes?["city"]?.ToString()?.ToUpperInvariant() ?? "" }
|
||||
}
|
||||
});
|
||||
|
||||
return response;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Custom OTP Delivery
|
||||
|
||||
Send one-time passwords via custom channels (SMS, email, push notification).
|
||||
|
||||
```csharp
|
||||
using Microsoft.Azure.WebJobs;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents;
|
||||
using Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents.Framework;
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
public static class CustomOtpFunction
|
||||
{
|
||||
[FunctionName("OnOtpSend")]
|
||||
public static async Task<WebJobsAuthenticationEventResponse> Run(
|
||||
[WebJobsAuthenticationEventsTrigger] WebJobsOnOtpSendRequest request,
|
||||
ILogger log)
|
||||
{
|
||||
var response = new WebJobsOnOtpSendResponse();
|
||||
|
||||
string? phoneNumber = request.Data?.OtpContext?.Identifier;
|
||||
string? otp = request.Data?.OtpContext?.OneTimeCode;
|
||||
|
||||
if (string.IsNullOrEmpty(phoneNumber) || string.IsNullOrEmpty(otp))
|
||||
{
|
||||
log.LogError("Missing phone number or OTP");
|
||||
response.Actions.Add(new WebJobsOnOtpSendFailed
|
||||
{
|
||||
Error = "Missing required data"
|
||||
});
|
||||
return response;
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
// Send OTP via your SMS provider
|
||||
await SendSmsAsync(phoneNumber, $"Your verification code is: {otp}");
|
||||
|
||||
response.Actions.Add(new WebJobsOnOtpSendSuccess());
|
||||
log.LogInformation("OTP sent successfully to {PhoneNumber}", phoneNumber);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
log.LogError(ex, "Failed to send OTP");
|
||||
response.Actions.Add(new WebJobsOnOtpSendFailed
|
||||
{
|
||||
Error = "Failed to send verification code"
|
||||
});
|
||||
}
|
||||
|
||||
return response;
|
||||
}
|
||||
|
||||
private static async Task SendSmsAsync(string phoneNumber, string message)
|
||||
{
|
||||
// Implement your SMS provider integration (Twilio, Azure Communication Services, etc.)
|
||||
await Task.CompletedTask;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Function App Configuration
|
||||
|
||||
Configure the Function App for authentication events.
|
||||
|
||||
```csharp
|
||||
// Program.cs (Isolated worker model)
|
||||
using Microsoft.Extensions.Hosting;
|
||||
|
||||
var host = new HostBuilder()
|
||||
.ConfigureFunctionsWorkerDefaults()
|
||||
.Build();
|
||||
|
||||
host.Run();
|
||||
```
|
||||
|
||||
```json
|
||||
// host.json
|
||||
{
|
||||
"version": "2.0",
|
||||
"logging": {
|
||||
"applicationInsights": {
|
||||
"samplingSettings": {
|
||||
"isEnabled": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"extensions": {
|
||||
"http": {
|
||||
"routePrefix": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
```json
|
||||
// local.settings.json
|
||||
{
|
||||
"IsEncrypted": false,
|
||||
"Values": {
|
||||
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
|
||||
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `WebJobsAuthenticationEventsTriggerAttribute` | Function trigger attribute |
|
||||
| `WebJobsTokenIssuanceStartRequest` | Token issuance event request |
|
||||
| `WebJobsTokenIssuanceStartResponse` | Token issuance event response |
|
||||
| `WebJobsProvideClaimsForToken` | Action to add claims |
|
||||
| `WebJobsAttributeCollectionStartRequest` | Attribute collection start request |
|
||||
| `WebJobsAttributeCollectionStartResponse` | Attribute collection start response |
|
||||
| `WebJobsAttributeCollectionSubmitRequest` | Attribute submission request |
|
||||
| `WebJobsAttributeCollectionSubmitResponse` | Attribute submission response |
|
||||
| `WebJobsSetPrefillValues` | Prefill form values |
|
||||
| `WebJobsShowBlockPage` | Block user with message |
|
||||
| `WebJobsShowValidationError` | Show validation errors |
|
||||
| `WebJobsModifyAttributeValues` | Modify submitted values |
|
||||
| `WebJobsOnOtpSendRequest` | OTP send event request |
|
||||
| `WebJobsOnOtpSendResponse` | OTP send event response |
|
||||
| `WebJobsOnOtpSendSuccess` | OTP sent successfully |
|
||||
| `WebJobsOnOtpSendFailed` | OTP send failed |
|
||||
| `WebJobsContinueWithDefaultBehavior` | Continue with default flow |
|
||||
|
||||
## Entra ID Configuration
|
||||
|
||||
After deploying your Function App, configure the custom extension in Entra ID:
|
||||
|
||||
1. **Register the API** in Entra ID → App registrations
|
||||
2. **Create Custom Authentication Extension** in Entra ID → External Identities → Custom authentication extensions
|
||||
3. **Link to User Flow** in Entra ID → External Identities → User flows
|
||||
|
||||
### Required App Registration Settings
|
||||
|
||||
```
|
||||
Expose an API:
|
||||
- Application ID URI: api://<your-function-app-name>.azurewebsites.net
|
||||
- Scope: CustomAuthenticationExtension.Receive.Payload
|
||||
|
||||
API Permissions:
|
||||
- Microsoft Graph: User.Read (delegated)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Validate all inputs** — Never trust request data; validate before processing
|
||||
2. **Handle errors gracefully** — Return appropriate error responses
|
||||
3. **Log correlation IDs** — Use `CorrelationId` for troubleshooting
|
||||
4. **Keep functions fast** — Authentication events have timeout limits
|
||||
5. **Use managed identity** — Access Azure resources securely
|
||||
6. **Cache external data** — Avoid slow lookups on every request
|
||||
7. **Test locally** — Use Azure Functions Core Tools with sample payloads
|
||||
8. **Monitor with App Insights** — Track function execution and errors
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
[FunctionName("OnTokenIssuanceStart")]
|
||||
public static WebJobsAuthenticationEventResponse Run(
|
||||
[WebJobsAuthenticationEventsTrigger] WebJobsTokenIssuanceStartRequest request,
|
||||
ILogger log)
|
||||
{
|
||||
try
|
||||
{
|
||||
// Your logic here
|
||||
var response = new WebJobsTokenIssuanceStartResponse();
|
||||
response.Actions.Add(new WebJobsProvideClaimsForToken
|
||||
{
|
||||
Claims = new Dictionary<string, string> { { "claim", "value" } }
|
||||
});
|
||||
return response;
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
log.LogError(ex, "Error processing token issuance event");
|
||||
|
||||
// Return empty response - authentication continues without custom claims
|
||||
// Do NOT throw - this would fail the authentication
|
||||
return new WebJobsTokenIssuanceStartResponse();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents` | Auth events (this SDK) | `dotnet add package Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents` |
|
||||
| `Microsoft.Identity.Web` | Web app authentication | `dotnet add package Microsoft.Identity.Web` |
|
||||
| `Azure.Identity` | Azure authentication | `dotnet add package Azure.Identity` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents |
|
||||
| Custom Extensions Overview | https://learn.microsoft.com/entra/identity-platform/custom-extension-overview |
|
||||
| Token Issuance Events | https://learn.microsoft.com/entra/identity-platform/custom-extension-tokenissuancestart-setup |
|
||||
| Attribute Collection Events | https://learn.microsoft.com/entra/identity-platform/custom-extension-attribute-collection |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/entra/Microsoft.Azure.WebJobs.Extensions.AuthenticationEvents |
|
||||
339
skills/official/microsoft/dotnet/entra/azure-identity/SKILL.md
Normal file
339
skills/official/microsoft/dotnet/entra/azure-identity/SKILL.md
Normal file
@@ -0,0 +1,339 @@
|
||||
---
|
||||
name: azure-identity-dotnet
|
||||
description: |
|
||||
Azure Identity SDK for .NET. Authentication library for Azure SDK clients using Microsoft Entra ID. Use for DefaultAzureCredential, managed identity, service principals, and developer credentials. Triggers: "Azure Identity", "DefaultAzureCredential", "ManagedIdentityCredential", "ClientSecretCredential", "authentication .NET", "Azure auth", "credential chain".
|
||||
package: Azure.Identity
|
||||
---
|
||||
|
||||
# Azure.Identity (.NET)
|
||||
|
||||
Authentication library for Azure SDK clients using Microsoft Entra ID (formerly Azure AD).
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.Identity
|
||||
|
||||
# For ASP.NET Core
|
||||
dotnet add package Microsoft.Extensions.Azure
|
||||
|
||||
# For brokered authentication (Windows)
|
||||
dotnet add package Azure.Identity.Broker
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.17.1, Preview v1.18.0-beta.2
|
||||
|
||||
## Environment Variables
|
||||
|
||||
### Service Principal with Secret
|
||||
```bash
|
||||
AZURE_CLIENT_ID=<application-client-id>
|
||||
AZURE_TENANT_ID=<directory-tenant-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret-value>
|
||||
```
|
||||
|
||||
### Service Principal with Certificate
|
||||
```bash
|
||||
AZURE_CLIENT_ID=<application-client-id>
|
||||
AZURE_TENANT_ID=<directory-tenant-id>
|
||||
AZURE_CLIENT_CERTIFICATE_PATH=<path-to-pfx-or-pem>
|
||||
AZURE_CLIENT_CERTIFICATE_PASSWORD=<certificate-password> # Optional
|
||||
```
|
||||
|
||||
### Managed Identity
|
||||
```bash
|
||||
AZURE_CLIENT_ID=<user-assigned-managed-identity-client-id> # Only for user-assigned
|
||||
```
|
||||
|
||||
## DefaultAzureCredential
|
||||
|
||||
The recommended credential for most scenarios. Tries multiple authentication methods in order:
|
||||
|
||||
| Order | Credential | Enabled by Default |
|
||||
|-------|------------|-------------------|
|
||||
| 1 | EnvironmentCredential | Yes |
|
||||
| 2 | WorkloadIdentityCredential | Yes |
|
||||
| 3 | ManagedIdentityCredential | Yes |
|
||||
| 4 | VisualStudioCredential | Yes |
|
||||
| 5 | VisualStudioCodeCredential | Yes |
|
||||
| 6 | AzureCliCredential | Yes |
|
||||
| 7 | AzurePowerShellCredential | Yes |
|
||||
| 8 | AzureDeveloperCliCredential | Yes |
|
||||
| 9 | InteractiveBrowserCredential | **No** |
|
||||
|
||||
### Basic Usage
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Storage.Blobs;
|
||||
|
||||
var credential = new DefaultAzureCredential();
|
||||
var blobClient = new BlobServiceClient(
|
||||
new Uri("https://myaccount.blob.core.windows.net"),
|
||||
credential);
|
||||
```
|
||||
|
||||
### ASP.NET Core with Dependency Injection
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Microsoft.Extensions.Azure;
|
||||
|
||||
builder.Services.AddAzureClients(clientBuilder =>
|
||||
{
|
||||
clientBuilder.AddBlobServiceClient(
|
||||
new Uri("https://myaccount.blob.core.windows.net"));
|
||||
clientBuilder.AddSecretClient(
|
||||
new Uri("https://myvault.vault.azure.net"));
|
||||
|
||||
// Uses DefaultAzureCredential by default
|
||||
clientBuilder.UseCredential(new DefaultAzureCredential());
|
||||
});
|
||||
```
|
||||
|
||||
### Customizing DefaultAzureCredential
|
||||
|
||||
```csharp
|
||||
var credential = new DefaultAzureCredential(
|
||||
new DefaultAzureCredentialOptions
|
||||
{
|
||||
ExcludeEnvironmentCredential = true,
|
||||
ExcludeManagedIdentityCredential = false,
|
||||
ExcludeVisualStudioCredential = false,
|
||||
ExcludeAzureCliCredential = false,
|
||||
ExcludeInteractiveBrowserCredential = false, // Enable interactive
|
||||
TenantId = "<tenant-id>",
|
||||
ManagedIdentityClientId = "<user-assigned-mi-client-id>"
|
||||
});
|
||||
```
|
||||
|
||||
## Credential Types
|
||||
|
||||
### ManagedIdentityCredential (Production)
|
||||
|
||||
```csharp
|
||||
// System-assigned managed identity
|
||||
var credential = new ManagedIdentityCredential(ManagedIdentityId.SystemAssigned);
|
||||
|
||||
// User-assigned by client ID
|
||||
var credential = new ManagedIdentityCredential(
|
||||
ManagedIdentityId.FromUserAssignedClientId("<client-id>"));
|
||||
|
||||
// User-assigned by resource ID
|
||||
var credential = new ManagedIdentityCredential(
|
||||
ManagedIdentityId.FromUserAssignedResourceId("<resource-id>"));
|
||||
```
|
||||
|
||||
### ClientSecretCredential
|
||||
|
||||
```csharp
|
||||
var credential = new ClientSecretCredential(
|
||||
tenantId: "<tenant-id>",
|
||||
clientId: "<client-id>",
|
||||
clientSecret: "<client-secret>");
|
||||
|
||||
var client = new SecretClient(
|
||||
new Uri("https://myvault.vault.azure.net"),
|
||||
credential);
|
||||
```
|
||||
|
||||
### ClientCertificateCredential
|
||||
|
||||
```csharp
|
||||
var certificate = X509CertificateLoader.LoadCertificateFromFile("MyCertificate.pfx");
|
||||
var credential = new ClientCertificateCredential(
|
||||
tenantId: "<tenant-id>",
|
||||
clientId: "<client-id>",
|
||||
certificate);
|
||||
```
|
||||
|
||||
### ChainedTokenCredential (Custom Chain)
|
||||
|
||||
```csharp
|
||||
var credential = new ChainedTokenCredential(
|
||||
new ManagedIdentityCredential(),
|
||||
new AzureCliCredential());
|
||||
|
||||
var client = new SecretClient(
|
||||
new Uri("https://myvault.vault.azure.net"),
|
||||
credential);
|
||||
```
|
||||
|
||||
### Developer Credentials
|
||||
|
||||
```csharp
|
||||
// Azure CLI
|
||||
var credential = new AzureCliCredential();
|
||||
|
||||
// Azure PowerShell
|
||||
var credential = new AzurePowerShellCredential();
|
||||
|
||||
// Azure Developer CLI (azd)
|
||||
var credential = new AzureDeveloperCliCredential();
|
||||
|
||||
// Visual Studio
|
||||
var credential = new VisualStudioCredential();
|
||||
|
||||
// Interactive Browser
|
||||
var credential = new InteractiveBrowserCredential();
|
||||
```
|
||||
|
||||
## Environment-Based Configuration
|
||||
|
||||
```csharp
|
||||
// Production vs Development
|
||||
TokenCredential credential = builder.Environment.IsProduction()
|
||||
? new ManagedIdentityCredential("<client-id>")
|
||||
: new DefaultAzureCredential();
|
||||
```
|
||||
|
||||
## Sovereign Clouds
|
||||
|
||||
```csharp
|
||||
var credential = new DefaultAzureCredential(
|
||||
new DefaultAzureCredentialOptions
|
||||
{
|
||||
AuthorityHost = AzureAuthorityHosts.AzureGovernment
|
||||
});
|
||||
|
||||
// Available authority hosts:
|
||||
// AzureAuthorityHosts.AzurePublicCloud (default)
|
||||
// AzureAuthorityHosts.AzureGovernment
|
||||
// AzureAuthorityHosts.AzureChina
|
||||
// AzureAuthorityHosts.AzureGermany
|
||||
```
|
||||
|
||||
## Credential Types Reference
|
||||
|
||||
| Category | Credential | Purpose |
|
||||
|----------|------------|---------|
|
||||
| **Chains** | `DefaultAzureCredential` | Preconfigured chain for dev-to-prod |
|
||||
| | `ChainedTokenCredential` | Custom credential chain |
|
||||
| **Azure-Hosted** | `ManagedIdentityCredential` | Azure managed identity |
|
||||
| | `WorkloadIdentityCredential` | Kubernetes workload identity |
|
||||
| | `EnvironmentCredential` | Environment variables |
|
||||
| **Service Principal** | `ClientSecretCredential` | Client ID + secret |
|
||||
| | `ClientCertificateCredential` | Client ID + certificate |
|
||||
| | `ClientAssertionCredential` | Signed client assertion |
|
||||
| **User** | `InteractiveBrowserCredential` | Browser-based auth |
|
||||
| | `DeviceCodeCredential` | Device code flow |
|
||||
| | `OnBehalfOfCredential` | Delegated identity |
|
||||
| **Developer** | `AzureCliCredential` | Azure CLI |
|
||||
| | `AzurePowerShellCredential` | Azure PowerShell |
|
||||
| | `AzureDeveloperCliCredential` | Azure Developer CLI |
|
||||
| | `VisualStudioCredential` | Visual Studio |
|
||||
|
||||
## Best Practices
|
||||
|
||||
### 1. Use Deterministic Credentials in Production
|
||||
|
||||
```csharp
|
||||
// Development
|
||||
var devCredential = new DefaultAzureCredential();
|
||||
|
||||
// Production - use specific credential
|
||||
var prodCredential = new ManagedIdentityCredential("<client-id>");
|
||||
```
|
||||
|
||||
### 2. Reuse Credential Instances
|
||||
|
||||
```csharp
|
||||
// Good: Single credential instance shared across clients
|
||||
var credential = new DefaultAzureCredential();
|
||||
var blobClient = new BlobServiceClient(blobUri, credential);
|
||||
var secretClient = new SecretClient(vaultUri, credential);
|
||||
```
|
||||
|
||||
### 3. Configure Retry Policies
|
||||
|
||||
```csharp
|
||||
var options = new ManagedIdentityCredentialOptions(
|
||||
ManagedIdentityId.FromUserAssignedClientId(clientId))
|
||||
{
|
||||
Retry =
|
||||
{
|
||||
MaxRetries = 3,
|
||||
Delay = TimeSpan.FromSeconds(0.5),
|
||||
}
|
||||
};
|
||||
var credential = new ManagedIdentityCredential(options);
|
||||
```
|
||||
|
||||
### 4. Enable Logging for Debugging
|
||||
|
||||
```csharp
|
||||
using Azure.Core.Diagnostics;
|
||||
|
||||
using AzureEventSourceListener listener = new((args, message) =>
|
||||
{
|
||||
if (args is { EventSource.Name: "Azure-Identity" })
|
||||
{
|
||||
Console.WriteLine(message);
|
||||
}
|
||||
}, EventLevel.LogAlways);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Security.KeyVault.Secrets;
|
||||
|
||||
var client = new SecretClient(
|
||||
new Uri("https://myvault.vault.azure.net"),
|
||||
new DefaultAzureCredential());
|
||||
|
||||
try
|
||||
{
|
||||
KeyVaultSecret secret = await client.GetSecretAsync("secret1");
|
||||
}
|
||||
catch (AuthenticationFailedException e)
|
||||
{
|
||||
Console.WriteLine($"Authentication Failed: {e.Message}");
|
||||
}
|
||||
catch (CredentialUnavailableException e)
|
||||
{
|
||||
Console.WriteLine($"Credential Unavailable: {e.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Key Exceptions
|
||||
|
||||
| Exception | Description |
|
||||
|-----------|-------------|
|
||||
| `AuthenticationFailedException` | Base exception for authentication errors |
|
||||
| `CredentialUnavailableException` | Credential cannot authenticate in current environment |
|
||||
| `AuthenticationRequiredException` | Interactive authentication is required |
|
||||
|
||||
## Managed Identity Support
|
||||
|
||||
Supported Azure services:
|
||||
- Azure App Service and Azure Functions
|
||||
- Azure Arc
|
||||
- Azure Cloud Shell
|
||||
- Azure Kubernetes Service (AKS)
|
||||
- Azure Service Fabric
|
||||
- Azure Virtual Machines
|
||||
- Azure Virtual Machine Scale Sets
|
||||
|
||||
## Thread Safety
|
||||
|
||||
All credential implementations are thread-safe. A single credential instance can be safely shared across multiple clients and threads.
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Identity` | Authentication (this SDK) | `dotnet add package Azure.Identity` |
|
||||
| `Microsoft.Extensions.Azure` | DI integration | `dotnet add package Microsoft.Extensions.Azure` |
|
||||
| `Azure.Identity.Broker` | Brokered auth (Windows) | `dotnet add package Azure.Identity.Broker` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.Identity |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.identity |
|
||||
| Credential Chains | https://learn.microsoft.com/dotnet/azure/sdk/authentication/credential-chains |
|
||||
| Best Practices | https://learn.microsoft.com/dotnet/azure/sdk/authentication/best-practices |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/identity/Azure.Identity |
|
||||
406
skills/official/microsoft/dotnet/entra/keyvault/SKILL.md
Normal file
406
skills/official/microsoft/dotnet/entra/keyvault/SKILL.md
Normal file
@@ -0,0 +1,406 @@
|
||||
---
|
||||
name: azure-security-keyvault-keys-dotnet
|
||||
description: |
|
||||
Azure Key Vault Keys SDK for .NET. Client library for managing cryptographic keys in Azure Key Vault and Managed HSM. Use for key creation, rotation, encryption, decryption, signing, and verification. Triggers: "Key Vault keys", "KeyClient", "CryptographyClient", "RSA key", "EC key", "encrypt decrypt .NET", "key rotation", "HSM".
|
||||
package: Azure.Security.KeyVault.Keys
|
||||
---
|
||||
|
||||
# Azure.Security.KeyVault.Keys (.NET)
|
||||
|
||||
Client library for managing cryptographic keys in Azure Key Vault and Managed HSM.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.Security.KeyVault.Keys
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: 4.7.0 (stable)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
KEY_VAULT_NAME=<your-key-vault-name>
|
||||
# Or full URI
|
||||
AZURE_KEYVAULT_URL=https://<vault-name>.vault.azure.net
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
KeyClient (key management)
|
||||
├── CreateKey / CreateRsaKey / CreateEcKey
|
||||
├── GetKey / GetKeys
|
||||
├── UpdateKeyProperties
|
||||
├── DeleteKey / PurgeDeletedKey
|
||||
├── BackupKey / RestoreKey
|
||||
└── GetCryptographyClient() → CryptographyClient
|
||||
|
||||
CryptographyClient (cryptographic operations)
|
||||
├── Encrypt / Decrypt
|
||||
├── WrapKey / UnwrapKey
|
||||
├── Sign / Verify
|
||||
└── SignData / VerifyData
|
||||
|
||||
KeyResolver (key resolution)
|
||||
└── Resolve(keyId) → CryptographyClient
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### DefaultAzureCredential (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Security.KeyVault.Keys;
|
||||
|
||||
var keyVaultName = Environment.GetEnvironmentVariable("KEY_VAULT_NAME");
|
||||
var kvUri = $"https://{keyVaultName}.vault.azure.net";
|
||||
|
||||
var client = new KeyClient(new Uri(kvUri), new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
### Service Principal
|
||||
|
||||
```csharp
|
||||
var credential = new ClientSecretCredential(
|
||||
tenantId: "<tenant-id>",
|
||||
clientId: "<client-id>",
|
||||
clientSecret: "<client-secret>");
|
||||
|
||||
var client = new KeyClient(new Uri(kvUri), credential);
|
||||
```
|
||||
|
||||
## Key Management
|
||||
|
||||
### Create Keys
|
||||
|
||||
```csharp
|
||||
// Create RSA key
|
||||
KeyVaultKey rsaKey = await client.CreateKeyAsync("my-rsa-key", KeyType.Rsa);
|
||||
Console.WriteLine($"Created key: {rsaKey.Name}, Type: {rsaKey.KeyType}");
|
||||
|
||||
// Create RSA key with options
|
||||
var rsaOptions = new CreateRsaKeyOptions("my-rsa-key-2048")
|
||||
{
|
||||
KeySize = 2048,
|
||||
HardwareProtected = false, // true for HSM-backed
|
||||
ExpiresOn = DateTimeOffset.UtcNow.AddYears(1),
|
||||
NotBefore = DateTimeOffset.UtcNow,
|
||||
Enabled = true
|
||||
};
|
||||
rsaOptions.KeyOperations.Add(KeyOperation.Encrypt);
|
||||
rsaOptions.KeyOperations.Add(KeyOperation.Decrypt);
|
||||
|
||||
KeyVaultKey rsaKey2 = await client.CreateRsaKeyAsync(rsaOptions);
|
||||
|
||||
// Create EC key
|
||||
var ecOptions = new CreateEcKeyOptions("my-ec-key")
|
||||
{
|
||||
CurveName = KeyCurveName.P256,
|
||||
HardwareProtected = true // HSM-backed
|
||||
};
|
||||
KeyVaultKey ecKey = await client.CreateEcKeyAsync(ecOptions);
|
||||
|
||||
// Create Oct (symmetric) key for wrap/unwrap
|
||||
var octOptions = new CreateOctKeyOptions("my-oct-key")
|
||||
{
|
||||
KeySize = 256,
|
||||
HardwareProtected = true
|
||||
};
|
||||
KeyVaultKey octKey = await client.CreateOctKeyAsync(octOptions);
|
||||
```
|
||||
|
||||
### Retrieve Keys
|
||||
|
||||
```csharp
|
||||
// Get specific key (latest version)
|
||||
KeyVaultKey key = await client.GetKeyAsync("my-rsa-key");
|
||||
Console.WriteLine($"Key ID: {key.Id}");
|
||||
Console.WriteLine($"Key Type: {key.KeyType}");
|
||||
Console.WriteLine($"Version: {key.Properties.Version}");
|
||||
|
||||
// Get specific version
|
||||
KeyVaultKey keyVersion = await client.GetKeyAsync("my-rsa-key", "version-id");
|
||||
|
||||
// List all keys
|
||||
await foreach (KeyProperties keyProps in client.GetPropertiesOfKeysAsync())
|
||||
{
|
||||
Console.WriteLine($"Key: {keyProps.Name}, Enabled: {keyProps.Enabled}");
|
||||
}
|
||||
|
||||
// List key versions
|
||||
await foreach (KeyProperties version in client.GetPropertiesOfKeyVersionsAsync("my-rsa-key"))
|
||||
{
|
||||
Console.WriteLine($"Version: {version.Version}, Created: {version.CreatedOn}");
|
||||
}
|
||||
```
|
||||
|
||||
### Update Key Properties
|
||||
|
||||
```csharp
|
||||
KeyVaultKey key = await client.GetKeyAsync("my-rsa-key");
|
||||
|
||||
key.Properties.ExpiresOn = DateTimeOffset.UtcNow.AddYears(2);
|
||||
key.Properties.Tags["environment"] = "production";
|
||||
|
||||
KeyVaultKey updatedKey = await client.UpdateKeyPropertiesAsync(key.Properties);
|
||||
```
|
||||
|
||||
### Delete and Purge Keys
|
||||
|
||||
```csharp
|
||||
// Start delete operation
|
||||
DeleteKeyOperation operation = await client.StartDeleteKeyAsync("my-rsa-key");
|
||||
|
||||
// Wait for deletion to complete (required before purge)
|
||||
await operation.WaitForCompletionAsync();
|
||||
Console.WriteLine($"Deleted key scheduled purge date: {operation.Value.ScheduledPurgeDate}");
|
||||
|
||||
// Purge immediately (if soft-delete is enabled)
|
||||
await client.PurgeDeletedKeyAsync("my-rsa-key");
|
||||
|
||||
// Or recover deleted key
|
||||
KeyVaultKey recoveredKey = await client.StartRecoverDeletedKeyAsync("my-rsa-key");
|
||||
```
|
||||
|
||||
### Backup and Restore
|
||||
|
||||
```csharp
|
||||
// Backup key
|
||||
byte[] backup = await client.BackupKeyAsync("my-rsa-key");
|
||||
await File.WriteAllBytesAsync("key-backup.bin", backup);
|
||||
|
||||
// Restore key
|
||||
byte[] backupData = await File.ReadAllBytesAsync("key-backup.bin");
|
||||
KeyVaultKey restoredKey = await client.RestoreKeyBackupAsync(backupData);
|
||||
```
|
||||
|
||||
## Cryptographic Operations
|
||||
|
||||
### Get CryptographyClient
|
||||
|
||||
```csharp
|
||||
// From KeyClient
|
||||
KeyVaultKey key = await client.GetKeyAsync("my-rsa-key");
|
||||
CryptographyClient cryptoClient = client.GetCryptographyClient(
|
||||
key.Name,
|
||||
key.Properties.Version);
|
||||
|
||||
// Or create directly with key ID
|
||||
CryptographyClient cryptoClient = new CryptographyClient(
|
||||
new Uri("https://myvault.vault.azure.net/keys/my-rsa-key/version"),
|
||||
new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
### Encrypt and Decrypt
|
||||
|
||||
```csharp
|
||||
byte[] plaintext = Encoding.UTF8.GetBytes("Secret message to encrypt");
|
||||
|
||||
// Encrypt
|
||||
EncryptResult encryptResult = await cryptoClient.EncryptAsync(
|
||||
EncryptionAlgorithm.RsaOaep256,
|
||||
plaintext);
|
||||
Console.WriteLine($"Encrypted: {Convert.ToBase64String(encryptResult.Ciphertext)}");
|
||||
|
||||
// Decrypt
|
||||
DecryptResult decryptResult = await cryptoClient.DecryptAsync(
|
||||
EncryptionAlgorithm.RsaOaep256,
|
||||
encryptResult.Ciphertext);
|
||||
string decrypted = Encoding.UTF8.GetString(decryptResult.Plaintext);
|
||||
Console.WriteLine($"Decrypted: {decrypted}");
|
||||
```
|
||||
|
||||
### Wrap and Unwrap Keys
|
||||
|
||||
```csharp
|
||||
// Key to wrap (e.g., AES key)
|
||||
byte[] keyToWrap = new byte[32]; // 256-bit key
|
||||
RandomNumberGenerator.Fill(keyToWrap);
|
||||
|
||||
// Wrap key
|
||||
WrapResult wrapResult = await cryptoClient.WrapKeyAsync(
|
||||
KeyWrapAlgorithm.RsaOaep256,
|
||||
keyToWrap);
|
||||
|
||||
// Unwrap key
|
||||
UnwrapResult unwrapResult = await cryptoClient.UnwrapKeyAsync(
|
||||
KeyWrapAlgorithm.RsaOaep256,
|
||||
wrapResult.EncryptedKey);
|
||||
```
|
||||
|
||||
### Sign and Verify
|
||||
|
||||
```csharp
|
||||
// Data to sign
|
||||
byte[] data = Encoding.UTF8.GetBytes("Data to sign");
|
||||
|
||||
// Sign data (computes hash internally)
|
||||
SignResult signResult = await cryptoClient.SignDataAsync(
|
||||
SignatureAlgorithm.RS256,
|
||||
data);
|
||||
|
||||
// Verify signature
|
||||
VerifyResult verifyResult = await cryptoClient.VerifyDataAsync(
|
||||
SignatureAlgorithm.RS256,
|
||||
data,
|
||||
signResult.Signature);
|
||||
Console.WriteLine($"Signature valid: {verifyResult.IsValid}");
|
||||
|
||||
// Or sign pre-computed hash
|
||||
using var sha256 = SHA256.Create();
|
||||
byte[] hash = sha256.ComputeHash(data);
|
||||
|
||||
SignResult signHashResult = await cryptoClient.SignAsync(
|
||||
SignatureAlgorithm.RS256,
|
||||
hash);
|
||||
```
|
||||
|
||||
## Key Resolver
|
||||
|
||||
```csharp
|
||||
using Azure.Security.KeyVault.Keys.Cryptography;
|
||||
|
||||
var resolver = new KeyResolver(new DefaultAzureCredential());
|
||||
|
||||
// Resolve key by ID to get CryptographyClient
|
||||
CryptographyClient cryptoClient = await resolver.ResolveAsync(
|
||||
new Uri("https://myvault.vault.azure.net/keys/my-key/version"));
|
||||
|
||||
// Use for encryption
|
||||
EncryptResult result = await cryptoClient.EncryptAsync(
|
||||
EncryptionAlgorithm.RsaOaep256,
|
||||
plaintext);
|
||||
```
|
||||
|
||||
## Key Rotation
|
||||
|
||||
```csharp
|
||||
// Rotate key (creates new version)
|
||||
KeyVaultKey rotatedKey = await client.RotateKeyAsync("my-rsa-key");
|
||||
Console.WriteLine($"New version: {rotatedKey.Properties.Version}");
|
||||
|
||||
// Get rotation policy
|
||||
KeyRotationPolicy policy = await client.GetKeyRotationPolicyAsync("my-rsa-key");
|
||||
|
||||
// Update rotation policy
|
||||
policy.ExpiresIn = "P90D"; // 90 days
|
||||
policy.LifetimeActions.Add(new KeyRotationLifetimeAction
|
||||
{
|
||||
Action = KeyRotationPolicyAction.Rotate,
|
||||
TimeBeforeExpiry = "P30D" // Rotate 30 days before expiry
|
||||
});
|
||||
|
||||
await client.UpdateKeyRotationPolicyAsync("my-rsa-key", policy);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `KeyClient` | Key management operations |
|
||||
| `CryptographyClient` | Cryptographic operations |
|
||||
| `KeyResolver` | Resolve key ID to CryptographyClient |
|
||||
| `KeyVaultKey` | Key with cryptographic material |
|
||||
| `KeyProperties` | Key metadata (no crypto material) |
|
||||
| `CreateRsaKeyOptions` | RSA key creation options |
|
||||
| `CreateEcKeyOptions` | EC key creation options |
|
||||
| `CreateOctKeyOptions` | Symmetric key options |
|
||||
| `EncryptResult` | Encryption result |
|
||||
| `DecryptResult` | Decryption result |
|
||||
| `SignResult` | Signing result |
|
||||
| `VerifyResult` | Verification result |
|
||||
| `WrapResult` | Key wrap result |
|
||||
| `UnwrapResult` | Key unwrap result |
|
||||
|
||||
## Algorithms Reference
|
||||
|
||||
### Encryption Algorithms
|
||||
| Algorithm | Key Type | Description |
|
||||
|-----------|----------|-------------|
|
||||
| `RsaOaep` | RSA | RSA-OAEP |
|
||||
| `RsaOaep256` | RSA | RSA-OAEP-256 |
|
||||
| `Rsa15` | RSA | RSA 1.5 (legacy) |
|
||||
| `A128Gcm` | Oct | AES-128-GCM |
|
||||
| `A256Gcm` | Oct | AES-256-GCM |
|
||||
|
||||
### Signature Algorithms
|
||||
| Algorithm | Key Type | Description |
|
||||
|-----------|----------|-------------|
|
||||
| `RS256` | RSA | RSASSA-PKCS1-v1_5 SHA-256 |
|
||||
| `RS384` | RSA | RSASSA-PKCS1-v1_5 SHA-384 |
|
||||
| `RS512` | RSA | RSASSA-PKCS1-v1_5 SHA-512 |
|
||||
| `PS256` | RSA | RSASSA-PSS SHA-256 |
|
||||
| `ES256` | EC | ECDSA P-256 SHA-256 |
|
||||
| `ES384` | EC | ECDSA P-384 SHA-384 |
|
||||
| `ES512` | EC | ECDSA P-521 SHA-512 |
|
||||
|
||||
### Key Wrap Algorithms
|
||||
| Algorithm | Key Type | Description |
|
||||
|-----------|----------|-------------|
|
||||
| `RsaOaep` | RSA | RSA-OAEP |
|
||||
| `RsaOaep256` | RSA | RSA-OAEP-256 |
|
||||
| `A128KW` | Oct | AES-128 Key Wrap |
|
||||
| `A256KW` | Oct | AES-256 Key Wrap |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Managed Identity** — Prefer `DefaultAzureCredential` over secrets
|
||||
2. **Enable soft-delete** — Protect against accidental deletion
|
||||
3. **Use HSM-backed keys** — Set `HardwareProtected = true` for sensitive keys
|
||||
4. **Implement key rotation** — Use automatic rotation policies
|
||||
5. **Limit key operations** — Only enable required `KeyOperations`
|
||||
6. **Set expiration dates** — Always set `ExpiresOn` for keys
|
||||
7. **Use specific versions** — Pin to versions in production
|
||||
8. **Cache CryptographyClient** — Reuse for multiple operations
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
KeyVaultKey key = await client.GetKeyAsync("my-key");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 404)
|
||||
{
|
||||
Console.WriteLine("Key not found");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 403)
|
||||
{
|
||||
Console.WriteLine("Access denied - check RBAC permissions");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Key Vault error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Required RBAC Roles
|
||||
|
||||
| Role | Permissions |
|
||||
|------|-------------|
|
||||
| Key Vault Crypto Officer | Full key management |
|
||||
| Key Vault Crypto User | Use keys for crypto operations |
|
||||
| Key Vault Reader | Read key metadata |
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Security.KeyVault.Keys` | Keys (this SDK) | `dotnet add package Azure.Security.KeyVault.Keys` |
|
||||
| `Azure.Security.KeyVault.Secrets` | Secrets | `dotnet add package Azure.Security.KeyVault.Secrets` |
|
||||
| `Azure.Security.KeyVault.Certificates` | Certificates | `dotnet add package Azure.Security.KeyVault.Certificates` |
|
||||
| `Azure.Identity` | Authentication | `dotnet add package Azure.Identity` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.Security.KeyVault.Keys |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.security.keyvault.keys |
|
||||
| Quickstart | https://learn.microsoft.com/azure/key-vault/keys/quick-create-net |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/keyvault/Azure.Security.KeyVault.Keys |
|
||||
@@ -0,0 +1,337 @@
|
||||
---
|
||||
name: azure-ai-document-intelligence-dotnet
|
||||
description: |
|
||||
Azure AI Document Intelligence SDK for .NET. Extract text, tables, and structured data from documents using prebuilt and custom models. Use for invoice processing, receipt extraction, ID document analysis, and custom document models. Triggers: "Document Intelligence", "DocumentIntelligenceClient", "form recognizer", "invoice extraction", "receipt OCR", "document analysis .NET".
|
||||
package: Azure.AI.DocumentIntelligence
|
||||
---
|
||||
|
||||
# Azure.AI.DocumentIntelligence (.NET)
|
||||
|
||||
Extract text, tables, and structured data from documents using prebuilt and custom models.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.DocumentIntelligence
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.0.0 (GA)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
DOCUMENT_INTELLIGENCE_ENDPOINT=https://<resource-name>.cognitiveservices.azure.com/
|
||||
DOCUMENT_INTELLIGENCE_API_KEY=<your-api-key>
|
||||
BLOB_CONTAINER_SAS_URL=https://<storage>.blob.core.windows.net/<container>?<sas-token>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Microsoft Entra ID (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.DocumentIntelligence;
|
||||
|
||||
string endpoint = Environment.GetEnvironmentVariable("DOCUMENT_INTELLIGENCE_ENDPOINT");
|
||||
var credential = new DefaultAzureCredential();
|
||||
var client = new DocumentIntelligenceClient(new Uri(endpoint), credential);
|
||||
```
|
||||
|
||||
> **Note**: Entra ID requires a **custom subdomain** (e.g., `https://<resource-name>.cognitiveservices.azure.com/`), not a regional endpoint.
|
||||
|
||||
### API Key
|
||||
|
||||
```csharp
|
||||
string endpoint = Environment.GetEnvironmentVariable("DOCUMENT_INTELLIGENCE_ENDPOINT");
|
||||
string apiKey = Environment.GetEnvironmentVariable("DOCUMENT_INTELLIGENCE_API_KEY");
|
||||
var client = new DocumentIntelligenceClient(new Uri(endpoint), new AzureKeyCredential(apiKey));
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `DocumentIntelligenceClient` | Analyze documents, classify documents |
|
||||
| `DocumentIntelligenceAdministrationClient` | Build/manage custom models and classifiers |
|
||||
|
||||
## Prebuilt Models
|
||||
|
||||
| Model ID | Description |
|
||||
|----------|-------------|
|
||||
| `prebuilt-read` | Extract text, languages, handwriting |
|
||||
| `prebuilt-layout` | Extract text, tables, selection marks, structure |
|
||||
| `prebuilt-invoice` | Extract invoice fields (vendor, items, totals) |
|
||||
| `prebuilt-receipt` | Extract receipt fields (merchant, items, total) |
|
||||
| `prebuilt-idDocument` | Extract ID document fields (name, DOB, address) |
|
||||
| `prebuilt-businessCard` | Extract business card fields |
|
||||
| `prebuilt-tax.us.w2` | Extract W-2 tax form fields |
|
||||
| `prebuilt-healthInsuranceCard.us` | Extract health insurance card fields |
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Analyze Invoice
|
||||
|
||||
```csharp
|
||||
using Azure.AI.DocumentIntelligence;
|
||||
|
||||
Uri invoiceUri = new Uri("https://example.com/invoice.pdf");
|
||||
|
||||
Operation<AnalyzeResult> operation = await client.AnalyzeDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
"prebuilt-invoice",
|
||||
invoiceUri);
|
||||
|
||||
AnalyzeResult result = operation.Value;
|
||||
|
||||
foreach (AnalyzedDocument document in result.Documents)
|
||||
{
|
||||
if (document.Fields.TryGetValue("VendorName", out DocumentField vendorNameField)
|
||||
&& vendorNameField.FieldType == DocumentFieldType.String)
|
||||
{
|
||||
string vendorName = vendorNameField.ValueString;
|
||||
Console.WriteLine($"Vendor Name: '{vendorName}', confidence: {vendorNameField.Confidence}");
|
||||
}
|
||||
|
||||
if (document.Fields.TryGetValue("InvoiceTotal", out DocumentField invoiceTotalField)
|
||||
&& invoiceTotalField.FieldType == DocumentFieldType.Currency)
|
||||
{
|
||||
CurrencyValue invoiceTotal = invoiceTotalField.ValueCurrency;
|
||||
Console.WriteLine($"Invoice Total: '{invoiceTotal.CurrencySymbol}{invoiceTotal.Amount}'");
|
||||
}
|
||||
|
||||
// Extract line items
|
||||
if (document.Fields.TryGetValue("Items", out DocumentField itemsField)
|
||||
&& itemsField.FieldType == DocumentFieldType.List)
|
||||
{
|
||||
foreach (DocumentField item in itemsField.ValueList)
|
||||
{
|
||||
var itemFields = item.ValueDictionary;
|
||||
if (itemFields.TryGetValue("Description", out DocumentField descField))
|
||||
Console.WriteLine($" Item: {descField.ValueString}");
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Extract Layout (Text, Tables, Structure)
|
||||
|
||||
```csharp
|
||||
Uri fileUri = new Uri("https://example.com/document.pdf");
|
||||
|
||||
Operation<AnalyzeResult> operation = await client.AnalyzeDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
"prebuilt-layout",
|
||||
fileUri);
|
||||
|
||||
AnalyzeResult result = operation.Value;
|
||||
|
||||
// Extract text by page
|
||||
foreach (DocumentPage page in result.Pages)
|
||||
{
|
||||
Console.WriteLine($"Page {page.PageNumber}: {page.Lines.Count} lines, {page.Words.Count} words");
|
||||
|
||||
foreach (DocumentLine line in page.Lines)
|
||||
{
|
||||
Console.WriteLine($" Line: '{line.Content}'");
|
||||
}
|
||||
}
|
||||
|
||||
// Extract tables
|
||||
foreach (DocumentTable table in result.Tables)
|
||||
{
|
||||
Console.WriteLine($"Table: {table.RowCount} rows x {table.ColumnCount} columns");
|
||||
foreach (DocumentTableCell cell in table.Cells)
|
||||
{
|
||||
Console.WriteLine($" Cell ({cell.RowIndex}, {cell.ColumnIndex}): {cell.Content}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Analyze Receipt
|
||||
|
||||
```csharp
|
||||
Operation<AnalyzeResult> operation = await client.AnalyzeDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
"prebuilt-receipt",
|
||||
receiptUri);
|
||||
|
||||
AnalyzeResult result = operation.Value;
|
||||
|
||||
foreach (AnalyzedDocument document in result.Documents)
|
||||
{
|
||||
if (document.Fields.TryGetValue("MerchantName", out DocumentField merchantField))
|
||||
Console.WriteLine($"Merchant: {merchantField.ValueString}");
|
||||
|
||||
if (document.Fields.TryGetValue("Total", out DocumentField totalField))
|
||||
Console.WriteLine($"Total: {totalField.ValueCurrency.Amount}");
|
||||
|
||||
if (document.Fields.TryGetValue("TransactionDate", out DocumentField dateField))
|
||||
Console.WriteLine($"Date: {dateField.ValueDate}");
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Build Custom Model
|
||||
|
||||
```csharp
|
||||
var adminClient = new DocumentIntelligenceAdministrationClient(
|
||||
new Uri(endpoint),
|
||||
new AzureKeyCredential(apiKey));
|
||||
|
||||
string modelId = "my-custom-model";
|
||||
Uri blobContainerUri = new Uri("<blob-container-sas-url>");
|
||||
|
||||
var blobSource = new BlobContentSource(blobContainerUri);
|
||||
var options = new BuildDocumentModelOptions(modelId, DocumentBuildMode.Template, blobSource);
|
||||
|
||||
Operation<DocumentModelDetails> operation = await adminClient.BuildDocumentModelAsync(
|
||||
WaitUntil.Completed,
|
||||
options);
|
||||
|
||||
DocumentModelDetails model = operation.Value;
|
||||
|
||||
Console.WriteLine($"Model ID: {model.ModelId}");
|
||||
Console.WriteLine($"Created: {model.CreatedOn}");
|
||||
|
||||
foreach (var docType in model.DocumentTypes)
|
||||
{
|
||||
Console.WriteLine($"Document type: {docType.Key}");
|
||||
foreach (var field in docType.Value.FieldSchema)
|
||||
{
|
||||
Console.WriteLine($" Field: {field.Key}, Confidence: {docType.Value.FieldConfidence[field.Key]}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Build Document Classifier
|
||||
|
||||
```csharp
|
||||
string classifierId = "my-classifier";
|
||||
Uri blobContainerUri = new Uri("<blob-container-sas-url>");
|
||||
|
||||
var sourceA = new BlobContentSource(blobContainerUri) { Prefix = "TypeA/train" };
|
||||
var sourceB = new BlobContentSource(blobContainerUri) { Prefix = "TypeB/train" };
|
||||
|
||||
var docTypes = new Dictionary<string, ClassifierDocumentTypeDetails>()
|
||||
{
|
||||
{ "TypeA", new ClassifierDocumentTypeDetails(sourceA) },
|
||||
{ "TypeB", new ClassifierDocumentTypeDetails(sourceB) }
|
||||
};
|
||||
|
||||
var options = new BuildClassifierOptions(classifierId, docTypes);
|
||||
|
||||
Operation<DocumentClassifierDetails> operation = await adminClient.BuildClassifierAsync(
|
||||
WaitUntil.Completed,
|
||||
options);
|
||||
|
||||
DocumentClassifierDetails classifier = operation.Value;
|
||||
Console.WriteLine($"Classifier ID: {classifier.ClassifierId}");
|
||||
```
|
||||
|
||||
### 6. Classify Document
|
||||
|
||||
```csharp
|
||||
string classifierId = "my-classifier";
|
||||
Uri documentUri = new Uri("https://example.com/document.pdf");
|
||||
|
||||
var options = new ClassifyDocumentOptions(classifierId, documentUri);
|
||||
|
||||
Operation<AnalyzeResult> operation = await client.ClassifyDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
options);
|
||||
|
||||
AnalyzeResult result = operation.Value;
|
||||
|
||||
foreach (AnalyzedDocument document in result.Documents)
|
||||
{
|
||||
Console.WriteLine($"Document type: {document.DocumentType}, confidence: {document.Confidence}");
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Manage Models
|
||||
|
||||
```csharp
|
||||
// Get resource details
|
||||
DocumentIntelligenceResourceDetails resourceDetails = await adminClient.GetResourceDetailsAsync();
|
||||
Console.WriteLine($"Custom models: {resourceDetails.CustomDocumentModels.Count}/{resourceDetails.CustomDocumentModels.Limit}");
|
||||
|
||||
// Get specific model
|
||||
DocumentModelDetails model = await adminClient.GetModelAsync("my-model-id");
|
||||
Console.WriteLine($"Model: {model.ModelId}, Created: {model.CreatedOn}");
|
||||
|
||||
// List models
|
||||
await foreach (DocumentModelDetails modelItem in adminClient.GetModelsAsync())
|
||||
{
|
||||
Console.WriteLine($"Model: {modelItem.ModelId}");
|
||||
}
|
||||
|
||||
// Delete model
|
||||
await adminClient.DeleteModelAsync("my-model-id");
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Description |
|
||||
|------|-------------|
|
||||
| `DocumentIntelligenceClient` | Main client for analysis |
|
||||
| `DocumentIntelligenceAdministrationClient` | Model management |
|
||||
| `AnalyzeResult` | Result of document analysis |
|
||||
| `AnalyzedDocument` | Single document within result |
|
||||
| `DocumentField` | Extracted field with value and confidence |
|
||||
| `DocumentFieldType` | String, Date, Number, Currency, etc. |
|
||||
| `DocumentPage` | Page info (lines, words, selection marks) |
|
||||
| `DocumentTable` | Extracted table with cells |
|
||||
| `DocumentModelDetails` | Custom model metadata |
|
||||
| `BlobContentSource` | Training data source |
|
||||
|
||||
## Build Modes
|
||||
|
||||
| Mode | Use Case |
|
||||
|------|----------|
|
||||
| `DocumentBuildMode.Template` | Fixed layout documents (forms) |
|
||||
| `DocumentBuildMode.Neural` | Variable layout documents |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for production
|
||||
2. **Reuse client instances** — clients are thread-safe
|
||||
3. **Handle long-running operations** — Use `WaitUntil.Completed` for simplicity
|
||||
4. **Check field confidence** — Always verify `Confidence` property
|
||||
5. **Use appropriate model** — Prebuilt for common docs, custom for specialized
|
||||
6. **Use custom subdomain** — Required for Entra ID authentication
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await client.AnalyzeDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
"prebuilt-invoice",
|
||||
documentUri);
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.DocumentIntelligence` | Document analysis (this SDK) | `dotnet add package Azure.AI.DocumentIntelligence` |
|
||||
| `Azure.AI.FormRecognizer` | Legacy SDK (deprecated) | Use DocumentIntelligence instead |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.DocumentIntelligence |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.documentintelligence |
|
||||
| GitHub Samples | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/documentintelligence/Azure.AI.DocumentIntelligence/samples |
|
||||
| Document Intelligence Studio | https://documentintelligence.ai.azure.com/ |
|
||||
| Prebuilt Models | https://aka.ms/azsdk/formrecognizer/models |
|
||||
455
skills/official/microsoft/dotnet/foundry/openai/SKILL.md
Normal file
455
skills/official/microsoft/dotnet/foundry/openai/SKILL.md
Normal file
@@ -0,0 +1,455 @@
|
||||
---
|
||||
name: azure-ai-openai-dotnet
|
||||
description: |
|
||||
Azure OpenAI SDK for .NET. Client library for Azure OpenAI and OpenAI services. Use for chat completions, embeddings, image generation, audio transcription, and assistants. Triggers: "Azure OpenAI", "AzureOpenAIClient", "ChatClient", "chat completions .NET", "GPT-4", "embeddings", "DALL-E", "Whisper", "OpenAI .NET".
|
||||
package: Azure.AI.OpenAI
|
||||
---
|
||||
|
||||
# Azure.AI.OpenAI (.NET)
|
||||
|
||||
Client library for Azure OpenAI Service providing access to OpenAI models including GPT-4, GPT-4o, embeddings, DALL-E, and Whisper.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.OpenAI
|
||||
|
||||
# For OpenAI (non-Azure) compatibility
|
||||
dotnet add package OpenAI
|
||||
```
|
||||
|
||||
**Current Version**: 2.1.0 (stable)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_OPENAI_ENDPOINT=https://<resource-name>.openai.azure.com
|
||||
AZURE_OPENAI_API_KEY=<api-key> # For key-based auth
|
||||
AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4o-mini # Your deployment name
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
AzureOpenAIClient (top-level)
|
||||
├── GetChatClient(deploymentName) → ChatClient
|
||||
├── GetEmbeddingClient(deploymentName) → EmbeddingClient
|
||||
├── GetImageClient(deploymentName) → ImageClient
|
||||
├── GetAudioClient(deploymentName) → AudioClient
|
||||
└── GetAssistantClient() → AssistantClient
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key Authentication
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.AI.OpenAI;
|
||||
|
||||
AzureOpenAIClient client = new(
|
||||
new Uri(Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT")!),
|
||||
new AzureKeyCredential(Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY")!));
|
||||
```
|
||||
|
||||
### Microsoft Entra ID (Recommended for Production)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.OpenAI;
|
||||
|
||||
AzureOpenAIClient client = new(
|
||||
new Uri(Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT")!),
|
||||
new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
### Using OpenAI SDK Directly with Azure
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using OpenAI;
|
||||
using OpenAI.Chat;
|
||||
using System.ClientModel.Primitives;
|
||||
|
||||
#pragma warning disable OPENAI001
|
||||
|
||||
BearerTokenPolicy tokenPolicy = new(
|
||||
new DefaultAzureCredential(),
|
||||
"https://cognitiveservices.azure.com/.default");
|
||||
|
||||
ChatClient client = new(
|
||||
model: "gpt-4o-mini",
|
||||
authenticationPolicy: tokenPolicy,
|
||||
options: new OpenAIClientOptions()
|
||||
{
|
||||
Endpoint = new Uri("https://YOUR-RESOURCE.openai.azure.com/openai/v1")
|
||||
});
|
||||
```
|
||||
|
||||
## Chat Completions
|
||||
|
||||
### Basic Chat
|
||||
|
||||
```csharp
|
||||
using Azure.AI.OpenAI;
|
||||
using OpenAI.Chat;
|
||||
|
||||
AzureOpenAIClient azureClient = new(
|
||||
new Uri(endpoint),
|
||||
new DefaultAzureCredential());
|
||||
|
||||
ChatClient chatClient = azureClient.GetChatClient("gpt-4o-mini");
|
||||
|
||||
ChatCompletion completion = chatClient.CompleteChat(
|
||||
[
|
||||
new SystemChatMessage("You are a helpful assistant."),
|
||||
new UserChatMessage("What is Azure OpenAI?")
|
||||
]);
|
||||
|
||||
Console.WriteLine(completion.Content[0].Text);
|
||||
```
|
||||
|
||||
### Async Chat
|
||||
|
||||
```csharp
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[
|
||||
new SystemChatMessage("You are a helpful assistant."),
|
||||
new UserChatMessage("Explain cloud computing in simple terms.")
|
||||
]);
|
||||
|
||||
Console.WriteLine($"Response: {completion.Content[0].Text}");
|
||||
Console.WriteLine($"Tokens used: {completion.Usage.TotalTokenCount}");
|
||||
```
|
||||
|
||||
### Streaming Chat
|
||||
|
||||
```csharp
|
||||
await foreach (StreamingChatCompletionUpdate update
|
||||
in chatClient.CompleteChatStreamingAsync(messages))
|
||||
{
|
||||
if (update.ContentUpdate.Count > 0)
|
||||
{
|
||||
Console.Write(update.ContentUpdate[0].Text);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Chat with Options
|
||||
|
||||
```csharp
|
||||
ChatCompletionOptions options = new()
|
||||
{
|
||||
MaxOutputTokenCount = 1000,
|
||||
Temperature = 0.7f,
|
||||
TopP = 0.95f,
|
||||
FrequencyPenalty = 0,
|
||||
PresencePenalty = 0
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(messages, options);
|
||||
```
|
||||
|
||||
### Multi-turn Conversation
|
||||
|
||||
```csharp
|
||||
List<ChatMessage> messages = new()
|
||||
{
|
||||
new SystemChatMessage("You are a helpful assistant."),
|
||||
new UserChatMessage("Hi, can you help me?"),
|
||||
new AssistantChatMessage("Of course! What do you need help with?"),
|
||||
new UserChatMessage("What's the capital of France?")
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(messages);
|
||||
messages.Add(new AssistantChatMessage(completion.Content[0].Text));
|
||||
```
|
||||
|
||||
## Structured Outputs (JSON Schema)
|
||||
|
||||
```csharp
|
||||
using System.Text.Json;
|
||||
|
||||
ChatCompletionOptions options = new()
|
||||
{
|
||||
ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat(
|
||||
jsonSchemaFormatName: "math_reasoning",
|
||||
jsonSchema: BinaryData.FromBytes("""
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"steps": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"explanation": { "type": "string" },
|
||||
"output": { "type": "string" }
|
||||
},
|
||||
"required": ["explanation", "output"],
|
||||
"additionalProperties": false
|
||||
}
|
||||
},
|
||||
"final_answer": { "type": "string" }
|
||||
},
|
||||
"required": ["steps", "final_answer"],
|
||||
"additionalProperties": false
|
||||
}
|
||||
"""u8.ToArray()),
|
||||
jsonSchemaIsStrict: true)
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[new UserChatMessage("How can I solve 8x + 7 = -23?")],
|
||||
options);
|
||||
|
||||
using JsonDocument json = JsonDocument.Parse(completion.Content[0].Text);
|
||||
Console.WriteLine($"Answer: {json.RootElement.GetProperty("final_answer")}");
|
||||
```
|
||||
|
||||
## Reasoning Models (o1, o4-mini)
|
||||
|
||||
```csharp
|
||||
ChatCompletionOptions options = new()
|
||||
{
|
||||
ReasoningEffortLevel = ChatReasoningEffortLevel.Low,
|
||||
MaxOutputTokenCount = 100000
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[
|
||||
new DeveloperChatMessage("You are a helpful assistant"),
|
||||
new UserChatMessage("Explain the theory of relativity")
|
||||
], options);
|
||||
```
|
||||
|
||||
## Azure AI Search Integration (RAG)
|
||||
|
||||
```csharp
|
||||
using Azure.AI.OpenAI.Chat;
|
||||
|
||||
#pragma warning disable AOAI001
|
||||
|
||||
ChatCompletionOptions options = new();
|
||||
options.AddDataSource(new AzureSearchChatDataSource()
|
||||
{
|
||||
Endpoint = new Uri(searchEndpoint),
|
||||
IndexName = searchIndex,
|
||||
Authentication = DataSourceAuthentication.FromApiKey(searchKey)
|
||||
});
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[new UserChatMessage("What health plans are available?")],
|
||||
options);
|
||||
|
||||
ChatMessageContext context = completion.GetMessageContext();
|
||||
if (context?.Intent is not null)
|
||||
{
|
||||
Console.WriteLine($"Intent: {context.Intent}");
|
||||
}
|
||||
foreach (ChatCitation citation in context?.Citations ?? [])
|
||||
{
|
||||
Console.WriteLine($"Citation: {citation.Content}");
|
||||
}
|
||||
```
|
||||
|
||||
## Embeddings
|
||||
|
||||
```csharp
|
||||
using OpenAI.Embeddings;
|
||||
|
||||
EmbeddingClient embeddingClient = azureClient.GetEmbeddingClient("text-embedding-ada-002");
|
||||
|
||||
OpenAIEmbedding embedding = await embeddingClient.GenerateEmbeddingAsync("Hello, world!");
|
||||
ReadOnlyMemory<float> vector = embedding.ToFloats();
|
||||
|
||||
Console.WriteLine($"Embedding dimensions: {vector.Length}");
|
||||
```
|
||||
|
||||
### Batch Embeddings
|
||||
|
||||
```csharp
|
||||
List<string> inputs = new()
|
||||
{
|
||||
"First document text",
|
||||
"Second document text",
|
||||
"Third document text"
|
||||
};
|
||||
|
||||
OpenAIEmbeddingCollection embeddings = await embeddingClient.GenerateEmbeddingsAsync(inputs);
|
||||
|
||||
foreach (OpenAIEmbedding emb in embeddings)
|
||||
{
|
||||
Console.WriteLine($"Index {emb.Index}: {emb.ToFloats().Length} dimensions");
|
||||
}
|
||||
```
|
||||
|
||||
## Image Generation (DALL-E)
|
||||
|
||||
```csharp
|
||||
using OpenAI.Images;
|
||||
|
||||
ImageClient imageClient = azureClient.GetImageClient("dall-e-3");
|
||||
|
||||
GeneratedImage image = await imageClient.GenerateImageAsync(
|
||||
"A futuristic city skyline at sunset",
|
||||
new ImageGenerationOptions
|
||||
{
|
||||
Size = GeneratedImageSize.W1024xH1024,
|
||||
Quality = GeneratedImageQuality.High,
|
||||
Style = GeneratedImageStyle.Vivid
|
||||
});
|
||||
|
||||
Console.WriteLine($"Image URL: {image.ImageUri}");
|
||||
```
|
||||
|
||||
## Audio (Whisper)
|
||||
|
||||
### Transcription
|
||||
|
||||
```csharp
|
||||
using OpenAI.Audio;
|
||||
|
||||
AudioClient audioClient = azureClient.GetAudioClient("whisper");
|
||||
|
||||
AudioTranscription transcription = await audioClient.TranscribeAudioAsync(
|
||||
"audio.mp3",
|
||||
new AudioTranscriptionOptions
|
||||
{
|
||||
ResponseFormat = AudioTranscriptionFormat.Verbose,
|
||||
Language = "en"
|
||||
});
|
||||
|
||||
Console.WriteLine(transcription.Text);
|
||||
```
|
||||
|
||||
### Text-to-Speech
|
||||
|
||||
```csharp
|
||||
BinaryData speech = await audioClient.GenerateSpeechAsync(
|
||||
"Hello, welcome to Azure OpenAI!",
|
||||
GeneratedSpeechVoice.Alloy,
|
||||
new SpeechGenerationOptions
|
||||
{
|
||||
SpeedRatio = 1.0f,
|
||||
ResponseFormat = GeneratedSpeechFormat.Mp3
|
||||
});
|
||||
|
||||
await File.WriteAllBytesAsync("output.mp3", speech.ToArray());
|
||||
```
|
||||
|
||||
## Function Calling (Tools)
|
||||
|
||||
```csharp
|
||||
ChatTool getCurrentWeatherTool = ChatTool.CreateFunctionTool(
|
||||
functionName: "get_current_weather",
|
||||
functionDescription: "Get the current weather in a given location",
|
||||
functionParameters: BinaryData.FromString("""
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"location": {
|
||||
"type": "string",
|
||||
"description": "The city and state, e.g. San Francisco, CA"
|
||||
},
|
||||
"unit": {
|
||||
"type": "string",
|
||||
"enum": ["celsius", "fahrenheit"]
|
||||
}
|
||||
},
|
||||
"required": ["location"]
|
||||
}
|
||||
"""));
|
||||
|
||||
ChatCompletionOptions options = new()
|
||||
{
|
||||
Tools = { getCurrentWeatherTool }
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[new UserChatMessage("What's the weather in Seattle?")],
|
||||
options);
|
||||
|
||||
if (completion.FinishReason == ChatFinishReason.ToolCalls)
|
||||
{
|
||||
foreach (ChatToolCall toolCall in completion.ToolCalls)
|
||||
{
|
||||
Console.WriteLine($"Function: {toolCall.FunctionName}");
|
||||
Console.WriteLine($"Arguments: {toolCall.FunctionArguments}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `AzureOpenAIClient` | Top-level client for Azure OpenAI |
|
||||
| `ChatClient` | Chat completions |
|
||||
| `EmbeddingClient` | Text embeddings |
|
||||
| `ImageClient` | Image generation (DALL-E) |
|
||||
| `AudioClient` | Audio transcription/TTS |
|
||||
| `ChatCompletion` | Chat response |
|
||||
| `ChatCompletionOptions` | Request configuration |
|
||||
| `StreamingChatCompletionUpdate` | Streaming response chunk |
|
||||
| `ChatMessage` | Base message type |
|
||||
| `SystemChatMessage` | System prompt |
|
||||
| `UserChatMessage` | User input |
|
||||
| `AssistantChatMessage` | Assistant response |
|
||||
| `DeveloperChatMessage` | Developer message (reasoning models) |
|
||||
| `ChatTool` | Function/tool definition |
|
||||
| `ChatToolCall` | Tool invocation request |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID in production** — Avoid API keys; use `DefaultAzureCredential`
|
||||
2. **Reuse client instances** — Create once, share across requests
|
||||
3. **Handle rate limits** — Implement exponential backoff for 429 errors
|
||||
4. **Stream for long responses** — Use `CompleteChatStreamingAsync` for better UX
|
||||
5. **Set appropriate timeouts** — Long completions may need extended timeouts
|
||||
6. **Use structured outputs** — JSON schema ensures consistent response format
|
||||
7. **Monitor token usage** — Track `completion.Usage` for cost management
|
||||
8. **Validate tool calls** — Always validate function arguments before execution
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(messages);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 429)
|
||||
{
|
||||
Console.WriteLine("Rate limited. Retry after delay.");
|
||||
await Task.Delay(TimeSpan.FromSeconds(10));
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Bad request: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure OpenAI error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.OpenAI` | Azure OpenAI client (this SDK) | `dotnet add package Azure.AI.OpenAI` |
|
||||
| `OpenAI` | OpenAI compatibility | `dotnet add package OpenAI` |
|
||||
| `Azure.Identity` | Authentication | `dotnet add package Azure.Identity` |
|
||||
| `Azure.Search.Documents` | AI Search for RAG | `dotnet add package Azure.Search.Documents` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.OpenAI |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.openai |
|
||||
| Migration Guide (1.0→2.0) | https://learn.microsoft.com/azure/ai-services/openai/how-to/dotnet-migration |
|
||||
| Quickstart | https://learn.microsoft.com/azure/ai-services/openai/quickstart |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/openai/Azure.AI.OpenAI |
|
||||
348
skills/official/microsoft/dotnet/foundry/projects/SKILL.md
Normal file
348
skills/official/microsoft/dotnet/foundry/projects/SKILL.md
Normal file
@@ -0,0 +1,348 @@
|
||||
---
|
||||
name: azure-ai-projects-dotnet
|
||||
description: |
|
||||
Azure AI Projects SDK for .NET. High-level client for Azure AI Foundry projects including agents, connections, datasets, deployments, evaluations, and indexes. Use for AI Foundry project management, versioned agents, and orchestration. Triggers: "AI Projects", "AIProjectClient", "Foundry project", "versioned agents", "evaluations", "datasets", "connections", "deployments .NET".
|
||||
package: Azure.AI.Projects
|
||||
---
|
||||
|
||||
# Azure.AI.Projects (.NET)
|
||||
|
||||
High-level SDK for Azure AI Foundry project operations including agents, connections, datasets, deployments, evaluations, and indexes.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.Projects
|
||||
dotnet add package Azure.Identity
|
||||
|
||||
# Optional: For versioned agents with OpenAI extensions
|
||||
dotnet add package Azure.AI.Projects.OpenAI --prerelease
|
||||
|
||||
# Optional: For low-level agent operations
|
||||
dotnet add package Azure.AI.Agents.Persistent --prerelease
|
||||
```
|
||||
|
||||
**Current Versions**: GA v1.1.0, Preview v1.2.0-beta.5
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
MODEL_DEPLOYMENT_NAME=gpt-4o-mini
|
||||
CONNECTION_NAME=<your-connection-name>
|
||||
AI_SEARCH_CONNECTION_NAME=<ai-search-connection>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.Projects;
|
||||
|
||||
var endpoint = Environment.GetEnvironmentVariable("PROJECT_ENDPOINT");
|
||||
AIProjectClient projectClient = new AIProjectClient(
|
||||
new Uri(endpoint),
|
||||
new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
AIProjectClient
|
||||
├── Agents → AIProjectAgentsOperations (versioned agents)
|
||||
├── Connections → ConnectionsClient
|
||||
├── Datasets → DatasetsClient
|
||||
├── Deployments → DeploymentsClient
|
||||
├── Evaluations → EvaluationsClient
|
||||
├── Evaluators → EvaluatorsClient
|
||||
├── Indexes → IndexesClient
|
||||
├── Telemetry → AIProjectTelemetry
|
||||
├── OpenAI → ProjectOpenAIClient (preview)
|
||||
└── GetPersistentAgentsClient() → PersistentAgentsClient
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Get Persistent Agents Client
|
||||
|
||||
```csharp
|
||||
// Get low-level agents client from project client
|
||||
PersistentAgentsClient agentsClient = projectClient.GetPersistentAgentsClient();
|
||||
|
||||
// Create agent
|
||||
PersistentAgent agent = await agentsClient.Administration.CreateAgentAsync(
|
||||
model: "gpt-4o-mini",
|
||||
name: "Math Tutor",
|
||||
instructions: "You are a personal math tutor.");
|
||||
|
||||
// Create thread and run
|
||||
PersistentAgentThread thread = await agentsClient.Threads.CreateThreadAsync();
|
||||
await agentsClient.Messages.CreateMessageAsync(thread.Id, MessageRole.User, "Solve 3x + 11 = 14");
|
||||
ThreadRun run = await agentsClient.Runs.CreateRunAsync(thread.Id, agent.Id);
|
||||
|
||||
// Poll for completion
|
||||
do
|
||||
{
|
||||
await Task.Delay(500);
|
||||
run = await agentsClient.Runs.GetRunAsync(thread.Id, run.Id);
|
||||
}
|
||||
while (run.Status == RunStatus.Queued || run.Status == RunStatus.InProgress);
|
||||
|
||||
// Get messages
|
||||
await foreach (var msg in agentsClient.Messages.GetMessagesAsync(thread.Id))
|
||||
{
|
||||
foreach (var content in msg.ContentItems)
|
||||
{
|
||||
if (content is MessageTextContent textContent)
|
||||
Console.WriteLine(textContent.Text);
|
||||
}
|
||||
}
|
||||
|
||||
// Cleanup
|
||||
await agentsClient.Threads.DeleteThreadAsync(thread.Id);
|
||||
await agentsClient.Administration.DeleteAgentAsync(agent.Id);
|
||||
```
|
||||
|
||||
### 2. Versioned Agents with Tools (Preview)
|
||||
|
||||
```csharp
|
||||
using Azure.AI.Projects.OpenAI;
|
||||
|
||||
// Create agent with web search tool
|
||||
PromptAgentDefinition agentDefinition = new(model: "gpt-4o-mini")
|
||||
{
|
||||
Instructions = "You are a helpful assistant that can search the web",
|
||||
Tools = {
|
||||
ResponseTool.CreateWebSearchTool(
|
||||
userLocation: WebSearchToolLocation.CreateApproximateLocation(
|
||||
country: "US",
|
||||
city: "Seattle",
|
||||
region: "Washington"
|
||||
)
|
||||
),
|
||||
}
|
||||
};
|
||||
|
||||
AgentVersion agentVersion = await projectClient.Agents.CreateAgentVersionAsync(
|
||||
agentName: "myAgent",
|
||||
options: new(agentDefinition));
|
||||
|
||||
// Get response client
|
||||
ProjectResponsesClient responseClient = projectClient.OpenAI.GetProjectResponsesClientForAgent(agentVersion.Name);
|
||||
|
||||
// Create response
|
||||
ResponseResult response = responseClient.CreateResponse("What's the weather in Seattle?");
|
||||
Console.WriteLine(response.GetOutputText());
|
||||
|
||||
// Cleanup
|
||||
projectClient.Agents.DeleteAgentVersion(agentName: agentVersion.Name, agentVersion: agentVersion.Version);
|
||||
```
|
||||
|
||||
### 3. Connections
|
||||
|
||||
```csharp
|
||||
// List all connections
|
||||
foreach (AIProjectConnection connection in projectClient.Connections.GetConnections())
|
||||
{
|
||||
Console.WriteLine($"{connection.Name}: {connection.ConnectionType}");
|
||||
}
|
||||
|
||||
// Get specific connection
|
||||
AIProjectConnection conn = projectClient.Connections.GetConnection(
|
||||
connectionName,
|
||||
includeCredentials: true);
|
||||
|
||||
// Get default connection
|
||||
AIProjectConnection defaultConn = projectClient.Connections.GetDefaultConnection(
|
||||
includeCredentials: false);
|
||||
```
|
||||
|
||||
### 4. Deployments
|
||||
|
||||
```csharp
|
||||
// List all deployments
|
||||
foreach (AIProjectDeployment deployment in projectClient.Deployments.GetDeployments())
|
||||
{
|
||||
Console.WriteLine($"{deployment.Name}: {deployment.ModelName}");
|
||||
}
|
||||
|
||||
// Filter by publisher
|
||||
foreach (var deployment in projectClient.Deployments.GetDeployments(modelPublisher: "Microsoft"))
|
||||
{
|
||||
Console.WriteLine(deployment.Name);
|
||||
}
|
||||
|
||||
// Get specific deployment
|
||||
ModelDeployment details = (ModelDeployment)projectClient.Deployments.GetDeployment("gpt-4o-mini");
|
||||
```
|
||||
|
||||
### 5. Datasets
|
||||
|
||||
```csharp
|
||||
// Upload single file
|
||||
FileDataset fileDataset = projectClient.Datasets.UploadFile(
|
||||
name: "my-dataset",
|
||||
version: "1.0",
|
||||
filePath: "data/training.txt",
|
||||
connectionName: connectionName);
|
||||
|
||||
// Upload folder
|
||||
FolderDataset folderDataset = projectClient.Datasets.UploadFolder(
|
||||
name: "my-dataset",
|
||||
version: "2.0",
|
||||
folderPath: "data/training",
|
||||
connectionName: connectionName,
|
||||
filePattern: new Regex(".*\\.txt"));
|
||||
|
||||
// Get dataset
|
||||
AIProjectDataset dataset = projectClient.Datasets.GetDataset("my-dataset", "1.0");
|
||||
|
||||
// Delete dataset
|
||||
projectClient.Datasets.Delete("my-dataset", "1.0");
|
||||
```
|
||||
|
||||
### 6. Indexes
|
||||
|
||||
```csharp
|
||||
// Create Azure AI Search index
|
||||
AzureAISearchIndex searchIndex = new(aiSearchConnectionName, aiSearchIndexName)
|
||||
{
|
||||
Description = "Sample Index"
|
||||
};
|
||||
|
||||
searchIndex = (AzureAISearchIndex)projectClient.Indexes.CreateOrUpdate(
|
||||
name: "my-index",
|
||||
version: "1.0",
|
||||
index: searchIndex);
|
||||
|
||||
// List indexes
|
||||
foreach (AIProjectIndex index in projectClient.Indexes.GetIndexes())
|
||||
{
|
||||
Console.WriteLine(index.Name);
|
||||
}
|
||||
|
||||
// Delete index
|
||||
projectClient.Indexes.Delete(name: "my-index", version: "1.0");
|
||||
```
|
||||
|
||||
### 7. Evaluations
|
||||
|
||||
```csharp
|
||||
// Create evaluation configuration
|
||||
var evaluatorConfig = new EvaluatorConfiguration(id: EvaluatorIDs.Relevance);
|
||||
evaluatorConfig.InitParams.Add("deployment_name", BinaryData.FromObjectAsJson("gpt-4o"));
|
||||
|
||||
// Create evaluation
|
||||
Evaluation evaluation = new Evaluation(
|
||||
data: new InputDataset("<dataset_id>"),
|
||||
evaluators: new Dictionary<string, EvaluatorConfiguration>
|
||||
{
|
||||
{ "relevance", evaluatorConfig }
|
||||
}
|
||||
)
|
||||
{
|
||||
DisplayName = "Sample Evaluation"
|
||||
};
|
||||
|
||||
// Run evaluation
|
||||
Evaluation result = projectClient.Evaluations.Create(evaluation: evaluation);
|
||||
|
||||
// Get evaluation
|
||||
Evaluation getResult = projectClient.Evaluations.Get(result.Name);
|
||||
|
||||
// List evaluations
|
||||
foreach (var eval in projectClient.Evaluations.GetAll())
|
||||
{
|
||||
Console.WriteLine($"{eval.DisplayName}: {eval.Status}");
|
||||
}
|
||||
```
|
||||
|
||||
### 8. Get Azure OpenAI Chat Client
|
||||
|
||||
```csharp
|
||||
using Azure.AI.OpenAI;
|
||||
using OpenAI.Chat;
|
||||
|
||||
ClientConnection connection = projectClient.GetConnection(typeof(AzureOpenAIClient).FullName!);
|
||||
|
||||
if (!connection.TryGetLocatorAsUri(out Uri uri) || uri is null)
|
||||
throw new InvalidOperationException("Invalid URI.");
|
||||
|
||||
uri = new Uri($"https://{uri.Host}");
|
||||
|
||||
AzureOpenAIClient azureOpenAIClient = new AzureOpenAIClient(uri, new DefaultAzureCredential());
|
||||
ChatClient chatClient = azureOpenAIClient.GetChatClient("gpt-4o-mini");
|
||||
|
||||
ChatCompletion result = chatClient.CompleteChat("List all rainbow colors");
|
||||
Console.WriteLine(result.Content[0].Text);
|
||||
```
|
||||
|
||||
## Available Agent Tools
|
||||
|
||||
| Tool | Class | Purpose |
|
||||
|------|-------|---------|
|
||||
| Code Interpreter | `CodeInterpreterToolDefinition` | Execute Python code |
|
||||
| File Search | `FileSearchToolDefinition` | Search uploaded files |
|
||||
| Function Calling | `FunctionToolDefinition` | Call custom functions |
|
||||
| Bing Grounding | `BingGroundingToolDefinition` | Web search via Bing |
|
||||
| Azure AI Search | `AzureAISearchToolDefinition` | Search Azure AI indexes |
|
||||
| OpenAPI | `OpenApiToolDefinition` | Call external APIs |
|
||||
| Azure Functions | `AzureFunctionToolDefinition` | Invoke Azure Functions |
|
||||
| MCP | `MCPToolDefinition` | Model Context Protocol tools |
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `AIProjectClient` | Main entry point |
|
||||
| `PersistentAgentsClient` | Low-level agent operations |
|
||||
| `PromptAgentDefinition` | Versioned agent definition |
|
||||
| `AgentVersion` | Versioned agent instance |
|
||||
| `AIProjectConnection` | Connection to Azure resource |
|
||||
| `AIProjectDeployment` | Model deployment info |
|
||||
| `AIProjectDataset` | Dataset metadata |
|
||||
| `AIProjectIndex` | Search index metadata |
|
||||
| `Evaluation` | Evaluation configuration and results |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `DefaultAzureCredential`** for production authentication
|
||||
2. **Use async methods** (`*Async`) for all I/O operations
|
||||
3. **Poll with appropriate delays** (500ms recommended) when waiting for runs
|
||||
4. **Clean up resources** — delete threads, agents, and files when done
|
||||
5. **Use versioned agents** (via `Azure.AI.Projects.OpenAI`) for production scenarios
|
||||
6. **Store connection IDs** rather than names for tool configurations
|
||||
7. **Use `includeCredentials: true`** only when credentials are needed
|
||||
8. **Handle pagination** — use `AsyncPageable<T>` for listing operations
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var result = await projectClient.Evaluations.CreateAsync(evaluation);
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.Projects` | High-level project client (this SDK) | `dotnet add package Azure.AI.Projects` |
|
||||
| `Azure.AI.Agents.Persistent` | Low-level agent operations | `dotnet add package Azure.AI.Agents.Persistent` |
|
||||
| `Azure.AI.Projects.OpenAI` | Versioned agents with OpenAI | `dotnet add package Azure.AI.Projects.OpenAI` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.Projects |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.projects |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.Projects |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.Projects/samples |
|
||||
@@ -0,0 +1,339 @@
|
||||
---
|
||||
name: azure-search-documents-dotnet
|
||||
description: |
|
||||
Azure AI Search SDK for .NET (Azure.Search.Documents). Use for building search applications with full-text, vector, semantic, and hybrid search. Covers SearchClient (queries, document CRUD), SearchIndexClient (index management), and SearchIndexerClient (indexers, skillsets). Triggers: "Azure Search .NET", "SearchClient", "SearchIndexClient", "vector search C#", "semantic search .NET", "hybrid search", "Azure.Search.Documents".
|
||||
package: Azure.Search.Documents
|
||||
---
|
||||
|
||||
# Azure.Search.Documents (.NET)
|
||||
|
||||
Build search applications with full-text, vector, semantic, and hybrid search capabilities.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.Search.Documents
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v11.7.0, Preview v11.8.0-beta.1
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
SEARCH_ENDPOINT=https://<search-service>.search.windows.net
|
||||
SEARCH_INDEX_NAME=<index-name>
|
||||
# For API key auth (not recommended for production)
|
||||
SEARCH_API_KEY=<api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
**DefaultAzureCredential (preferred)**:
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Search.Documents;
|
||||
|
||||
var credential = new DefaultAzureCredential();
|
||||
var client = new SearchClient(
|
||||
new Uri(Environment.GetEnvironmentVariable("SEARCH_ENDPOINT")),
|
||||
Environment.GetEnvironmentVariable("SEARCH_INDEX_NAME"),
|
||||
credential);
|
||||
```
|
||||
|
||||
**API Key**:
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Search.Documents;
|
||||
|
||||
var credential = new AzureKeyCredential(
|
||||
Environment.GetEnvironmentVariable("SEARCH_API_KEY"));
|
||||
var client = new SearchClient(
|
||||
new Uri(Environment.GetEnvironmentVariable("SEARCH_ENDPOINT")),
|
||||
Environment.GetEnvironmentVariable("SEARCH_INDEX_NAME"),
|
||||
credential);
|
||||
```
|
||||
|
||||
## Client Selection
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `SearchClient` | Query indexes, upload/update/delete documents |
|
||||
| `SearchIndexClient` | Create/manage indexes, synonym maps |
|
||||
| `SearchIndexerClient` | Manage indexers, skillsets, data sources |
|
||||
|
||||
## Index Creation
|
||||
|
||||
### Using FieldBuilder (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Search.Documents.Indexes;
|
||||
using Azure.Search.Documents.Indexes.Models;
|
||||
|
||||
// Define model with attributes
|
||||
public class Hotel
|
||||
{
|
||||
[SimpleField(IsKey = true, IsFilterable = true)]
|
||||
public string HotelId { get; set; }
|
||||
|
||||
[SearchableField(IsSortable = true)]
|
||||
public string HotelName { get; set; }
|
||||
|
||||
[SearchableField(AnalyzerName = LexicalAnalyzerName.EnLucene)]
|
||||
public string Description { get; set; }
|
||||
|
||||
[SimpleField(IsFilterable = true, IsSortable = true, IsFacetable = true)]
|
||||
public double? Rating { get; set; }
|
||||
|
||||
[VectorSearchField(VectorSearchDimensions = 1536, VectorSearchProfileName = "vector-profile")]
|
||||
public ReadOnlyMemory<float>? DescriptionVector { get; set; }
|
||||
}
|
||||
|
||||
// Create index
|
||||
var indexClient = new SearchIndexClient(endpoint, credential);
|
||||
var fieldBuilder = new FieldBuilder();
|
||||
var fields = fieldBuilder.Build(typeof(Hotel));
|
||||
|
||||
var index = new SearchIndex("hotels")
|
||||
{
|
||||
Fields = fields,
|
||||
VectorSearch = new VectorSearch
|
||||
{
|
||||
Profiles = { new VectorSearchProfile("vector-profile", "hnsw-algo") },
|
||||
Algorithms = { new HnswAlgorithmConfiguration("hnsw-algo") }
|
||||
}
|
||||
};
|
||||
|
||||
await indexClient.CreateOrUpdateIndexAsync(index);
|
||||
```
|
||||
|
||||
### Manual Field Definition
|
||||
|
||||
```csharp
|
||||
var index = new SearchIndex("hotels")
|
||||
{
|
||||
Fields =
|
||||
{
|
||||
new SimpleField("hotelId", SearchFieldDataType.String) { IsKey = true, IsFilterable = true },
|
||||
new SearchableField("hotelName") { IsSortable = true },
|
||||
new SearchableField("description") { AnalyzerName = LexicalAnalyzerName.EnLucene },
|
||||
new SimpleField("rating", SearchFieldDataType.Double) { IsFilterable = true, IsSortable = true },
|
||||
new SearchField("descriptionVector", SearchFieldDataType.Collection(SearchFieldDataType.Single))
|
||||
{
|
||||
VectorSearchDimensions = 1536,
|
||||
VectorSearchProfileName = "vector-profile"
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
## Document Operations
|
||||
|
||||
```csharp
|
||||
var searchClient = new SearchClient(endpoint, indexName, credential);
|
||||
|
||||
// Upload (add new)
|
||||
var hotels = new[] { new Hotel { HotelId = "1", HotelName = "Hotel A" } };
|
||||
await searchClient.UploadDocumentsAsync(hotels);
|
||||
|
||||
// Merge (update existing)
|
||||
await searchClient.MergeDocumentsAsync(hotels);
|
||||
|
||||
// Merge or Upload (upsert)
|
||||
await searchClient.MergeOrUploadDocumentsAsync(hotels);
|
||||
|
||||
// Delete
|
||||
await searchClient.DeleteDocumentsAsync("hotelId", new[] { "1", "2" });
|
||||
|
||||
// Batch operations
|
||||
var batch = IndexDocumentsBatch.Create(
|
||||
IndexDocumentsAction.Upload(hotel1),
|
||||
IndexDocumentsAction.Merge(hotel2),
|
||||
IndexDocumentsAction.Delete(hotel3));
|
||||
await searchClient.IndexDocumentsAsync(batch);
|
||||
```
|
||||
|
||||
## Search Patterns
|
||||
|
||||
### Basic Search
|
||||
|
||||
```csharp
|
||||
var options = new SearchOptions
|
||||
{
|
||||
Filter = "rating ge 4",
|
||||
OrderBy = { "rating desc" },
|
||||
Select = { "hotelId", "hotelName", "rating" },
|
||||
Size = 10,
|
||||
Skip = 0,
|
||||
IncludeTotalCount = true
|
||||
};
|
||||
|
||||
SearchResults<Hotel> results = await searchClient.SearchAsync<Hotel>("luxury", options);
|
||||
|
||||
Console.WriteLine($"Total: {results.TotalCount}");
|
||||
await foreach (SearchResult<Hotel> result in results.GetResultsAsync())
|
||||
{
|
||||
Console.WriteLine($"{result.Document.HotelName} (Score: {result.Score})");
|
||||
}
|
||||
```
|
||||
|
||||
### Faceted Search
|
||||
|
||||
```csharp
|
||||
var options = new SearchOptions
|
||||
{
|
||||
Facets = { "rating,count:5", "category" }
|
||||
};
|
||||
|
||||
var results = await searchClient.SearchAsync<Hotel>("*", options);
|
||||
|
||||
foreach (var facet in results.Value.Facets["rating"])
|
||||
{
|
||||
Console.WriteLine($"Rating {facet.Value}: {facet.Count}");
|
||||
}
|
||||
```
|
||||
|
||||
### Autocomplete and Suggestions
|
||||
|
||||
```csharp
|
||||
// Autocomplete
|
||||
var autocompleteOptions = new AutocompleteOptions { Mode = AutocompleteMode.OneTermWithContext };
|
||||
var autocomplete = await searchClient.AutocompleteAsync("lux", "suggester-name", autocompleteOptions);
|
||||
|
||||
// Suggestions
|
||||
var suggestOptions = new SuggestOptions { UseFuzzyMatching = true };
|
||||
var suggestions = await searchClient.SuggestAsync<Hotel>("lux", "suggester-name", suggestOptions);
|
||||
```
|
||||
|
||||
## Vector Search
|
||||
|
||||
See [references/vector-search.md](references/vector-search.md) for detailed patterns.
|
||||
|
||||
```csharp
|
||||
using Azure.Search.Documents.Models;
|
||||
|
||||
// Pure vector search
|
||||
var vectorQuery = new VectorizedQuery(embedding)
|
||||
{
|
||||
KNearestNeighborsCount = 5,
|
||||
Fields = { "descriptionVector" }
|
||||
};
|
||||
|
||||
var options = new SearchOptions
|
||||
{
|
||||
VectorSearch = new VectorSearchOptions
|
||||
{
|
||||
Queries = { vectorQuery }
|
||||
}
|
||||
};
|
||||
|
||||
var results = await searchClient.SearchAsync<Hotel>(null, options);
|
||||
```
|
||||
|
||||
## Semantic Search
|
||||
|
||||
See [references/semantic-search.md](references/semantic-search.md) for detailed patterns.
|
||||
|
||||
```csharp
|
||||
var options = new SearchOptions
|
||||
{
|
||||
QueryType = SearchQueryType.Semantic,
|
||||
SemanticSearch = new SemanticSearchOptions
|
||||
{
|
||||
SemanticConfigurationName = "my-semantic-config",
|
||||
QueryCaption = new QueryCaption(QueryCaptionType.Extractive),
|
||||
QueryAnswer = new QueryAnswer(QueryAnswerType.Extractive)
|
||||
}
|
||||
};
|
||||
|
||||
var results = await searchClient.SearchAsync<Hotel>("best hotel for families", options);
|
||||
|
||||
// Access semantic answers
|
||||
foreach (var answer in results.Value.SemanticSearch.Answers)
|
||||
{
|
||||
Console.WriteLine($"Answer: {answer.Text} (Score: {answer.Score})");
|
||||
}
|
||||
|
||||
// Access captions
|
||||
await foreach (var result in results.Value.GetResultsAsync())
|
||||
{
|
||||
var caption = result.SemanticSearch?.Captions?.FirstOrDefault();
|
||||
Console.WriteLine($"Caption: {caption?.Text}");
|
||||
}
|
||||
```
|
||||
|
||||
## Hybrid Search (Vector + Keyword + Semantic)
|
||||
|
||||
```csharp
|
||||
var vectorQuery = new VectorizedQuery(embedding)
|
||||
{
|
||||
KNearestNeighborsCount = 5,
|
||||
Fields = { "descriptionVector" }
|
||||
};
|
||||
|
||||
var options = new SearchOptions
|
||||
{
|
||||
QueryType = SearchQueryType.Semantic,
|
||||
SemanticSearch = new SemanticSearchOptions
|
||||
{
|
||||
SemanticConfigurationName = "my-semantic-config"
|
||||
},
|
||||
VectorSearch = new VectorSearchOptions
|
||||
{
|
||||
Queries = { vectorQuery }
|
||||
}
|
||||
};
|
||||
|
||||
// Combines keyword search, vector search, and semantic ranking
|
||||
var results = await searchClient.SearchAsync<Hotel>("luxury beachfront", options);
|
||||
```
|
||||
|
||||
## Field Attributes Reference
|
||||
|
||||
| Attribute | Purpose |
|
||||
|-----------|---------|
|
||||
| `SimpleField` | Non-searchable field (filters, sorting, facets) |
|
||||
| `SearchableField` | Full-text searchable field |
|
||||
| `VectorSearchField` | Vector embedding field |
|
||||
| `IsKey = true` | Document key (required, one per index) |
|
||||
| `IsFilterable = true` | Enable $filter expressions |
|
||||
| `IsSortable = true` | Enable $orderby |
|
||||
| `IsFacetable = true` | Enable faceted navigation |
|
||||
| `IsHidden = true` | Exclude from results |
|
||||
| `AnalyzerName` | Specify text analyzer |
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var results = await searchClient.SearchAsync<Hotel>("query");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 404)
|
||||
{
|
||||
Console.WriteLine("Index not found");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Search error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `DefaultAzureCredential`** over API keys for production
|
||||
2. **Use `FieldBuilder`** with model attributes for type-safe index definitions
|
||||
3. **Use `CreateOrUpdateIndexAsync`** for idempotent index creation
|
||||
4. **Batch document operations** for better throughput
|
||||
5. **Use `Select`** to return only needed fields
|
||||
6. **Configure semantic search** for natural language queries
|
||||
7. **Combine vector + keyword + semantic** for best relevance
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | Contents |
|
||||
|------|----------|
|
||||
| [references/vector-search.md](references/vector-search.md) | Vector search, hybrid search, vectorizers |
|
||||
| [references/semantic-search.md](references/semantic-search.md) | Semantic ranking, captions, answers |
|
||||
265
skills/official/microsoft/dotnet/foundry/voicelive/SKILL.md
Normal file
265
skills/official/microsoft/dotnet/foundry/voicelive/SKILL.md
Normal file
@@ -0,0 +1,265 @@
|
||||
---
|
||||
name: azure-ai-voicelive-dotnet
|
||||
description: |
|
||||
Azure AI Voice Live SDK for .NET. Build real-time voice AI applications with bidirectional WebSocket communication. Use for voice assistants, conversational AI, real-time speech-to-speech, and voice-enabled chatbots. Triggers: "voice live", "real-time voice", "VoiceLiveClient", "VoiceLiveSession", "voice assistant .NET", "bidirectional audio", "speech-to-speech".
|
||||
package: Azure.AI.VoiceLive
|
||||
---
|
||||
|
||||
# Azure.AI.VoiceLive (.NET)
|
||||
|
||||
Real-time voice AI SDK for building bidirectional voice assistants with Azure AI.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.VoiceLive
|
||||
dotnet add package Azure.Identity
|
||||
dotnet add package NAudio # For audio capture/playback
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.0.0, Preview v1.1.0-beta.1
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_VOICELIVE_ENDPOINT=https://<resource>.services.ai.azure.com/
|
||||
AZURE_VOICELIVE_MODEL=gpt-4o-realtime-preview
|
||||
AZURE_VOICELIVE_VOICE=en-US-AvaNeural
|
||||
# Optional: API key if not using Entra ID
|
||||
AZURE_VOICELIVE_API_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Microsoft Entra ID (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.VoiceLive;
|
||||
|
||||
Uri endpoint = new Uri("https://your-resource.cognitiveservices.azure.com");
|
||||
DefaultAzureCredential credential = new DefaultAzureCredential();
|
||||
VoiceLiveClient client = new VoiceLiveClient(endpoint, credential);
|
||||
```
|
||||
|
||||
**Required Role**: `Cognitive Services User` (assign in Azure Portal → Access control)
|
||||
|
||||
### API Key
|
||||
|
||||
```csharp
|
||||
Uri endpoint = new Uri("https://your-resource.cognitiveservices.azure.com");
|
||||
AzureKeyCredential credential = new AzureKeyCredential("your-api-key");
|
||||
VoiceLiveClient client = new VoiceLiveClient(endpoint, credential);
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
VoiceLiveClient
|
||||
└── VoiceLiveSession (WebSocket connection)
|
||||
├── ConfigureSessionAsync()
|
||||
├── GetUpdatesAsync() → SessionUpdate events
|
||||
├── AddItemAsync() → UserMessageItem, FunctionCallOutputItem
|
||||
├── SendAudioAsync()
|
||||
└── StartResponseAsync()
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Start Session and Configure
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.VoiceLive;
|
||||
|
||||
var endpoint = new Uri(Environment.GetEnvironmentVariable("AZURE_VOICELIVE_ENDPOINT"));
|
||||
var client = new VoiceLiveClient(endpoint, new DefaultAzureCredential());
|
||||
|
||||
var model = "gpt-4o-mini-realtime-preview";
|
||||
|
||||
// Start session
|
||||
using VoiceLiveSession session = await client.StartSessionAsync(model);
|
||||
|
||||
// Configure session
|
||||
VoiceLiveSessionOptions sessionOptions = new()
|
||||
{
|
||||
Model = model,
|
||||
Instructions = "You are a helpful AI assistant. Respond naturally.",
|
||||
Voice = new AzureStandardVoice("en-US-AvaNeural"),
|
||||
TurnDetection = new AzureSemanticVadTurnDetection()
|
||||
{
|
||||
Threshold = 0.5f,
|
||||
PrefixPadding = TimeSpan.FromMilliseconds(300),
|
||||
SilenceDuration = TimeSpan.FromMilliseconds(500)
|
||||
},
|
||||
InputAudioFormat = InputAudioFormat.Pcm16,
|
||||
OutputAudioFormat = OutputAudioFormat.Pcm16
|
||||
};
|
||||
|
||||
// Set modalities (both text and audio for voice assistants)
|
||||
sessionOptions.Modalities.Clear();
|
||||
sessionOptions.Modalities.Add(InteractionModality.Text);
|
||||
sessionOptions.Modalities.Add(InteractionModality.Audio);
|
||||
|
||||
await session.ConfigureSessionAsync(sessionOptions);
|
||||
```
|
||||
|
||||
### 2. Process Events
|
||||
|
||||
```csharp
|
||||
await foreach (SessionUpdate serverEvent in session.GetUpdatesAsync())
|
||||
{
|
||||
switch (serverEvent)
|
||||
{
|
||||
case SessionUpdateResponseAudioDelta audioDelta:
|
||||
byte[] audioData = audioDelta.Delta.ToArray();
|
||||
// Play audio via NAudio or other audio library
|
||||
break;
|
||||
|
||||
case SessionUpdateResponseTextDelta textDelta:
|
||||
Console.Write(textDelta.Delta);
|
||||
break;
|
||||
|
||||
case SessionUpdateResponseFunctionCallArgumentsDone functionCall:
|
||||
// Handle function call (see Function Calling section)
|
||||
break;
|
||||
|
||||
case SessionUpdateError error:
|
||||
Console.WriteLine($"Error: {error.Error.Message}");
|
||||
break;
|
||||
|
||||
case SessionUpdateResponseDone:
|
||||
Console.WriteLine("\n--- Response complete ---");
|
||||
break;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Send User Message
|
||||
|
||||
```csharp
|
||||
await session.AddItemAsync(new UserMessageItem("Hello, can you help me?"));
|
||||
await session.StartResponseAsync();
|
||||
```
|
||||
|
||||
### 4. Function Calling
|
||||
|
||||
```csharp
|
||||
// Define function
|
||||
var weatherFunction = new VoiceLiveFunctionDefinition("get_current_weather")
|
||||
{
|
||||
Description = "Get the current weather for a given location",
|
||||
Parameters = BinaryData.FromString("""
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"location": {
|
||||
"type": "string",
|
||||
"description": "The city and state or country"
|
||||
}
|
||||
},
|
||||
"required": ["location"]
|
||||
}
|
||||
""")
|
||||
};
|
||||
|
||||
// Add to session options
|
||||
sessionOptions.Tools.Add(weatherFunction);
|
||||
|
||||
// Handle function call in event loop
|
||||
if (serverEvent is SessionUpdateResponseFunctionCallArgumentsDone functionCall)
|
||||
{
|
||||
if (functionCall.Name == "get_current_weather")
|
||||
{
|
||||
var parameters = JsonSerializer.Deserialize<Dictionary<string, string>>(functionCall.Arguments);
|
||||
string location = parameters?["location"] ?? "";
|
||||
|
||||
// Call external service
|
||||
string weatherInfo = $"The weather in {location} is sunny, 75°F.";
|
||||
|
||||
// Send response
|
||||
await session.AddItemAsync(new FunctionCallOutputItem(functionCall.CallId, weatherInfo));
|
||||
await session.StartResponseAsync();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Voice Options
|
||||
|
||||
| Voice Type | Class | Example |
|
||||
|------------|-------|---------|
|
||||
| Azure Standard | `AzureStandardVoice` | `"en-US-AvaNeural"` |
|
||||
| Azure HD | `AzureStandardVoice` | `"en-US-Ava:DragonHDLatestNeural"` |
|
||||
| Azure Custom | `AzureCustomVoice` | Custom voice with endpoint ID |
|
||||
|
||||
## Supported Models
|
||||
|
||||
| Model | Description |
|
||||
|-------|-------------|
|
||||
| `gpt-4o-realtime-preview` | GPT-4o with real-time audio |
|
||||
| `gpt-4o-mini-realtime-preview` | Lightweight, fast interactions |
|
||||
| `phi4-mm-realtime` | Cost-effective multimodal |
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `VoiceLiveClient` | Main client for creating sessions |
|
||||
| `VoiceLiveSession` | Active WebSocket session |
|
||||
| `VoiceLiveSessionOptions` | Session configuration |
|
||||
| `AzureStandardVoice` | Standard Azure voice provider |
|
||||
| `AzureSemanticVadTurnDetection` | Voice activity detection |
|
||||
| `VoiceLiveFunctionDefinition` | Function tool definition |
|
||||
| `UserMessageItem` | User text message |
|
||||
| `FunctionCallOutputItem` | Function call response |
|
||||
| `SessionUpdateResponseAudioDelta` | Audio chunk event |
|
||||
| `SessionUpdateResponseTextDelta` | Text chunk event |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always set both modalities** — Include `Text` and `Audio` for voice assistants
|
||||
2. **Use `AzureSemanticVadTurnDetection`** — Provides natural conversation flow
|
||||
3. **Configure appropriate silence duration** — 500ms typical to avoid premature cutoffs
|
||||
4. **Use `using` statement** — Ensures proper session disposal
|
||||
5. **Handle all event types** — Check for errors, audio, text, and function calls
|
||||
6. **Use DefaultAzureCredential** — Never hardcode API keys
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
if (serverEvent is SessionUpdateError error)
|
||||
{
|
||||
if (error.Error.Message.Contains("Cancellation failed: no active response"))
|
||||
{
|
||||
// Benign error, can ignore
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"Error: {error.Error.Message}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Audio Configuration
|
||||
|
||||
- **Input Format**: `InputAudioFormat.Pcm16` (16-bit PCM)
|
||||
- **Output Format**: `OutputAudioFormat.Pcm16`
|
||||
- **Sample Rate**: 24kHz recommended
|
||||
- **Channels**: Mono
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.VoiceLive` | Real-time voice (this SDK) | `dotnet add package Azure.AI.VoiceLive` |
|
||||
| `Microsoft.CognitiveServices.Speech` | Speech-to-text, text-to-speech | `dotnet add package Microsoft.CognitiveServices.Speech` |
|
||||
| `NAudio` | Audio capture/playback | `dotnet add package NAudio` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.VoiceLive |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.voicelive |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.VoiceLive |
|
||||
| Quickstart | https://learn.microsoft.com/azure/ai-services/speech-service/voice-live-quickstart |
|
||||
@@ -0,0 +1,328 @@
|
||||
---
|
||||
name: azure-mgmt-weightsandbiases-dotnet
|
||||
description: |
|
||||
Azure Weights & Biases SDK for .NET. ML experiment tracking and model management via Azure Marketplace. Use for creating W&B instances, managing SSO, marketplace integration, and ML observability. Triggers: "Weights and Biases", "W&B", "WeightsAndBiases", "ML experiment tracking", "model registry", "experiment management", "wandb".
|
||||
package: Azure.ResourceManager.WeightsAndBiases
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.WeightsAndBiases (.NET)
|
||||
|
||||
Azure Resource Manager SDK for deploying and managing Weights & Biases ML experiment tracking instances via Azure Marketplace.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.WeightsAndBiases --prerelease
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.0.0-beta.1 (preview)
|
||||
**API Version**: 2024-09-18-preview
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_WANDB_INSTANCE_NAME=<your-wandb-instance>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.WeightsAndBiases;
|
||||
|
||||
ArmClient client = new ArmClient(new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
Subscription
|
||||
└── ResourceGroup
|
||||
└── WeightsAndBiasesInstance # W&B deployment from Azure Marketplace
|
||||
├── Properties
|
||||
│ ├── Marketplace # Offer details, plan, publisher
|
||||
│ ├── User # Admin user info
|
||||
│ ├── PartnerProperties # W&B-specific config (region, subdomain)
|
||||
│ └── SingleSignOnPropertiesV2 # Entra ID SSO configuration
|
||||
└── Identity # Managed identity (optional)
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create Weights & Biases Instance
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.WeightsAndBiases;
|
||||
using Azure.ResourceManager.WeightsAndBiases.Models;
|
||||
|
||||
ResourceGroupResource resourceGroup = await client
|
||||
.GetDefaultSubscriptionAsync()
|
||||
.Result
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
WeightsAndBiasesInstanceCollection instances = resourceGroup.GetWeightsAndBiasesInstances();
|
||||
|
||||
WeightsAndBiasesInstanceData data = new WeightsAndBiasesInstanceData(AzureLocation.EastUS)
|
||||
{
|
||||
Properties = new WeightsAndBiasesInstanceProperties
|
||||
{
|
||||
// Marketplace configuration
|
||||
Marketplace = new WeightsAndBiasesMarketplaceDetails
|
||||
{
|
||||
SubscriptionId = "<marketplace-subscription-id>",
|
||||
OfferDetails = new WeightsAndBiasesOfferDetails
|
||||
{
|
||||
PublisherId = "wandb",
|
||||
OfferId = "wandb-pay-as-you-go",
|
||||
PlanId = "wandb-payg",
|
||||
PlanName = "Pay As You Go",
|
||||
TermId = "monthly",
|
||||
TermUnit = "P1M"
|
||||
}
|
||||
},
|
||||
// Admin user
|
||||
User = new WeightsAndBiasesUserDetails
|
||||
{
|
||||
FirstName = "Admin",
|
||||
LastName = "User",
|
||||
EmailAddress = "admin@example.com",
|
||||
Upn = "admin@example.com"
|
||||
},
|
||||
// W&B-specific configuration
|
||||
PartnerProperties = new WeightsAndBiasesPartnerProperties
|
||||
{
|
||||
Region = WeightsAndBiasesRegion.EastUS,
|
||||
Subdomain = "my-company-wandb"
|
||||
}
|
||||
},
|
||||
// Optional: Enable managed identity
|
||||
Identity = new ManagedServiceIdentity(ManagedServiceIdentityType.SystemAssigned)
|
||||
};
|
||||
|
||||
ArmOperation<WeightsAndBiasesInstanceResource> operation = await instances
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-wandb-instance", data);
|
||||
|
||||
WeightsAndBiasesInstanceResource instance = operation.Value;
|
||||
|
||||
Console.WriteLine($"W&B Instance created: {instance.Data.Name}");
|
||||
Console.WriteLine($"Provisioning state: {instance.Data.Properties.ProvisioningState}");
|
||||
```
|
||||
|
||||
### 2. Get Existing Instance
|
||||
|
||||
```csharp
|
||||
WeightsAndBiasesInstanceResource instance = await resourceGroup
|
||||
.GetWeightsAndBiasesInstanceAsync("my-wandb-instance");
|
||||
|
||||
Console.WriteLine($"Instance: {instance.Data.Name}");
|
||||
Console.WriteLine($"Location: {instance.Data.Location}");
|
||||
Console.WriteLine($"State: {instance.Data.Properties.ProvisioningState}");
|
||||
|
||||
if (instance.Data.Properties.PartnerProperties != null)
|
||||
{
|
||||
Console.WriteLine($"Region: {instance.Data.Properties.PartnerProperties.Region}");
|
||||
Console.WriteLine($"Subdomain: {instance.Data.Properties.PartnerProperties.Subdomain}");
|
||||
}
|
||||
```
|
||||
|
||||
### 3. List All Instances
|
||||
|
||||
```csharp
|
||||
// List in resource group
|
||||
await foreach (WeightsAndBiasesInstanceResource instance in
|
||||
resourceGroup.GetWeightsAndBiasesInstances())
|
||||
{
|
||||
Console.WriteLine($"Instance: {instance.Data.Name}");
|
||||
Console.WriteLine($" Location: {instance.Data.Location}");
|
||||
Console.WriteLine($" State: {instance.Data.Properties.ProvisioningState}");
|
||||
}
|
||||
|
||||
// List in subscription
|
||||
SubscriptionResource subscription = await client.GetDefaultSubscriptionAsync();
|
||||
await foreach (WeightsAndBiasesInstanceResource instance in
|
||||
subscription.GetWeightsAndBiasesInstancesAsync())
|
||||
{
|
||||
Console.WriteLine($"{instance.Data.Name} in {instance.Id.ResourceGroupName}");
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Configure Single Sign-On (SSO)
|
||||
|
||||
```csharp
|
||||
WeightsAndBiasesInstanceResource instance = await resourceGroup
|
||||
.GetWeightsAndBiasesInstanceAsync("my-wandb-instance");
|
||||
|
||||
// Update with SSO configuration
|
||||
WeightsAndBiasesInstanceData updateData = instance.Data;
|
||||
|
||||
updateData.Properties.SingleSignOnPropertiesV2 = new WeightsAndBiasSingleSignOnPropertiesV2
|
||||
{
|
||||
Type = WeightsAndBiasSingleSignOnType.Saml,
|
||||
State = WeightsAndBiasSingleSignOnState.Enable,
|
||||
EnterpriseAppId = "<entra-app-id>",
|
||||
AadDomains = { "example.com", "contoso.com" }
|
||||
};
|
||||
|
||||
ArmOperation<WeightsAndBiasesInstanceResource> operation = await resourceGroup
|
||||
.GetWeightsAndBiasesInstances()
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-wandb-instance", updateData);
|
||||
```
|
||||
|
||||
### 5. Update Instance
|
||||
|
||||
```csharp
|
||||
WeightsAndBiasesInstanceResource instance = await resourceGroup
|
||||
.GetWeightsAndBiasesInstanceAsync("my-wandb-instance");
|
||||
|
||||
// Update tags
|
||||
WeightsAndBiasesInstancePatch patch = new WeightsAndBiasesInstancePatch
|
||||
{
|
||||
Tags =
|
||||
{
|
||||
{ "environment", "production" },
|
||||
{ "team", "ml-platform" },
|
||||
{ "costCenter", "CC-ML-001" }
|
||||
}
|
||||
};
|
||||
|
||||
instance = await instance.UpdateAsync(patch);
|
||||
Console.WriteLine($"Updated instance: {instance.Data.Name}");
|
||||
```
|
||||
|
||||
### 6. Delete Instance
|
||||
|
||||
```csharp
|
||||
WeightsAndBiasesInstanceResource instance = await resourceGroup
|
||||
.GetWeightsAndBiasesInstanceAsync("my-wandb-instance");
|
||||
|
||||
await instance.DeleteAsync(WaitUntil.Completed);
|
||||
Console.WriteLine("Instance deleted");
|
||||
```
|
||||
|
||||
### 7. Check Resource Name Availability
|
||||
|
||||
```csharp
|
||||
// Check if name is available before creating
|
||||
// (Implement via direct ARM call if SDK doesn't expose this)
|
||||
try
|
||||
{
|
||||
await resourceGroup.GetWeightsAndBiasesInstanceAsync("desired-name");
|
||||
Console.WriteLine("Name is already taken");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 404)
|
||||
{
|
||||
Console.WriteLine("Name is available");
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `WeightsAndBiasesInstanceResource` | W&B instance resource |
|
||||
| `WeightsAndBiasesInstanceData` | Instance configuration data |
|
||||
| `WeightsAndBiasesInstanceCollection` | Collection of instances |
|
||||
| `WeightsAndBiasesInstanceProperties` | Instance properties |
|
||||
| `WeightsAndBiasesMarketplaceDetails` | Marketplace subscription info |
|
||||
| `WeightsAndBiasesOfferDetails` | Marketplace offer details |
|
||||
| `WeightsAndBiasesUserDetails` | Admin user information |
|
||||
| `WeightsAndBiasesPartnerProperties` | W&B-specific configuration |
|
||||
| `WeightsAndBiasSingleSignOnPropertiesV2` | SSO configuration |
|
||||
| `WeightsAndBiasesInstancePatch` | Patch for updates |
|
||||
| `WeightsAndBiasesRegion` | Supported regions enum |
|
||||
|
||||
## Available Regions
|
||||
|
||||
| Region Enum | Azure Region |
|
||||
|-------------|--------------|
|
||||
| `WeightsAndBiasesRegion.EastUS` | East US |
|
||||
| `WeightsAndBiasesRegion.CentralUS` | Central US |
|
||||
| `WeightsAndBiasesRegion.WestUS` | West US |
|
||||
| `WeightsAndBiasesRegion.WestEurope` | West Europe |
|
||||
| `WeightsAndBiasesRegion.JapanEast` | Japan East |
|
||||
| `WeightsAndBiasesRegion.KoreaCentral` | Korea Central |
|
||||
|
||||
## Marketplace Offer Details
|
||||
|
||||
For Azure Marketplace integration:
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Publisher ID | `wandb` |
|
||||
| Offer ID | `wandb-pay-as-you-go` |
|
||||
| Plan ID | `wandb-payg` (Pay As You Go) |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** — Supports multiple auth methods automatically
|
||||
2. **Enable managed identity** — For secure access to other Azure resources
|
||||
3. **Configure SSO** — Enable Entra ID SSO for enterprise security
|
||||
4. **Tag resources** — Use tags for cost tracking and organization
|
||||
5. **Check provisioning state** — Wait for `Succeeded` before using instance
|
||||
6. **Use appropriate region** — Choose region closest to your compute
|
||||
7. **Monitor with Azure** — Use Azure Monitor for resource health
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ArmOperation<WeightsAndBiasesInstanceResource> operation = await instances
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-wandb", data);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Instance already exists or name conflict");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid configuration: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Integration with W&B SDK
|
||||
|
||||
After creating the Azure resource, use the W&B Python SDK for experiment tracking:
|
||||
|
||||
```python
|
||||
# Install: pip install wandb
|
||||
import wandb
|
||||
|
||||
# Login with your W&B API key from the Azure-deployed instance
|
||||
wandb.login(host="https://my-company-wandb.wandb.ai")
|
||||
|
||||
# Initialize a run
|
||||
run = wandb.init(project="my-ml-project")
|
||||
|
||||
# Log metrics
|
||||
wandb.log({"accuracy": 0.95, "loss": 0.05})
|
||||
|
||||
# Finish run
|
||||
run.finish()
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.WeightsAndBiases` | W&B instance management (this SDK) | `dotnet add package Azure.ResourceManager.WeightsAndBiases --prerelease` |
|
||||
| `Azure.ResourceManager.MachineLearning` | Azure ML workspaces | `dotnet add package Azure.ResourceManager.MachineLearning` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.ResourceManager.WeightsAndBiases |
|
||||
| W&B Documentation | https://docs.wandb.ai/ |
|
||||
| Azure Marketplace | https://azuremarketplace.microsoft.com/marketplace/apps/wandb.wandb-pay-as-you-go |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/weightsandbiases |
|
||||
494
skills/official/microsoft/dotnet/general/maps/SKILL.md
Normal file
494
skills/official/microsoft/dotnet/general/maps/SKILL.md
Normal file
@@ -0,0 +1,494 @@
|
||||
---
|
||||
name: azure-maps-search-dotnet
|
||||
description: |
|
||||
Azure Maps SDK for .NET. Location-based services including geocoding, routing, rendering, geolocation, and weather. Use for address search, directions, map tiles, IP geolocation, and weather data. Triggers: "Azure Maps", "MapsSearchClient", "MapsRoutingClient", "MapsRenderingClient", "geocoding .NET", "route directions", "map tiles", "geolocation".
|
||||
package: Azure.Maps.Search
|
||||
---
|
||||
|
||||
# Azure Maps (.NET)
|
||||
|
||||
Azure Maps SDK for .NET providing location-based services: geocoding, routing, rendering, geolocation, and weather.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Search (geocoding, reverse geocoding)
|
||||
dotnet add package Azure.Maps.Search --prerelease
|
||||
|
||||
# Routing (directions, route matrix)
|
||||
dotnet add package Azure.Maps.Routing --prerelease
|
||||
|
||||
# Rendering (map tiles, static images)
|
||||
dotnet add package Azure.Maps.Rendering --prerelease
|
||||
|
||||
# Geolocation (IP to location)
|
||||
dotnet add package Azure.Maps.Geolocation --prerelease
|
||||
|
||||
# Weather
|
||||
dotnet add package Azure.Maps.Weather --prerelease
|
||||
|
||||
# Resource Management (account management, SAS tokens)
|
||||
dotnet add package Azure.ResourceManager.Maps --prerelease
|
||||
|
||||
# Required for authentication
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**:
|
||||
- `Azure.Maps.Search`: v2.0.0-beta.5
|
||||
- `Azure.Maps.Routing`: v1.0.0-beta.4
|
||||
- `Azure.Maps.Rendering`: v2.0.0-beta.1
|
||||
- `Azure.Maps.Geolocation`: v1.0.0-beta.3
|
||||
- `Azure.ResourceManager.Maps`: v1.1.0-beta.2
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_MAPS_SUBSCRIPTION_KEY=<your-subscription-key>
|
||||
AZURE_MAPS_CLIENT_ID=<your-client-id> # For Entra ID auth
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Subscription Key (Shared Key)
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Maps.Search;
|
||||
|
||||
var subscriptionKey = Environment.GetEnvironmentVariable("AZURE_MAPS_SUBSCRIPTION_KEY");
|
||||
var credential = new AzureKeyCredential(subscriptionKey);
|
||||
|
||||
var client = new MapsSearchClient(credential);
|
||||
```
|
||||
|
||||
### Microsoft Entra ID (Recommended for Production)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Maps.Search;
|
||||
|
||||
var credential = new DefaultAzureCredential();
|
||||
var clientId = Environment.GetEnvironmentVariable("AZURE_MAPS_CLIENT_ID");
|
||||
|
||||
var client = new MapsSearchClient(credential, clientId);
|
||||
```
|
||||
|
||||
### Shared Access Signature (SAS)
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Core;
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.Maps;
|
||||
using Azure.ResourceManager.Maps.Models;
|
||||
using Azure.Maps.Search;
|
||||
|
||||
// Authenticate with Azure Resource Manager
|
||||
ArmClient armClient = new ArmClient(new DefaultAzureCredential());
|
||||
|
||||
// Get Maps account resource
|
||||
ResourceIdentifier mapsAccountResourceId = MapsAccountResource.CreateResourceIdentifier(
|
||||
subscriptionId, resourceGroupName, accountName);
|
||||
MapsAccountResource mapsAccount = armClient.GetMapsAccountResource(mapsAccountResourceId);
|
||||
|
||||
// Generate SAS token
|
||||
MapsAccountSasContent sasContent = new MapsAccountSasContent(
|
||||
MapsSigningKey.PrimaryKey,
|
||||
principalId,
|
||||
maxRatePerSecond: 500,
|
||||
start: DateTime.UtcNow.ToString("O"),
|
||||
expiry: DateTime.UtcNow.AddDays(1).ToString("O"));
|
||||
|
||||
Response<MapsAccountSasToken> sas = mapsAccount.GetSas(sasContent);
|
||||
|
||||
// Create client with SAS token
|
||||
var sasCredential = new AzureSasCredential(sas.Value.AccountSasToken);
|
||||
var client = new MapsSearchClient(sasCredential);
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
Azure.Maps.Search
|
||||
└── MapsSearchClient
|
||||
├── GetGeocoding() → Geocode addresses
|
||||
├── GetGeocodingBatch() → Batch geocoding
|
||||
├── GetReverseGeocoding() → Coordinates to address
|
||||
├── GetReverseGeocodingBatch() → Batch reverse geocoding
|
||||
└── GetPolygon() → Get boundary polygons
|
||||
|
||||
Azure.Maps.Routing
|
||||
└── MapsRoutingClient
|
||||
├── GetDirections() → Route directions
|
||||
├── GetImmediateRouteMatrix() → Route matrix (sync, ≤100)
|
||||
├── GetRouteMatrix() → Route matrix (async, ≤700)
|
||||
└── GetRouteRange() → Isochrone/reachable range
|
||||
|
||||
Azure.Maps.Rendering
|
||||
└── MapsRenderingClient
|
||||
├── GetMapTile() → Map tiles
|
||||
├── GetMapStaticImage() → Static map images
|
||||
└── GetCopyrightCaption() → Copyright info
|
||||
|
||||
Azure.Maps.Geolocation
|
||||
└── MapsGeolocationClient
|
||||
└── GetCountryCode() → IP to country/region
|
||||
|
||||
Azure.Maps.Weather
|
||||
└── MapsWeatherClient
|
||||
├── GetCurrentWeatherConditions() → Current weather
|
||||
├── GetDailyForecast() → Daily forecast
|
||||
├── GetHourlyForecast() → Hourly forecast
|
||||
└── GetSevereWeatherAlerts() → Weather alerts
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Geocoding (Address to Coordinates)
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Maps.Search;
|
||||
|
||||
var credential = new AzureKeyCredential(subscriptionKey);
|
||||
var client = new MapsSearchClient(credential);
|
||||
|
||||
Response<GeocodingResponse> result = client.GetGeocoding("1 Microsoft Way, Redmond, WA 98052");
|
||||
|
||||
foreach (var feature in result.Value.Features)
|
||||
{
|
||||
Console.WriteLine($"Coordinates: {string.Join(",", feature.Geometry.Coordinates)}");
|
||||
Console.WriteLine($"Address: {feature.Properties.Address.FormattedAddress}");
|
||||
Console.WriteLine($"Confidence: {feature.Properties.Confidence}");
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Batch Geocoding
|
||||
|
||||
```csharp
|
||||
using Azure.Maps.Search.Models.Queries;
|
||||
|
||||
List<GeocodingQuery> queries = new List<GeocodingQuery>
|
||||
{
|
||||
new GeocodingQuery() { Query = "400 Broad St, Seattle, WA" },
|
||||
new GeocodingQuery() { Query = "1 Microsoft Way, Redmond, WA" },
|
||||
new GeocodingQuery() { AddressLine = "Space Needle", Top = 1 },
|
||||
};
|
||||
|
||||
Response<GeocodingBatchResponse> results = client.GetGeocodingBatch(queries);
|
||||
|
||||
foreach (var batchItem in results.Value.BatchItems)
|
||||
{
|
||||
foreach (var feature in batchItem.Features)
|
||||
{
|
||||
Console.WriteLine($"Coordinates: {string.Join(",", feature.Geometry.Coordinates)}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Reverse Geocoding (Coordinates to Address)
|
||||
|
||||
```csharp
|
||||
using Azure.Core.GeoJson;
|
||||
|
||||
GeoPosition coordinates = new GeoPosition(-122.138685, 47.6305637);
|
||||
Response<GeocodingResponse> result = client.GetReverseGeocoding(coordinates);
|
||||
|
||||
foreach (var feature in result.Value.Features)
|
||||
{
|
||||
Console.WriteLine($"Address: {feature.Properties.Address.FormattedAddress}");
|
||||
Console.WriteLine($"Locality: {feature.Properties.Address.Locality}");
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Get Boundary Polygon
|
||||
|
||||
```csharp
|
||||
using Azure.Maps.Search.Models;
|
||||
|
||||
GetPolygonOptions options = new GetPolygonOptions()
|
||||
{
|
||||
Coordinates = new GeoPosition(-122.204141, 47.61256),
|
||||
ResultType = BoundaryResultTypeEnum.Locality,
|
||||
Resolution = ResolutionEnum.Small,
|
||||
};
|
||||
|
||||
Response<Boundary> result = client.GetPolygon(options);
|
||||
|
||||
Console.WriteLine($"Boundary copyright: {result.Value.Properties?.Copyright}");
|
||||
Console.WriteLine($"Polygon count: {result.Value.Geometry.Count}");
|
||||
```
|
||||
|
||||
### 5. Route Directions
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Core.GeoJson;
|
||||
using Azure.Maps.Routing;
|
||||
using Azure.Maps.Routing.Models;
|
||||
|
||||
var client = new MapsRoutingClient(new AzureKeyCredential(subscriptionKey));
|
||||
|
||||
List<GeoPosition> routePoints = new List<GeoPosition>()
|
||||
{
|
||||
new GeoPosition(-122.34, 47.61), // Seattle
|
||||
new GeoPosition(-122.13, 47.64) // Redmond
|
||||
};
|
||||
|
||||
RouteDirectionQuery query = new RouteDirectionQuery(routePoints);
|
||||
Response<RouteDirections> result = client.GetDirections(query);
|
||||
|
||||
foreach (var route in result.Value.Routes)
|
||||
{
|
||||
Console.WriteLine($"Distance: {route.Summary.LengthInMeters} meters");
|
||||
Console.WriteLine($"Duration: {route.Summary.TravelTimeDuration}");
|
||||
|
||||
foreach (RouteLeg leg in route.Legs)
|
||||
{
|
||||
Console.WriteLine($"Leg points: {leg.Points.Count}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Route Directions with Options
|
||||
|
||||
```csharp
|
||||
RouteDirectionOptions options = new RouteDirectionOptions()
|
||||
{
|
||||
RouteType = RouteType.Fastest,
|
||||
UseTrafficData = true,
|
||||
TravelMode = TravelMode.Bicycle,
|
||||
Language = RoutingLanguage.EnglishUsa,
|
||||
InstructionsType = RouteInstructionsType.Text,
|
||||
};
|
||||
|
||||
RouteDirectionQuery query = new RouteDirectionQuery(routePoints)
|
||||
{
|
||||
RouteDirectionOptions = options
|
||||
};
|
||||
|
||||
Response<RouteDirections> result = client.GetDirections(query);
|
||||
```
|
||||
|
||||
### 7. Route Matrix
|
||||
|
||||
```csharp
|
||||
RouteMatrixQuery routeMatrixQuery = new RouteMatrixQuery
|
||||
{
|
||||
Origins = new List<GeoPosition>()
|
||||
{
|
||||
new GeoPosition(-122.34, 47.61),
|
||||
new GeoPosition(-122.13, 47.64)
|
||||
},
|
||||
Destinations = new List<GeoPosition>()
|
||||
{
|
||||
new GeoPosition(-122.20, 47.62),
|
||||
new GeoPosition(-122.40, 47.65)
|
||||
},
|
||||
};
|
||||
|
||||
// Synchronous (up to 100 route combinations)
|
||||
Response<RouteMatrixResult> result = client.GetImmediateRouteMatrix(routeMatrixQuery);
|
||||
|
||||
foreach (var cell in result.Value.Matrix.SelectMany(row => row))
|
||||
{
|
||||
Console.WriteLine($"Distance: {cell.Response?.RouteSummary?.LengthInMeters}");
|
||||
Console.WriteLine($"Duration: {cell.Response?.RouteSummary?.TravelTimeDuration}");
|
||||
}
|
||||
|
||||
// Asynchronous (up to 700 route combinations)
|
||||
RouteMatrixOptions routeMatrixOptions = new RouteMatrixOptions(routeMatrixQuery)
|
||||
{
|
||||
TravelTimeType = TravelTimeType.All,
|
||||
};
|
||||
GetRouteMatrixOperation asyncResult = client.GetRouteMatrix(WaitUntil.Completed, routeMatrixOptions);
|
||||
```
|
||||
|
||||
### 8. Route Range (Isochrone)
|
||||
|
||||
```csharp
|
||||
RouteRangeOptions options = new RouteRangeOptions(-122.34, 47.61)
|
||||
{
|
||||
TimeBudget = new TimeSpan(0, 20, 0) // 20 minutes
|
||||
};
|
||||
|
||||
Response<RouteRangeResult> result = client.GetRouteRange(options);
|
||||
|
||||
// result.Value.ReachableRange contains the polygon
|
||||
Console.WriteLine($"Boundary points: {result.Value.ReachableRange.Boundary.Count}");
|
||||
```
|
||||
|
||||
### 9. Get Map Tiles
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Maps.Rendering;
|
||||
|
||||
var client = new MapsRenderingClient(new AzureKeyCredential(subscriptionKey));
|
||||
|
||||
int zoom = 10;
|
||||
int tileSize = 256;
|
||||
|
||||
// Convert coordinates to tile index
|
||||
MapTileIndex tileIndex = MapsRenderingClient.PositionToTileXY(
|
||||
new GeoPosition(13.3854, 52.517), zoom, tileSize);
|
||||
|
||||
// Fetch map tile
|
||||
GetMapTileOptions options = new GetMapTileOptions(
|
||||
MapTileSetId.MicrosoftImagery,
|
||||
new MapTileIndex(tileIndex.X, tileIndex.Y, zoom)
|
||||
);
|
||||
|
||||
Response<Stream> mapTile = client.GetMapTile(options);
|
||||
|
||||
// Save to file
|
||||
using (FileStream fileStream = File.Create("./MapTile.png"))
|
||||
{
|
||||
mapTile.Value.CopyTo(fileStream);
|
||||
}
|
||||
```
|
||||
|
||||
### 10. IP Geolocation
|
||||
|
||||
```csharp
|
||||
using System.Net;
|
||||
using Azure;
|
||||
using Azure.Maps.Geolocation;
|
||||
|
||||
var client = new MapsGeolocationClient(new AzureKeyCredential(subscriptionKey));
|
||||
|
||||
IPAddress ipAddress = IPAddress.Parse("2001:4898:80e8:b::189");
|
||||
Response<CountryRegionResult> result = client.GetCountryCode(ipAddress);
|
||||
|
||||
Console.WriteLine($"Country ISO Code: {result.Value.IsoCode}");
|
||||
```
|
||||
|
||||
### 11. Current Weather
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Core.GeoJson;
|
||||
using Azure.Maps.Weather;
|
||||
|
||||
var client = new MapsWeatherClient(new AzureKeyCredential(subscriptionKey));
|
||||
|
||||
var position = new GeoPosition(-122.13071, 47.64011);
|
||||
var options = new GetCurrentWeatherConditionsOptions(position);
|
||||
|
||||
Response<CurrentConditionsResult> result = client.GetCurrentWeatherConditions(options);
|
||||
|
||||
foreach (var condition in result.Value.Results)
|
||||
{
|
||||
Console.WriteLine($"Temperature: {condition.Temperature.Value} {condition.Temperature.Unit}");
|
||||
Console.WriteLine($"Weather: {condition.Phrase}");
|
||||
Console.WriteLine($"Humidity: {condition.RelativeHumidity}%");
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
### Search Package
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `MapsSearchClient` | Main client for search operations |
|
||||
| `GeocodingResponse` | Geocoding result |
|
||||
| `GeocodingBatchResponse` | Batch geocoding result |
|
||||
| `GeocodingQuery` | Query for batch geocoding |
|
||||
| `ReverseGeocodingQuery` | Query for batch reverse geocoding |
|
||||
| `GetPolygonOptions` | Options for polygon retrieval |
|
||||
| `Boundary` | Boundary polygon result |
|
||||
| `BoundaryResultTypeEnum` | Boundary type (Locality, AdminDistrict, etc.) |
|
||||
| `ResolutionEnum` | Polygon resolution (Small, Medium, Large) |
|
||||
|
||||
### Routing Package
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `MapsRoutingClient` | Main client for routing operations |
|
||||
| `RouteDirectionQuery` | Query for route directions |
|
||||
| `RouteDirectionOptions` | Route calculation options |
|
||||
| `RouteDirections` | Route directions result |
|
||||
| `RouteLeg` | Segment of a route |
|
||||
| `RouteMatrixQuery` | Query for route matrix |
|
||||
| `RouteMatrixResult` | Route matrix result |
|
||||
| `RouteRangeOptions` | Options for isochrone |
|
||||
| `RouteRangeResult` | Isochrone result |
|
||||
| `RouteType` | Route type (Fastest, Shortest, Eco, Thrilling) |
|
||||
| `TravelMode` | Travel mode (Car, Truck, Bicycle, Pedestrian) |
|
||||
|
||||
### Rendering Package
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `MapsRenderingClient` | Main client for rendering |
|
||||
| `GetMapTileOptions` | Map tile options |
|
||||
| `MapTileIndex` | Tile coordinates (X, Y, Zoom) |
|
||||
| `MapTileSetId` | Tile set identifier |
|
||||
|
||||
### Common Types
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `GeoPosition` | Geographic position (longitude, latitude) |
|
||||
| `GeoBoundingBox` | Bounding box for geographic area |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID for production** — Prefer over subscription keys
|
||||
2. **Batch operations** — Use batch geocoding for multiple addresses
|
||||
3. **Cache results** — Geocoding results don't change frequently
|
||||
4. **Use appropriate tile sizes** — 256 or 512 pixels based on display
|
||||
5. **Handle rate limits** — Implement exponential backoff
|
||||
6. **Use async route matrix** — For large matrix calculations (>100)
|
||||
7. **Consider traffic data** — Set `UseTrafficData = true` for accurate ETAs
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
try
|
||||
{
|
||||
Response<GeocodingResponse> result = client.GetGeocoding(address);
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Status: {ex.Status}");
|
||||
Console.WriteLine($"Error: {ex.Message}");
|
||||
|
||||
switch (ex.Status)
|
||||
{
|
||||
case 400:
|
||||
// Invalid request parameters
|
||||
break;
|
||||
case 401:
|
||||
// Authentication failed
|
||||
break;
|
||||
case 429:
|
||||
// Rate limited - implement backoff
|
||||
break;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Maps.Search` | Geocoding, search | `dotnet add package Azure.Maps.Search --prerelease` |
|
||||
| `Azure.Maps.Routing` | Directions, matrix | `dotnet add package Azure.Maps.Routing --prerelease` |
|
||||
| `Azure.Maps.Rendering` | Map tiles, images | `dotnet add package Azure.Maps.Rendering --prerelease` |
|
||||
| `Azure.Maps.Geolocation` | IP geolocation | `dotnet add package Azure.Maps.Geolocation --prerelease` |
|
||||
| `Azure.Maps.Weather` | Weather data | `dotnet add package Azure.Maps.Weather --prerelease` |
|
||||
| `Azure.ResourceManager.Maps` | Account management | `dotnet add package Azure.ResourceManager.Maps --prerelease` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Azure Maps Documentation | https://learn.microsoft.com/azure/azure-maps/ |
|
||||
| Search API Reference | https://learn.microsoft.com/dotnet/api/azure.maps.search |
|
||||
| Routing API Reference | https://learn.microsoft.com/dotnet/api/azure.maps.routing |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/maps |
|
||||
| Pricing | https://azure.microsoft.com/pricing/details/azure-maps/ |
|
||||
411
skills/official/microsoft/dotnet/integration/apicenter/SKILL.md
Normal file
411
skills/official/microsoft/dotnet/integration/apicenter/SKILL.md
Normal file
@@ -0,0 +1,411 @@
|
||||
---
|
||||
name: azure-mgmt-apicenter-dotnet
|
||||
description: |
|
||||
Azure API Center SDK for .NET. Centralized API inventory management with governance, versioning, and discovery. Use for creating API services, workspaces, APIs, versions, definitions, environments, deployments, and metadata schemas. Triggers: "API Center", "ApiCenterService", "ApiCenterWorkspace", "ApiCenterApi", "API inventory", "API governance", "API versioning", "API catalog", "API discovery".
|
||||
package: Azure.ResourceManager.ApiCenter
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.ApiCenter (.NET)
|
||||
|
||||
Centralized API inventory and governance SDK for managing APIs across your organization.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.ApiCenter
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.0.0 (GA)
|
||||
**API Version**: 2024-03-01
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_APICENTER_SERVICE_NAME=<your-apicenter-service>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.ApiCenter;
|
||||
|
||||
ArmClient client = new ArmClient(new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
Subscription
|
||||
└── ResourceGroup
|
||||
└── ApiCenterService # API inventory service
|
||||
├── Workspace # Logical grouping of APIs
|
||||
│ ├── Api # API definition
|
||||
│ │ └── ApiVersion # Version of the API
|
||||
│ │ └── ApiDefinition # OpenAPI/GraphQL/etc specification
|
||||
│ ├── Environment # Deployment target (dev/staging/prod)
|
||||
│ └── Deployment # API deployed to environment
|
||||
└── MetadataSchema # Custom metadata definitions
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create API Center Service
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.ApiCenter;
|
||||
using Azure.ResourceManager.ApiCenter.Models;
|
||||
|
||||
ResourceGroupResource resourceGroup = await client
|
||||
.GetDefaultSubscriptionAsync()
|
||||
.Result
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
ApiCenterServiceCollection services = resourceGroup.GetApiCenterServices();
|
||||
|
||||
ApiCenterServiceData data = new ApiCenterServiceData(AzureLocation.EastUS)
|
||||
{
|
||||
Identity = new ManagedServiceIdentity(ManagedServiceIdentityType.SystemAssigned)
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterServiceResource> operation = await services
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-api-center", data);
|
||||
|
||||
ApiCenterServiceResource service = operation.Value;
|
||||
```
|
||||
|
||||
### 2. Create Workspace
|
||||
|
||||
```csharp
|
||||
ApiCenterWorkspaceCollection workspaces = service.GetApiCenterWorkspaces();
|
||||
|
||||
ApiCenterWorkspaceData workspaceData = new ApiCenterWorkspaceData
|
||||
{
|
||||
Title = "Engineering APIs",
|
||||
Description = "APIs owned by the engineering team"
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterWorkspaceResource> operation = await workspaces
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "engineering", workspaceData);
|
||||
|
||||
ApiCenterWorkspaceResource workspace = operation.Value;
|
||||
```
|
||||
|
||||
### 3. Create API
|
||||
|
||||
```csharp
|
||||
ApiCenterApiCollection apis = workspace.GetApiCenterApis();
|
||||
|
||||
ApiCenterApiData apiData = new ApiCenterApiData
|
||||
{
|
||||
Title = "Orders API",
|
||||
Description = "API for managing customer orders",
|
||||
Kind = ApiKind.Rest,
|
||||
LifecycleStage = ApiLifecycleStage.Production,
|
||||
TermsOfService = new ApiTermsOfService
|
||||
{
|
||||
Uri = new Uri("https://example.com/terms")
|
||||
},
|
||||
ExternalDocumentation =
|
||||
{
|
||||
new ApiExternalDocumentation
|
||||
{
|
||||
Title = "Documentation",
|
||||
Uri = new Uri("https://docs.example.com/orders")
|
||||
}
|
||||
},
|
||||
Contacts =
|
||||
{
|
||||
new ApiContact
|
||||
{
|
||||
Name = "API Support",
|
||||
Email = "api-support@example.com"
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Add custom metadata
|
||||
apiData.CustomProperties = BinaryData.FromObjectAsJson(new
|
||||
{
|
||||
team = "orders-team",
|
||||
costCenter = "CC-1234"
|
||||
});
|
||||
|
||||
ArmOperation<ApiCenterApiResource> operation = await apis
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "orders-api", apiData);
|
||||
|
||||
ApiCenterApiResource api = operation.Value;
|
||||
```
|
||||
|
||||
### 4. Create API Version
|
||||
|
||||
```csharp
|
||||
ApiCenterApiVersionCollection versions = api.GetApiCenterApiVersions();
|
||||
|
||||
ApiCenterApiVersionData versionData = new ApiCenterApiVersionData
|
||||
{
|
||||
Title = "v1.0.0",
|
||||
LifecycleStage = ApiLifecycleStage.Production
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterApiVersionResource> operation = await versions
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "v1-0-0", versionData);
|
||||
|
||||
ApiCenterApiVersionResource version = operation.Value;
|
||||
```
|
||||
|
||||
### 5. Create API Definition (Upload OpenAPI Spec)
|
||||
|
||||
```csharp
|
||||
ApiCenterApiDefinitionCollection definitions = version.GetApiCenterApiDefinitions();
|
||||
|
||||
ApiCenterApiDefinitionData definitionData = new ApiCenterApiDefinitionData
|
||||
{
|
||||
Title = "OpenAPI Specification",
|
||||
Description = "Orders API OpenAPI 3.0 definition"
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterApiDefinitionResource> operation = await definitions
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "openapi", definitionData);
|
||||
|
||||
ApiCenterApiDefinitionResource definition = operation.Value;
|
||||
|
||||
// Import specification
|
||||
string openApiSpec = await File.ReadAllTextAsync("orders-api.yaml");
|
||||
|
||||
ApiSpecImportContent importContent = new ApiSpecImportContent
|
||||
{
|
||||
Format = ApiSpecImportSourceFormat.Inline,
|
||||
Value = openApiSpec,
|
||||
Specification = new ApiSpecImportSpecification
|
||||
{
|
||||
Name = "openapi",
|
||||
Version = "3.0.1"
|
||||
}
|
||||
};
|
||||
|
||||
await definition.ImportSpecificationAsync(WaitUntil.Completed, importContent);
|
||||
```
|
||||
|
||||
### 6. Export API Specification
|
||||
|
||||
```csharp
|
||||
ApiCenterApiDefinitionResource definition = await client
|
||||
.GetApiCenterApiDefinitionResource(definitionResourceId)
|
||||
.GetAsync();
|
||||
|
||||
ArmOperation<ApiSpecExportResult> operation = await definition
|
||||
.ExportSpecificationAsync(WaitUntil.Completed);
|
||||
|
||||
ApiSpecExportResult result = operation.Value;
|
||||
|
||||
// result.Format - e.g., "inline"
|
||||
// result.Value - the specification content
|
||||
```
|
||||
|
||||
### 7. Create Environment
|
||||
|
||||
```csharp
|
||||
ApiCenterEnvironmentCollection environments = workspace.GetApiCenterEnvironments();
|
||||
|
||||
ApiCenterEnvironmentData envData = new ApiCenterEnvironmentData
|
||||
{
|
||||
Title = "Production",
|
||||
Description = "Production environment",
|
||||
Kind = ApiCenterEnvironmentKind.Production,
|
||||
Server = new ApiCenterEnvironmentServer
|
||||
{
|
||||
ManagementPortalUris = { new Uri("https://portal.azure.com") }
|
||||
},
|
||||
Onboarding = new EnvironmentOnboardingModel
|
||||
{
|
||||
Instructions = "Contact platform team for access",
|
||||
DeveloperPortalUris = { new Uri("https://developer.example.com") }
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterEnvironmentResource> operation = await environments
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "production", envData);
|
||||
```
|
||||
|
||||
### 8. Create Deployment
|
||||
|
||||
```csharp
|
||||
ApiCenterDeploymentCollection deployments = workspace.GetApiCenterDeployments();
|
||||
|
||||
// Get environment resource ID
|
||||
ResourceIdentifier envResourceId = ApiCenterEnvironmentResource.CreateResourceIdentifier(
|
||||
subscriptionId, resourceGroupName, serviceName, workspaceName, "production");
|
||||
|
||||
// Get API definition resource ID
|
||||
ResourceIdentifier definitionResourceId = ApiCenterApiDefinitionResource.CreateResourceIdentifier(
|
||||
subscriptionId, resourceGroupName, serviceName, workspaceName,
|
||||
"orders-api", "v1-0-0", "openapi");
|
||||
|
||||
ApiCenterDeploymentData deploymentData = new ApiCenterDeploymentData
|
||||
{
|
||||
Title = "Orders API - Production",
|
||||
Description = "Production deployment of Orders API v1.0.0",
|
||||
EnvironmentId = envResourceId,
|
||||
DefinitionId = definitionResourceId,
|
||||
State = ApiCenterDeploymentState.Active,
|
||||
Server = new ApiCenterDeploymentServer
|
||||
{
|
||||
RuntimeUris = { new Uri("https://api.example.com/orders") }
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterDeploymentResource> operation = await deployments
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "orders-api-prod", deploymentData);
|
||||
```
|
||||
|
||||
### 9. Create Metadata Schema
|
||||
|
||||
```csharp
|
||||
ApiCenterMetadataSchemaCollection schemas = service.GetApiCenterMetadataSchemas();
|
||||
|
||||
string jsonSchema = """
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"team": {
|
||||
"type": "string",
|
||||
"title": "Owning Team"
|
||||
},
|
||||
"costCenter": {
|
||||
"type": "string",
|
||||
"title": "Cost Center"
|
||||
},
|
||||
"dataClassification": {
|
||||
"type": "string",
|
||||
"enum": ["public", "internal", "confidential"],
|
||||
"title": "Data Classification"
|
||||
}
|
||||
},
|
||||
"required": ["team"]
|
||||
}
|
||||
""";
|
||||
|
||||
ApiCenterMetadataSchemaData schemaData = new ApiCenterMetadataSchemaData
|
||||
{
|
||||
Schema = jsonSchema,
|
||||
AssignedTo =
|
||||
{
|
||||
new MetadataAssignment
|
||||
{
|
||||
Entity = MetadataAssignmentEntity.Api,
|
||||
Required = true
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterMetadataSchemaResource> operation = await schemas
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "api-metadata", schemaData);
|
||||
```
|
||||
|
||||
### 10. List and Search APIs
|
||||
|
||||
```csharp
|
||||
// List all APIs in a workspace
|
||||
ApiCenterWorkspaceResource workspace = await client
|
||||
.GetApiCenterWorkspaceResource(workspaceResourceId)
|
||||
.GetAsync();
|
||||
|
||||
await foreach (ApiCenterApiResource api in workspace.GetApiCenterApis())
|
||||
{
|
||||
Console.WriteLine($"API: {api.Data.Title}");
|
||||
Console.WriteLine($" Kind: {api.Data.Kind}");
|
||||
Console.WriteLine($" Stage: {api.Data.LifecycleStage}");
|
||||
|
||||
// List versions
|
||||
await foreach (ApiCenterApiVersionResource version in api.GetApiCenterApiVersions())
|
||||
{
|
||||
Console.WriteLine($" Version: {version.Data.Title}");
|
||||
}
|
||||
}
|
||||
|
||||
// List environments
|
||||
await foreach (ApiCenterEnvironmentResource env in workspace.GetApiCenterEnvironments())
|
||||
{
|
||||
Console.WriteLine($"Environment: {env.Data.Title} ({env.Data.Kind})");
|
||||
}
|
||||
|
||||
// List deployments
|
||||
await foreach (ApiCenterDeploymentResource deployment in workspace.GetApiCenterDeployments())
|
||||
{
|
||||
Console.WriteLine($"Deployment: {deployment.Data.Title}");
|
||||
Console.WriteLine($" State: {deployment.Data.State}");
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ApiCenterServiceResource` | API Center service instance |
|
||||
| `ApiCenterWorkspaceResource` | Logical grouping of APIs |
|
||||
| `ApiCenterApiResource` | Individual API |
|
||||
| `ApiCenterApiVersionResource` | Version of an API |
|
||||
| `ApiCenterApiDefinitionResource` | API specification (OpenAPI, etc.) |
|
||||
| `ApiCenterEnvironmentResource` | Deployment environment |
|
||||
| `ApiCenterDeploymentResource` | API deployment to environment |
|
||||
| `ApiCenterMetadataSchemaResource` | Custom metadata schema |
|
||||
| `ApiKind` | rest, graphql, grpc, soap, webhook, websocket, mcp |
|
||||
| `ApiLifecycleStage` | design, development, testing, preview, production, deprecated, retired |
|
||||
| `ApiCenterEnvironmentKind` | development, testing, staging, production |
|
||||
| `ApiCenterDeploymentState` | active, inactive |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Organize with workspaces** — Group APIs by team, domain, or product
|
||||
2. **Use metadata schemas** — Define custom properties for governance
|
||||
3. **Track lifecycle stages** — Keep API status current (design → production → deprecated)
|
||||
4. **Document environments** — Include onboarding instructions and portal URIs
|
||||
5. **Version consistently** — Use semantic versioning for API versions
|
||||
6. **Import specifications** — Upload OpenAPI/GraphQL specs for discovery
|
||||
7. **Link deployments** — Connect APIs to their runtime environments
|
||||
8. **Use managed identity** — Enable SystemAssigned identity for secure integrations
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ArmOperation<ApiCenterApiResource> operation = await apis
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-api", apiData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("API already exists with conflicting configuration");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid request: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.ApiCenter` | API Center management (this SDK) | `dotnet add package Azure.ResourceManager.ApiCenter` |
|
||||
| `Azure.ResourceManager.ApiManagement` | API gateway and policies | `dotnet add package Azure.ResourceManager.ApiManagement` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.ResourceManager.ApiCenter |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.resourcemanager.apicenter |
|
||||
| Product Documentation | https://learn.microsoft.com/azure/api-center/ |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/apicenter/Azure.ResourceManager.ApiCenter |
|
||||
@@ -0,0 +1,310 @@
|
||||
---
|
||||
name: azure-mgmt-apimanagement-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for API Management in .NET. Use for MANAGEMENT PLANE operations: creating/managing APIM services, APIs, products, subscriptions, policies, users, groups, gateways, and backends via Azure Resource Manager. Triggers: "API Management", "APIM service", "create APIM", "manage APIs", "ApiManagementServiceResource", "API policies", "APIM products", "APIM subscriptions".
|
||||
package: Azure.ResourceManager.ApiManagement
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.ApiManagement (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Azure API Management resources via Azure Resource Manager.
|
||||
|
||||
> **⚠️ Management vs Data Plane**
|
||||
> - **This SDK (Azure.ResourceManager.ApiManagement)**: Create services, APIs, products, subscriptions, policies, users, groups
|
||||
> - **Data Plane**: Direct API calls to your APIM gateway endpoints
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.ApiManagement
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.3.0
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.ApiManagement;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
|
||||
var subscription = armClient.GetSubscriptionResource(
|
||||
new ResourceIdentifier($"/subscriptions/{subscriptionId}"));
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
└── ResourceGroupResource
|
||||
└── ApiManagementServiceResource
|
||||
├── ApiResource
|
||||
│ ├── ApiOperationResource
|
||||
│ │ └── ApiOperationPolicyResource
|
||||
│ ├── ApiPolicyResource
|
||||
│ ├── ApiSchemaResource
|
||||
│ └── ApiDiagnosticResource
|
||||
├── ApiManagementProductResource
|
||||
│ ├── ProductApiResource
|
||||
│ ├── ProductGroupResource
|
||||
│ └── ProductPolicyResource
|
||||
├── ApiManagementSubscriptionResource
|
||||
├── ApiManagementPolicyResource
|
||||
├── ApiManagementUserResource
|
||||
├── ApiManagementGroupResource
|
||||
├── ApiManagementBackendResource
|
||||
├── ApiManagementGatewayResource
|
||||
├── ApiManagementCertificateResource
|
||||
├── ApiManagementNamedValueResource
|
||||
└── ApiManagementLoggerResource
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Create API Management Service
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.ApiManagement;
|
||||
using Azure.ResourceManager.ApiManagement.Models;
|
||||
|
||||
// Get resource group
|
||||
var resourceGroup = await subscription
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Define service
|
||||
var serviceData = new ApiManagementServiceData(
|
||||
location: AzureLocation.EastUS,
|
||||
sku: new ApiManagementServiceSkuProperties(
|
||||
ApiManagementServiceSkuType.Developer,
|
||||
capacity: 1),
|
||||
publisherEmail: "admin@contoso.com",
|
||||
publisherName: "Contoso");
|
||||
|
||||
// Create service (long-running operation - can take 30+ minutes)
|
||||
var serviceCollection = resourceGroup.Value.GetApiManagementServices();
|
||||
var operation = await serviceCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-apim-service",
|
||||
serviceData);
|
||||
|
||||
ApiManagementServiceResource service = operation.Value;
|
||||
```
|
||||
|
||||
### 2. Create an API
|
||||
|
||||
```csharp
|
||||
var apiData = new ApiCreateOrUpdateContent
|
||||
{
|
||||
DisplayName = "My API",
|
||||
Path = "myapi",
|
||||
Protocols = { ApiOperationInvokableProtocol.Https },
|
||||
ServiceUri = new Uri("https://backend.contoso.com/api")
|
||||
};
|
||||
|
||||
var apiCollection = service.GetApis();
|
||||
var apiOperation = await apiCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-api",
|
||||
apiData);
|
||||
|
||||
ApiResource api = apiOperation.Value;
|
||||
```
|
||||
|
||||
### 3. Create a Product
|
||||
|
||||
```csharp
|
||||
var productData = new ApiManagementProductData
|
||||
{
|
||||
DisplayName = "Starter",
|
||||
Description = "Starter tier with limited access",
|
||||
IsSubscriptionRequired = true,
|
||||
IsApprovalRequired = false,
|
||||
SubscriptionsLimit = 1,
|
||||
State = ApiManagementProductState.Published
|
||||
};
|
||||
|
||||
var productCollection = service.GetApiManagementProducts();
|
||||
var productOperation = await productCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"starter",
|
||||
productData);
|
||||
|
||||
ApiManagementProductResource product = productOperation.Value;
|
||||
|
||||
// Add API to product
|
||||
await product.GetProductApis().CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-api");
|
||||
```
|
||||
|
||||
### 4. Create a Subscription
|
||||
|
||||
```csharp
|
||||
var subscriptionData = new ApiManagementSubscriptionCreateOrUpdateContent
|
||||
{
|
||||
DisplayName = "My Subscription",
|
||||
Scope = $"/products/{product.Data.Name}",
|
||||
State = ApiManagementSubscriptionState.Active
|
||||
};
|
||||
|
||||
var subscriptionCollection = service.GetApiManagementSubscriptions();
|
||||
var subOperation = await subscriptionCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-subscription",
|
||||
subscriptionData);
|
||||
|
||||
ApiManagementSubscriptionResource subscription = subOperation.Value;
|
||||
|
||||
// Get subscription keys
|
||||
var keys = await subscription.GetSecretsAsync();
|
||||
Console.WriteLine($"Primary Key: {keys.Value.PrimaryKey}");
|
||||
```
|
||||
|
||||
### 5. Set API Policy
|
||||
|
||||
```csharp
|
||||
var policyXml = @"
|
||||
<policies>
|
||||
<inbound>
|
||||
<rate-limit calls=""100"" renewal-period=""60"" />
|
||||
<set-header name=""X-Custom-Header"" exists-action=""override"">
|
||||
<value>CustomValue</value>
|
||||
</set-header>
|
||||
<base />
|
||||
</inbound>
|
||||
<backend>
|
||||
<base />
|
||||
</backend>
|
||||
<outbound>
|
||||
<base />
|
||||
</outbound>
|
||||
<on-error>
|
||||
<base />
|
||||
</on-error>
|
||||
</policies>";
|
||||
|
||||
var policyData = new PolicyContractData
|
||||
{
|
||||
Value = policyXml,
|
||||
Format = PolicyContentFormat.Xml
|
||||
};
|
||||
|
||||
await api.GetApiPolicy().CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
policyData);
|
||||
```
|
||||
|
||||
### 6. Backup and Restore
|
||||
|
||||
```csharp
|
||||
// Backup
|
||||
var backupParams = new ApiManagementServiceBackupRestoreContent(
|
||||
storageAccount: "mystorageaccount",
|
||||
containerName: "apim-backups",
|
||||
backupName: "backup-2024-01-15")
|
||||
{
|
||||
AccessType = StorageAccountAccessType.SystemAssignedManagedIdentity
|
||||
};
|
||||
|
||||
await service.BackupAsync(WaitUntil.Completed, backupParams);
|
||||
|
||||
// Restore
|
||||
await service.RestoreAsync(WaitUntil.Completed, backupParams);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `ApiManagementServiceResource` | Represents an APIM service instance |
|
||||
| `ApiManagementServiceCollection` | Collection for service CRUD |
|
||||
| `ApiResource` | Represents an API |
|
||||
| `ApiManagementProductResource` | Represents a product |
|
||||
| `ApiManagementSubscriptionResource` | Represents a subscription |
|
||||
| `ApiManagementPolicyResource` | Service-level policy |
|
||||
| `ApiPolicyResource` | API-level policy |
|
||||
| `ApiManagementUserResource` | Represents a user |
|
||||
| `ApiManagementGroupResource` | Represents a group |
|
||||
| `ApiManagementBackendResource` | Represents a backend service |
|
||||
| `ApiManagementGatewayResource` | Represents a self-hosted gateway |
|
||||
|
||||
## SKU Types
|
||||
|
||||
| SKU | Purpose | Capacity |
|
||||
|-----|---------|----------|
|
||||
| `Developer` | Development/testing (no SLA) | 1 |
|
||||
| `Basic` | Entry-level production | 1-2 |
|
||||
| `Standard` | Medium workloads | 1-4 |
|
||||
| `Premium` | High availability, multi-region | 1-12 per region |
|
||||
| `Consumption` | Serverless, pay-per-call | N/A |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `WaitUntil.Completed`** for operations that must finish before proceeding
|
||||
2. **Use `WaitUntil.Started`** for long operations like service creation (30+ min)
|
||||
3. **Always use `DefaultAzureCredential`** — never hardcode keys
|
||||
4. **Handle `RequestFailedException`** for ARM API errors
|
||||
5. **Use `CreateOrUpdateAsync`** for idempotent operations
|
||||
6. **Navigate hierarchy** via `Get*` methods (e.g., `service.GetApis()`)
|
||||
7. **Policy format** — Use XML format for policies; JSON is also supported
|
||||
8. **Service creation** — Developer SKU is fastest for testing (~15-30 min)
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await serviceCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, serviceName, serviceData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Service already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Bad request: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | When to Read |
|
||||
|------|--------------|
|
||||
| [references/service-management.md](references/service-management.md) | Service CRUD, SKUs, networking, backup/restore |
|
||||
| [references/apis-operations.md](references/apis-operations.md) | APIs, operations, schemas, versioning |
|
||||
| [references/products-subscriptions.md](references/products-subscriptions.md) | Products, subscriptions, access control |
|
||||
| [references/policies.md](references/policies.md) | Policy XML patterns, scopes, common policies |
|
||||
|
||||
## Related Resources
|
||||
|
||||
| Resource | Purpose |
|
||||
|----------|---------|
|
||||
| [API Management Documentation](https://learn.microsoft.com/en-us/azure/api-management/) | Official Azure docs |
|
||||
| [Policy Reference](https://learn.microsoft.com/en-us/azure/api-management/api-management-policies) | Complete policy reference |
|
||||
| [SDK Reference](https://learn.microsoft.com/en-us/dotnet/api/azure.resourcemanager.apimanagement) | .NET API reference |
|
||||
290
skills/official/microsoft/dotnet/m365/m365-agents/SKILL.md
Normal file
290
skills/official/microsoft/dotnet/m365/m365-agents/SKILL.md
Normal file
@@ -0,0 +1,290 @@
|
||||
---
|
||||
name: m365-agents-dotnet
|
||||
description: |
|
||||
Microsoft 365 Agents SDK for .NET. Build multichannel agents for Teams/M365/Copilot Studio with ASP.NET Core hosting, AgentApplication routing, and MSAL-based auth. Triggers: "Microsoft 365 Agents SDK", "Microsoft.Agents", "AddAgentApplicationOptions", "AgentApplication", "AddAgentAspNetAuthentication", "Copilot Studio client", "IAgentHttpAdapter".
|
||||
package: Microsoft.Agents.Hosting.AspNetCore, Microsoft.Agents.Authentication.Msal, Microsoft.Agents.CopilotStudio.Client
|
||||
---
|
||||
|
||||
# Microsoft 365 Agents SDK (.NET)
|
||||
|
||||
## Overview
|
||||
Build enterprise agents for Microsoft 365, Teams, and Copilot Studio using the Microsoft.Agents SDK with ASP.NET Core hosting, agent routing, and MSAL-based authentication.
|
||||
|
||||
## Before implementation
|
||||
- Use the microsoft-docs MCP to verify the latest APIs for AddAgent, AgentApplication, and authentication options.
|
||||
- Confirm package versions in NuGet for the Microsoft.Agents.* packages you plan to use.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Microsoft.Agents.Hosting.AspNetCore
|
||||
dotnet add package Microsoft.Agents.Authentication.Msal
|
||||
dotnet add package Microsoft.Agents.Storage
|
||||
dotnet add package Microsoft.Agents.CopilotStudio.Client
|
||||
dotnet add package Microsoft.Identity.Client.Extensions.Msal
|
||||
```
|
||||
|
||||
## Configuration (appsettings.json)
|
||||
|
||||
```json
|
||||
{
|
||||
"TokenValidation": {
|
||||
"Enabled": true,
|
||||
"Audiences": [
|
||||
"{{ClientId}}"
|
||||
],
|
||||
"TenantId": "{{TenantId}}"
|
||||
},
|
||||
"AgentApplication": {
|
||||
"StartTypingTimer": false,
|
||||
"RemoveRecipientMention": false,
|
||||
"NormalizeMentions": false
|
||||
},
|
||||
"Connections": {
|
||||
"ServiceConnection": {
|
||||
"Settings": {
|
||||
"AuthType": "ClientSecret",
|
||||
"ClientId": "{{ClientId}}",
|
||||
"ClientSecret": "{{ClientSecret}}",
|
||||
"AuthorityEndpoint": "https://login.microsoftonline.com/{{TenantId}}",
|
||||
"Scopes": [
|
||||
"https://api.botframework.com/.default"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"ConnectionsMap": [
|
||||
{
|
||||
"ServiceUrl": "*",
|
||||
"Connection": "ServiceConnection"
|
||||
}
|
||||
],
|
||||
"CopilotStudioClientSettings": {
|
||||
"DirectConnectUrl": "",
|
||||
"EnvironmentId": "",
|
||||
"SchemaName": "",
|
||||
"TenantId": "",
|
||||
"AppClientId": "",
|
||||
"AppClientSecret": ""
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Core Workflow: ASP.NET Core agent host
|
||||
|
||||
```csharp
|
||||
using Microsoft.Agents.Builder;
|
||||
using Microsoft.Agents.Hosting.AspNetCore;
|
||||
using Microsoft.Agents.Storage;
|
||||
using Microsoft.AspNetCore.Builder;
|
||||
using Microsoft.AspNetCore.Http;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
|
||||
var builder = WebApplication.CreateBuilder(args);
|
||||
|
||||
builder.Services.AddHttpClient();
|
||||
builder.AddAgentApplicationOptions();
|
||||
builder.AddAgent<MyAgent>();
|
||||
builder.Services.AddSingleton<IStorage, MemoryStorage>();
|
||||
|
||||
builder.Services.AddControllers();
|
||||
builder.Services.AddAgentAspNetAuthentication(builder.Configuration);
|
||||
|
||||
WebApplication app = builder.Build();
|
||||
|
||||
app.UseAuthentication();
|
||||
app.UseAuthorization();
|
||||
|
||||
app.MapGet("/", () => "Microsoft Agents SDK Sample");
|
||||
|
||||
var incomingRoute = app.MapPost("/api/messages",
|
||||
async (HttpRequest request, HttpResponse response, IAgentHttpAdapter adapter, IAgent agent, CancellationToken ct) =>
|
||||
{
|
||||
await adapter.ProcessAsync(request, response, agent, ct);
|
||||
});
|
||||
|
||||
if (!app.Environment.IsDevelopment())
|
||||
{
|
||||
incomingRoute.RequireAuthorization();
|
||||
}
|
||||
else
|
||||
{
|
||||
app.Urls.Add("http://localhost:3978");
|
||||
}
|
||||
|
||||
app.Run();
|
||||
```
|
||||
|
||||
## AgentApplication routing
|
||||
|
||||
```csharp
|
||||
using Microsoft.Agents.Builder;
|
||||
using Microsoft.Agents.Builder.App;
|
||||
using Microsoft.Agents.Builder.State;
|
||||
using Microsoft.Agents.Core.Models;
|
||||
using System;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
|
||||
public sealed class MyAgent : AgentApplication
|
||||
{
|
||||
public MyAgent(AgentApplicationOptions options) : base(options)
|
||||
{
|
||||
OnConversationUpdate(ConversationUpdateEvents.MembersAdded, WelcomeAsync);
|
||||
OnActivity(ActivityTypes.Message, OnMessageAsync, rank: RouteRank.Last);
|
||||
OnTurnError(OnTurnErrorAsync);
|
||||
}
|
||||
|
||||
private static async Task WelcomeAsync(ITurnContext turnContext, ITurnState turnState, CancellationToken ct)
|
||||
{
|
||||
foreach (ChannelAccount member in turnContext.Activity.MembersAdded)
|
||||
{
|
||||
if (member.Id != turnContext.Activity.Recipient.Id)
|
||||
{
|
||||
await turnContext.SendActivityAsync(
|
||||
MessageFactory.Text("Welcome to the agent."),
|
||||
ct);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static async Task OnMessageAsync(ITurnContext turnContext, ITurnState turnState, CancellationToken ct)
|
||||
{
|
||||
await turnContext.SendActivityAsync(
|
||||
MessageFactory.Text($"You said: {turnContext.Activity.Text}"),
|
||||
ct);
|
||||
}
|
||||
|
||||
private static async Task OnTurnErrorAsync(
|
||||
ITurnContext turnContext,
|
||||
ITurnState turnState,
|
||||
Exception exception,
|
||||
CancellationToken ct)
|
||||
{
|
||||
await turnState.Conversation.DeleteStateAsync(turnContext, ct);
|
||||
|
||||
var endOfConversation = Activity.CreateEndOfConversationActivity();
|
||||
endOfConversation.Code = EndOfConversationCodes.Error;
|
||||
endOfConversation.Text = exception.Message;
|
||||
await turnContext.SendActivityAsync(endOfConversation, ct);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Copilot Studio direct-to-engine client
|
||||
|
||||
### DelegatingHandler for token acquisition (interactive flow)
|
||||
|
||||
```csharp
|
||||
using System.Net.Http.Headers;
|
||||
using Microsoft.Agents.CopilotStudio.Client;
|
||||
using Microsoft.Identity.Client;
|
||||
|
||||
internal sealed class AddTokenHandler : DelegatingHandler
|
||||
{
|
||||
private readonly SampleConnectionSettings _settings;
|
||||
|
||||
public AddTokenHandler(SampleConnectionSettings settings) : base(new HttpClientHandler())
|
||||
{
|
||||
_settings = settings;
|
||||
}
|
||||
|
||||
protected override async Task<HttpResponseMessage> SendAsync(
|
||||
HttpRequestMessage request,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
if (request.Headers.Authorization is null)
|
||||
{
|
||||
string[] scopes = [CopilotClient.ScopeFromSettings(_settings)];
|
||||
|
||||
IPublicClientApplication app = PublicClientApplicationBuilder
|
||||
.Create(_settings.AppClientId)
|
||||
.WithAuthority(AadAuthorityAudience.AzureAdMyOrg)
|
||||
.WithTenantId(_settings.TenantId)
|
||||
.WithRedirectUri("http://localhost")
|
||||
.Build();
|
||||
|
||||
AuthenticationResult authResponse;
|
||||
try
|
||||
{
|
||||
var account = (await app.GetAccountsAsync()).FirstOrDefault();
|
||||
authResponse = await app.AcquireTokenSilent(scopes, account).ExecuteAsync(cancellationToken);
|
||||
}
|
||||
catch (MsalUiRequiredException)
|
||||
{
|
||||
authResponse = await app.AcquireTokenInteractive(scopes).ExecuteAsync(cancellationToken);
|
||||
}
|
||||
|
||||
request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", authResponse.AccessToken);
|
||||
}
|
||||
|
||||
return await base.SendAsync(request, cancellationToken);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Console host with CopilotClient
|
||||
|
||||
```csharp
|
||||
using Microsoft.Agents.CopilotStudio.Client;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
|
||||
HostApplicationBuilder builder = Host.CreateApplicationBuilder(args);
|
||||
|
||||
var settings = new SampleConnectionSettings(
|
||||
builder.Configuration.GetSection("CopilotStudioClientSettings"));
|
||||
|
||||
builder.Services.AddHttpClient("mcs").ConfigurePrimaryHttpMessageHandler(() =>
|
||||
{
|
||||
return new AddTokenHandler(settings);
|
||||
});
|
||||
|
||||
builder.Services
|
||||
.AddSingleton(settings)
|
||||
.AddTransient<CopilotClient>(sp =>
|
||||
{
|
||||
var logger = sp.GetRequiredService<ILoggerFactory>().CreateLogger<CopilotClient>();
|
||||
return new CopilotClient(settings, sp.GetRequiredService<IHttpClientFactory>(), logger, "mcs");
|
||||
});
|
||||
|
||||
IHost host = builder.Build();
|
||||
var client = host.Services.GetRequiredService<CopilotClient>();
|
||||
|
||||
await foreach (var activity in client.StartConversationAsync(emitStartConversationEvent: true))
|
||||
{
|
||||
Console.WriteLine(activity.Type);
|
||||
}
|
||||
|
||||
await foreach (var activity in client.AskQuestionAsync("Hello!", null))
|
||||
{
|
||||
Console.WriteLine(activity.Type);
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. Use AgentApplication subclasses to centralize routing and error handling.
|
||||
2. Use MemoryStorage only for development; use persisted storage in production.
|
||||
3. Enable TokenValidation in production and require authorization on /api/messages.
|
||||
4. Keep auth secrets in configuration providers (Key Vault, managed identity, env vars).
|
||||
5. Reuse HttpClient from IHttpClientFactory and cache MSAL tokens.
|
||||
6. Prefer async handlers and pass CancellationToken to SDK calls.
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | Contents |
|
||||
| --- | --- |
|
||||
| [references/acceptance-criteria.md](references/acceptance-criteria.md) | Import paths, hosting pipeline, Copilot Studio client patterns, anti-patterns |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
| --- | --- |
|
||||
| Microsoft 365 Agents SDK | https://learn.microsoft.com/en-us/microsoft-365/agents-sdk/ |
|
||||
| AddAgent API | https://learn.microsoft.com/en-us/dotnet/api/microsoft.agents.hosting.aspnetcore.servicecollectionextensions.addagent?view=m365-agents-sdk |
|
||||
| AgentApplication API | https://learn.microsoft.com/en-us/dotnet/api/microsoft.agents.builder.app.agentapplication?view=m365-agents-sdk |
|
||||
| Auth configuration options | https://learn.microsoft.com/en-us/microsoft-365/agents-sdk/microsoft-authentication-library-configuration-options |
|
||||
| Copilot Studio integration | https://learn.microsoft.com/en-us/microsoft-365/agents-sdk/integrate-with-mcs |
|
||||
| GitHub samples | https://github.com/microsoft/agents |
|
||||
488
skills/official/microsoft/dotnet/messaging/eventgrid/SKILL.md
Normal file
488
skills/official/microsoft/dotnet/messaging/eventgrid/SKILL.md
Normal file
@@ -0,0 +1,488 @@
|
||||
---
|
||||
name: azure-eventgrid-dotnet
|
||||
description: |
|
||||
Azure Event Grid SDK for .NET. Client library for publishing and consuming events with Azure Event Grid. Use for event-driven architectures, pub/sub messaging, CloudEvents, and EventGridEvents. Triggers: "Event Grid", "EventGridPublisherClient", "CloudEvent", "EventGridEvent", "publish events .NET", "event-driven", "pub/sub".
|
||||
package: Azure.Messaging.EventGrid
|
||||
---
|
||||
|
||||
# Azure.Messaging.EventGrid (.NET)
|
||||
|
||||
Client library for publishing events to Azure Event Grid topics, domains, and namespaces.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# For topics and domains (push delivery)
|
||||
dotnet add package Azure.Messaging.EventGrid
|
||||
|
||||
# For namespaces (pull delivery)
|
||||
dotnet add package Azure.Messaging.EventGrid.Namespaces
|
||||
|
||||
# For CloudNative CloudEvents interop
|
||||
dotnet add package Microsoft.Azure.Messaging.EventGrid.CloudNativeCloudEvents
|
||||
```
|
||||
|
||||
**Current Version**: 4.28.0 (stable)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Topic/Domain endpoint
|
||||
EVENT_GRID_TOPIC_ENDPOINT=https://<topic-name>.<region>.eventgrid.azure.net/api/events
|
||||
EVENT_GRID_TOPIC_KEY=<access-key>
|
||||
|
||||
# Namespace endpoint (for pull delivery)
|
||||
EVENT_GRID_NAMESPACE_ENDPOINT=https://<namespace>.<region>.eventgrid.azure.net
|
||||
EVENT_GRID_TOPIC_NAME=<topic-name>
|
||||
EVENT_GRID_SUBSCRIPTION_NAME=<subscription-name>
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
Push Delivery (Topics/Domains)
|
||||
└── EventGridPublisherClient
|
||||
├── SendEventAsync(EventGridEvent)
|
||||
├── SendEventsAsync(IEnumerable<EventGridEvent>)
|
||||
├── SendEventAsync(CloudEvent)
|
||||
└── SendEventsAsync(IEnumerable<CloudEvent>)
|
||||
|
||||
Pull Delivery (Namespaces)
|
||||
├── EventGridSenderClient
|
||||
│ └── SendAsync(CloudEvent)
|
||||
└── EventGridReceiverClient
|
||||
├── ReceiveAsync()
|
||||
├── AcknowledgeAsync()
|
||||
├── ReleaseAsync()
|
||||
└── RejectAsync()
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key Authentication
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Messaging.EventGrid;
|
||||
|
||||
EventGridPublisherClient client = new(
|
||||
new Uri("https://mytopic.eastus-1.eventgrid.azure.net/api/events"),
|
||||
new AzureKeyCredential("<access-key>"));
|
||||
```
|
||||
|
||||
### Microsoft Entra ID (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventGrid;
|
||||
|
||||
EventGridPublisherClient client = new(
|
||||
new Uri("https://mytopic.eastus-1.eventgrid.azure.net/api/events"),
|
||||
new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
### SAS Token Authentication
|
||||
|
||||
```csharp
|
||||
string sasToken = EventGridPublisherClient.BuildSharedAccessSignature(
|
||||
new Uri(topicEndpoint),
|
||||
DateTimeOffset.UtcNow.AddHours(1),
|
||||
new AzureKeyCredential(topicKey));
|
||||
|
||||
var sasCredential = new AzureSasCredential(sasToken);
|
||||
EventGridPublisherClient client = new(
|
||||
new Uri(topicEndpoint),
|
||||
sasCredential);
|
||||
```
|
||||
|
||||
## Publishing Events
|
||||
|
||||
### EventGridEvent Schema
|
||||
|
||||
```csharp
|
||||
EventGridPublisherClient client = new(
|
||||
new Uri(topicEndpoint),
|
||||
new AzureKeyCredential(topicKey));
|
||||
|
||||
// Single event
|
||||
EventGridEvent egEvent = new(
|
||||
subject: "orders/12345",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: new { OrderId = "12345", Amount = 99.99 });
|
||||
|
||||
await client.SendEventAsync(egEvent);
|
||||
|
||||
// Batch of events
|
||||
List<EventGridEvent> events = new()
|
||||
{
|
||||
new EventGridEvent(
|
||||
subject: "orders/12345",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: new OrderData { OrderId = "12345", Amount = 99.99 }),
|
||||
new EventGridEvent(
|
||||
subject: "orders/12346",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: new OrderData { OrderId = "12346", Amount = 149.99 })
|
||||
};
|
||||
|
||||
await client.SendEventsAsync(events);
|
||||
```
|
||||
|
||||
### CloudEvent Schema
|
||||
|
||||
```csharp
|
||||
CloudEvent cloudEvent = new(
|
||||
source: "/orders/system",
|
||||
type: "Order.Created",
|
||||
data: new { OrderId = "12345", Amount = 99.99 });
|
||||
|
||||
cloudEvent.Subject = "orders/12345";
|
||||
cloudEvent.Id = Guid.NewGuid().ToString();
|
||||
cloudEvent.Time = DateTimeOffset.UtcNow;
|
||||
|
||||
await client.SendEventAsync(cloudEvent);
|
||||
|
||||
// Batch of CloudEvents
|
||||
List<CloudEvent> cloudEvents = new()
|
||||
{
|
||||
new CloudEvent("/orders", "Order.Created", new { OrderId = "1" }),
|
||||
new CloudEvent("/orders", "Order.Updated", new { OrderId = "2" })
|
||||
};
|
||||
|
||||
await client.SendEventsAsync(cloudEvents);
|
||||
```
|
||||
|
||||
### Publishing to Event Grid Domain
|
||||
|
||||
```csharp
|
||||
// Events must specify the Topic property for domain routing
|
||||
List<EventGridEvent> events = new()
|
||||
{
|
||||
new EventGridEvent(
|
||||
subject: "orders/12345",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: new { OrderId = "12345" })
|
||||
{
|
||||
Topic = "orders-topic" // Domain topic name
|
||||
},
|
||||
new EventGridEvent(
|
||||
subject: "inventory/item-1",
|
||||
eventType: "Inventory.Updated",
|
||||
dataVersion: "1.0",
|
||||
data: new { ItemId = "item-1" })
|
||||
{
|
||||
Topic = "inventory-topic"
|
||||
}
|
||||
};
|
||||
|
||||
await client.SendEventsAsync(events);
|
||||
```
|
||||
|
||||
### Custom Serialization
|
||||
|
||||
```csharp
|
||||
using System.Text.Json;
|
||||
|
||||
var serializerOptions = new JsonSerializerOptions
|
||||
{
|
||||
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
|
||||
};
|
||||
|
||||
var customSerializer = new JsonObjectSerializer(serializerOptions);
|
||||
|
||||
EventGridEvent egEvent = new(
|
||||
subject: "orders/12345",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: customSerializer.Serialize(new OrderData { OrderId = "12345" }));
|
||||
|
||||
await client.SendEventAsync(egEvent);
|
||||
```
|
||||
|
||||
## Pull Delivery (Namespaces)
|
||||
|
||||
### Send Events to Namespace Topic
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Messaging;
|
||||
using Azure.Messaging.EventGrid.Namespaces;
|
||||
|
||||
var senderClient = new EventGridSenderClient(
|
||||
new Uri(namespaceEndpoint),
|
||||
topicName,
|
||||
new AzureKeyCredential(topicKey));
|
||||
|
||||
// Send single event
|
||||
CloudEvent cloudEvent = new("employee_source", "Employee.Created",
|
||||
new { Name = "John", Age = 30 });
|
||||
await senderClient.SendAsync(cloudEvent);
|
||||
|
||||
// Send batch
|
||||
await senderClient.SendAsync(new[]
|
||||
{
|
||||
new CloudEvent("source", "type", new { Name = "Alice" }),
|
||||
new CloudEvent("source", "type", new { Name = "Bob" })
|
||||
});
|
||||
```
|
||||
|
||||
### Receive and Process Events
|
||||
|
||||
```csharp
|
||||
var receiverClient = new EventGridReceiverClient(
|
||||
new Uri(namespaceEndpoint),
|
||||
topicName,
|
||||
subscriptionName,
|
||||
new AzureKeyCredential(topicKey));
|
||||
|
||||
// Receive events
|
||||
ReceiveResult result = await receiverClient.ReceiveAsync(maxEvents: 10);
|
||||
|
||||
List<string> lockTokensToAck = new();
|
||||
List<string> lockTokensToRelease = new();
|
||||
|
||||
foreach (ReceiveDetails detail in result.Details)
|
||||
{
|
||||
CloudEvent cloudEvent = detail.Event;
|
||||
string lockToken = detail.BrokerProperties.LockToken;
|
||||
|
||||
try
|
||||
{
|
||||
// Process the event
|
||||
Console.WriteLine($"Event: {cloudEvent.Type}, Data: {cloudEvent.Data}");
|
||||
lockTokensToAck.Add(lockToken);
|
||||
}
|
||||
catch (Exception)
|
||||
{
|
||||
// Release for retry
|
||||
lockTokensToRelease.Add(lockToken);
|
||||
}
|
||||
}
|
||||
|
||||
// Acknowledge successfully processed events
|
||||
if (lockTokensToAck.Any())
|
||||
{
|
||||
await receiverClient.AcknowledgeAsync(lockTokensToAck);
|
||||
}
|
||||
|
||||
// Release events for retry
|
||||
if (lockTokensToRelease.Any())
|
||||
{
|
||||
await receiverClient.ReleaseAsync(lockTokensToRelease);
|
||||
}
|
||||
```
|
||||
|
||||
### Reject Events (Dead Letter)
|
||||
|
||||
```csharp
|
||||
// Reject events that cannot be processed
|
||||
await receiverClient.RejectAsync(new[] { lockToken });
|
||||
```
|
||||
|
||||
## Consuming Events (Azure Functions)
|
||||
|
||||
### EventGridEvent Trigger
|
||||
|
||||
```csharp
|
||||
using Azure.Messaging.EventGrid;
|
||||
using Microsoft.Azure.WebJobs;
|
||||
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
|
||||
|
||||
public static class EventGridFunction
|
||||
{
|
||||
[FunctionName("ProcessEventGridEvent")]
|
||||
public static void Run(
|
||||
[EventGridTrigger] EventGridEvent eventGridEvent,
|
||||
ILogger log)
|
||||
{
|
||||
log.LogInformation($"Event Type: {eventGridEvent.EventType}");
|
||||
log.LogInformation($"Subject: {eventGridEvent.Subject}");
|
||||
log.LogInformation($"Data: {eventGridEvent.Data}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### CloudEvent Trigger
|
||||
|
||||
```csharp
|
||||
using Azure.Messaging;
|
||||
using Microsoft.Azure.Functions.Worker;
|
||||
|
||||
public class CloudEventFunction
|
||||
{
|
||||
[Function("ProcessCloudEvent")]
|
||||
public void Run(
|
||||
[EventGridTrigger] CloudEvent cloudEvent,
|
||||
FunctionContext context)
|
||||
{
|
||||
var logger = context.GetLogger("ProcessCloudEvent");
|
||||
logger.LogInformation($"Event Type: {cloudEvent.Type}");
|
||||
logger.LogInformation($"Source: {cloudEvent.Source}");
|
||||
logger.LogInformation($"Data: {cloudEvent.Data}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Parsing Events
|
||||
|
||||
### Parse EventGridEvent
|
||||
|
||||
```csharp
|
||||
// From JSON string
|
||||
string json = "..."; // Event Grid webhook payload
|
||||
EventGridEvent[] events = EventGridEvent.ParseMany(BinaryData.FromString(json));
|
||||
|
||||
foreach (EventGridEvent egEvent in events)
|
||||
{
|
||||
if (egEvent.TryGetSystemEventData(out object systemEvent))
|
||||
{
|
||||
// Handle system event
|
||||
switch (systemEvent)
|
||||
{
|
||||
case StorageBlobCreatedEventData blobCreated:
|
||||
Console.WriteLine($"Blob created: {blobCreated.Url}");
|
||||
break;
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
// Handle custom event
|
||||
var customData = egEvent.Data.ToObjectFromJson<MyCustomData>();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Parse CloudEvent
|
||||
|
||||
```csharp
|
||||
CloudEvent[] cloudEvents = CloudEvent.ParseMany(BinaryData.FromString(json));
|
||||
|
||||
foreach (CloudEvent cloudEvent in cloudEvents)
|
||||
{
|
||||
var data = cloudEvent.Data.ToObjectFromJson<MyEventData>();
|
||||
Console.WriteLine($"Type: {cloudEvent.Type}, Data: {data}");
|
||||
}
|
||||
```
|
||||
|
||||
## System Events
|
||||
|
||||
```csharp
|
||||
// Common system event types
|
||||
using Azure.Messaging.EventGrid.SystemEvents;
|
||||
|
||||
// Storage events
|
||||
StorageBlobCreatedEventData blobCreated;
|
||||
StorageBlobDeletedEventData blobDeleted;
|
||||
|
||||
// Resource events
|
||||
ResourceWriteSuccessEventData resourceCreated;
|
||||
ResourceDeleteSuccessEventData resourceDeleted;
|
||||
|
||||
// App Service events
|
||||
WebAppUpdatedEventData webAppUpdated;
|
||||
|
||||
// Container Registry events
|
||||
ContainerRegistryImagePushedEventData imagePushed;
|
||||
|
||||
// IoT Hub events
|
||||
IotHubDeviceCreatedEventData deviceCreated;
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `EventGridPublisherClient` | Publish to topics/domains |
|
||||
| `EventGridSenderClient` | Send to namespace topics |
|
||||
| `EventGridReceiverClient` | Receive from namespace subscriptions |
|
||||
| `EventGridEvent` | Event Grid native schema |
|
||||
| `CloudEvent` | CloudEvents 1.0 schema |
|
||||
| `ReceiveResult` | Pull delivery response |
|
||||
| `ReceiveDetails` | Event with broker properties |
|
||||
| `BrokerProperties` | Lock token, delivery count |
|
||||
|
||||
## Event Schemas Comparison
|
||||
|
||||
| Feature | EventGridEvent | CloudEvent |
|
||||
|---------|----------------|------------|
|
||||
| Standard | Azure-specific | CNCF standard |
|
||||
| Required fields | subject, eventType, dataVersion, data | source, type |
|
||||
| Extensibility | Limited | Extension attributes |
|
||||
| Interoperability | Azure only | Cross-platform |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use CloudEvents** — Prefer CloudEvents for new implementations (industry standard)
|
||||
2. **Batch events** — Send multiple events in one call for efficiency
|
||||
3. **Use Entra ID** — Prefer managed identity over access keys
|
||||
4. **Idempotent handlers** — Events may be delivered more than once
|
||||
5. **Set event TTL** — Configure time-to-live for namespace events
|
||||
6. **Handle partial failures** — Acknowledge/release events individually
|
||||
7. **Use dead-letter** — Configure dead-letter for failed events
|
||||
8. **Validate schemas** — Validate event data before processing
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
await client.SendEventAsync(cloudEvent);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 401)
|
||||
{
|
||||
Console.WriteLine("Authentication failed - check credentials");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 403)
|
||||
{
|
||||
Console.WriteLine("Authorization failed - check RBAC permissions");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 413)
|
||||
{
|
||||
Console.WriteLine("Payload too large - max 1MB per event, 1MB total batch");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Event Grid error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Failover Pattern
|
||||
|
||||
```csharp
|
||||
try
|
||||
{
|
||||
var primaryClient = new EventGridPublisherClient(primaryUri, primaryKey);
|
||||
await primaryClient.SendEventsAsync(events);
|
||||
}
|
||||
catch (RequestFailedException)
|
||||
{
|
||||
// Failover to secondary region
|
||||
var secondaryClient = new EventGridPublisherClient(secondaryUri, secondaryKey);
|
||||
await secondaryClient.SendEventsAsync(events);
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Messaging.EventGrid` | Topics/Domains (this SDK) | `dotnet add package Azure.Messaging.EventGrid` |
|
||||
| `Azure.Messaging.EventGrid.Namespaces` | Pull delivery | `dotnet add package Azure.Messaging.EventGrid.Namespaces` |
|
||||
| `Azure.Identity` | Authentication | `dotnet add package Azure.Identity` |
|
||||
| `Microsoft.Azure.WebJobs.Extensions.EventGrid` | Azure Functions trigger | `dotnet add package Microsoft.Azure.WebJobs.Extensions.EventGrid` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.Messaging.EventGrid |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.messaging.eventgrid |
|
||||
| Quickstart | https://learn.microsoft.com/azure/event-grid/custom-event-quickstart |
|
||||
| Pull Delivery | https://learn.microsoft.com/azure/event-grid/pull-delivery-overview |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/eventgrid/Azure.Messaging.EventGrid |
|
||||
362
skills/official/microsoft/dotnet/messaging/eventhubs/SKILL.md
Normal file
362
skills/official/microsoft/dotnet/messaging/eventhubs/SKILL.md
Normal file
@@ -0,0 +1,362 @@
|
||||
---
|
||||
name: azure-eventhub-dotnet
|
||||
description: |
|
||||
Azure Event Hubs SDK for .NET. Use for high-throughput event streaming: sending events (EventHubProducerClient, EventHubBufferedProducerClient), receiving events (EventProcessorClient with checkpointing), partition management, and real-time data ingestion. Triggers: "Event Hubs", "event streaming", "EventHubProducerClient", "EventProcessorClient", "send events", "receive events", "checkpointing", "partition".
|
||||
package: Azure.Messaging.EventHubs
|
||||
---
|
||||
|
||||
# Azure.Messaging.EventHubs (.NET)
|
||||
|
||||
High-throughput event streaming SDK for sending and receiving events via Azure Event Hubs.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Core package (sending and simple receiving)
|
||||
dotnet add package Azure.Messaging.EventHubs
|
||||
|
||||
# Processor package (production receiving with checkpointing)
|
||||
dotnet add package Azure.Messaging.EventHubs.Processor
|
||||
|
||||
# Authentication
|
||||
dotnet add package Azure.Identity
|
||||
|
||||
# For checkpointing (required by EventProcessorClient)
|
||||
dotnet add package Azure.Storage.Blobs
|
||||
```
|
||||
|
||||
**Current Versions**: Azure.Messaging.EventHubs v5.12.2, Azure.Messaging.EventHubs.Processor v5.12.2
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENTHUB_FULLY_QUALIFIED_NAMESPACE=<namespace>.servicebus.windows.net
|
||||
EVENTHUB_NAME=<event-hub-name>
|
||||
|
||||
# For checkpointing (EventProcessorClient)
|
||||
BLOB_STORAGE_CONNECTION_STRING=<storage-connection-string>
|
||||
BLOB_CONTAINER_NAME=<checkpoint-container>
|
||||
|
||||
# Alternative: Connection string auth (not recommended for production)
|
||||
EVENTHUB_CONNECTION_STRING=Endpoint=sb://<namespace>.servicebus.windows.net/;SharedAccessKeyName=...
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventHubs;
|
||||
using Azure.Messaging.EventHubs.Producer;
|
||||
|
||||
// Always use DefaultAzureCredential for production
|
||||
var credential = new DefaultAzureCredential();
|
||||
|
||||
var fullyQualifiedNamespace = Environment.GetEnvironmentVariable("EVENTHUB_FULLY_QUALIFIED_NAMESPACE");
|
||||
var eventHubName = Environment.GetEnvironmentVariable("EVENTHUB_NAME");
|
||||
|
||||
var producer = new EventHubProducerClient(
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
credential);
|
||||
```
|
||||
|
||||
**Required RBAC Roles**:
|
||||
- **Sending**: `Azure Event Hubs Data Sender`
|
||||
- **Receiving**: `Azure Event Hubs Data Receiver`
|
||||
- **Both**: `Azure Event Hubs Data Owner`
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose | When to Use |
|
||||
|--------|---------|-------------|
|
||||
| `EventHubProducerClient` | Send events immediately in batches | Real-time sending, full control over batching |
|
||||
| `EventHubBufferedProducerClient` | Automatic batching with background sending | High-volume, fire-and-forget scenarios |
|
||||
| `EventHubConsumerClient` | Simple event reading | Prototyping only, NOT for production |
|
||||
| `EventProcessorClient` | Production event processing | **Always use this for receiving in production** |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Send Events (Batch)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventHubs;
|
||||
using Azure.Messaging.EventHubs.Producer;
|
||||
|
||||
await using var producer = new EventHubProducerClient(
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
new DefaultAzureCredential());
|
||||
|
||||
// Create a batch (respects size limits automatically)
|
||||
using EventDataBatch batch = await producer.CreateBatchAsync();
|
||||
|
||||
// Add events to batch
|
||||
var events = new[]
|
||||
{
|
||||
new EventData(BinaryData.FromString("{\"id\": 1, \"message\": \"Hello\"}")),
|
||||
new EventData(BinaryData.FromString("{\"id\": 2, \"message\": \"World\"}"))
|
||||
};
|
||||
|
||||
foreach (var eventData in events)
|
||||
{
|
||||
if (!batch.TryAdd(eventData))
|
||||
{
|
||||
// Batch is full - send it and create a new one
|
||||
await producer.SendAsync(batch);
|
||||
batch = await producer.CreateBatchAsync();
|
||||
|
||||
if (!batch.TryAdd(eventData))
|
||||
{
|
||||
throw new Exception("Event too large for empty batch");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Send remaining events
|
||||
if (batch.Count > 0)
|
||||
{
|
||||
await producer.SendAsync(batch);
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Send Events (Buffered - High Volume)
|
||||
|
||||
```csharp
|
||||
using Azure.Messaging.EventHubs.Producer;
|
||||
|
||||
var options = new EventHubBufferedProducerClientOptions
|
||||
{
|
||||
MaximumWaitTime = TimeSpan.FromSeconds(1)
|
||||
};
|
||||
|
||||
await using var producer = new EventHubBufferedProducerClient(
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
new DefaultAzureCredential(),
|
||||
options);
|
||||
|
||||
// Handle send success/failure
|
||||
producer.SendEventBatchSucceededAsync += args =>
|
||||
{
|
||||
Console.WriteLine($"Batch sent: {args.EventBatch.Count} events");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
producer.SendEventBatchFailedAsync += args =>
|
||||
{
|
||||
Console.WriteLine($"Batch failed: {args.Exception.Message}");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
// Enqueue events (sent automatically in background)
|
||||
for (int i = 0; i < 1000; i++)
|
||||
{
|
||||
await producer.EnqueueEventAsync(new EventData($"Event {i}"));
|
||||
}
|
||||
|
||||
// Flush remaining events before disposing
|
||||
await producer.FlushAsync();
|
||||
```
|
||||
|
||||
### 3. Receive Events (Production - EventProcessorClient)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventHubs;
|
||||
using Azure.Messaging.EventHubs.Consumer;
|
||||
using Azure.Messaging.EventHubs.Processor;
|
||||
using Azure.Storage.Blobs;
|
||||
|
||||
// Blob container for checkpointing
|
||||
var blobClient = new BlobContainerClient(
|
||||
Environment.GetEnvironmentVariable("BLOB_STORAGE_CONNECTION_STRING"),
|
||||
Environment.GetEnvironmentVariable("BLOB_CONTAINER_NAME"));
|
||||
|
||||
await blobClient.CreateIfNotExistsAsync();
|
||||
|
||||
// Create processor
|
||||
var processor = new EventProcessorClient(
|
||||
blobClient,
|
||||
EventHubConsumerClient.DefaultConsumerGroup,
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
new DefaultAzureCredential());
|
||||
|
||||
// Handle events
|
||||
processor.ProcessEventAsync += async args =>
|
||||
{
|
||||
Console.WriteLine($"Partition: {args.Partition.PartitionId}");
|
||||
Console.WriteLine($"Data: {args.Data.EventBody}");
|
||||
|
||||
// Checkpoint after processing (or batch checkpoints)
|
||||
await args.UpdateCheckpointAsync();
|
||||
};
|
||||
|
||||
// Handle errors
|
||||
processor.ProcessErrorAsync += args =>
|
||||
{
|
||||
Console.WriteLine($"Error: {args.Exception.Message}");
|
||||
Console.WriteLine($"Partition: {args.PartitionId}");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
// Start processing
|
||||
await processor.StartProcessingAsync();
|
||||
|
||||
// Run until cancelled
|
||||
await Task.Delay(Timeout.Infinite, cancellationToken);
|
||||
|
||||
// Stop gracefully
|
||||
await processor.StopProcessingAsync();
|
||||
```
|
||||
|
||||
### 4. Partition Operations
|
||||
|
||||
```csharp
|
||||
// Get partition IDs
|
||||
string[] partitionIds = await producer.GetPartitionIdsAsync();
|
||||
|
||||
// Send to specific partition (use sparingly)
|
||||
var options = new SendEventOptions
|
||||
{
|
||||
PartitionId = "0"
|
||||
};
|
||||
await producer.SendAsync(events, options);
|
||||
|
||||
// Use partition key (recommended for ordering)
|
||||
var batchOptions = new CreateBatchOptions
|
||||
{
|
||||
PartitionKey = "customer-123" // Events with same key go to same partition
|
||||
};
|
||||
using var batch = await producer.CreateBatchAsync(batchOptions);
|
||||
```
|
||||
|
||||
## EventPosition Options
|
||||
|
||||
Control where to start reading:
|
||||
|
||||
```csharp
|
||||
// Start from beginning
|
||||
EventPosition.Earliest
|
||||
|
||||
// Start from end (new events only)
|
||||
EventPosition.Latest
|
||||
|
||||
// Start from specific offset
|
||||
EventPosition.FromOffset(12345)
|
||||
|
||||
// Start from specific sequence number
|
||||
EventPosition.FromSequenceNumber(100)
|
||||
|
||||
// Start from specific time
|
||||
EventPosition.FromEnqueuedTime(DateTimeOffset.UtcNow.AddHours(-1))
|
||||
```
|
||||
|
||||
## ASP.NET Core Integration
|
||||
|
||||
```csharp
|
||||
// Program.cs
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventHubs.Producer;
|
||||
using Microsoft.Extensions.Azure;
|
||||
|
||||
builder.Services.AddAzureClients(clientBuilder =>
|
||||
{
|
||||
clientBuilder.AddEventHubProducerClient(
|
||||
builder.Configuration["EventHub:FullyQualifiedNamespace"],
|
||||
builder.Configuration["EventHub:Name"]);
|
||||
|
||||
clientBuilder.UseCredential(new DefaultAzureCredential());
|
||||
});
|
||||
|
||||
// Inject in controller/service
|
||||
public class EventService
|
||||
{
|
||||
private readonly EventHubProducerClient _producer;
|
||||
|
||||
public EventService(EventHubProducerClient producer)
|
||||
{
|
||||
_producer = producer;
|
||||
}
|
||||
|
||||
public async Task SendAsync(string message)
|
||||
{
|
||||
using var batch = await _producer.CreateBatchAsync();
|
||||
batch.TryAdd(new EventData(message));
|
||||
await _producer.SendAsync(batch);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `EventProcessorClient` for receiving** — Never use `EventHubConsumerClient` in production
|
||||
2. **Checkpoint strategically** — After N events or time interval, not every event
|
||||
3. **Use partition keys** — For ordering guarantees within a partition
|
||||
4. **Reuse clients** — Create once, use as singleton (thread-safe)
|
||||
5. **Use `await using`** — Ensures proper disposal
|
||||
6. **Handle `ProcessErrorAsync`** — Always register error handler
|
||||
7. **Batch events** — Use `CreateBatchAsync()` to respect size limits
|
||||
8. **Use buffered producer** — For high-volume scenarios with automatic batching
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure.Messaging.EventHubs;
|
||||
|
||||
try
|
||||
{
|
||||
await producer.SendAsync(batch);
|
||||
}
|
||||
catch (EventHubsException ex) when (ex.Reason == EventHubsException.FailureReason.ServiceBusy)
|
||||
{
|
||||
// Retry with backoff
|
||||
await Task.Delay(TimeSpan.FromSeconds(5));
|
||||
}
|
||||
catch (EventHubsException ex) when (ex.IsTransient)
|
||||
{
|
||||
// Transient error - safe to retry
|
||||
Console.WriteLine($"Transient error: {ex.Message}");
|
||||
}
|
||||
catch (EventHubsException ex)
|
||||
{
|
||||
// Non-transient error
|
||||
Console.WriteLine($"Error: {ex.Reason} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Checkpointing Strategies
|
||||
|
||||
| Strategy | When to Use |
|
||||
|----------|-------------|
|
||||
| Every event | Low volume, critical data |
|
||||
| Every N events | Balanced throughput/reliability |
|
||||
| Time-based | Consistent checkpoint intervals |
|
||||
| Batch completion | After processing a logical batch |
|
||||
|
||||
```csharp
|
||||
// Checkpoint every 100 events
|
||||
private int _eventCount = 0;
|
||||
|
||||
processor.ProcessEventAsync += async args =>
|
||||
{
|
||||
// Process event...
|
||||
|
||||
_eventCount++;
|
||||
if (_eventCount >= 100)
|
||||
{
|
||||
await args.UpdateCheckpointAsync();
|
||||
_eventCount = 0;
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Messaging.EventHubs` | Core sending/receiving | `dotnet add package Azure.Messaging.EventHubs` |
|
||||
| `Azure.Messaging.EventHubs.Processor` | Production processing | `dotnet add package Azure.Messaging.EventHubs.Processor` |
|
||||
| `Azure.ResourceManager.EventHubs` | Management plane (create hubs) | `dotnet add package Azure.ResourceManager.EventHubs` |
|
||||
| `Microsoft.Azure.WebJobs.Extensions.EventHubs` | Azure Functions binding | `dotnet add package Microsoft.Azure.WebJobs.Extensions.EventHubs` |
|
||||
333
skills/official/microsoft/dotnet/messaging/servicebus/SKILL.md
Normal file
333
skills/official/microsoft/dotnet/messaging/servicebus/SKILL.md
Normal file
@@ -0,0 +1,333 @@
|
||||
---
|
||||
name: azure-servicebus-dotnet
|
||||
description: |
|
||||
Azure Service Bus SDK for .NET. Enterprise messaging with queues, topics, subscriptions, and sessions. Use for reliable message delivery, pub/sub patterns, dead letter handling, and background processing. Triggers: "Service Bus", "ServiceBusClient", "ServiceBusSender", "ServiceBusReceiver", "ServiceBusProcessor", "message queue", "pub/sub .NET", "dead letter queue".
|
||||
package: Azure.Messaging.ServiceBus
|
||||
---
|
||||
|
||||
# Azure.Messaging.ServiceBus (.NET)
|
||||
|
||||
Enterprise messaging SDK for reliable message delivery with queues, topics, subscriptions, and sessions.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.Messaging.ServiceBus
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v7.20.1 (stable)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SERVICEBUS_FULLY_QUALIFIED_NAMESPACE=<namespace>.servicebus.windows.net
|
||||
# Or connection string (less secure)
|
||||
AZURE_SERVICEBUS_CONNECTION_STRING=Endpoint=sb://...
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Microsoft Entra ID (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.ServiceBus;
|
||||
|
||||
string fullyQualifiedNamespace = "<namespace>.servicebus.windows.net";
|
||||
await using ServiceBusClient client = new(fullyQualifiedNamespace, new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
### Connection String
|
||||
|
||||
```csharp
|
||||
string connectionString = "<connection_string>";
|
||||
await using ServiceBusClient client = new(connectionString);
|
||||
```
|
||||
|
||||
### ASP.NET Core Dependency Injection
|
||||
|
||||
```csharp
|
||||
services.AddAzureClients(builder =>
|
||||
{
|
||||
builder.AddServiceBusClientWithNamespace("<namespace>.servicebus.windows.net");
|
||||
builder.UseCredential(new DefaultAzureCredential());
|
||||
});
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
ServiceBusClient
|
||||
├── CreateSender(queueOrTopicName) → ServiceBusSender
|
||||
├── CreateReceiver(queueName) → ServiceBusReceiver
|
||||
├── CreateReceiver(topicName, subName) → ServiceBusReceiver
|
||||
├── AcceptNextSessionAsync(queueName) → ServiceBusSessionReceiver
|
||||
├── CreateProcessor(queueName) → ServiceBusProcessor
|
||||
└── CreateSessionProcessor(queueName) → ServiceBusSessionProcessor
|
||||
|
||||
ServiceBusAdministrationClient (separate client for CRUD)
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Send Messages
|
||||
|
||||
```csharp
|
||||
await using ServiceBusClient client = new(fullyQualifiedNamespace, new DefaultAzureCredential());
|
||||
ServiceBusSender sender = client.CreateSender("my-queue");
|
||||
|
||||
// Single message
|
||||
ServiceBusMessage message = new("Hello world!");
|
||||
await sender.SendMessageAsync(message);
|
||||
|
||||
// Safe batching (recommended)
|
||||
using ServiceBusMessageBatch batch = await sender.CreateMessageBatchAsync();
|
||||
if (batch.TryAddMessage(new ServiceBusMessage("Message 1")))
|
||||
{
|
||||
// Message added successfully
|
||||
}
|
||||
if (batch.TryAddMessage(new ServiceBusMessage("Message 2")))
|
||||
{
|
||||
// Message added successfully
|
||||
}
|
||||
await sender.SendMessagesAsync(batch);
|
||||
```
|
||||
|
||||
### 2. Receive Messages
|
||||
|
||||
```csharp
|
||||
ServiceBusReceiver receiver = client.CreateReceiver("my-queue");
|
||||
|
||||
// Single message
|
||||
ServiceBusReceivedMessage message = await receiver.ReceiveMessageAsync();
|
||||
string body = message.Body.ToString();
|
||||
Console.WriteLine(body);
|
||||
|
||||
// Complete the message (removes from queue)
|
||||
await receiver.CompleteMessageAsync(message);
|
||||
|
||||
// Batch receive
|
||||
IReadOnlyList<ServiceBusReceivedMessage> messages = await receiver.ReceiveMessagesAsync(maxMessages: 10);
|
||||
foreach (var msg in messages)
|
||||
{
|
||||
Console.WriteLine(msg.Body.ToString());
|
||||
await receiver.CompleteMessageAsync(msg);
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Message Settlement
|
||||
|
||||
```csharp
|
||||
// Complete - removes message from queue
|
||||
await receiver.CompleteMessageAsync(message);
|
||||
|
||||
// Abandon - releases lock, message can be received again
|
||||
await receiver.AbandonMessageAsync(message);
|
||||
|
||||
// Defer - prevents normal receive, use ReceiveDeferredMessageAsync
|
||||
await receiver.DeferMessageAsync(message);
|
||||
|
||||
// Dead Letter - moves to dead letter subqueue
|
||||
await receiver.DeadLetterMessageAsync(message, "InvalidFormat", "Message body was not valid JSON");
|
||||
```
|
||||
|
||||
### 4. Background Processing with Processor
|
||||
|
||||
```csharp
|
||||
ServiceBusProcessor processor = client.CreateProcessor("my-queue", new ServiceBusProcessorOptions
|
||||
{
|
||||
AutoCompleteMessages = false,
|
||||
MaxConcurrentCalls = 2
|
||||
});
|
||||
|
||||
processor.ProcessMessageAsync += async (args) =>
|
||||
{
|
||||
try
|
||||
{
|
||||
string body = args.Message.Body.ToString();
|
||||
Console.WriteLine($"Received: {body}");
|
||||
await args.CompleteMessageAsync(args.Message);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.WriteLine($"Error processing: {ex.Message}");
|
||||
await args.AbandonMessageAsync(args.Message);
|
||||
}
|
||||
};
|
||||
|
||||
processor.ProcessErrorAsync += (args) =>
|
||||
{
|
||||
Console.WriteLine($"Error source: {args.ErrorSource}");
|
||||
Console.WriteLine($"Entity: {args.EntityPath}");
|
||||
Console.WriteLine($"Exception: {args.Exception}");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
await processor.StartProcessingAsync();
|
||||
// ... application runs
|
||||
await processor.StopProcessingAsync();
|
||||
```
|
||||
|
||||
### 5. Sessions (Ordered Processing)
|
||||
|
||||
```csharp
|
||||
// Send session message
|
||||
ServiceBusMessage message = new("Hello")
|
||||
{
|
||||
SessionId = "order-123"
|
||||
};
|
||||
await sender.SendMessageAsync(message);
|
||||
|
||||
// Receive from next available session
|
||||
ServiceBusSessionReceiver receiver = await client.AcceptNextSessionAsync("my-queue");
|
||||
|
||||
// Or receive from specific session
|
||||
ServiceBusSessionReceiver receiver = await client.AcceptSessionAsync("my-queue", "order-123");
|
||||
|
||||
// Session state management
|
||||
await receiver.SetSessionStateAsync(new BinaryData("processing"));
|
||||
BinaryData state = await receiver.GetSessionStateAsync();
|
||||
|
||||
// Renew session lock
|
||||
await receiver.RenewSessionLockAsync();
|
||||
```
|
||||
|
||||
### 6. Dead Letter Queue
|
||||
|
||||
```csharp
|
||||
// Receive from dead letter queue
|
||||
ServiceBusReceiver dlqReceiver = client.CreateReceiver("my-queue", new ServiceBusReceiverOptions
|
||||
{
|
||||
SubQueue = SubQueue.DeadLetter
|
||||
});
|
||||
|
||||
ServiceBusReceivedMessage dlqMessage = await dlqReceiver.ReceiveMessageAsync();
|
||||
|
||||
// Access dead letter metadata
|
||||
string reason = dlqMessage.DeadLetterReason;
|
||||
string description = dlqMessage.DeadLetterErrorDescription;
|
||||
Console.WriteLine($"Dead letter reason: {reason} - {description}");
|
||||
```
|
||||
|
||||
### 7. Topics and Subscriptions
|
||||
|
||||
```csharp
|
||||
// Send to topic
|
||||
ServiceBusSender topicSender = client.CreateSender("my-topic");
|
||||
await topicSender.SendMessageAsync(new ServiceBusMessage("Broadcast message"));
|
||||
|
||||
// Receive from subscription
|
||||
ServiceBusReceiver subReceiver = client.CreateReceiver("my-topic", "my-subscription");
|
||||
var message = await subReceiver.ReceiveMessageAsync();
|
||||
```
|
||||
|
||||
### 8. Administration (CRUD)
|
||||
|
||||
```csharp
|
||||
var adminClient = new ServiceBusAdministrationClient(
|
||||
fullyQualifiedNamespace,
|
||||
new DefaultAzureCredential());
|
||||
|
||||
// Create queue
|
||||
var options = new CreateQueueOptions("my-queue")
|
||||
{
|
||||
MaxDeliveryCount = 10,
|
||||
LockDuration = TimeSpan.FromSeconds(30),
|
||||
RequiresSession = true,
|
||||
DeadLetteringOnMessageExpiration = true
|
||||
};
|
||||
QueueProperties queue = await adminClient.CreateQueueAsync(options);
|
||||
|
||||
// Update queue
|
||||
queue.LockDuration = TimeSpan.FromSeconds(60);
|
||||
await adminClient.UpdateQueueAsync(queue);
|
||||
|
||||
// Create topic and subscription
|
||||
await adminClient.CreateTopicAsync(new CreateTopicOptions("my-topic"));
|
||||
await adminClient.CreateSubscriptionAsync(new CreateSubscriptionOptions("my-topic", "my-subscription"));
|
||||
|
||||
// Delete
|
||||
await adminClient.DeleteQueueAsync("my-queue");
|
||||
```
|
||||
|
||||
### 9. Cross-Entity Transactions
|
||||
|
||||
```csharp
|
||||
var options = new ServiceBusClientOptions { EnableCrossEntityTransactions = true };
|
||||
await using var client = new ServiceBusClient(connectionString, options);
|
||||
|
||||
ServiceBusReceiver receiverA = client.CreateReceiver("queueA");
|
||||
ServiceBusSender senderB = client.CreateSender("queueB");
|
||||
|
||||
ServiceBusReceivedMessage receivedMessage = await receiverA.ReceiveMessageAsync();
|
||||
|
||||
using (var ts = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
|
||||
{
|
||||
await receiverA.CompleteMessageAsync(receivedMessage);
|
||||
await senderB.SendMessageAsync(new ServiceBusMessage("Forwarded"));
|
||||
ts.Complete();
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ServiceBusClient` | Main entry point, manages connection |
|
||||
| `ServiceBusSender` | Sends messages to queues/topics |
|
||||
| `ServiceBusReceiver` | Receives messages from queues/subscriptions |
|
||||
| `ServiceBusSessionReceiver` | Receives session messages |
|
||||
| `ServiceBusProcessor` | Background message processing |
|
||||
| `ServiceBusSessionProcessor` | Background session processing |
|
||||
| `ServiceBusAdministrationClient` | CRUD for queues/topics/subscriptions |
|
||||
| `ServiceBusMessage` | Message to send |
|
||||
| `ServiceBusReceivedMessage` | Received message with metadata |
|
||||
| `ServiceBusMessageBatch` | Batch of messages |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use singletons** — Clients, senders, receivers, and processors are thread-safe
|
||||
2. **Always dispose** — Use `await using` or call `DisposeAsync()`
|
||||
3. **Dispose order** — Close senders/receivers/processors first, then client
|
||||
4. **Use DefaultAzureCredential** — Prefer over connection strings for production
|
||||
5. **Use processors for background work** — Handles lock renewal automatically
|
||||
6. **Use safe batching** — `CreateMessageBatchAsync()` and `TryAddMessage()`
|
||||
7. **Handle transient errors** — Use `ServiceBusException.Reason`
|
||||
8. **Configure transport** — Use `AmqpWebSockets` if ports 5671/5672 are blocked
|
||||
9. **Set appropriate lock duration** — Default is 30 seconds
|
||||
10. **Use sessions for ordering** — FIFO within a session
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
try
|
||||
{
|
||||
await sender.SendMessageAsync(message);
|
||||
}
|
||||
catch (ServiceBusException ex) when (ex.Reason == ServiceBusFailureReason.ServiceBusy)
|
||||
{
|
||||
// Retry with backoff
|
||||
}
|
||||
catch (ServiceBusException ex)
|
||||
{
|
||||
Console.WriteLine($"Service Bus Error: {ex.Reason} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Messaging.ServiceBus` | Service Bus (this SDK) | `dotnet add package Azure.Messaging.ServiceBus` |
|
||||
| `Azure.Messaging.EventHubs` | Event streaming | `dotnet add package Azure.Messaging.EventHubs` |
|
||||
| `Azure.Messaging.EventGrid` | Event routing | `dotnet add package Azure.Messaging.EventGrid` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.Messaging.ServiceBus |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.messaging.servicebus |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/servicebus/Azure.Messaging.ServiceBus |
|
||||
| Troubleshooting | https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/servicebus/Azure.Messaging.ServiceBus/TROUBLESHOOTING.md |
|
||||
@@ -0,0 +1,486 @@
|
||||
---
|
||||
name: azure-mgmt-applicationinsights-dotnet
|
||||
description: |
|
||||
Azure Application Insights SDK for .NET. Application performance monitoring and observability resource management. Use for creating Application Insights components, web tests, workbooks, analytics items, and API keys. Triggers: "Application Insights", "ApplicationInsights", "App Insights", "APM", "application monitoring", "web tests", "availability tests", "workbooks".
|
||||
package: Azure.ResourceManager.ApplicationInsights
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.ApplicationInsights (.NET)
|
||||
|
||||
Azure Resource Manager SDK for managing Application Insights resources for application performance monitoring.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.ApplicationInsights
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.0.0 (GA)
|
||||
**API Version**: 2022-06-15
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_APPINSIGHTS_NAME=<your-appinsights-component>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.ApplicationInsights;
|
||||
|
||||
ArmClient client = new ArmClient(new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
Subscription
|
||||
└── ResourceGroup
|
||||
└── ApplicationInsightsComponent # App Insights resource
|
||||
├── ApplicationInsightsComponentApiKey # API keys for programmatic access
|
||||
├── ComponentLinkedStorageAccount # Linked storage for data export
|
||||
└── (via component ID)
|
||||
├── WebTest # Availability tests
|
||||
├── Workbook # Workbooks for analysis
|
||||
├── WorkbookTemplate # Workbook templates
|
||||
└── MyWorkbook # Private workbooks
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create Application Insights Component (Workspace-based)
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.ApplicationInsights;
|
||||
using Azure.ResourceManager.ApplicationInsights.Models;
|
||||
|
||||
ResourceGroupResource resourceGroup = await client
|
||||
.GetDefaultSubscriptionAsync()
|
||||
.Result
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
ApplicationInsightsComponentCollection components = resourceGroup.GetApplicationInsightsComponents();
|
||||
|
||||
// Workspace-based Application Insights (recommended)
|
||||
ApplicationInsightsComponentData data = new ApplicationInsightsComponentData(
|
||||
AzureLocation.EastUS,
|
||||
ApplicationInsightsApplicationType.Web)
|
||||
{
|
||||
Kind = "web",
|
||||
WorkspaceResourceId = new ResourceIdentifier(
|
||||
"/subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.OperationalInsights/workspaces/<workspace-name>"),
|
||||
IngestionMode = IngestionMode.LogAnalytics,
|
||||
PublicNetworkAccessForIngestion = PublicNetworkAccessType.Enabled,
|
||||
PublicNetworkAccessForQuery = PublicNetworkAccessType.Enabled,
|
||||
RetentionInDays = 90,
|
||||
SamplingPercentage = 100,
|
||||
DisableIPMasking = false,
|
||||
ImmediatePurgeDataOn30Days = false,
|
||||
Tags =
|
||||
{
|
||||
{ "environment", "production" },
|
||||
{ "application", "mywebapp" }
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<ApplicationInsightsComponentResource> operation = await components
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-appinsights", data);
|
||||
|
||||
ApplicationInsightsComponentResource component = operation.Value;
|
||||
|
||||
Console.WriteLine($"Component created: {component.Data.Name}");
|
||||
Console.WriteLine($"Instrumentation Key: {component.Data.InstrumentationKey}");
|
||||
Console.WriteLine($"Connection String: {component.Data.ConnectionString}");
|
||||
```
|
||||
|
||||
### 2. Get Connection String and Keys
|
||||
|
||||
```csharp
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
|
||||
// Get connection string for SDK configuration
|
||||
string connectionString = component.Data.ConnectionString;
|
||||
string instrumentationKey = component.Data.InstrumentationKey;
|
||||
string appId = component.Data.AppId;
|
||||
|
||||
Console.WriteLine($"Connection String: {connectionString}");
|
||||
Console.WriteLine($"Instrumentation Key: {instrumentationKey}");
|
||||
Console.WriteLine($"App ID: {appId}");
|
||||
```
|
||||
|
||||
### 3. Create API Key
|
||||
|
||||
```csharp
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
|
||||
ApplicationInsightsComponentApiKeyCollection apiKeys = component.GetApplicationInsightsComponentApiKeys();
|
||||
|
||||
// API key for reading telemetry
|
||||
ApplicationInsightsApiKeyContent keyContent = new ApplicationInsightsApiKeyContent
|
||||
{
|
||||
Name = "ReadTelemetryKey",
|
||||
LinkedReadProperties =
|
||||
{
|
||||
$"/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/microsoft.insights/components/{component.Data.Name}/api",
|
||||
$"/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/microsoft.insights/components/{component.Data.Name}/agentconfig"
|
||||
}
|
||||
};
|
||||
|
||||
ApplicationInsightsComponentApiKeyResource apiKey = await apiKeys
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, keyContent);
|
||||
|
||||
Console.WriteLine($"API Key Name: {apiKey.Data.Name}");
|
||||
Console.WriteLine($"API Key: {apiKey.Data.ApiKey}"); // Only shown once!
|
||||
```
|
||||
|
||||
### 4. Create Web Test (Availability Test)
|
||||
|
||||
```csharp
|
||||
WebTestCollection webTests = resourceGroup.GetWebTests();
|
||||
|
||||
// URL Ping Test
|
||||
WebTestData urlPingTest = new WebTestData(AzureLocation.EastUS)
|
||||
{
|
||||
Kind = WebTestKind.Ping,
|
||||
SyntheticMonitorId = "webtest-ping-myapp",
|
||||
WebTestName = "Homepage Availability",
|
||||
Description = "Checks if homepage is available",
|
||||
IsEnabled = true,
|
||||
Frequency = 300, // 5 minutes
|
||||
Timeout = 120, // 2 minutes
|
||||
WebTestKind = WebTestKind.Ping,
|
||||
IsRetryEnabled = true,
|
||||
Locations =
|
||||
{
|
||||
new WebTestGeolocation { WebTestLocationId = "us-ca-sjc-azr" }, // West US
|
||||
new WebTestGeolocation { WebTestLocationId = "us-tx-sn1-azr" }, // South Central US
|
||||
new WebTestGeolocation { WebTestLocationId = "us-il-ch1-azr" }, // North Central US
|
||||
new WebTestGeolocation { WebTestLocationId = "emea-gb-db3-azr" }, // UK South
|
||||
new WebTestGeolocation { WebTestLocationId = "apac-sg-sin-azr" } // Southeast Asia
|
||||
},
|
||||
Configuration = new WebTestConfiguration
|
||||
{
|
||||
WebTest = """
|
||||
<WebTest Name="Homepage" Enabled="True" Timeout="120"
|
||||
xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
|
||||
<Items>
|
||||
<Request Method="GET" Version="1.1" Url="https://myapp.example.com"
|
||||
ThinkTime="0" Timeout="120" ParseDependentRequests="False"
|
||||
FollowRedirects="True" RecordResult="True" Cache="False"
|
||||
ResponseTimeGoal="0" Encoding="utf-8" ExpectedHttpStatusCode="200" />
|
||||
</Items>
|
||||
</WebTest>
|
||||
"""
|
||||
},
|
||||
Tags =
|
||||
{
|
||||
{ $"hidden-link:/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/microsoft.insights/components/my-appinsights", "Resource" }
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<WebTestResource> operation = await webTests
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "webtest-homepage", urlPingTest);
|
||||
|
||||
WebTestResource webTest = operation.Value;
|
||||
Console.WriteLine($"Web test created: {webTest.Data.Name}");
|
||||
```
|
||||
|
||||
### 5. Create Multi-Step Web Test
|
||||
|
||||
```csharp
|
||||
WebTestData multiStepTest = new WebTestData(AzureLocation.EastUS)
|
||||
{
|
||||
Kind = WebTestKind.MultiStep,
|
||||
SyntheticMonitorId = "webtest-multistep-login",
|
||||
WebTestName = "Login Flow Test",
|
||||
Description = "Tests login functionality",
|
||||
IsEnabled = true,
|
||||
Frequency = 900, // 15 minutes
|
||||
Timeout = 300, // 5 minutes
|
||||
WebTestKind = WebTestKind.MultiStep,
|
||||
IsRetryEnabled = true,
|
||||
Locations =
|
||||
{
|
||||
new WebTestGeolocation { WebTestLocationId = "us-ca-sjc-azr" }
|
||||
},
|
||||
Configuration = new WebTestConfiguration
|
||||
{
|
||||
WebTest = """
|
||||
<WebTest Name="LoginFlow" Enabled="True" Timeout="300"
|
||||
xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
|
||||
<Items>
|
||||
<Request Method="GET" Version="1.1" Url="https://myapp.example.com/login"
|
||||
ThinkTime="0" Timeout="60" />
|
||||
<Request Method="POST" Version="1.1" Url="https://myapp.example.com/api/auth"
|
||||
ThinkTime="0" Timeout="60">
|
||||
<Headers>
|
||||
<Header Name="Content-Type" Value="application/json" />
|
||||
</Headers>
|
||||
<Body>{"username":"testuser","password":"{{TestPassword}}"}</Body>
|
||||
</Request>
|
||||
</Items>
|
||||
</WebTest>
|
||||
"""
|
||||
},
|
||||
Tags =
|
||||
{
|
||||
{ $"hidden-link:/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/microsoft.insights/components/my-appinsights", "Resource" }
|
||||
}
|
||||
};
|
||||
|
||||
await webTests.CreateOrUpdateAsync(WaitUntil.Completed, "webtest-login-flow", multiStepTest);
|
||||
```
|
||||
|
||||
### 6. Create Workbook
|
||||
|
||||
```csharp
|
||||
WorkbookCollection workbooks = resourceGroup.GetWorkbooks();
|
||||
|
||||
WorkbookData workbookData = new WorkbookData(AzureLocation.EastUS)
|
||||
{
|
||||
DisplayName = "Application Performance Dashboard",
|
||||
Category = "workbook",
|
||||
Kind = WorkbookSharedTypeKind.Shared,
|
||||
SerializedData = """
|
||||
{
|
||||
"version": "Notebook/1.0",
|
||||
"items": [
|
||||
{
|
||||
"type": 1,
|
||||
"content": {
|
||||
"json": "# Application Performance\n\nThis workbook shows application performance metrics."
|
||||
},
|
||||
"name": "header"
|
||||
},
|
||||
{
|
||||
"type": 3,
|
||||
"content": {
|
||||
"version": "KqlItem/1.0",
|
||||
"query": "requests\n| summarize count() by bin(timestamp, 1h)\n| render timechart",
|
||||
"size": 0,
|
||||
"title": "Requests per Hour",
|
||||
"timeContext": {
|
||||
"durationMs": 86400000
|
||||
},
|
||||
"queryType": 0,
|
||||
"resourceType": "microsoft.insights/components"
|
||||
},
|
||||
"name": "requestsChart"
|
||||
}
|
||||
],
|
||||
"isLocked": false
|
||||
}
|
||||
""",
|
||||
SourceId = component.Id,
|
||||
Tags =
|
||||
{
|
||||
{ "environment", "production" }
|
||||
}
|
||||
};
|
||||
|
||||
// Note: Workbook ID should be a new GUID
|
||||
string workbookId = Guid.NewGuid().ToString();
|
||||
|
||||
ArmOperation<WorkbookResource> operation = await workbooks
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, workbookId, workbookData);
|
||||
|
||||
WorkbookResource workbook = operation.Value;
|
||||
Console.WriteLine($"Workbook created: {workbook.Data.DisplayName}");
|
||||
```
|
||||
|
||||
### 7. Link Storage Account
|
||||
|
||||
```csharp
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
|
||||
ComponentLinkedStorageAccountCollection linkedStorage = component.GetComponentLinkedStorageAccounts();
|
||||
|
||||
ComponentLinkedStorageAccountData storageData = new ComponentLinkedStorageAccountData
|
||||
{
|
||||
LinkedStorageAccount = new ResourceIdentifier(
|
||||
"/subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.Storage/storageAccounts/<storage-account>")
|
||||
};
|
||||
|
||||
ArmOperation<ComponentLinkedStorageAccountResource> operation = await linkedStorage
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, StorageType.ServiceProfiler, storageData);
|
||||
```
|
||||
|
||||
### 8. List and Manage Components
|
||||
|
||||
```csharp
|
||||
// List all Application Insights components in resource group
|
||||
await foreach (ApplicationInsightsComponentResource component in
|
||||
resourceGroup.GetApplicationInsightsComponents())
|
||||
{
|
||||
Console.WriteLine($"Component: {component.Data.Name}");
|
||||
Console.WriteLine($" App ID: {component.Data.AppId}");
|
||||
Console.WriteLine($" Type: {component.Data.ApplicationType}");
|
||||
Console.WriteLine($" Ingestion Mode: {component.Data.IngestionMode}");
|
||||
Console.WriteLine($" Retention: {component.Data.RetentionInDays} days");
|
||||
}
|
||||
|
||||
// List web tests
|
||||
await foreach (WebTestResource webTest in resourceGroup.GetWebTests())
|
||||
{
|
||||
Console.WriteLine($"Web Test: {webTest.Data.WebTestName}");
|
||||
Console.WriteLine($" Enabled: {webTest.Data.IsEnabled}");
|
||||
Console.WriteLine($" Frequency: {webTest.Data.Frequency}s");
|
||||
}
|
||||
|
||||
// List workbooks
|
||||
await foreach (WorkbookResource workbook in resourceGroup.GetWorkbooks())
|
||||
{
|
||||
Console.WriteLine($"Workbook: {workbook.Data.DisplayName}");
|
||||
}
|
||||
```
|
||||
|
||||
### 9. Update Component
|
||||
|
||||
```csharp
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
|
||||
// Update using full data (PUT operation)
|
||||
ApplicationInsightsComponentData updateData = component.Data;
|
||||
updateData.RetentionInDays = 180;
|
||||
updateData.SamplingPercentage = 50;
|
||||
updateData.Tags["updated"] = "true";
|
||||
|
||||
ArmOperation<ApplicationInsightsComponentResource> operation = await resourceGroup
|
||||
.GetApplicationInsightsComponents()
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-appinsights", updateData);
|
||||
```
|
||||
|
||||
### 10. Delete Resources
|
||||
|
||||
```csharp
|
||||
// Delete Application Insights component
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
await component.DeleteAsync(WaitUntil.Completed);
|
||||
|
||||
// Delete web test
|
||||
WebTestResource webTest = await resourceGroup.GetWebTestAsync("webtest-homepage");
|
||||
await webTest.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ApplicationInsightsComponentResource` | App Insights component |
|
||||
| `ApplicationInsightsComponentData` | Component configuration |
|
||||
| `ApplicationInsightsComponentCollection` | Collection of components |
|
||||
| `ApplicationInsightsComponentApiKeyResource` | API key for programmatic access |
|
||||
| `WebTestResource` | Availability/web test |
|
||||
| `WebTestData` | Web test configuration |
|
||||
| `WorkbookResource` | Analysis workbook |
|
||||
| `WorkbookData` | Workbook configuration |
|
||||
| `ComponentLinkedStorageAccountResource` | Linked storage for exports |
|
||||
|
||||
## Application Types
|
||||
|
||||
| Type | Enum Value |
|
||||
|------|------------|
|
||||
| Web Application | `Web` |
|
||||
| iOS Application | `iOS` |
|
||||
| Java Application | `Java` |
|
||||
| Node.js Application | `NodeJS` |
|
||||
| .NET Application | `MRT` |
|
||||
| Other | `Other` |
|
||||
|
||||
## Web Test Locations
|
||||
|
||||
| Location ID | Region |
|
||||
|-------------|--------|
|
||||
| `us-ca-sjc-azr` | West US |
|
||||
| `us-tx-sn1-azr` | South Central US |
|
||||
| `us-il-ch1-azr` | North Central US |
|
||||
| `us-va-ash-azr` | East US |
|
||||
| `emea-gb-db3-azr` | UK South |
|
||||
| `emea-nl-ams-azr` | West Europe |
|
||||
| `emea-fr-pra-edge` | France Central |
|
||||
| `apac-sg-sin-azr` | Southeast Asia |
|
||||
| `apac-hk-hkn-azr` | East Asia |
|
||||
| `apac-jp-kaw-edge` | Japan East |
|
||||
| `latam-br-gru-edge` | Brazil South |
|
||||
| `emea-au-syd-edge` | Australia East |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use workspace-based** — Workspace-based App Insights is the current standard
|
||||
2. **Link to Log Analytics** — Store data in Log Analytics for better querying
|
||||
3. **Set appropriate retention** — Balance cost vs. data availability
|
||||
4. **Use sampling** — Reduce costs for high-volume applications
|
||||
5. **Store connection string securely** — Use Key Vault or managed identity
|
||||
6. **Enable multiple test locations** — For accurate availability monitoring
|
||||
7. **Use workbooks** — For custom dashboards and analysis
|
||||
8. **Set up alerts** — Based on availability tests and metrics
|
||||
9. **Tag resources** — For cost allocation and organization
|
||||
10. **Use private endpoints** — For secure data ingestion
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ArmOperation<ApplicationInsightsComponentResource> operation = await components
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-appinsights", data);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Component already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid configuration: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## SDK Integration
|
||||
|
||||
Use the connection string with Application Insights SDK:
|
||||
|
||||
```csharp
|
||||
// Program.cs in ASP.NET Core
|
||||
builder.Services.AddApplicationInsightsTelemetry(options =>
|
||||
{
|
||||
options.ConnectionString = configuration["ApplicationInsights:ConnectionString"];
|
||||
});
|
||||
|
||||
// Or set via environment variable
|
||||
// APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=...;IngestionEndpoint=...
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.ApplicationInsights` | Resource management (this SDK) | `dotnet add package Azure.ResourceManager.ApplicationInsights` |
|
||||
| `Microsoft.ApplicationInsights` | Telemetry SDK | `dotnet add package Microsoft.ApplicationInsights` |
|
||||
| `Microsoft.ApplicationInsights.AspNetCore` | ASP.NET Core integration | `dotnet add package Microsoft.ApplicationInsights.AspNetCore` |
|
||||
| `Azure.Monitor.OpenTelemetry.Exporter` | OpenTelemetry export | `dotnet add package Azure.Monitor.OpenTelemetry.Exporter` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.ResourceManager.ApplicationInsights |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.resourcemanager.applicationinsights |
|
||||
| Product Documentation | https://learn.microsoft.com/azure/azure-monitor/app/app-insights-overview |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/applicationinsights/Azure.ResourceManager.ApplicationInsights |
|
||||
@@ -0,0 +1,230 @@
|
||||
---
|
||||
name: azure-mgmt-arizeaiobservabilityeval-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for Arize AI Observability and Evaluation (.NET). Use when managing Arize AI organizations
|
||||
on Azure via Azure Marketplace, creating/updating/deleting Arize resources, or integrating Arize ML observability
|
||||
into .NET applications. Triggers: "Arize AI", "ML observability", "ArizeAIObservabilityEval", "Arize organization".
|
||||
package: Azure.ResourceManager.ArizeAIObservabilityEval
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.ArizeAIObservabilityEval
|
||||
|
||||
.NET SDK for managing Arize AI Observability and Evaluation resources on Azure.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.ArizeAIObservabilityEval --version 1.0.0
|
||||
```
|
||||
|
||||
## Package Info
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Package | `Azure.ResourceManager.ArizeAIObservabilityEval` |
|
||||
| Version | `1.0.0` (GA) |
|
||||
| API Version | `2024-10-01` |
|
||||
| ARM Type | `ArizeAi.ObservabilityEval/organizations` |
|
||||
| Dependencies | `Azure.Core` >= 1.46.2, `Azure.ResourceManager` >= 1.13.1 |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_TENANT_ID=<your-tenant-id>
|
||||
AZURE_CLIENT_ID=<your-client-id>
|
||||
AZURE_CLIENT_SECRET=<your-client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.ArizeAIObservabilityEval;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Create an Arize AI Organization
|
||||
|
||||
```csharp
|
||||
using Azure.Core;
|
||||
using Azure.ResourceManager.Resources;
|
||||
using Azure.ResourceManager.ArizeAIObservabilityEval;
|
||||
using Azure.ResourceManager.ArizeAIObservabilityEval.Models;
|
||||
|
||||
// Get subscription and resource group
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
|
||||
var subscription = await armClient.GetSubscriptionResource(
|
||||
SubscriptionResource.CreateResourceIdentifier(subscriptionId)).GetAsync();
|
||||
var resourceGroup = await subscription.Value.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Get the organization collection
|
||||
var collection = resourceGroup.Value.GetArizeAIObservabilityEvalOrganizations();
|
||||
|
||||
// Create organization data
|
||||
var data = new ArizeAIObservabilityEvalOrganizationData(AzureLocation.EastUS)
|
||||
{
|
||||
Properties = new ArizeAIObservabilityEvalOrganizationProperties
|
||||
{
|
||||
Marketplace = new ArizeAIObservabilityEvalMarketplaceDetails
|
||||
{
|
||||
SubscriptionId = "marketplace-subscription-id",
|
||||
OfferDetails = new ArizeAIObservabilityEvalOfferDetails
|
||||
{
|
||||
PublisherId = "arikimlabs1649082416596",
|
||||
OfferId = "arize-liftr-1",
|
||||
PlanId = "arize-liftr-1-plan",
|
||||
PlanName = "Arize AI Plan",
|
||||
TermUnit = "P1M",
|
||||
TermId = "term-id"
|
||||
}
|
||||
},
|
||||
User = new ArizeAIObservabilityEvalUserDetails
|
||||
{
|
||||
FirstName = "John",
|
||||
LastName = "Doe",
|
||||
EmailAddress = "john.doe@example.com"
|
||||
}
|
||||
},
|
||||
Tags = { ["environment"] = "production" }
|
||||
};
|
||||
|
||||
// Create (long-running operation)
|
||||
var operation = await collection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-arize-org",
|
||||
data);
|
||||
|
||||
var organization = operation.Value;
|
||||
Console.WriteLine($"Created: {organization.Data.Name}");
|
||||
```
|
||||
|
||||
### Get an Organization
|
||||
|
||||
```csharp
|
||||
// Option 1: From collection
|
||||
var org = await collection.GetAsync("my-arize-org");
|
||||
|
||||
// Option 2: Check if exists first
|
||||
var exists = await collection.ExistsAsync("my-arize-org");
|
||||
if (exists.Value)
|
||||
{
|
||||
var org = await collection.GetAsync("my-arize-org");
|
||||
}
|
||||
|
||||
// Option 3: GetIfExists (returns null if not found)
|
||||
var response = await collection.GetIfExistsAsync("my-arize-org");
|
||||
if (response.HasValue)
|
||||
{
|
||||
var org = response.Value;
|
||||
}
|
||||
```
|
||||
|
||||
### List Organizations
|
||||
|
||||
```csharp
|
||||
// List in resource group
|
||||
await foreach (var org in collection.GetAllAsync())
|
||||
{
|
||||
Console.WriteLine($"Org: {org.Data.Name}, State: {org.Data.Properties?.ProvisioningState}");
|
||||
}
|
||||
|
||||
// List in subscription
|
||||
await foreach (var org in subscription.Value.GetArizeAIObservabilityEvalOrganizationsAsync())
|
||||
{
|
||||
Console.WriteLine($"Org: {org.Data.Name}");
|
||||
}
|
||||
```
|
||||
|
||||
### Update an Organization
|
||||
|
||||
```csharp
|
||||
// Update tags
|
||||
var org = await collection.GetAsync("my-arize-org");
|
||||
var updateData = new ArizeAIObservabilityEvalOrganizationPatch
|
||||
{
|
||||
Tags = { ["environment"] = "staging", ["team"] = "ml-ops" }
|
||||
};
|
||||
var updated = await org.Value.UpdateAsync(updateData);
|
||||
```
|
||||
|
||||
### Delete an Organization
|
||||
|
||||
```csharp
|
||||
var org = await collection.GetAsync("my-arize-org");
|
||||
await org.Value.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArizeAIObservabilityEvalOrganizationResource` | Main ARM resource for Arize organizations |
|
||||
| `ArizeAIObservabilityEvalOrganizationCollection` | Collection for CRUD operations |
|
||||
| `ArizeAIObservabilityEvalOrganizationData` | Resource data model |
|
||||
| `ArizeAIObservabilityEvalOrganizationProperties` | Organization properties |
|
||||
| `ArizeAIObservabilityEvalMarketplaceDetails` | Azure Marketplace subscription info |
|
||||
| `ArizeAIObservabilityEvalOfferDetails` | Marketplace offer configuration |
|
||||
| `ArizeAIObservabilityEvalUserDetails` | User contact information |
|
||||
| `ArizeAIObservabilityEvalOrganizationPatch` | Patch model for updates |
|
||||
| `ArizeAIObservabilityEvalSingleSignOnPropertiesV2` | SSO configuration |
|
||||
|
||||
## Enums
|
||||
|
||||
| Enum | Values |
|
||||
|------|--------|
|
||||
| `ArizeAIObservabilityEvalOfferProvisioningState` | `Succeeded`, `Failed`, `Canceled`, `Provisioning`, `Updating`, `Deleting`, `Accepted` |
|
||||
| `ArizeAIObservabilityEvalMarketplaceSubscriptionStatus` | `PendingFulfillmentStart`, `Subscribed`, `Suspended`, `Unsubscribed` |
|
||||
| `ArizeAIObservabilityEvalSingleSignOnState` | `Initial`, `Enable`, `Disable` |
|
||||
| `ArizeAIObservabilityEvalSingleSignOnType` | `Saml`, `OpenId` |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use async methods** — All operations support async/await
|
||||
2. **Handle long-running operations** — Use `WaitUntil.Completed` or poll manually
|
||||
3. **Use GetIfExistsAsync** — Avoid exceptions for conditional logic
|
||||
4. **Implement retry policies** — Configure via `ArmClientOptions`
|
||||
5. **Use resource identifiers** — For direct resource access without listing
|
||||
6. **Close clients properly** — Use `using` statements or dispose explicitly
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
try
|
||||
{
|
||||
var org = await collection.GetAsync("my-arize-org");
|
||||
}
|
||||
catch (Azure.RequestFailedException ex) when (ex.Status == 404)
|
||||
{
|
||||
Console.WriteLine("Organization not found");
|
||||
}
|
||||
catch (Azure.RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure error: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Direct Resource Access
|
||||
|
||||
```csharp
|
||||
// Access resource directly by ID (without listing)
|
||||
var resourceId = ArizeAIObservabilityEvalOrganizationResource.CreateResourceIdentifier(
|
||||
subscriptionId,
|
||||
"my-resource-group",
|
||||
"my-arize-org");
|
||||
|
||||
var org = armClient.GetArizeAIObservabilityEvalOrganizationResource(resourceId);
|
||||
var data = await org.GetAsync();
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- [NuGet Package](https://www.nuget.org/packages/Azure.ResourceManager.ArizeAIObservabilityEval)
|
||||
- [Azure SDK for .NET](https://github.com/Azure/azure-sdk-for-net)
|
||||
- [Arize AI](https://arize.com/)
|
||||
354
skills/official/microsoft/dotnet/partner/mongodbatlas/SKILL.md
Normal file
354
skills/official/microsoft/dotnet/partner/mongodbatlas/SKILL.md
Normal file
@@ -0,0 +1,354 @@
|
||||
---
|
||||
name: azure-mgmt-mongodbatlas-dotnet
|
||||
description: Manage MongoDB Atlas Organizations as Azure ARM resources using Azure.ResourceManager.MongoDBAtlas SDK. Use when creating, updating, listing, or deleting MongoDB Atlas organizations through Azure Marketplace integration. This SDK manages the Azure-side organization resource, not Atlas clusters/databases directly.
|
||||
package: Azure.ResourceManager.MongoDBAtlas
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.MongoDBAtlas SDK
|
||||
|
||||
Manage MongoDB Atlas Organizations as Azure ARM resources with unified billing through Azure Marketplace.
|
||||
|
||||
## Package Information
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Package | `Azure.ResourceManager.MongoDBAtlas` |
|
||||
| Version | 1.0.0 (GA) |
|
||||
| API Version | 2025-06-01 |
|
||||
| Resource Type | `MongoDB.Atlas/organizations` |
|
||||
| NuGet | [Azure.ResourceManager.MongoDBAtlas](https://www.nuget.org/packages/Azure.ResourceManager.MongoDBAtlas) |
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.MongoDBAtlas
|
||||
dotnet add package Azure.Identity
|
||||
dotnet add package Azure.ResourceManager
|
||||
```
|
||||
|
||||
## Important Scope Limitation
|
||||
|
||||
This SDK manages **MongoDB Atlas Organizations as Azure ARM resources** for marketplace integration. It does NOT directly manage:
|
||||
- Atlas clusters
|
||||
- Databases
|
||||
- Collections
|
||||
- Users/roles
|
||||
|
||||
For cluster management, use the MongoDB Atlas API directly after creating the organization.
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.MongoDBAtlas;
|
||||
using Azure.ResourceManager.MongoDBAtlas.Models;
|
||||
|
||||
// Create ARM client with DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
```
|
||||
|
||||
## Core Types
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `MongoDBAtlasOrganizationResource` | ARM resource representing an Atlas organization |
|
||||
| `MongoDBAtlasOrganizationCollection` | Collection of organizations in a resource group |
|
||||
| `MongoDBAtlasOrganizationData` | Data model for organization resource |
|
||||
| `MongoDBAtlasOrganizationProperties` | Organization-specific properties |
|
||||
| `MongoDBAtlasMarketplaceDetails` | Azure Marketplace subscription details |
|
||||
| `MongoDBAtlasOfferDetails` | Marketplace offer configuration |
|
||||
| `MongoDBAtlasUserDetails` | User information for the organization |
|
||||
| `MongoDBAtlasPartnerProperties` | MongoDB-specific properties (org name, ID) |
|
||||
|
||||
## Workflows
|
||||
|
||||
### Get Organization Collection
|
||||
|
||||
```csharp
|
||||
// Get resource group
|
||||
var subscription = await armClient.GetDefaultSubscriptionAsync();
|
||||
var resourceGroup = await subscription.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Get organizations collection
|
||||
MongoDBAtlasOrganizationCollection organizations =
|
||||
resourceGroup.Value.GetMongoDBAtlasOrganizations();
|
||||
```
|
||||
|
||||
### Create Organization
|
||||
|
||||
```csharp
|
||||
var organizationName = "my-atlas-org";
|
||||
var location = AzureLocation.EastUS2;
|
||||
|
||||
// Build organization data
|
||||
var organizationData = new MongoDBAtlasOrganizationData(location)
|
||||
{
|
||||
Properties = new MongoDBAtlasOrganizationProperties(
|
||||
marketplace: new MongoDBAtlasMarketplaceDetails(
|
||||
subscriptionId: "your-azure-subscription-id",
|
||||
offerDetails: new MongoDBAtlasOfferDetails(
|
||||
publisherId: "mongodb",
|
||||
offerId: "mongodb_atlas_azure_native_prod",
|
||||
planId: "private_plan",
|
||||
planName: "Pay as You Go (Free) (Private)",
|
||||
termUnit: "P1M",
|
||||
termId: "gmz7xq9ge3py"
|
||||
)
|
||||
),
|
||||
user: new MongoDBAtlasUserDetails(
|
||||
emailAddress: "admin@example.com",
|
||||
upn: "admin@example.com"
|
||||
)
|
||||
{
|
||||
FirstName = "Admin",
|
||||
LastName = "User"
|
||||
}
|
||||
)
|
||||
{
|
||||
PartnerProperties = new MongoDBAtlasPartnerProperties
|
||||
{
|
||||
OrganizationName = organizationName
|
||||
}
|
||||
},
|
||||
Tags = { ["Environment"] = "Production" }
|
||||
};
|
||||
|
||||
// Create the organization (long-running operation)
|
||||
var operation = await organizations.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
organizationName,
|
||||
organizationData
|
||||
);
|
||||
|
||||
MongoDBAtlasOrganizationResource organization = operation.Value;
|
||||
Console.WriteLine($"Created: {organization.Id}");
|
||||
```
|
||||
|
||||
### Get Existing Organization
|
||||
|
||||
```csharp
|
||||
// Option 1: From collection
|
||||
MongoDBAtlasOrganizationResource org =
|
||||
await organizations.GetAsync("my-atlas-org");
|
||||
|
||||
// Option 2: From resource identifier
|
||||
var resourceId = MongoDBAtlasOrganizationResource.CreateResourceIdentifier(
|
||||
subscriptionId: "subscription-id",
|
||||
resourceGroupName: "my-resource-group",
|
||||
organizationName: "my-atlas-org"
|
||||
);
|
||||
MongoDBAtlasOrganizationResource org2 =
|
||||
armClient.GetMongoDBAtlasOrganizationResource(resourceId);
|
||||
await org2.GetAsync(); // Fetch data
|
||||
```
|
||||
|
||||
### List Organizations
|
||||
|
||||
```csharp
|
||||
// List in resource group
|
||||
await foreach (var org in organizations.GetAllAsync())
|
||||
{
|
||||
Console.WriteLine($"Org: {org.Data.Name}");
|
||||
Console.WriteLine($" Location: {org.Data.Location}");
|
||||
Console.WriteLine($" State: {org.Data.Properties?.ProvisioningState}");
|
||||
}
|
||||
|
||||
// List across subscription
|
||||
await foreach (var org in subscription.GetMongoDBAtlasOrganizationsAsync())
|
||||
{
|
||||
Console.WriteLine($"Org: {org.Data.Name} in {org.Data.Id}");
|
||||
}
|
||||
```
|
||||
|
||||
### Update Tags
|
||||
|
||||
```csharp
|
||||
// Add a single tag
|
||||
await organization.AddTagAsync("CostCenter", "12345");
|
||||
|
||||
// Replace all tags
|
||||
await organization.SetTagsAsync(new Dictionary<string, string>
|
||||
{
|
||||
["Environment"] = "Production",
|
||||
["Team"] = "Platform"
|
||||
});
|
||||
|
||||
// Remove a tag
|
||||
await organization.RemoveTagAsync("OldTag");
|
||||
```
|
||||
|
||||
### Update Organization Properties
|
||||
|
||||
```csharp
|
||||
var patch = new MongoDBAtlasOrganizationPatch
|
||||
{
|
||||
Tags = { ["UpdatedAt"] = DateTime.UtcNow.ToString("o") },
|
||||
Properties = new MongoDBAtlasOrganizationUpdateProperties
|
||||
{
|
||||
// Update user details if needed
|
||||
User = new MongoDBAtlasUserDetails(
|
||||
emailAddress: "newadmin@example.com",
|
||||
upn: "newadmin@example.com"
|
||||
)
|
||||
}
|
||||
};
|
||||
|
||||
var updateOperation = await organization.UpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
patch
|
||||
);
|
||||
```
|
||||
|
||||
### Delete Organization
|
||||
|
||||
```csharp
|
||||
// Delete (long-running operation)
|
||||
await organization.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Model Properties Reference
|
||||
|
||||
### MongoDBAtlasOrganizationProperties
|
||||
|
||||
| Property | Type | Description |
|
||||
|----------|------|-------------|
|
||||
| `Marketplace` | `MongoDBAtlasMarketplaceDetails` | Required. Marketplace subscription details |
|
||||
| `User` | `MongoDBAtlasUserDetails` | Required. Organization admin user |
|
||||
| `PartnerProperties` | `MongoDBAtlasPartnerProperties` | MongoDB-specific properties |
|
||||
| `ProvisioningState` | `MongoDBAtlasResourceProvisioningState` | Read-only. Current provisioning state |
|
||||
|
||||
### MongoDBAtlasMarketplaceDetails
|
||||
|
||||
| Property | Type | Description |
|
||||
|----------|------|-------------|
|
||||
| `SubscriptionId` | `string` | Required. Azure subscription ID for billing |
|
||||
| `OfferDetails` | `MongoDBAtlasOfferDetails` | Required. Marketplace offer configuration |
|
||||
| `SubscriptionStatus` | `MarketplaceSubscriptionStatus` | Read-only. Subscription status |
|
||||
|
||||
### MongoDBAtlasOfferDetails
|
||||
|
||||
| Property | Type | Description |
|
||||
|----------|------|-------------|
|
||||
| `PublisherId` | `string` | Required. Publisher ID (typically "mongodb") |
|
||||
| `OfferId` | `string` | Required. Offer ID |
|
||||
| `PlanId` | `string` | Required. Plan ID |
|
||||
| `PlanName` | `string` | Required. Display name of the plan |
|
||||
| `TermUnit` | `string` | Required. Billing term unit (e.g., "P1M") |
|
||||
| `TermId` | `string` | Required. Term identifier |
|
||||
|
||||
### MongoDBAtlasUserDetails
|
||||
|
||||
| Property | Type | Description |
|
||||
|----------|------|-------------|
|
||||
| `EmailAddress` | `string` | Required. User email address |
|
||||
| `Upn` | `string` | Required. User principal name |
|
||||
| `FirstName` | `string` | Optional. User first name |
|
||||
| `LastName` | `string` | Optional. User last name |
|
||||
|
||||
### MongoDBAtlasPartnerProperties
|
||||
|
||||
| Property | Type | Description |
|
||||
|----------|------|-------------|
|
||||
| `OrganizationName` | `string` | Name of the MongoDB Atlas organization |
|
||||
| `OrganizationId` | `string` | Read-only. MongoDB Atlas organization ID |
|
||||
|
||||
## Provisioning States
|
||||
|
||||
| State | Description |
|
||||
|-------|-------------|
|
||||
| `Succeeded` | Resource provisioned successfully |
|
||||
| `Failed` | Provisioning failed |
|
||||
| `Canceled` | Provisioning was canceled |
|
||||
| `Provisioning` | Resource is being provisioned |
|
||||
| `Updating` | Resource is being updated |
|
||||
| `Deleting` | Resource is being deleted |
|
||||
| `Accepted` | Request accepted, provisioning starting |
|
||||
|
||||
## Marketplace Subscription Status
|
||||
|
||||
| Status | Description |
|
||||
|--------|-------------|
|
||||
| `PendingFulfillmentStart` | Subscription pending activation |
|
||||
| `Subscribed` | Active subscription |
|
||||
| `Suspended` | Subscription suspended |
|
||||
| `Unsubscribed` | Subscription canceled |
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Use Async Methods
|
||||
|
||||
```csharp
|
||||
// Prefer async for all operations
|
||||
var org = await organizations.GetAsync("my-org");
|
||||
await org.Value.AddTagAsync("key", "value");
|
||||
```
|
||||
|
||||
### Handle Long-Running Operations
|
||||
|
||||
```csharp
|
||||
// Wait for completion
|
||||
var operation = await organizations.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, // Blocks until done
|
||||
name,
|
||||
data
|
||||
);
|
||||
|
||||
// Or start and poll later
|
||||
var operation = await organizations.CreateOrUpdateAsync(
|
||||
WaitUntil.Started, // Returns immediately
|
||||
name,
|
||||
data
|
||||
);
|
||||
|
||||
// Poll for completion
|
||||
while (!operation.HasCompleted)
|
||||
{
|
||||
await Task.Delay(TimeSpan.FromSeconds(5));
|
||||
await operation.UpdateStatusAsync();
|
||||
}
|
||||
```
|
||||
|
||||
### Check Provisioning State
|
||||
|
||||
```csharp
|
||||
var org = await organizations.GetAsync("my-org");
|
||||
if (org.Value.Data.Properties?.ProvisioningState ==
|
||||
MongoDBAtlasResourceProvisioningState.Succeeded)
|
||||
{
|
||||
Console.WriteLine("Organization is ready");
|
||||
}
|
||||
```
|
||||
|
||||
### Use Resource Identifiers
|
||||
|
||||
```csharp
|
||||
// Create identifier without API call
|
||||
var resourceId = MongoDBAtlasOrganizationResource.CreateResourceIdentifier(
|
||||
subscriptionId,
|
||||
resourceGroupName,
|
||||
organizationName
|
||||
);
|
||||
|
||||
// Get resource handle (no data yet)
|
||||
var orgResource = armClient.GetMongoDBAtlasOrganizationResource(resourceId);
|
||||
|
||||
// Fetch data when needed
|
||||
var response = await orgResource.GetAsync();
|
||||
```
|
||||
|
||||
## Common Errors
|
||||
|
||||
| Error | Cause | Solution |
|
||||
|-------|-------|----------|
|
||||
| `ResourceNotFound` | Organization doesn't exist | Verify name and resource group |
|
||||
| `AuthorizationFailed` | Insufficient permissions | Check RBAC roles on resource group |
|
||||
| `InvalidParameter` | Missing required properties | Ensure all required fields are set |
|
||||
| `MarketplaceError` | Marketplace subscription issue | Verify offer details and subscription |
|
||||
|
||||
## Related Resources
|
||||
|
||||
- [Microsoft Learn: MongoDB Atlas on Azure](https://learn.microsoft.com/en-us/azure/partner-solutions/mongodb-atlas/)
|
||||
- [API Reference](https://learn.microsoft.com/en-us/dotnet/api/azure.resourcemanager.mongodbatlas)
|
||||
- [Azure SDK for .NET](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/mongodbatlas)
|
||||
@@ -0,0 +1,254 @@
|
||||
---
|
||||
name: azure-communication-callautomation-java
|
||||
description: Build call automation workflows with Azure Communication Services Call Automation Java SDK. Use when implementing IVR systems, call routing, call recording, DTMF recognition, text-to-speech, or AI-powered call flows.
|
||||
package: com.azure:azure-communication-callautomation
|
||||
---
|
||||
|
||||
# Azure Communication Call Automation (Java)
|
||||
|
||||
Build server-side call automation workflows including IVR systems, call routing, recording, and AI-powered interactions.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-callautomation</artifactId>
|
||||
<version>1.6.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
```java
|
||||
import com.azure.communication.callautomation.CallAutomationClient;
|
||||
import com.azure.communication.callautomation.CallAutomationClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
// With DefaultAzureCredential
|
||||
CallAutomationClient client = new CallAutomationClientBuilder()
|
||||
.endpoint("https://<resource>.communication.azure.com")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
|
||||
// With connection string
|
||||
CallAutomationClient client = new CallAutomationClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Class | Purpose |
|
||||
|-------|---------|
|
||||
| `CallAutomationClient` | Make calls, answer/reject incoming calls, redirect calls |
|
||||
| `CallConnection` | Actions in established calls (add participants, terminate) |
|
||||
| `CallMedia` | Media operations (play audio, recognize DTMF/speech) |
|
||||
| `CallRecording` | Start/stop/pause recording |
|
||||
| `CallAutomationEventParser` | Parse webhook events from ACS |
|
||||
|
||||
## Create Outbound Call
|
||||
|
||||
```java
|
||||
import com.azure.communication.callautomation.models.*;
|
||||
import com.azure.communication.common.CommunicationUserIdentifier;
|
||||
import com.azure.communication.common.PhoneNumberIdentifier;
|
||||
|
||||
// Call to PSTN number
|
||||
PhoneNumberIdentifier target = new PhoneNumberIdentifier("+14255551234");
|
||||
PhoneNumberIdentifier caller = new PhoneNumberIdentifier("+14255550100");
|
||||
|
||||
CreateCallOptions options = new CreateCallOptions(
|
||||
new CommunicationUserIdentifier("<user-id>"), // Source
|
||||
List.of(target)) // Targets
|
||||
.setSourceCallerId(caller)
|
||||
.setCallbackUrl("https://your-app.com/api/callbacks");
|
||||
|
||||
CreateCallResult result = client.createCall(options);
|
||||
String callConnectionId = result.getCallConnectionProperties().getCallConnectionId();
|
||||
```
|
||||
|
||||
## Answer Incoming Call
|
||||
|
||||
```java
|
||||
// From Event Grid webhook - IncomingCall event
|
||||
String incomingCallContext = "<incoming-call-context-from-event>";
|
||||
|
||||
AnswerCallOptions options = new AnswerCallOptions(
|
||||
incomingCallContext,
|
||||
"https://your-app.com/api/callbacks");
|
||||
|
||||
AnswerCallResult result = client.answerCall(options);
|
||||
CallConnection callConnection = result.getCallConnection();
|
||||
```
|
||||
|
||||
## Play Audio (Text-to-Speech)
|
||||
|
||||
```java
|
||||
CallConnection callConnection = client.getCallConnection(callConnectionId);
|
||||
CallMedia callMedia = callConnection.getCallMedia();
|
||||
|
||||
// Play text-to-speech
|
||||
TextSource textSource = new TextSource()
|
||||
.setText("Welcome to Contoso. Press 1 for sales, 2 for support.")
|
||||
.setVoiceName("en-US-JennyNeural");
|
||||
|
||||
PlayOptions playOptions = new PlayOptions(
|
||||
List.of(textSource),
|
||||
List.of(new CommunicationUserIdentifier("<target-user>")));
|
||||
|
||||
callMedia.play(playOptions);
|
||||
|
||||
// Play audio file
|
||||
FileSource fileSource = new FileSource()
|
||||
.setUrl("https://storage.blob.core.windows.net/audio/greeting.wav");
|
||||
|
||||
callMedia.play(new PlayOptions(List.of(fileSource), List.of(target)));
|
||||
```
|
||||
|
||||
## Recognize DTMF Input
|
||||
|
||||
```java
|
||||
// Recognize DTMF tones
|
||||
DtmfTone stopTones = DtmfTone.POUND;
|
||||
|
||||
CallMediaRecognizeDtmfOptions recognizeOptions = new CallMediaRecognizeDtmfOptions(
|
||||
new CommunicationUserIdentifier("<target-user>"),
|
||||
5) // Max tones to collect
|
||||
.setInterToneTimeout(Duration.ofSeconds(5))
|
||||
.setStopTones(List.of(stopTones))
|
||||
.setInitialSilenceTimeout(Duration.ofSeconds(15))
|
||||
.setPlayPrompt(new TextSource().setText("Enter your account number followed by pound."));
|
||||
|
||||
callMedia.startRecognizing(recognizeOptions);
|
||||
```
|
||||
|
||||
## Recognize Speech
|
||||
|
||||
```java
|
||||
// Speech recognition with AI
|
||||
CallMediaRecognizeSpeechOptions speechOptions = new CallMediaRecognizeSpeechOptions(
|
||||
new CommunicationUserIdentifier("<target-user>"))
|
||||
.setEndSilenceTimeout(Duration.ofSeconds(2))
|
||||
.setSpeechLanguage("en-US")
|
||||
.setPlayPrompt(new TextSource().setText("How can I help you today?"));
|
||||
|
||||
callMedia.startRecognizing(speechOptions);
|
||||
```
|
||||
|
||||
## Call Recording
|
||||
|
||||
```java
|
||||
CallRecording callRecording = client.getCallRecording();
|
||||
|
||||
// Start recording
|
||||
StartRecordingOptions recordingOptions = new StartRecordingOptions(
|
||||
new ServerCallLocator("<server-call-id>"))
|
||||
.setRecordingChannel(RecordingChannel.MIXED)
|
||||
.setRecordingContent(RecordingContent.AUDIO_VIDEO)
|
||||
.setRecordingFormat(RecordingFormat.MP4);
|
||||
|
||||
RecordingStateResult recordingResult = callRecording.start(recordingOptions);
|
||||
String recordingId = recordingResult.getRecordingId();
|
||||
|
||||
// Pause/resume/stop
|
||||
callRecording.pause(recordingId);
|
||||
callRecording.resume(recordingId);
|
||||
callRecording.stop(recordingId);
|
||||
|
||||
// Download recording (after RecordingFileStatusUpdated event)
|
||||
callRecording.downloadTo(recordingUrl, Paths.get("recording.mp4"));
|
||||
```
|
||||
|
||||
## Add Participant to Call
|
||||
|
||||
```java
|
||||
CallConnection callConnection = client.getCallConnection(callConnectionId);
|
||||
|
||||
CommunicationUserIdentifier participant = new CommunicationUserIdentifier("<user-id>");
|
||||
AddParticipantOptions addOptions = new AddParticipantOptions(participant)
|
||||
.setInvitationTimeout(Duration.ofSeconds(30));
|
||||
|
||||
AddParticipantResult result = callConnection.addParticipant(addOptions);
|
||||
```
|
||||
|
||||
## Transfer Call
|
||||
|
||||
```java
|
||||
// Blind transfer
|
||||
PhoneNumberIdentifier transferTarget = new PhoneNumberIdentifier("+14255559999");
|
||||
TransferCallToParticipantResult result = callConnection.transferCallToParticipant(transferTarget);
|
||||
```
|
||||
|
||||
## Handle Events (Webhook)
|
||||
|
||||
```java
|
||||
import com.azure.communication.callautomation.CallAutomationEventParser;
|
||||
import com.azure.communication.callautomation.models.events.*;
|
||||
|
||||
// In your webhook endpoint
|
||||
public void handleCallback(String requestBody) {
|
||||
List<CallAutomationEventBase> events = CallAutomationEventParser.parseEvents(requestBody);
|
||||
|
||||
for (CallAutomationEventBase event : events) {
|
||||
if (event instanceof CallConnected) {
|
||||
CallConnected connected = (CallConnected) event;
|
||||
System.out.println("Call connected: " + connected.getCallConnectionId());
|
||||
} else if (event instanceof RecognizeCompleted) {
|
||||
RecognizeCompleted recognized = (RecognizeCompleted) event;
|
||||
// Handle DTMF or speech recognition result
|
||||
DtmfResult dtmfResult = (DtmfResult) recognized.getRecognizeResult();
|
||||
String tones = dtmfResult.getTones().stream()
|
||||
.map(DtmfTone::toString)
|
||||
.collect(Collectors.joining());
|
||||
System.out.println("DTMF received: " + tones);
|
||||
} else if (event instanceof PlayCompleted) {
|
||||
System.out.println("Audio playback completed");
|
||||
} else if (event instanceof CallDisconnected) {
|
||||
System.out.println("Call ended");
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Hang Up Call
|
||||
|
||||
```java
|
||||
// Hang up for all participants
|
||||
callConnection.hangUp(true);
|
||||
|
||||
// Hang up only this leg
|
||||
callConnection.hangUp(false);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.answerCall(options);
|
||||
} catch (HttpResponseException e) {
|
||||
if (e.getResponse().getStatusCode() == 404) {
|
||||
System.out.println("Call not found or already ended");
|
||||
} else if (e.getResponse().getStatusCode() == 400) {
|
||||
System.out.println("Invalid request: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COMMUNICATION_ENDPOINT=https://<resource>.communication.azure.com
|
||||
AZURE_COMMUNICATION_CONNECTION_STRING=endpoint=https://...;accesskey=...
|
||||
CALLBACK_BASE_URL=https://your-app.com/api/callbacks
|
||||
```
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "call automation Java", "IVR Java", "interactive voice response"
|
||||
- "call recording Java", "DTMF recognition Java"
|
||||
- "text to speech call", "speech recognition call"
|
||||
- "answer incoming call", "transfer call Java"
|
||||
- "Azure Communication Services call automation"
|
||||
@@ -0,0 +1,91 @@
|
||||
---
|
||||
name: azure-communication-callingserver-java
|
||||
description: Azure Communication Services CallingServer (legacy) Java SDK. Note - This SDK is deprecated. Use azure-communication-callautomation instead for new projects. Only use this skill when maintaining legacy code.
|
||||
package: com.azure:azure-communication-callingserver
|
||||
---
|
||||
|
||||
# Azure Communication CallingServer (Java) - DEPRECATED
|
||||
|
||||
> **⚠️ DEPRECATED**: This SDK has been renamed to **Call Automation**. For new projects, use `azure-communication-callautomation` instead. This skill is for maintaining legacy code only.
|
||||
|
||||
## Migration to Call Automation
|
||||
|
||||
```xml
|
||||
<!-- OLD (deprecated) -->
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-callingserver</artifactId>
|
||||
<version>1.0.0-beta.5</version>
|
||||
</dependency>
|
||||
|
||||
<!-- NEW (use this instead) -->
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-callautomation</artifactId>
|
||||
<version>1.6.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Class Name Changes
|
||||
|
||||
| CallingServer (Old) | Call Automation (New) |
|
||||
|---------------------|----------------------|
|
||||
| `CallingServerClient` | `CallAutomationClient` |
|
||||
| `CallingServerClientBuilder` | `CallAutomationClientBuilder` |
|
||||
| `CallConnection` | `CallConnection` (same) |
|
||||
| `ServerCall` | Removed - use `CallConnection` |
|
||||
|
||||
## Legacy Client Creation
|
||||
|
||||
```java
|
||||
// OLD WAY (deprecated)
|
||||
import com.azure.communication.callingserver.CallingServerClient;
|
||||
import com.azure.communication.callingserver.CallingServerClientBuilder;
|
||||
|
||||
CallingServerClient client = new CallingServerClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
|
||||
// NEW WAY
|
||||
import com.azure.communication.callautomation.CallAutomationClient;
|
||||
import com.azure.communication.callautomation.CallAutomationClientBuilder;
|
||||
|
||||
CallAutomationClient client = new CallAutomationClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Legacy Recording
|
||||
|
||||
```java
|
||||
// OLD WAY
|
||||
StartRecordingOptions options = new StartRecordingOptions(serverCallId)
|
||||
.setRecordingStateCallbackUri(callbackUri);
|
||||
|
||||
StartCallRecordingResult result = client.startRecording(options);
|
||||
String recordingId = result.getRecordingId();
|
||||
|
||||
client.pauseRecording(recordingId);
|
||||
client.resumeRecording(recordingId);
|
||||
client.stopRecording(recordingId);
|
||||
|
||||
// NEW WAY - see azure-communication-callautomation skill
|
||||
```
|
||||
|
||||
## For New Development
|
||||
|
||||
**Do not use this SDK for new projects.**
|
||||
|
||||
See the `azure-communication-callautomation-java` skill for:
|
||||
- Making outbound calls
|
||||
- Answering incoming calls
|
||||
- Call recording
|
||||
- DTMF recognition
|
||||
- Text-to-speech / speech-to-text
|
||||
- Adding/removing participants
|
||||
- Call transfer
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "callingserver legacy", "deprecated calling SDK"
|
||||
- "migrate callingserver to callautomation"
|
||||
310
skills/official/microsoft/java/communication/chat/SKILL.md
Normal file
310
skills/official/microsoft/java/communication/chat/SKILL.md
Normal file
@@ -0,0 +1,310 @@
|
||||
---
|
||||
name: azure-communication-chat-java
|
||||
description: Build real-time chat applications with Azure Communication Services Chat Java SDK. Use when implementing chat threads, messaging, participants, read receipts, typing notifications, or real-time chat features.
|
||||
package: com.azure:azure-communication-chat
|
||||
---
|
||||
|
||||
# Azure Communication Chat (Java)
|
||||
|
||||
Build real-time chat applications with thread management, messaging, participants, and read receipts.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-chat</artifactId>
|
||||
<version>1.6.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
```java
|
||||
import com.azure.communication.chat.ChatClient;
|
||||
import com.azure.communication.chat.ChatClientBuilder;
|
||||
import com.azure.communication.chat.ChatThreadClient;
|
||||
import com.azure.communication.common.CommunicationTokenCredential;
|
||||
|
||||
// ChatClient requires a CommunicationTokenCredential (user access token)
|
||||
String endpoint = "https://<resource>.communication.azure.com";
|
||||
String userAccessToken = "<user-access-token>";
|
||||
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(userAccessToken);
|
||||
|
||||
ChatClient chatClient = new ChatClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
|
||||
// Async client
|
||||
ChatAsyncClient chatAsyncClient = new ChatClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(credential)
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Class | Purpose |
|
||||
|-------|---------|
|
||||
| `ChatClient` | Create/delete chat threads, get thread clients |
|
||||
| `ChatThreadClient` | Operations within a thread (messages, participants, receipts) |
|
||||
| `ChatParticipant` | User in a chat thread with display name |
|
||||
| `ChatMessage` | Message content, type, sender info, timestamps |
|
||||
| `ChatMessageReadReceipt` | Read receipt tracking per participant |
|
||||
|
||||
## Create Chat Thread
|
||||
|
||||
```java
|
||||
import com.azure.communication.chat.models.*;
|
||||
import com.azure.communication.common.CommunicationUserIdentifier;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
// Define participants
|
||||
List<ChatParticipant> participants = new ArrayList<>();
|
||||
|
||||
ChatParticipant participant1 = new ChatParticipant()
|
||||
.setCommunicationIdentifier(new CommunicationUserIdentifier("<user-id-1>"))
|
||||
.setDisplayName("Alice");
|
||||
|
||||
ChatParticipant participant2 = new ChatParticipant()
|
||||
.setCommunicationIdentifier(new CommunicationUserIdentifier("<user-id-2>"))
|
||||
.setDisplayName("Bob");
|
||||
|
||||
participants.add(participant1);
|
||||
participants.add(participant2);
|
||||
|
||||
// Create thread
|
||||
CreateChatThreadOptions options = new CreateChatThreadOptions("Project Discussion")
|
||||
.setParticipants(participants);
|
||||
|
||||
CreateChatThreadResult result = chatClient.createChatThread(options);
|
||||
String threadId = result.getChatThread().getId();
|
||||
|
||||
// Get thread client for operations
|
||||
ChatThreadClient threadClient = chatClient.getChatThreadClient(threadId);
|
||||
```
|
||||
|
||||
## Send Messages
|
||||
|
||||
```java
|
||||
// Send text message
|
||||
SendChatMessageOptions messageOptions = new SendChatMessageOptions()
|
||||
.setContent("Hello, team!")
|
||||
.setSenderDisplayName("Alice")
|
||||
.setType(ChatMessageType.TEXT);
|
||||
|
||||
SendChatMessageResult sendResult = threadClient.sendMessage(messageOptions);
|
||||
String messageId = sendResult.getId();
|
||||
|
||||
// Send HTML message
|
||||
SendChatMessageOptions htmlOptions = new SendChatMessageOptions()
|
||||
.setContent("<strong>Important:</strong> Meeting at 3pm")
|
||||
.setType(ChatMessageType.HTML);
|
||||
|
||||
threadClient.sendMessage(htmlOptions);
|
||||
```
|
||||
|
||||
## Get Messages
|
||||
|
||||
```java
|
||||
import com.azure.core.util.paging.PagedIterable;
|
||||
|
||||
// List all messages
|
||||
PagedIterable<ChatMessage> messages = threadClient.listMessages();
|
||||
|
||||
for (ChatMessage message : messages) {
|
||||
System.out.println("ID: " + message.getId());
|
||||
System.out.println("Type: " + message.getType());
|
||||
System.out.println("Content: " + message.getContent().getMessage());
|
||||
System.out.println("Sender: " + message.getSenderDisplayName());
|
||||
System.out.println("Created: " + message.getCreatedOn());
|
||||
|
||||
// Check if edited or deleted
|
||||
if (message.getEditedOn() != null) {
|
||||
System.out.println("Edited: " + message.getEditedOn());
|
||||
}
|
||||
if (message.getDeletedOn() != null) {
|
||||
System.out.println("Deleted: " + message.getDeletedOn());
|
||||
}
|
||||
}
|
||||
|
||||
// Get specific message
|
||||
ChatMessage message = threadClient.getMessage(messageId);
|
||||
```
|
||||
|
||||
## Update and Delete Messages
|
||||
|
||||
```java
|
||||
// Update message
|
||||
UpdateChatMessageOptions updateOptions = new UpdateChatMessageOptions()
|
||||
.setContent("Updated message content");
|
||||
|
||||
threadClient.updateMessage(messageId, updateOptions);
|
||||
|
||||
// Delete message
|
||||
threadClient.deleteMessage(messageId);
|
||||
```
|
||||
|
||||
## Manage Participants
|
||||
|
||||
```java
|
||||
// List participants
|
||||
PagedIterable<ChatParticipant> participants = threadClient.listParticipants();
|
||||
|
||||
for (ChatParticipant participant : participants) {
|
||||
CommunicationUserIdentifier user =
|
||||
(CommunicationUserIdentifier) participant.getCommunicationIdentifier();
|
||||
System.out.println("User: " + user.getId());
|
||||
System.out.println("Display Name: " + participant.getDisplayName());
|
||||
}
|
||||
|
||||
// Add participants
|
||||
List<ChatParticipant> newParticipants = new ArrayList<>();
|
||||
newParticipants.add(new ChatParticipant()
|
||||
.setCommunicationIdentifier(new CommunicationUserIdentifier("<new-user-id>"))
|
||||
.setDisplayName("Charlie")
|
||||
.setShareHistoryTime(OffsetDateTime.now().minusDays(7))); // Share last 7 days
|
||||
|
||||
threadClient.addParticipants(newParticipants);
|
||||
|
||||
// Remove participant
|
||||
CommunicationUserIdentifier userToRemove = new CommunicationUserIdentifier("<user-id>");
|
||||
threadClient.removeParticipant(userToRemove);
|
||||
```
|
||||
|
||||
## Read Receipts
|
||||
|
||||
```java
|
||||
// Send read receipt
|
||||
threadClient.sendReadReceipt(messageId);
|
||||
|
||||
// Get read receipts
|
||||
PagedIterable<ChatMessageReadReceipt> receipts = threadClient.listReadReceipts();
|
||||
|
||||
for (ChatMessageReadReceipt receipt : receipts) {
|
||||
System.out.println("Message ID: " + receipt.getChatMessageId());
|
||||
System.out.println("Read by: " + receipt.getSenderCommunicationIdentifier());
|
||||
System.out.println("Read at: " + receipt.getReadOn());
|
||||
}
|
||||
```
|
||||
|
||||
## Typing Notifications
|
||||
|
||||
```java
|
||||
import com.azure.communication.chat.models.TypingNotificationOptions;
|
||||
|
||||
// Send typing notification
|
||||
TypingNotificationOptions typingOptions = new TypingNotificationOptions()
|
||||
.setSenderDisplayName("Alice");
|
||||
|
||||
threadClient.sendTypingNotificationWithResponse(typingOptions, Context.NONE);
|
||||
|
||||
// Simple typing notification
|
||||
threadClient.sendTypingNotification();
|
||||
```
|
||||
|
||||
## Thread Operations
|
||||
|
||||
```java
|
||||
// Get thread properties
|
||||
ChatThreadProperties properties = threadClient.getProperties();
|
||||
System.out.println("Topic: " + properties.getTopic());
|
||||
System.out.println("Created: " + properties.getCreatedOn());
|
||||
|
||||
// Update topic
|
||||
threadClient.updateTopic("New Project Discussion Topic");
|
||||
|
||||
// Delete thread
|
||||
chatClient.deleteChatThread(threadId);
|
||||
```
|
||||
|
||||
## List Threads
|
||||
|
||||
```java
|
||||
// List all chat threads for the user
|
||||
PagedIterable<ChatThreadItem> threads = chatClient.listChatThreads();
|
||||
|
||||
for (ChatThreadItem thread : threads) {
|
||||
System.out.println("Thread ID: " + thread.getId());
|
||||
System.out.println("Topic: " + thread.getTopic());
|
||||
System.out.println("Last message: " + thread.getLastMessageReceivedOn());
|
||||
}
|
||||
```
|
||||
|
||||
## Pagination
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.PagedResponse;
|
||||
|
||||
// Paginate through messages
|
||||
int maxPageSize = 10;
|
||||
ListChatMessagesOptions listOptions = new ListChatMessagesOptions()
|
||||
.setMaxPageSize(maxPageSize);
|
||||
|
||||
PagedIterable<ChatMessage> pagedMessages = threadClient.listMessages(listOptions);
|
||||
|
||||
pagedMessages.iterableByPage().forEach(page -> {
|
||||
System.out.println("Page status code: " + page.getStatusCode());
|
||||
page.getElements().forEach(msg ->
|
||||
System.out.println("Message: " + msg.getContent().getMessage()));
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
threadClient.sendMessage(messageOptions);
|
||||
} catch (HttpResponseException e) {
|
||||
switch (e.getResponse().getStatusCode()) {
|
||||
case 401:
|
||||
System.out.println("Unauthorized - check token");
|
||||
break;
|
||||
case 403:
|
||||
System.out.println("Forbidden - user not in thread");
|
||||
break;
|
||||
case 404:
|
||||
System.out.println("Thread not found");
|
||||
break;
|
||||
default:
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Message Types
|
||||
|
||||
| Type | Description |
|
||||
|------|-------------|
|
||||
| `TEXT` | Regular chat message |
|
||||
| `HTML` | HTML-formatted message |
|
||||
| `TOPIC_UPDATED` | System message - topic changed |
|
||||
| `PARTICIPANT_ADDED` | System message - participant joined |
|
||||
| `PARTICIPANT_REMOVED` | System message - participant left |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COMMUNICATION_ENDPOINT=https://<resource>.communication.azure.com
|
||||
AZURE_COMMUNICATION_USER_TOKEN=<user-access-token>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Token Management** - User tokens expire; implement refresh logic with `CommunicationTokenRefreshOptions`
|
||||
2. **Pagination** - Use `listMessages(options)` with `maxPageSize` for large threads
|
||||
3. **Share History** - Set `shareHistoryTime` when adding participants to control message visibility
|
||||
4. **Message Types** - Filter system messages (`PARTICIPANT_ADDED`, etc.) from user messages
|
||||
5. **Read Receipts** - Send receipts only when messages are actually viewed by user
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "chat application Java", "real-time messaging Java"
|
||||
- "chat thread", "chat participants", "chat messages"
|
||||
- "read receipts", "typing notifications"
|
||||
- "Azure Communication Services chat"
|
||||
304
skills/official/microsoft/java/communication/common/SKILL.md
Normal file
304
skills/official/microsoft/java/communication/common/SKILL.md
Normal file
@@ -0,0 +1,304 @@
|
||||
---
|
||||
name: azure-communication-common-java
|
||||
description: Azure Communication Services common utilities for Java. Use when working with CommunicationTokenCredential, user identifiers, token refresh, or shared authentication across ACS services.
|
||||
package: com.azure:azure-communication-common
|
||||
---
|
||||
|
||||
# Azure Communication Common (Java)
|
||||
|
||||
Shared authentication utilities and data structures for Azure Communication Services.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-common</artifactId>
|
||||
<version>1.4.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Class | Purpose |
|
||||
|-------|---------|
|
||||
| `CommunicationTokenCredential` | Authenticate users with ACS services |
|
||||
| `CommunicationTokenRefreshOptions` | Configure automatic token refresh |
|
||||
| `CommunicationUserIdentifier` | Identify ACS users |
|
||||
| `PhoneNumberIdentifier` | Identify PSTN phone numbers |
|
||||
| `MicrosoftTeamsUserIdentifier` | Identify Teams users |
|
||||
| `UnknownIdentifier` | Generic identifier for unknown types |
|
||||
|
||||
## CommunicationTokenCredential
|
||||
|
||||
### Static Token (Short-lived Clients)
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationTokenCredential;
|
||||
|
||||
// Simple static token - no refresh
|
||||
String userToken = "<user-access-token>";
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(userToken);
|
||||
|
||||
// Use with Chat, Calling, etc.
|
||||
ChatClient chatClient = new ChatClientBuilder()
|
||||
.endpoint("https://<resource>.communication.azure.com")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Proactive Token Refresh (Long-lived Clients)
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationTokenRefreshOptions;
|
||||
import java.util.concurrent.Callable;
|
||||
|
||||
// Token refresher callback - called when token is about to expire
|
||||
Callable<String> tokenRefresher = () -> {
|
||||
// Call your server to get a fresh token
|
||||
return fetchNewTokenFromServer();
|
||||
};
|
||||
|
||||
// With proactive refresh
|
||||
CommunicationTokenRefreshOptions refreshOptions = new CommunicationTokenRefreshOptions(tokenRefresher)
|
||||
.setRefreshProactively(true) // Refresh before expiry
|
||||
.setInitialToken(currentToken); // Optional initial token
|
||||
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(refreshOptions);
|
||||
```
|
||||
|
||||
### Async Token Refresh
|
||||
|
||||
```java
|
||||
import java.util.concurrent.CompletableFuture;
|
||||
|
||||
// Async token fetcher
|
||||
Callable<String> asyncRefresher = () -> {
|
||||
CompletableFuture<String> future = fetchTokenAsync();
|
||||
return future.get(); // Block until token is available
|
||||
};
|
||||
|
||||
CommunicationTokenRefreshOptions options = new CommunicationTokenRefreshOptions(asyncRefresher)
|
||||
.setRefreshProactively(true);
|
||||
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(options);
|
||||
```
|
||||
|
||||
## Entra ID (Azure AD) Authentication
|
||||
|
||||
```java
|
||||
import com.azure.identity.InteractiveBrowserCredentialBuilder;
|
||||
import com.azure.communication.common.EntraCommunicationTokenCredentialOptions;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
// For Teams Phone Extensibility
|
||||
InteractiveBrowserCredential entraCredential = new InteractiveBrowserCredentialBuilder()
|
||||
.clientId("<your-client-id>")
|
||||
.tenantId("<your-tenant-id>")
|
||||
.redirectUrl("<your-redirect-uri>")
|
||||
.build();
|
||||
|
||||
String resourceEndpoint = "https://<resource>.communication.azure.com";
|
||||
List<String> scopes = Arrays.asList(
|
||||
"https://auth.msft.communication.azure.com/TeamsExtension.ManageCalls"
|
||||
);
|
||||
|
||||
EntraCommunicationTokenCredentialOptions entraOptions =
|
||||
new EntraCommunicationTokenCredentialOptions(entraCredential, resourceEndpoint)
|
||||
.setScopes(scopes);
|
||||
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(entraOptions);
|
||||
```
|
||||
|
||||
## Communication Identifiers
|
||||
|
||||
### CommunicationUserIdentifier
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationUserIdentifier;
|
||||
|
||||
// Create identifier for ACS user
|
||||
CommunicationUserIdentifier user = new CommunicationUserIdentifier("8:acs:resource-id_user-id");
|
||||
|
||||
// Get raw ID
|
||||
String rawId = user.getId();
|
||||
```
|
||||
|
||||
### PhoneNumberIdentifier
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.PhoneNumberIdentifier;
|
||||
|
||||
// E.164 format phone number
|
||||
PhoneNumberIdentifier phone = new PhoneNumberIdentifier("+14255551234");
|
||||
|
||||
String phoneNumber = phone.getPhoneNumber(); // "+14255551234"
|
||||
String rawId = phone.getRawId(); // "4:+14255551234"
|
||||
```
|
||||
|
||||
### MicrosoftTeamsUserIdentifier
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.MicrosoftTeamsUserIdentifier;
|
||||
|
||||
// Teams user identifier
|
||||
MicrosoftTeamsUserIdentifier teamsUser = new MicrosoftTeamsUserIdentifier("<teams-user-id>")
|
||||
.setCloudEnvironment(CommunicationCloudEnvironment.PUBLIC);
|
||||
|
||||
// For anonymous Teams users
|
||||
MicrosoftTeamsUserIdentifier anonymousTeamsUser = new MicrosoftTeamsUserIdentifier("<teams-user-id>")
|
||||
.setAnonymous(true);
|
||||
```
|
||||
|
||||
### UnknownIdentifier
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.UnknownIdentifier;
|
||||
|
||||
// For identifiers of unknown type
|
||||
UnknownIdentifier unknown = new UnknownIdentifier("some-raw-id");
|
||||
```
|
||||
|
||||
## Identifier Parsing
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationIdentifier;
|
||||
import com.azure.communication.common.CommunicationIdentifierModel;
|
||||
|
||||
// Parse raw ID to appropriate type
|
||||
public CommunicationIdentifier parseIdentifier(String rawId) {
|
||||
if (rawId.startsWith("8:acs:")) {
|
||||
return new CommunicationUserIdentifier(rawId);
|
||||
} else if (rawId.startsWith("4:")) {
|
||||
String phone = rawId.substring(2);
|
||||
return new PhoneNumberIdentifier(phone);
|
||||
} else if (rawId.startsWith("8:orgid:")) {
|
||||
String teamsId = rawId.substring(8);
|
||||
return new MicrosoftTeamsUserIdentifier(teamsId);
|
||||
} else {
|
||||
return new UnknownIdentifier(rawId);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Type Checking Identifiers
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationIdentifier;
|
||||
|
||||
public void processIdentifier(CommunicationIdentifier identifier) {
|
||||
if (identifier instanceof CommunicationUserIdentifier) {
|
||||
CommunicationUserIdentifier user = (CommunicationUserIdentifier) identifier;
|
||||
System.out.println("ACS User: " + user.getId());
|
||||
|
||||
} else if (identifier instanceof PhoneNumberIdentifier) {
|
||||
PhoneNumberIdentifier phone = (PhoneNumberIdentifier) identifier;
|
||||
System.out.println("Phone: " + phone.getPhoneNumber());
|
||||
|
||||
} else if (identifier instanceof MicrosoftTeamsUserIdentifier) {
|
||||
MicrosoftTeamsUserIdentifier teams = (MicrosoftTeamsUserIdentifier) identifier;
|
||||
System.out.println("Teams User: " + teams.getUserId());
|
||||
System.out.println("Anonymous: " + teams.isAnonymous());
|
||||
|
||||
} else if (identifier instanceof UnknownIdentifier) {
|
||||
UnknownIdentifier unknown = (UnknownIdentifier) identifier;
|
||||
System.out.println("Unknown: " + unknown.getId());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Token Access
|
||||
|
||||
```java
|
||||
import com.azure.core.credential.AccessToken;
|
||||
|
||||
// Get current token (for debugging/logging - don't expose!)
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(token);
|
||||
|
||||
// Sync access
|
||||
AccessToken accessToken = credential.getToken();
|
||||
System.out.println("Token expires: " + accessToken.getExpiresAt());
|
||||
|
||||
// Async access
|
||||
credential.getTokenAsync()
|
||||
.subscribe(token -> {
|
||||
System.out.println("Token: " + token.getToken().substring(0, 20) + "...");
|
||||
System.out.println("Expires: " + token.getExpiresAt());
|
||||
});
|
||||
```
|
||||
|
||||
## Dispose Credential
|
||||
|
||||
```java
|
||||
// Clean up when done
|
||||
credential.close();
|
||||
|
||||
// Or use try-with-resources
|
||||
try (CommunicationTokenCredential cred = new CommunicationTokenCredential(options)) {
|
||||
// Use credential
|
||||
chatClient.doSomething();
|
||||
}
|
||||
```
|
||||
|
||||
## Cloud Environments
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationCloudEnvironment;
|
||||
|
||||
// Available environments
|
||||
CommunicationCloudEnvironment publicCloud = CommunicationCloudEnvironment.PUBLIC;
|
||||
CommunicationCloudEnvironment govCloud = CommunicationCloudEnvironment.GCCH;
|
||||
CommunicationCloudEnvironment dodCloud = CommunicationCloudEnvironment.DOD;
|
||||
|
||||
// Set on Teams identifier
|
||||
MicrosoftTeamsUserIdentifier teamsUser = new MicrosoftTeamsUserIdentifier("<user-id>")
|
||||
.setCloudEnvironment(CommunicationCloudEnvironment.GCCH);
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COMMUNICATION_ENDPOINT=https://<resource>.communication.azure.com
|
||||
AZURE_COMMUNICATION_USER_TOKEN=<user-access-token>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Proactive Refresh** - Always use `setRefreshProactively(true)` for long-lived clients
|
||||
2. **Token Security** - Never log or expose full tokens
|
||||
3. **Close Credentials** - Dispose of credentials when no longer needed
|
||||
4. **Error Handling** - Handle token refresh failures gracefully
|
||||
5. **Identifier Types** - Use specific identifier types, not raw strings
|
||||
|
||||
## Common Usage Patterns
|
||||
|
||||
```java
|
||||
// Pattern: Create credential for Chat/Calling client
|
||||
public ChatClient createChatClient(String token, String endpoint) {
|
||||
CommunicationTokenRefreshOptions refreshOptions =
|
||||
new CommunicationTokenRefreshOptions(this::refreshToken)
|
||||
.setRefreshProactively(true)
|
||||
.setInitialToken(token);
|
||||
|
||||
CommunicationTokenCredential credential =
|
||||
new CommunicationTokenCredential(refreshOptions);
|
||||
|
||||
return new ChatClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
}
|
||||
|
||||
private String refreshToken() {
|
||||
// Call your token endpoint
|
||||
return tokenService.getNewToken();
|
||||
}
|
||||
```
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "ACS authentication", "communication token credential"
|
||||
- "user access token", "token refresh"
|
||||
- "CommunicationUserIdentifier", "PhoneNumberIdentifier"
|
||||
- "Azure Communication Services authentication"
|
||||
274
skills/official/microsoft/java/communication/sms/SKILL.md
Normal file
274
skills/official/microsoft/java/communication/sms/SKILL.md
Normal file
@@ -0,0 +1,274 @@
|
||||
---
|
||||
name: azure-communication-sms-java
|
||||
description: Send SMS messages with Azure Communication Services SMS Java SDK. Use when implementing SMS notifications, alerts, OTP delivery, bulk messaging, or delivery reports.
|
||||
package: com.azure:azure-communication-sms
|
||||
---
|
||||
|
||||
# Azure Communication SMS (Java)
|
||||
|
||||
Send SMS messages to single or multiple recipients with delivery reporting.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-sms</artifactId>
|
||||
<version>1.2.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
```java
|
||||
import com.azure.communication.sms.SmsClient;
|
||||
import com.azure.communication.sms.SmsClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
// With DefaultAzureCredential (recommended)
|
||||
SmsClient smsClient = new SmsClientBuilder()
|
||||
.endpoint("https://<resource>.communication.azure.com")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
|
||||
// With connection string
|
||||
SmsClient smsClient = new SmsClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
|
||||
// With AzureKeyCredential
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
SmsClient smsClient = new SmsClientBuilder()
|
||||
.endpoint("https://<resource>.communication.azure.com")
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.buildClient();
|
||||
|
||||
// Async client
|
||||
SmsAsyncClient smsAsyncClient = new SmsClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Send SMS to Single Recipient
|
||||
|
||||
```java
|
||||
import com.azure.communication.sms.models.SmsSendResult;
|
||||
|
||||
// Simple send
|
||||
SmsSendResult result = smsClient.send(
|
||||
"+14255550100", // From (your ACS phone number)
|
||||
"+14255551234", // To
|
||||
"Your verification code is 123456");
|
||||
|
||||
System.out.println("Message ID: " + result.getMessageId());
|
||||
System.out.println("To: " + result.getTo());
|
||||
System.out.println("Success: " + result.isSuccessful());
|
||||
|
||||
if (!result.isSuccessful()) {
|
||||
System.out.println("Error: " + result.getErrorMessage());
|
||||
System.out.println("Status: " + result.getHttpStatusCode());
|
||||
}
|
||||
```
|
||||
|
||||
## Send SMS to Multiple Recipients
|
||||
|
||||
```java
|
||||
import com.azure.communication.sms.models.SmsSendOptions;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
List<String> recipients = Arrays.asList(
|
||||
"+14255551111",
|
||||
"+14255552222",
|
||||
"+14255553333"
|
||||
);
|
||||
|
||||
// With options
|
||||
SmsSendOptions options = new SmsSendOptions()
|
||||
.setDeliveryReportEnabled(true)
|
||||
.setTag("marketing-campaign-001");
|
||||
|
||||
Iterable<SmsSendResult> results = smsClient.sendWithResponse(
|
||||
"+14255550100", // From
|
||||
recipients, // To list
|
||||
"Flash sale! 50% off today only.",
|
||||
options,
|
||||
Context.NONE
|
||||
).getValue();
|
||||
|
||||
for (SmsSendResult result : results) {
|
||||
if (result.isSuccessful()) {
|
||||
System.out.println("Sent to " + result.getTo() + ": " + result.getMessageId());
|
||||
} else {
|
||||
System.out.println("Failed to " + result.getTo() + ": " + result.getErrorMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Send Options
|
||||
|
||||
```java
|
||||
SmsSendOptions options = new SmsSendOptions();
|
||||
|
||||
// Enable delivery reports (sent via Event Grid)
|
||||
options.setDeliveryReportEnabled(true);
|
||||
|
||||
// Add custom tag for tracking
|
||||
options.setTag("order-confirmation-12345");
|
||||
```
|
||||
|
||||
## Response Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.Response;
|
||||
|
||||
Response<Iterable<SmsSendResult>> response = smsClient.sendWithResponse(
|
||||
"+14255550100",
|
||||
Arrays.asList("+14255551234"),
|
||||
"Hello!",
|
||||
new SmsSendOptions().setDeliveryReportEnabled(true),
|
||||
Context.NONE
|
||||
);
|
||||
|
||||
// Check HTTP response
|
||||
System.out.println("Status code: " + response.getStatusCode());
|
||||
System.out.println("Headers: " + response.getHeaders());
|
||||
|
||||
// Process results
|
||||
for (SmsSendResult result : response.getValue()) {
|
||||
System.out.println("Message ID: " + result.getMessageId());
|
||||
System.out.println("Successful: " + result.isSuccessful());
|
||||
|
||||
if (!result.isSuccessful()) {
|
||||
System.out.println("HTTP Status: " + result.getHttpStatusCode());
|
||||
System.out.println("Error: " + result.getErrorMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Async Operations
|
||||
|
||||
```java
|
||||
import reactor.core.publisher.Mono;
|
||||
|
||||
SmsAsyncClient asyncClient = new SmsClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildAsyncClient();
|
||||
|
||||
// Send single message
|
||||
asyncClient.send("+14255550100", "+14255551234", "Async message!")
|
||||
.subscribe(
|
||||
result -> System.out.println("Sent: " + result.getMessageId()),
|
||||
error -> System.out.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
// Send to multiple with options
|
||||
SmsSendOptions options = new SmsSendOptions()
|
||||
.setDeliveryReportEnabled(true);
|
||||
|
||||
asyncClient.sendWithResponse(
|
||||
"+14255550100",
|
||||
Arrays.asList("+14255551111", "+14255552222"),
|
||||
"Bulk async message",
|
||||
options)
|
||||
.subscribe(response -> {
|
||||
for (SmsSendResult result : response.getValue()) {
|
||||
System.out.println("Result: " + result.getTo() + " - " + result.isSuccessful());
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
SmsSendResult result = smsClient.send(
|
||||
"+14255550100",
|
||||
"+14255551234",
|
||||
"Test message"
|
||||
);
|
||||
|
||||
// Individual message errors don't throw exceptions
|
||||
if (!result.isSuccessful()) {
|
||||
handleMessageError(result);
|
||||
}
|
||||
|
||||
} catch (HttpResponseException e) {
|
||||
// Request-level failures (auth, network, etc.)
|
||||
System.out.println("Request failed: " + e.getMessage());
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
} catch (RuntimeException e) {
|
||||
System.out.println("Unexpected error: " + e.getMessage());
|
||||
}
|
||||
|
||||
private void handleMessageError(SmsSendResult result) {
|
||||
int status = result.getHttpStatusCode();
|
||||
String error = result.getErrorMessage();
|
||||
|
||||
if (status == 400) {
|
||||
System.out.println("Invalid phone number: " + result.getTo());
|
||||
} else if (status == 429) {
|
||||
System.out.println("Rate limited - retry later");
|
||||
} else {
|
||||
System.out.println("Error " + status + ": " + error);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Delivery Reports
|
||||
|
||||
Delivery reports are sent via Azure Event Grid. Configure an Event Grid subscription for your ACS resource.
|
||||
|
||||
```java
|
||||
// Event Grid webhook handler (in your endpoint)
|
||||
public void handleDeliveryReport(String eventJson) {
|
||||
// Parse Event Grid event
|
||||
// Event type: Microsoft.Communication.SMSDeliveryReportReceived
|
||||
|
||||
// Event data contains:
|
||||
// - messageId: correlates to SmsSendResult.getMessageId()
|
||||
// - from: sender number
|
||||
// - to: recipient number
|
||||
// - deliveryStatus: "Delivered", "Failed", etc.
|
||||
// - deliveryStatusDetails: detailed status
|
||||
// - receivedTimestamp: when status was received
|
||||
// - tag: your custom tag from SmsSendOptions
|
||||
}
|
||||
```
|
||||
|
||||
## SmsSendResult Properties
|
||||
|
||||
| Property | Type | Description |
|
||||
|----------|------|-------------|
|
||||
| `getMessageId()` | String | Unique message identifier |
|
||||
| `getTo()` | String | Recipient phone number |
|
||||
| `isSuccessful()` | boolean | Whether send succeeded |
|
||||
| `getHttpStatusCode()` | int | HTTP status for this recipient |
|
||||
| `getErrorMessage()` | String | Error details if failed |
|
||||
| `getRepeatabilityResult()` | RepeatabilityResult | Idempotency result |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COMMUNICATION_ENDPOINT=https://<resource>.communication.azure.com
|
||||
AZURE_COMMUNICATION_CONNECTION_STRING=endpoint=https://...;accesskey=...
|
||||
SMS_FROM_NUMBER=+14255550100
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Phone Number Format** - Use E.164 format: `+[country code][number]`
|
||||
2. **Delivery Reports** - Enable for critical messages (OTP, alerts)
|
||||
3. **Tagging** - Use tags to correlate messages with business context
|
||||
4. **Error Handling** - Check `isSuccessful()` for each recipient individually
|
||||
5. **Rate Limiting** - Implement retry with backoff for 429 responses
|
||||
6. **Bulk Sending** - Use batch send for multiple recipients (more efficient)
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "send SMS Java", "text message Java"
|
||||
- "SMS notification", "OTP SMS", "bulk SMS"
|
||||
- "delivery report SMS", "Azure Communication Services SMS"
|
||||
379
skills/official/microsoft/java/compute/batch/SKILL.md
Normal file
379
skills/official/microsoft/java/compute/batch/SKILL.md
Normal file
@@ -0,0 +1,379 @@
|
||||
---
|
||||
name: azure-compute-batch-java
|
||||
description: |
|
||||
Azure Batch SDK for Java. Run large-scale parallel and HPC batch jobs with pools, jobs, tasks, and compute nodes.
|
||||
Triggers: "BatchClient java", "azure batch java", "batch pool java", "batch job java", "HPC java", "parallel computing java".
|
||||
---
|
||||
|
||||
# Azure Batch SDK for Java
|
||||
|
||||
Client library for running large-scale parallel and high-performance computing (HPC) batch jobs in Azure.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-compute-batch</artifactId>
|
||||
<version>1.0.0-beta.5</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Azure Batch account
|
||||
- Pool configured with compute nodes
|
||||
- Azure subscription
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_BATCH_ENDPOINT=https://<account>.<region>.batch.azure.com
|
||||
AZURE_BATCH_ACCOUNT=<account-name>
|
||||
AZURE_BATCH_ACCESS_KEY=<account-key>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With Microsoft Entra ID (Recommended)
|
||||
|
||||
```java
|
||||
import com.azure.compute.batch.BatchClient;
|
||||
import com.azure.compute.batch.BatchClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
BatchClient batchClient = new BatchClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(System.getenv("AZURE_BATCH_ENDPOINT"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.compute.batch.BatchAsyncClient;
|
||||
|
||||
BatchAsyncClient batchAsyncClient = new BatchClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(System.getenv("AZURE_BATCH_ENDPOINT"))
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### With Shared Key Credentials
|
||||
|
||||
```java
|
||||
import com.azure.core.credential.AzureNamedKeyCredential;
|
||||
|
||||
String accountName = System.getenv("AZURE_BATCH_ACCOUNT");
|
||||
String accountKey = System.getenv("AZURE_BATCH_ACCESS_KEY");
|
||||
AzureNamedKeyCredential sharedKeyCreds = new AzureNamedKeyCredential(accountName, accountKey);
|
||||
|
||||
BatchClient batchClient = new BatchClientBuilder()
|
||||
.credential(sharedKeyCreds)
|
||||
.endpoint(System.getenv("AZURE_BATCH_ENDPOINT"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| Pool | Collection of compute nodes that run tasks |
|
||||
| Job | Logical grouping of tasks |
|
||||
| Task | Unit of computation (command/script) |
|
||||
| Node | VM that executes tasks |
|
||||
| Job Schedule | Recurring job creation |
|
||||
|
||||
## Pool Operations
|
||||
|
||||
### Create Pool
|
||||
|
||||
```java
|
||||
import com.azure.compute.batch.models.*;
|
||||
|
||||
batchClient.createPool(new BatchPoolCreateParameters("myPoolId", "STANDARD_DC2s_V2")
|
||||
.setVirtualMachineConfiguration(
|
||||
new VirtualMachineConfiguration(
|
||||
new BatchVmImageReference()
|
||||
.setPublisher("Canonical")
|
||||
.setOffer("UbuntuServer")
|
||||
.setSku("22_04-lts")
|
||||
.setVersion("latest"),
|
||||
"batch.node.ubuntu 22.04"))
|
||||
.setTargetDedicatedNodes(2)
|
||||
.setTargetLowPriorityNodes(0), null);
|
||||
```
|
||||
|
||||
### Get Pool
|
||||
|
||||
```java
|
||||
BatchPool pool = batchClient.getPool("myPoolId");
|
||||
System.out.println("Pool state: " + pool.getState());
|
||||
System.out.println("Current dedicated nodes: " + pool.getCurrentDedicatedNodes());
|
||||
```
|
||||
|
||||
### List Pools
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.PagedIterable;
|
||||
|
||||
PagedIterable<BatchPool> pools = batchClient.listPools();
|
||||
for (BatchPool pool : pools) {
|
||||
System.out.println("Pool: " + pool.getId() + ", State: " + pool.getState());
|
||||
}
|
||||
```
|
||||
|
||||
### Resize Pool
|
||||
|
||||
```java
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
|
||||
BatchPoolResizeParameters resizeParams = new BatchPoolResizeParameters()
|
||||
.setTargetDedicatedNodes(4)
|
||||
.setTargetLowPriorityNodes(2);
|
||||
|
||||
SyncPoller<BatchPool, BatchPool> poller = batchClient.beginResizePool("myPoolId", resizeParams);
|
||||
poller.waitForCompletion();
|
||||
BatchPool resizedPool = poller.getFinalResult();
|
||||
```
|
||||
|
||||
### Enable AutoScale
|
||||
|
||||
```java
|
||||
BatchPoolEnableAutoScaleParameters autoScaleParams = new BatchPoolEnableAutoScaleParameters()
|
||||
.setAutoScaleEvaluationInterval(Duration.ofMinutes(5))
|
||||
.setAutoScaleFormula("$TargetDedicatedNodes = min(10, $PendingTasks.GetSample(TimeInterval_Minute * 5));");
|
||||
|
||||
batchClient.enablePoolAutoScale("myPoolId", autoScaleParams);
|
||||
```
|
||||
|
||||
### Delete Pool
|
||||
|
||||
```java
|
||||
SyncPoller<BatchPool, Void> deletePoller = batchClient.beginDeletePool("myPoolId");
|
||||
deletePoller.waitForCompletion();
|
||||
```
|
||||
|
||||
## Job Operations
|
||||
|
||||
### Create Job
|
||||
|
||||
```java
|
||||
batchClient.createJob(
|
||||
new BatchJobCreateParameters("myJobId", new BatchPoolInfo().setPoolId("myPoolId"))
|
||||
.setPriority(100)
|
||||
.setConstraints(new BatchJobConstraints()
|
||||
.setMaxWallClockTime(Duration.ofHours(24))
|
||||
.setMaxTaskRetryCount(3)),
|
||||
null);
|
||||
```
|
||||
|
||||
### Get Job
|
||||
|
||||
```java
|
||||
BatchJob job = batchClient.getJob("myJobId", null, null);
|
||||
System.out.println("Job state: " + job.getState());
|
||||
```
|
||||
|
||||
### List Jobs
|
||||
|
||||
```java
|
||||
PagedIterable<BatchJob> jobs = batchClient.listJobs(new BatchJobsListOptions());
|
||||
for (BatchJob job : jobs) {
|
||||
System.out.println("Job: " + job.getId() + ", State: " + job.getState());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Task Counts
|
||||
|
||||
```java
|
||||
BatchTaskCountsResult counts = batchClient.getJobTaskCounts("myJobId");
|
||||
System.out.println("Active: " + counts.getTaskCounts().getActive());
|
||||
System.out.println("Running: " + counts.getTaskCounts().getRunning());
|
||||
System.out.println("Completed: " + counts.getTaskCounts().getCompleted());
|
||||
```
|
||||
|
||||
### Terminate Job
|
||||
|
||||
```java
|
||||
BatchJobTerminateParameters terminateParams = new BatchJobTerminateParameters()
|
||||
.setTerminationReason("Manual termination");
|
||||
BatchJobTerminateOptions options = new BatchJobTerminateOptions().setParameters(terminateParams);
|
||||
|
||||
SyncPoller<BatchJob, BatchJob> poller = batchClient.beginTerminateJob("myJobId", options, null);
|
||||
poller.waitForCompletion();
|
||||
```
|
||||
|
||||
### Delete Job
|
||||
|
||||
```java
|
||||
SyncPoller<BatchJob, Void> deletePoller = batchClient.beginDeleteJob("myJobId");
|
||||
deletePoller.waitForCompletion();
|
||||
```
|
||||
|
||||
## Task Operations
|
||||
|
||||
### Create Single Task
|
||||
|
||||
```java
|
||||
BatchTaskCreateParameters task = new BatchTaskCreateParameters("task1", "echo 'Hello World'");
|
||||
batchClient.createTask("myJobId", task);
|
||||
```
|
||||
|
||||
### Create Task with Exit Conditions
|
||||
|
||||
```java
|
||||
batchClient.createTask("myJobId", new BatchTaskCreateParameters("task2", "cmd /c exit 3")
|
||||
.setExitConditions(new ExitConditions()
|
||||
.setExitCodeRanges(Arrays.asList(
|
||||
new ExitCodeRangeMapping(2, 4,
|
||||
new ExitOptions().setJobAction(BatchJobActionKind.TERMINATE)))))
|
||||
.setUserIdentity(new UserIdentity()
|
||||
.setAutoUser(new AutoUserSpecification()
|
||||
.setScope(AutoUserScope.TASK)
|
||||
.setElevationLevel(ElevationLevel.NON_ADMIN))),
|
||||
null);
|
||||
```
|
||||
|
||||
### Create Task Collection (up to 100)
|
||||
|
||||
```java
|
||||
List<BatchTaskCreateParameters> taskList = Arrays.asList(
|
||||
new BatchTaskCreateParameters("task1", "echo Task 1"),
|
||||
new BatchTaskCreateParameters("task2", "echo Task 2"),
|
||||
new BatchTaskCreateParameters("task3", "echo Task 3")
|
||||
);
|
||||
BatchTaskGroup taskGroup = new BatchTaskGroup(taskList);
|
||||
BatchCreateTaskCollectionResult result = batchClient.createTaskCollection("myJobId", taskGroup);
|
||||
```
|
||||
|
||||
### Create Many Tasks (no limit)
|
||||
|
||||
```java
|
||||
List<BatchTaskCreateParameters> tasks = new ArrayList<>();
|
||||
for (int i = 0; i < 1000; i++) {
|
||||
tasks.add(new BatchTaskCreateParameters("task" + i, "echo Task " + i));
|
||||
}
|
||||
batchClient.createTasks("myJobId", tasks);
|
||||
```
|
||||
|
||||
### Get Task
|
||||
|
||||
```java
|
||||
BatchTask task = batchClient.getTask("myJobId", "task1");
|
||||
System.out.println("Task state: " + task.getState());
|
||||
System.out.println("Exit code: " + task.getExecutionInfo().getExitCode());
|
||||
```
|
||||
|
||||
### List Tasks
|
||||
|
||||
```java
|
||||
PagedIterable<BatchTask> tasks = batchClient.listTasks("myJobId");
|
||||
for (BatchTask task : tasks) {
|
||||
System.out.println("Task: " + task.getId() + ", State: " + task.getState());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Task Output
|
||||
|
||||
```java
|
||||
import com.azure.core.util.BinaryData;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
|
||||
BinaryData stdout = batchClient.getTaskFile("myJobId", "task1", "stdout.txt");
|
||||
System.out.println(new String(stdout.toBytes(), StandardCharsets.UTF_8));
|
||||
```
|
||||
|
||||
### Terminate Task
|
||||
|
||||
```java
|
||||
batchClient.terminateTask("myJobId", "task1", null, null);
|
||||
```
|
||||
|
||||
## Node Operations
|
||||
|
||||
### List Nodes
|
||||
|
||||
```java
|
||||
PagedIterable<BatchNode> nodes = batchClient.listNodes("myPoolId", new BatchNodesListOptions());
|
||||
for (BatchNode node : nodes) {
|
||||
System.out.println("Node: " + node.getId() + ", State: " + node.getState());
|
||||
}
|
||||
```
|
||||
|
||||
### Reboot Node
|
||||
|
||||
```java
|
||||
SyncPoller<BatchNode, BatchNode> rebootPoller = batchClient.beginRebootNode("myPoolId", "nodeId");
|
||||
rebootPoller.waitForCompletion();
|
||||
```
|
||||
|
||||
### Get Remote Login Settings
|
||||
|
||||
```java
|
||||
BatchNodeRemoteLoginSettings settings = batchClient.getNodeRemoteLoginSettings("myPoolId", "nodeId");
|
||||
System.out.println("IP: " + settings.getRemoteLoginIpAddress());
|
||||
System.out.println("Port: " + settings.getRemoteLoginPort());
|
||||
```
|
||||
|
||||
## Job Schedule Operations
|
||||
|
||||
### Create Job Schedule
|
||||
|
||||
```java
|
||||
batchClient.createJobSchedule(new BatchJobScheduleCreateParameters("myScheduleId",
|
||||
new BatchJobScheduleConfiguration()
|
||||
.setRecurrenceInterval(Duration.ofHours(6))
|
||||
.setDoNotRunUntil(OffsetDateTime.now().plusDays(1)),
|
||||
new BatchJobSpecification(new BatchPoolInfo().setPoolId("myPoolId"))
|
||||
.setPriority(50)),
|
||||
null);
|
||||
```
|
||||
|
||||
### Get Job Schedule
|
||||
|
||||
```java
|
||||
BatchJobSchedule schedule = batchClient.getJobSchedule("myScheduleId");
|
||||
System.out.println("Schedule state: " + schedule.getState());
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.compute.batch.models.BatchErrorException;
|
||||
import com.azure.compute.batch.models.BatchError;
|
||||
|
||||
try {
|
||||
batchClient.getPool("nonexistent-pool");
|
||||
} catch (BatchErrorException e) {
|
||||
BatchError error = e.getValue();
|
||||
System.err.println("Error code: " + error.getCode());
|
||||
System.err.println("Message: " + error.getMessage().getValue());
|
||||
|
||||
if ("PoolNotFound".equals(error.getCode())) {
|
||||
System.err.println("The specified pool does not exist.");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID** — Preferred over shared key for authentication
|
||||
2. **Use management SDK for pools** — `azure-resourcemanager-batch` supports managed identities
|
||||
3. **Batch task creation** — Use `createTaskCollection` or `createTasks` for multiple tasks
|
||||
4. **Handle LRO properly** — Pool resize, delete operations are long-running
|
||||
5. **Monitor task counts** — Use `getJobTaskCounts` to track progress
|
||||
6. **Set constraints** — Configure `maxWallClockTime` and `maxTaskRetryCount`
|
||||
7. **Use low-priority nodes** — Cost savings for fault-tolerant workloads
|
||||
8. **Enable autoscale** — Dynamically adjust pool size based on workload
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-compute-batch |
|
||||
| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/batch/azure-compute-batch |
|
||||
| API Documentation | https://learn.microsoft.com/java/api/com.azure.compute.batch |
|
||||
| Product Docs | https://learn.microsoft.com/azure/batch/ |
|
||||
| REST API | https://learn.microsoft.com/rest/api/batchservice/ |
|
||||
| Samples | https://github.com/azure/azure-batch-samples |
|
||||
388
skills/official/microsoft/java/data/blob/SKILL.md
Normal file
388
skills/official/microsoft/java/data/blob/SKILL.md
Normal file
@@ -0,0 +1,388 @@
|
||||
---
|
||||
name: azure-storage-blob-java
|
||||
description: Build blob storage applications with Azure Storage Blob SDK for Java. Use when uploading, downloading, or managing files in Azure Blob Storage, working with containers, or implementing streaming data operations.
|
||||
package: com.azure:azure-storage-blob
|
||||
---
|
||||
|
||||
# Azure Storage Blob SDK for Java
|
||||
|
||||
Build blob storage applications using the Azure Storage Blob SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-storage-blob</artifactId>
|
||||
<version>12.33.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### BlobServiceClient
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.BlobServiceClient;
|
||||
import com.azure.storage.blob.BlobServiceClientBuilder;
|
||||
|
||||
// With SAS token
|
||||
BlobServiceClient serviceClient = new BlobServiceClientBuilder()
|
||||
.endpoint("<storage-account-url>")
|
||||
.sasToken("<sas-token>")
|
||||
.buildClient();
|
||||
|
||||
// With connection string
|
||||
BlobServiceClient serviceClient = new BlobServiceClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
BlobServiceClient serviceClient = new BlobServiceClientBuilder()
|
||||
.endpoint("<storage-account-url>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### BlobContainerClient
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.BlobContainerClient;
|
||||
|
||||
// From service client
|
||||
BlobContainerClient containerClient = serviceClient.getBlobContainerClient("mycontainer");
|
||||
|
||||
// Direct construction
|
||||
BlobContainerClient containerClient = new BlobContainerClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.containerName("mycontainer")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### BlobClient
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.BlobClient;
|
||||
|
||||
// From container client
|
||||
BlobClient blobClient = containerClient.getBlobClient("myblob.txt");
|
||||
|
||||
// With directory structure
|
||||
BlobClient blobClient = containerClient.getBlobClient("folder/subfolder/myblob.txt");
|
||||
|
||||
// Direct construction
|
||||
BlobClient blobClient = new BlobClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.containerName("mycontainer")
|
||||
.blobName("myblob.txt")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Create Container
|
||||
|
||||
```java
|
||||
// Create container
|
||||
serviceClient.createBlobContainer("mycontainer");
|
||||
|
||||
// Create if not exists
|
||||
BlobContainerClient container = serviceClient.createBlobContainerIfNotExists("mycontainer");
|
||||
|
||||
// From container client
|
||||
containerClient.create();
|
||||
containerClient.createIfNotExists();
|
||||
```
|
||||
|
||||
### Upload Data
|
||||
|
||||
```java
|
||||
import com.azure.core.util.BinaryData;
|
||||
|
||||
// Upload string
|
||||
String data = "Hello, Azure Blob Storage!";
|
||||
blobClient.upload(BinaryData.fromString(data));
|
||||
|
||||
// Upload with overwrite
|
||||
blobClient.upload(BinaryData.fromString(data), true);
|
||||
```
|
||||
|
||||
### Upload from File
|
||||
|
||||
```java
|
||||
blobClient.uploadFromFile("local-file.txt");
|
||||
|
||||
// With overwrite
|
||||
blobClient.uploadFromFile("local-file.txt", true);
|
||||
```
|
||||
|
||||
### Upload from Stream
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.specialized.BlockBlobClient;
|
||||
|
||||
BlockBlobClient blockBlobClient = blobClient.getBlockBlobClient();
|
||||
|
||||
try (ByteArrayInputStream dataStream = new ByteArrayInputStream(data.getBytes())) {
|
||||
blockBlobClient.upload(dataStream, data.length());
|
||||
}
|
||||
```
|
||||
|
||||
### Upload with Options
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.models.BlobHttpHeaders;
|
||||
import com.azure.storage.blob.options.BlobParallelUploadOptions;
|
||||
|
||||
BlobHttpHeaders headers = new BlobHttpHeaders()
|
||||
.setContentType("text/plain")
|
||||
.setCacheControl("max-age=3600");
|
||||
|
||||
Map<String, String> metadata = Map.of("author", "john", "version", "1.0");
|
||||
|
||||
try (InputStream stream = new FileInputStream("large-file.bin")) {
|
||||
BlobParallelUploadOptions options = new BlobParallelUploadOptions(stream)
|
||||
.setHeaders(headers)
|
||||
.setMetadata(metadata);
|
||||
|
||||
blobClient.uploadWithResponse(options, null, Context.NONE);
|
||||
}
|
||||
```
|
||||
|
||||
### Upload if Not Exists
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.models.BlobRequestConditions;
|
||||
|
||||
BlobParallelUploadOptions options = new BlobParallelUploadOptions(inputStream, length)
|
||||
.setRequestConditions(new BlobRequestConditions().setIfNoneMatch("*"));
|
||||
|
||||
blobClient.uploadWithResponse(options, null, Context.NONE);
|
||||
```
|
||||
|
||||
### Download Data
|
||||
|
||||
```java
|
||||
// Download to BinaryData
|
||||
BinaryData content = blobClient.downloadContent();
|
||||
String text = content.toString();
|
||||
|
||||
// Download to file
|
||||
blobClient.downloadToFile("downloaded-file.txt");
|
||||
```
|
||||
|
||||
### Download to Stream
|
||||
|
||||
```java
|
||||
try (ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
|
||||
blobClient.downloadStream(outputStream);
|
||||
byte[] data = outputStream.toByteArray();
|
||||
}
|
||||
```
|
||||
|
||||
### Download with InputStream
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.specialized.BlobInputStream;
|
||||
|
||||
try (BlobInputStream blobIS = blobClient.openInputStream()) {
|
||||
byte[] buffer = new byte[1024];
|
||||
int bytesRead;
|
||||
while ((bytesRead = blobIS.read(buffer)) != -1) {
|
||||
// Process buffer
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Upload via OutputStream
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.specialized.BlobOutputStream;
|
||||
|
||||
try (BlobOutputStream blobOS = blobClient.getBlockBlobClient().getBlobOutputStream()) {
|
||||
blobOS.write("Data to upload".getBytes());
|
||||
}
|
||||
```
|
||||
|
||||
### List Blobs
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.models.BlobItem;
|
||||
|
||||
// List all blobs
|
||||
for (BlobItem blobItem : containerClient.listBlobs()) {
|
||||
System.out.println("Blob: " + blobItem.getName());
|
||||
}
|
||||
|
||||
// List with prefix (virtual directory)
|
||||
import com.azure.storage.blob.models.ListBlobsOptions;
|
||||
|
||||
ListBlobsOptions options = new ListBlobsOptions().setPrefix("folder/");
|
||||
for (BlobItem blobItem : containerClient.listBlobs(options, null)) {
|
||||
System.out.println("Blob: " + blobItem.getName());
|
||||
}
|
||||
```
|
||||
|
||||
### List Blobs by Hierarchy
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.models.BlobListDetails;
|
||||
|
||||
String delimiter = "/";
|
||||
ListBlobsOptions options = new ListBlobsOptions()
|
||||
.setPrefix("data/")
|
||||
.setDetails(new BlobListDetails().setRetrieveMetadata(true));
|
||||
|
||||
for (BlobItem item : containerClient.listBlobsByHierarchy(delimiter, options, null)) {
|
||||
if (item.isPrefix()) {
|
||||
System.out.println("Directory: " + item.getName());
|
||||
} else {
|
||||
System.out.println("Blob: " + item.getName());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Delete Blob
|
||||
|
||||
```java
|
||||
blobClient.delete();
|
||||
|
||||
// Delete if exists
|
||||
blobClient.deleteIfExists();
|
||||
|
||||
// Delete with snapshots
|
||||
import com.azure.storage.blob.models.DeleteSnapshotsOptionType;
|
||||
blobClient.deleteWithResponse(DeleteSnapshotsOptionType.INCLUDE, null, null, Context.NONE);
|
||||
```
|
||||
|
||||
### Copy Blob
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.models.BlobCopyInfo;
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
|
||||
// Async copy (for large blobs or cross-account)
|
||||
SyncPoller<BlobCopyInfo, Void> poller = blobClient.beginCopy("<source-blob-url>", Duration.ofSeconds(1));
|
||||
poller.waitForCompletion();
|
||||
|
||||
// Sync copy from URL (for same account)
|
||||
blobClient.copyFromUrl("<source-blob-url>");
|
||||
```
|
||||
|
||||
### Generate SAS Token
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.sas.*;
|
||||
import java.time.OffsetDateTime;
|
||||
|
||||
// Blob-level SAS
|
||||
BlobSasPermission permissions = new BlobSasPermission().setReadPermission(true);
|
||||
OffsetDateTime expiry = OffsetDateTime.now().plusDays(1);
|
||||
|
||||
BlobServiceSasSignatureValues sasValues = new BlobServiceSasSignatureValues(expiry, permissions);
|
||||
String sasToken = blobClient.generateSas(sasValues);
|
||||
|
||||
// Container-level SAS
|
||||
BlobContainerSasPermission containerPermissions = new BlobContainerSasPermission()
|
||||
.setReadPermission(true)
|
||||
.setListPermission(true);
|
||||
|
||||
BlobServiceSasSignatureValues containerSasValues = new BlobServiceSasSignatureValues(expiry, containerPermissions);
|
||||
String containerSas = containerClient.generateSas(containerSasValues);
|
||||
```
|
||||
|
||||
### Blob Properties and Metadata
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.models.BlobProperties;
|
||||
|
||||
// Get properties
|
||||
BlobProperties properties = blobClient.getProperties();
|
||||
System.out.println("Size: " + properties.getBlobSize());
|
||||
System.out.println("Content-Type: " + properties.getContentType());
|
||||
System.out.println("Last Modified: " + properties.getLastModified());
|
||||
|
||||
// Set metadata
|
||||
Map<String, String> metadata = Map.of("key1", "value1", "key2", "value2");
|
||||
blobClient.setMetadata(metadata);
|
||||
|
||||
// Set HTTP headers
|
||||
BlobHttpHeaders headers = new BlobHttpHeaders()
|
||||
.setContentType("application/json")
|
||||
.setCacheControl("max-age=86400");
|
||||
blobClient.setHttpHeaders(headers);
|
||||
```
|
||||
|
||||
### Lease Blob
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.specialized.BlobLeaseClient;
|
||||
import com.azure.storage.blob.specialized.BlobLeaseClientBuilder;
|
||||
|
||||
BlobLeaseClient leaseClient = new BlobLeaseClientBuilder()
|
||||
.blobClient(blobClient)
|
||||
.buildClient();
|
||||
|
||||
// Acquire lease (-1 for infinite)
|
||||
String leaseId = leaseClient.acquireLease(60);
|
||||
|
||||
// Renew lease
|
||||
leaseClient.renewLease();
|
||||
|
||||
// Release lease
|
||||
leaseClient.releaseLease();
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.storage.blob.models.BlobStorageException;
|
||||
|
||||
try {
|
||||
blobClient.download(outputStream);
|
||||
} catch (BlobStorageException e) {
|
||||
System.out.println("Status: " + e.getStatusCode());
|
||||
System.out.println("Error code: " + e.getErrorCode());
|
||||
// 404 = Blob not found
|
||||
// 409 = Conflict (lease, etc.)
|
||||
}
|
||||
```
|
||||
|
||||
## Proxy Configuration
|
||||
|
||||
```java
|
||||
import com.azure.core.http.ProxyOptions;
|
||||
import com.azure.core.http.netty.NettyAsyncHttpClientBuilder;
|
||||
import java.net.InetSocketAddress;
|
||||
|
||||
ProxyOptions proxyOptions = new ProxyOptions(
|
||||
ProxyOptions.Type.HTTP,
|
||||
new InetSocketAddress("localhost", 8888));
|
||||
|
||||
BlobServiceClient client = new BlobServiceClientBuilder()
|
||||
.endpoint("<endpoint>")
|
||||
.sasToken("<sas-token>")
|
||||
.httpClient(new NettyAsyncHttpClientBuilder().proxy(proxyOptions).build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...
|
||||
AZURE_STORAGE_ACCOUNT_URL=https://<account>.blob.core.windows.net
|
||||
```
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Azure Blob Storage Java"
|
||||
- "upload download blob"
|
||||
- "blob container SDK"
|
||||
- "storage streaming"
|
||||
- "SAS token generation"
|
||||
- "blob metadata properties"
|
||||
258
skills/official/microsoft/java/data/cosmos/SKILL.md
Normal file
258
skills/official/microsoft/java/data/cosmos/SKILL.md
Normal file
@@ -0,0 +1,258 @@
|
||||
---
|
||||
name: azure-cosmos-java
|
||||
description: |
|
||||
Azure Cosmos DB SDK for Java. NoSQL database operations with global distribution, multi-model support, and reactive patterns.
|
||||
Triggers: "CosmosClient java", "CosmosAsyncClient", "cosmos database java", "cosmosdb java", "document database java".
|
||||
package: azure-cosmos
|
||||
---
|
||||
|
||||
# Azure Cosmos DB SDK for Java
|
||||
|
||||
Client library for Azure Cosmos DB NoSQL API with global distribution and reactive patterns.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-cosmos</artifactId>
|
||||
<version>LATEST</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
Or use Azure SDK BOM:
|
||||
|
||||
```xml
|
||||
<dependencyManagement>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-sdk-bom</artifactId>
|
||||
<version>{bom_version}</version>
|
||||
<type>pom</type>
|
||||
<scope>import</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</dependencyManagement>
|
||||
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-cosmos</artifactId>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
COSMOS_ENDPOINT=https://<account>.documents.azure.com:443/
|
||||
COSMOS_KEY=<your-primary-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Key-based Authentication
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.CosmosClient;
|
||||
import com.azure.cosmos.CosmosClientBuilder;
|
||||
|
||||
CosmosClient client = new CosmosClientBuilder()
|
||||
.endpoint(System.getenv("COSMOS_ENDPOINT"))
|
||||
.key(System.getenv("COSMOS_KEY"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.CosmosAsyncClient;
|
||||
|
||||
CosmosAsyncClient asyncClient = new CosmosClientBuilder()
|
||||
.endpoint(serviceEndpoint)
|
||||
.key(key)
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### With Customizations
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.ConsistencyLevel;
|
||||
import java.util.Arrays;
|
||||
|
||||
CosmosClient client = new CosmosClientBuilder()
|
||||
.endpoint(serviceEndpoint)
|
||||
.key(key)
|
||||
.directMode(directConnectionConfig, gatewayConnectionConfig)
|
||||
.consistencyLevel(ConsistencyLevel.SESSION)
|
||||
.connectionSharingAcrossClientsEnabled(true)
|
||||
.contentResponseOnWriteEnabled(true)
|
||||
.userAgentSuffix("my-application")
|
||||
.preferredRegions(Arrays.asList("West US", "East US"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
| Class | Purpose |
|
||||
|-------|---------|
|
||||
| `CosmosClient` / `CosmosAsyncClient` | Account-level operations |
|
||||
| `CosmosDatabase` / `CosmosAsyncDatabase` | Database operations |
|
||||
| `CosmosContainer` / `CosmosAsyncContainer` | Container/item operations |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Create Database
|
||||
|
||||
```java
|
||||
// Sync
|
||||
client.createDatabaseIfNotExists("myDatabase")
|
||||
.map(response -> client.getDatabase(response.getProperties().getId()));
|
||||
|
||||
// Async with chaining
|
||||
asyncClient.createDatabaseIfNotExists("myDatabase")
|
||||
.map(response -> asyncClient.getDatabase(response.getProperties().getId()))
|
||||
.subscribe(database -> System.out.println("Created: " + database.getId()));
|
||||
```
|
||||
|
||||
### Create Container
|
||||
|
||||
```java
|
||||
asyncClient.createDatabaseIfNotExists("myDatabase")
|
||||
.flatMap(dbResponse -> {
|
||||
String databaseId = dbResponse.getProperties().getId();
|
||||
return asyncClient.getDatabase(databaseId)
|
||||
.createContainerIfNotExists("myContainer", "/partitionKey")
|
||||
.map(containerResponse -> asyncClient.getDatabase(databaseId)
|
||||
.getContainer(containerResponse.getProperties().getId()));
|
||||
})
|
||||
.subscribe(container -> System.out.println("Container: " + container.getId()));
|
||||
```
|
||||
|
||||
### CRUD Operations
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.models.PartitionKey;
|
||||
|
||||
CosmosAsyncContainer container = asyncClient
|
||||
.getDatabase("myDatabase")
|
||||
.getContainer("myContainer");
|
||||
|
||||
// Create
|
||||
container.createItem(new User("1", "John Doe", "john@example.com"))
|
||||
.flatMap(response -> {
|
||||
System.out.println("Created: " + response.getItem());
|
||||
// Read
|
||||
return container.readItem(
|
||||
response.getItem().getId(),
|
||||
new PartitionKey(response.getItem().getId()),
|
||||
User.class);
|
||||
})
|
||||
.flatMap(response -> {
|
||||
System.out.println("Read: " + response.getItem());
|
||||
// Update
|
||||
User user = response.getItem();
|
||||
user.setEmail("john.doe@example.com");
|
||||
return container.replaceItem(
|
||||
user,
|
||||
user.getId(),
|
||||
new PartitionKey(user.getId()));
|
||||
})
|
||||
.flatMap(response -> {
|
||||
// Delete
|
||||
return container.deleteItem(
|
||||
response.getItem().getId(),
|
||||
new PartitionKey(response.getItem().getId()));
|
||||
})
|
||||
.block();
|
||||
```
|
||||
|
||||
### Query Documents
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.models.CosmosQueryRequestOptions;
|
||||
import com.azure.cosmos.util.CosmosPagedIterable;
|
||||
|
||||
CosmosContainer container = client.getDatabase("myDatabase").getContainer("myContainer");
|
||||
|
||||
String query = "SELECT * FROM c WHERE c.status = @status";
|
||||
CosmosQueryRequestOptions options = new CosmosQueryRequestOptions();
|
||||
|
||||
CosmosPagedIterable<User> results = container.queryItems(
|
||||
query,
|
||||
options,
|
||||
User.class
|
||||
);
|
||||
|
||||
results.forEach(user -> System.out.println("User: " + user.getName()));
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### Partition Keys
|
||||
|
||||
Choose a partition key with:
|
||||
- High cardinality (many distinct values)
|
||||
- Even distribution of data and requests
|
||||
- Frequently used in queries
|
||||
|
||||
### Consistency Levels
|
||||
|
||||
| Level | Guarantee |
|
||||
|-------|-----------|
|
||||
| Strong | Linearizability |
|
||||
| Bounded Staleness | Consistent prefix with bounded lag |
|
||||
| Session | Consistent prefix within session |
|
||||
| Consistent Prefix | Reads never see out-of-order writes |
|
||||
| Eventual | No ordering guarantee |
|
||||
|
||||
### Request Units (RUs)
|
||||
|
||||
All operations consume RUs. Check response headers:
|
||||
|
||||
```java
|
||||
CosmosItemResponse<User> response = container.createItem(user);
|
||||
System.out.println("RU charge: " + response.getRequestCharge());
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Reuse CosmosClient** — Create once, reuse throughout application
|
||||
2. **Use async client** for high-throughput scenarios
|
||||
3. **Choose partition key carefully** — Affects performance and scalability
|
||||
4. **Enable content response on write** for immediate access to created items
|
||||
5. **Configure preferred regions** for geo-distributed applications
|
||||
6. **Handle 429 errors** with retry policies (built-in by default)
|
||||
7. **Use direct mode** for lowest latency in production
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.CosmosException;
|
||||
|
||||
try {
|
||||
container.createItem(item);
|
||||
} catch (CosmosException e) {
|
||||
System.err.println("Status: " + e.getStatusCode());
|
||||
System.err.println("Message: " + e.getMessage());
|
||||
System.err.println("Request charge: " + e.getRequestCharge());
|
||||
|
||||
if (e.getStatusCode() == 409) {
|
||||
System.err.println("Item already exists");
|
||||
} else if (e.getStatusCode() == 429) {
|
||||
System.err.println("Rate limited, retry after: " + e.getRetryAfterDuration());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-cosmos |
|
||||
| API Documentation | https://azuresdkdocs.z19.web.core.windows.net/java/azure-cosmos/latest/index.html |
|
||||
| Product Docs | https://learn.microsoft.com/azure/cosmos-db/ |
|
||||
| Samples | https://github.com/Azure-Samples/azure-cosmos-java-sql-api-samples |
|
||||
| Performance Guide | https://learn.microsoft.com/azure/cosmos-db/performance-tips-java-sdk-v4-sql |
|
||||
| Troubleshooting | https://learn.microsoft.com/azure/cosmos-db/troubleshoot-java-sdk-v4-sql |
|
||||
334
skills/official/microsoft/java/data/tables/SKILL.md
Normal file
334
skills/official/microsoft/java/data/tables/SKILL.md
Normal file
@@ -0,0 +1,334 @@
|
||||
---
|
||||
name: azure-data-tables-java
|
||||
description: Build table storage applications with Azure Tables SDK for Java. Use when working with Azure Table Storage or Cosmos DB Table API for NoSQL key-value data, schemaless storage, or structured data at scale.
|
||||
package: com.azure:azure-data-tables
|
||||
---
|
||||
|
||||
# Azure Tables SDK for Java
|
||||
|
||||
Build table storage applications using the Azure Tables SDK for Java. Works with both Azure Table Storage and Cosmos DB Table API.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-data-tables</artifactId>
|
||||
<version>12.6.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With Connection String
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.TableServiceClient;
|
||||
import com.azure.data.tables.TableServiceClientBuilder;
|
||||
import com.azure.data.tables.TableClient;
|
||||
|
||||
TableServiceClient serviceClient = new TableServiceClientBuilder()
|
||||
.connectionString("<your-connection-string>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With Shared Key
|
||||
|
||||
```java
|
||||
import com.azure.core.credential.AzureNamedKeyCredential;
|
||||
|
||||
AzureNamedKeyCredential credential = new AzureNamedKeyCredential(
|
||||
"<account-name>",
|
||||
"<account-key>");
|
||||
|
||||
TableServiceClient serviceClient = new TableServiceClientBuilder()
|
||||
.endpoint("<your-table-account-url>")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With SAS Token
|
||||
|
||||
```java
|
||||
TableServiceClient serviceClient = new TableServiceClientBuilder()
|
||||
.endpoint("<your-table-account-url>")
|
||||
.sasToken("<sas-token>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential (Storage only)
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
TableServiceClient serviceClient = new TableServiceClientBuilder()
|
||||
.endpoint("<your-table-account-url>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
- **TableServiceClient**: Manage tables (create, list, delete)
|
||||
- **TableClient**: Manage entities within a table (CRUD)
|
||||
- **Partition Key**: Groups entities for efficient queries
|
||||
- **Row Key**: Unique identifier within a partition
|
||||
- **Entity**: A row with up to 252 properties (1MB Storage, 2MB Cosmos)
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Create Table
|
||||
|
||||
```java
|
||||
// Create table (throws if exists)
|
||||
TableClient tableClient = serviceClient.createTable("mytable");
|
||||
|
||||
// Create if not exists (no exception)
|
||||
TableClient tableClient = serviceClient.createTableIfNotExists("mytable");
|
||||
```
|
||||
|
||||
### Get Table Client
|
||||
|
||||
```java
|
||||
// From service client
|
||||
TableClient tableClient = serviceClient.getTableClient("mytable");
|
||||
|
||||
// Direct construction
|
||||
TableClient tableClient = new TableClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.tableName("mytable")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Create Entity
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableEntity;
|
||||
|
||||
TableEntity entity = new TableEntity("partitionKey", "rowKey")
|
||||
.addProperty("Name", "Product A")
|
||||
.addProperty("Price", 29.99)
|
||||
.addProperty("Quantity", 100)
|
||||
.addProperty("IsAvailable", true);
|
||||
|
||||
tableClient.createEntity(entity);
|
||||
```
|
||||
|
||||
### Get Entity
|
||||
|
||||
```java
|
||||
TableEntity entity = tableClient.getEntity("partitionKey", "rowKey");
|
||||
|
||||
String name = (String) entity.getProperty("Name");
|
||||
Double price = (Double) entity.getProperty("Price");
|
||||
System.out.printf("Product: %s, Price: %.2f%n", name, price);
|
||||
```
|
||||
|
||||
### Update Entity
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableEntityUpdateMode;
|
||||
|
||||
// Merge (update only specified properties)
|
||||
TableEntity updateEntity = new TableEntity("partitionKey", "rowKey")
|
||||
.addProperty("Price", 24.99);
|
||||
tableClient.updateEntity(updateEntity, TableEntityUpdateMode.MERGE);
|
||||
|
||||
// Replace (replace entire entity)
|
||||
TableEntity replaceEntity = new TableEntity("partitionKey", "rowKey")
|
||||
.addProperty("Name", "Product A Updated")
|
||||
.addProperty("Price", 24.99)
|
||||
.addProperty("Quantity", 150);
|
||||
tableClient.updateEntity(replaceEntity, TableEntityUpdateMode.REPLACE);
|
||||
```
|
||||
|
||||
### Upsert Entity
|
||||
|
||||
```java
|
||||
// Insert or update (merge mode)
|
||||
tableClient.upsertEntity(entity, TableEntityUpdateMode.MERGE);
|
||||
|
||||
// Insert or replace
|
||||
tableClient.upsertEntity(entity, TableEntityUpdateMode.REPLACE);
|
||||
```
|
||||
|
||||
### Delete Entity
|
||||
|
||||
```java
|
||||
tableClient.deleteEntity("partitionKey", "rowKey");
|
||||
```
|
||||
|
||||
### List Entities
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.ListEntitiesOptions;
|
||||
|
||||
// List all entities
|
||||
for (TableEntity entity : tableClient.listEntities()) {
|
||||
System.out.printf("%s - %s%n",
|
||||
entity.getPartitionKey(),
|
||||
entity.getRowKey());
|
||||
}
|
||||
|
||||
// With filtering and selection
|
||||
ListEntitiesOptions options = new ListEntitiesOptions()
|
||||
.setFilter("PartitionKey eq 'sales'")
|
||||
.setSelect("Name", "Price");
|
||||
|
||||
for (TableEntity entity : tableClient.listEntities(options, null, null)) {
|
||||
System.out.printf("%s: %.2f%n",
|
||||
entity.getProperty("Name"),
|
||||
entity.getProperty("Price"));
|
||||
}
|
||||
```
|
||||
|
||||
### Query with OData Filter
|
||||
|
||||
```java
|
||||
// Filter by partition key
|
||||
ListEntitiesOptions options = new ListEntitiesOptions()
|
||||
.setFilter("PartitionKey eq 'electronics'");
|
||||
|
||||
// Filter with multiple conditions
|
||||
options.setFilter("PartitionKey eq 'electronics' and Price gt 100");
|
||||
|
||||
// Filter with comparison operators
|
||||
options.setFilter("Quantity ge 10 and Quantity le 100");
|
||||
|
||||
// Top N results
|
||||
options.setTop(10);
|
||||
|
||||
for (TableEntity entity : tableClient.listEntities(options, null, null)) {
|
||||
System.out.println(entity.getRowKey());
|
||||
}
|
||||
```
|
||||
|
||||
### Batch Operations (Transactions)
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableTransactionAction;
|
||||
import com.azure.data.tables.models.TableTransactionActionType;
|
||||
import java.util.Arrays;
|
||||
|
||||
// All entities must have same partition key
|
||||
List<TableTransactionAction> actions = Arrays.asList(
|
||||
new TableTransactionAction(
|
||||
TableTransactionActionType.CREATE,
|
||||
new TableEntity("batch", "row1").addProperty("Name", "Item 1")),
|
||||
new TableTransactionAction(
|
||||
TableTransactionActionType.CREATE,
|
||||
new TableEntity("batch", "row2").addProperty("Name", "Item 2")),
|
||||
new TableTransactionAction(
|
||||
TableTransactionActionType.UPSERT_MERGE,
|
||||
new TableEntity("batch", "row3").addProperty("Name", "Item 3"))
|
||||
);
|
||||
|
||||
tableClient.submitTransaction(actions);
|
||||
```
|
||||
|
||||
### List Tables
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableItem;
|
||||
import com.azure.data.tables.models.ListTablesOptions;
|
||||
|
||||
// List all tables
|
||||
for (TableItem table : serviceClient.listTables()) {
|
||||
System.out.println(table.getName());
|
||||
}
|
||||
|
||||
// Filter tables
|
||||
ListTablesOptions options = new ListTablesOptions()
|
||||
.setFilter("TableName eq 'mytable'");
|
||||
|
||||
for (TableItem table : serviceClient.listTables(options, null, null)) {
|
||||
System.out.println(table.getName());
|
||||
}
|
||||
```
|
||||
|
||||
### Delete Table
|
||||
|
||||
```java
|
||||
serviceClient.deleteTable("mytable");
|
||||
```
|
||||
|
||||
## Typed Entities
|
||||
|
||||
```java
|
||||
public class Product implements TableEntity {
|
||||
private String partitionKey;
|
||||
private String rowKey;
|
||||
private OffsetDateTime timestamp;
|
||||
private String eTag;
|
||||
private String name;
|
||||
private double price;
|
||||
|
||||
// Getters and setters for all fields
|
||||
@Override
|
||||
public String getPartitionKey() { return partitionKey; }
|
||||
@Override
|
||||
public void setPartitionKey(String partitionKey) { this.partitionKey = partitionKey; }
|
||||
@Override
|
||||
public String getRowKey() { return rowKey; }
|
||||
@Override
|
||||
public void setRowKey(String rowKey) { this.rowKey = rowKey; }
|
||||
// ... other getters/setters
|
||||
|
||||
public String getName() { return name; }
|
||||
public void setName(String name) { this.name = name; }
|
||||
public double getPrice() { return price; }
|
||||
public void setPrice(double price) { this.price = price; }
|
||||
}
|
||||
|
||||
// Usage
|
||||
Product product = new Product();
|
||||
product.setPartitionKey("electronics");
|
||||
product.setRowKey("laptop-001");
|
||||
product.setName("Laptop");
|
||||
product.setPrice(999.99);
|
||||
|
||||
tableClient.createEntity(product);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableServiceException;
|
||||
|
||||
try {
|
||||
tableClient.createEntity(entity);
|
||||
} catch (TableServiceException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
// 409 = Conflict (entity exists)
|
||||
// 404 = Not Found
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Storage Account
|
||||
AZURE_TABLES_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...
|
||||
AZURE_TABLES_ENDPOINT=https://<account>.table.core.windows.net
|
||||
|
||||
# Cosmos DB Table API
|
||||
COSMOS_TABLE_ENDPOINT=https://<account>.table.cosmosdb.azure.com
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Partition Key Design**: Choose keys that distribute load evenly
|
||||
2. **Batch Operations**: Use transactions for atomic multi-entity updates
|
||||
3. **Query Optimization**: Always filter by PartitionKey when possible
|
||||
4. **Select Projection**: Only select needed properties for performance
|
||||
5. **Entity Size**: Keep entities under 1MB (Storage) or 2MB (Cosmos)
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Azure Tables Java"
|
||||
- "table storage SDK"
|
||||
- "Cosmos DB Table API"
|
||||
- "NoSQL key-value storage"
|
||||
- "partition key row key"
|
||||
- "table entity CRUD"
|
||||
366
skills/official/microsoft/java/entra/azure-identity/SKILL.md
Normal file
366
skills/official/microsoft/java/entra/azure-identity/SKILL.md
Normal file
@@ -0,0 +1,366 @@
|
||||
---
|
||||
name: azure-identity-java
|
||||
description: Azure Identity Java SDK for authentication with Azure services. Use when implementing DefaultAzureCredential, managed identity, service principal, or any Azure authentication pattern in Java applications.
|
||||
package: com.azure:azure-identity
|
||||
---
|
||||
|
||||
# Azure Identity (Java)
|
||||
|
||||
Authenticate Java applications with Azure services using Microsoft Entra ID (Azure AD).
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-identity</artifactId>
|
||||
<version>1.15.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Credential | Use Case |
|
||||
|------------|----------|
|
||||
| `DefaultAzureCredential` | **Recommended** - Works in dev and production |
|
||||
| `ManagedIdentityCredential` | Azure-hosted apps (App Service, Functions, VMs) |
|
||||
| `EnvironmentCredential` | CI/CD pipelines with env vars |
|
||||
| `ClientSecretCredential` | Service principals with secret |
|
||||
| `ClientCertificateCredential` | Service principals with certificate |
|
||||
| `AzureCliCredential` | Local dev using `az login` |
|
||||
| `InteractiveBrowserCredential` | Interactive login flow |
|
||||
| `DeviceCodeCredential` | Headless device authentication |
|
||||
|
||||
## DefaultAzureCredential (Recommended)
|
||||
|
||||
The `DefaultAzureCredential` tries multiple authentication methods in order:
|
||||
|
||||
1. Environment variables
|
||||
2. Workload Identity
|
||||
3. Managed Identity
|
||||
4. Azure CLI
|
||||
5. Azure PowerShell
|
||||
6. Azure Developer CLI
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredential;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
// Simple usage
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder().build();
|
||||
|
||||
// Use with any Azure client
|
||||
BlobServiceClient blobClient = new BlobServiceClientBuilder()
|
||||
.endpoint("https://<storage-account>.blob.core.windows.net")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
|
||||
KeyClient keyClient = new KeyClientBuilder()
|
||||
.vaultUrl("https://<vault-name>.vault.azure.net")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Configure DefaultAzureCredential
|
||||
|
||||
```java
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder()
|
||||
.managedIdentityClientId("<user-assigned-identity-client-id>") // For user-assigned MI
|
||||
.tenantId("<tenant-id>") // Limit to specific tenant
|
||||
.excludeEnvironmentCredential() // Skip env vars
|
||||
.excludeAzureCliCredential() // Skip Azure CLI
|
||||
.build();
|
||||
```
|
||||
|
||||
## Managed Identity
|
||||
|
||||
For Azure-hosted applications (App Service, Functions, AKS, VMs).
|
||||
|
||||
```java
|
||||
import com.azure.identity.ManagedIdentityCredential;
|
||||
import com.azure.identity.ManagedIdentityCredentialBuilder;
|
||||
|
||||
// System-assigned managed identity
|
||||
ManagedIdentityCredential credential = new ManagedIdentityCredentialBuilder()
|
||||
.build();
|
||||
|
||||
// User-assigned managed identity (by client ID)
|
||||
ManagedIdentityCredential credential = new ManagedIdentityCredentialBuilder()
|
||||
.clientId("<user-assigned-client-id>")
|
||||
.build();
|
||||
|
||||
// User-assigned managed identity (by resource ID)
|
||||
ManagedIdentityCredential credential = new ManagedIdentityCredentialBuilder()
|
||||
.resourceId("/subscriptions/<sub>/resourceGroups/<rg>/providers/Microsoft.ManagedIdentity/userAssignedIdentities/<name>")
|
||||
.build();
|
||||
```
|
||||
|
||||
## Service Principal with Secret
|
||||
|
||||
```java
|
||||
import com.azure.identity.ClientSecretCredential;
|
||||
import com.azure.identity.ClientSecretCredentialBuilder;
|
||||
|
||||
ClientSecretCredential credential = new ClientSecretCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.clientSecret("<client-secret>")
|
||||
.build();
|
||||
```
|
||||
|
||||
## Service Principal with Certificate
|
||||
|
||||
```java
|
||||
import com.azure.identity.ClientCertificateCredential;
|
||||
import com.azure.identity.ClientCertificateCredentialBuilder;
|
||||
|
||||
// From PEM file
|
||||
ClientCertificateCredential credential = new ClientCertificateCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.pemCertificate("<path-to-cert.pem>")
|
||||
.build();
|
||||
|
||||
// From PFX file with password
|
||||
ClientCertificateCredential credential = new ClientCertificateCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.pfxCertificate("<path-to-cert.pfx>", "<pfx-password>")
|
||||
.build();
|
||||
|
||||
// Send certificate chain for SNI
|
||||
ClientCertificateCredential credential = new ClientCertificateCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.pemCertificate("<path-to-cert.pem>")
|
||||
.sendCertificateChain(true)
|
||||
.build();
|
||||
```
|
||||
|
||||
## Environment Credential
|
||||
|
||||
Reads credentials from environment variables.
|
||||
|
||||
```java
|
||||
import com.azure.identity.EnvironmentCredential;
|
||||
import com.azure.identity.EnvironmentCredentialBuilder;
|
||||
|
||||
EnvironmentCredential credential = new EnvironmentCredentialBuilder().build();
|
||||
```
|
||||
|
||||
### Required Environment Variables
|
||||
|
||||
**For service principal with secret:**
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
**For service principal with certificate:**
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_CERTIFICATE_PATH=/path/to/cert.pem
|
||||
AZURE_CLIENT_CERTIFICATE_PASSWORD=<optional-password>
|
||||
```
|
||||
|
||||
**For username/password:**
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_USERNAME=<username>
|
||||
AZURE_PASSWORD=<password>
|
||||
```
|
||||
|
||||
## Azure CLI Credential
|
||||
|
||||
For local development using `az login`.
|
||||
|
||||
```java
|
||||
import com.azure.identity.AzureCliCredential;
|
||||
import com.azure.identity.AzureCliCredentialBuilder;
|
||||
|
||||
AzureCliCredential credential = new AzureCliCredentialBuilder()
|
||||
.tenantId("<tenant-id>") // Optional: specific tenant
|
||||
.build();
|
||||
```
|
||||
|
||||
## Interactive Browser
|
||||
|
||||
For desktop applications requiring user login.
|
||||
|
||||
```java
|
||||
import com.azure.identity.InteractiveBrowserCredential;
|
||||
import com.azure.identity.InteractiveBrowserCredentialBuilder;
|
||||
|
||||
InteractiveBrowserCredential credential = new InteractiveBrowserCredentialBuilder()
|
||||
.clientId("<client-id>")
|
||||
.redirectUrl("http://localhost:8080") // Must match app registration
|
||||
.build();
|
||||
```
|
||||
|
||||
## Device Code
|
||||
|
||||
For headless devices (IoT, CLI tools).
|
||||
|
||||
```java
|
||||
import com.azure.identity.DeviceCodeCredential;
|
||||
import com.azure.identity.DeviceCodeCredentialBuilder;
|
||||
|
||||
DeviceCodeCredential credential = new DeviceCodeCredentialBuilder()
|
||||
.clientId("<client-id>")
|
||||
.challengeConsumer(challenge -> {
|
||||
// Display to user
|
||||
System.out.println(challenge.getMessage());
|
||||
})
|
||||
.build();
|
||||
```
|
||||
|
||||
## Chained Credential
|
||||
|
||||
Create custom authentication chains.
|
||||
|
||||
```java
|
||||
import com.azure.identity.ChainedTokenCredential;
|
||||
import com.azure.identity.ChainedTokenCredentialBuilder;
|
||||
|
||||
ChainedTokenCredential credential = new ChainedTokenCredentialBuilder()
|
||||
.addFirst(new ManagedIdentityCredentialBuilder().build())
|
||||
.addLast(new AzureCliCredentialBuilder().build())
|
||||
.build();
|
||||
```
|
||||
|
||||
## Workload Identity (AKS)
|
||||
|
||||
For Azure Kubernetes Service with workload identity.
|
||||
|
||||
```java
|
||||
import com.azure.identity.WorkloadIdentityCredential;
|
||||
import com.azure.identity.WorkloadIdentityCredentialBuilder;
|
||||
|
||||
// Reads from AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_FEDERATED_TOKEN_FILE
|
||||
WorkloadIdentityCredential credential = new WorkloadIdentityCredentialBuilder().build();
|
||||
|
||||
// Or explicit configuration
|
||||
WorkloadIdentityCredential credential = new WorkloadIdentityCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.tokenFilePath("/var/run/secrets/azure/tokens/azure-identity-token")
|
||||
.build();
|
||||
```
|
||||
|
||||
## Token Caching
|
||||
|
||||
Enable persistent token caching for better performance.
|
||||
|
||||
```java
|
||||
// Enable token caching (in-memory by default)
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder()
|
||||
.enableAccountIdentifierLogging()
|
||||
.build();
|
||||
|
||||
// With shared token cache (for multi-credential scenarios)
|
||||
SharedTokenCacheCredential credential = new SharedTokenCacheCredentialBuilder()
|
||||
.clientId("<client-id>")
|
||||
.build();
|
||||
```
|
||||
|
||||
## Sovereign Clouds
|
||||
|
||||
```java
|
||||
import com.azure.identity.AzureAuthorityHosts;
|
||||
|
||||
// Azure Government
|
||||
DefaultAzureCredential govCredential = new DefaultAzureCredentialBuilder()
|
||||
.authorityHost(AzureAuthorityHosts.AZURE_GOVERNMENT)
|
||||
.build();
|
||||
|
||||
// Azure China
|
||||
DefaultAzureCredential chinaCredential = new DefaultAzureCredentialBuilder()
|
||||
.authorityHost(AzureAuthorityHosts.AZURE_CHINA)
|
||||
.build();
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.identity.CredentialUnavailableException;
|
||||
import com.azure.core.exception.ClientAuthenticationException;
|
||||
|
||||
try {
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder().build();
|
||||
AccessToken token = credential.getToken(new TokenRequestContext()
|
||||
.addScopes("https://management.azure.com/.default"));
|
||||
} catch (CredentialUnavailableException e) {
|
||||
// No credential could authenticate
|
||||
System.out.println("Authentication failed: " + e.getMessage());
|
||||
} catch (ClientAuthenticationException e) {
|
||||
// Authentication error (wrong credentials, expired, etc.)
|
||||
System.out.println("Auth error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Logging
|
||||
|
||||
Enable authentication logging for debugging.
|
||||
|
||||
```java
|
||||
// Via environment variable
|
||||
// AZURE_LOG_LEVEL=verbose
|
||||
|
||||
// Or programmatically
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder()
|
||||
.enableAccountIdentifierLogging() // Log account info
|
||||
.build();
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# DefaultAzureCredential configuration
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
|
||||
# Managed Identity
|
||||
AZURE_CLIENT_ID=<user-assigned-mi-client-id>
|
||||
|
||||
# Workload Identity (AKS)
|
||||
AZURE_FEDERATED_TOKEN_FILE=/var/run/secrets/azure/tokens/azure-identity-token
|
||||
|
||||
# Logging
|
||||
AZURE_LOG_LEVEL=verbose
|
||||
|
||||
# Authority host
|
||||
AZURE_AUTHORITY_HOST=https://login.microsoftonline.com/
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** - Works seamlessly from dev to production
|
||||
2. **Managed Identity in Production** - No secrets to manage, automatic rotation
|
||||
3. **Azure CLI for Local Dev** - Run `az login` before running your app
|
||||
4. **Least Privilege** - Grant only required permissions to service principals
|
||||
5. **Token Caching** - Enabled by default, reduces auth round-trips
|
||||
6. **Environment Variables** - Use for CI/CD, not hardcoded secrets
|
||||
|
||||
## Credential Selection Matrix
|
||||
|
||||
| Environment | Recommended Credential |
|
||||
|-------------|----------------------|
|
||||
| Local Development | `DefaultAzureCredential` (uses Azure CLI) |
|
||||
| Azure App Service | `DefaultAzureCredential` (uses Managed Identity) |
|
||||
| Azure Functions | `DefaultAzureCredential` (uses Managed Identity) |
|
||||
| Azure Kubernetes Service | `WorkloadIdentityCredential` |
|
||||
| Azure VMs | `DefaultAzureCredential` (uses Managed Identity) |
|
||||
| CI/CD Pipeline | `EnvironmentCredential` |
|
||||
| Desktop App | `InteractiveBrowserCredential` |
|
||||
| CLI Tool | `DeviceCodeCredential` |
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Azure authentication Java", "DefaultAzureCredential Java"
|
||||
- "managed identity Java", "service principal Java"
|
||||
- "Azure login Java", "Azure credentials Java"
|
||||
- "AZURE_CLIENT_ID", "AZURE_TENANT_ID"
|
||||
362
skills/official/microsoft/java/entra/keyvault-keys/SKILL.md
Normal file
362
skills/official/microsoft/java/entra/keyvault-keys/SKILL.md
Normal file
@@ -0,0 +1,362 @@
|
||||
---
|
||||
name: azure-security-keyvault-keys-java
|
||||
description: Azure Key Vault Keys Java SDK for cryptographic key management. Use when creating, managing, or using RSA/EC keys, performing encrypt/decrypt/sign/verify operations, or working with HSM-backed keys.
|
||||
package: com.azure:azure-security-keyvault-keys
|
||||
---
|
||||
|
||||
# Azure Key Vault Keys (Java)
|
||||
|
||||
Manage cryptographic keys and perform cryptographic operations in Azure Key Vault and Managed HSM.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-security-keyvault-keys</artifactId>
|
||||
<version>4.9.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
```java
|
||||
import com.azure.security.keyvault.keys.KeyClient;
|
||||
import com.azure.security.keyvault.keys.KeyClientBuilder;
|
||||
import com.azure.security.keyvault.keys.cryptography.CryptographyClient;
|
||||
import com.azure.security.keyvault.keys.cryptography.CryptographyClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
// Key management client
|
||||
KeyClient keyClient = new KeyClientBuilder()
|
||||
.vaultUrl("https://<vault-name>.vault.azure.net")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
|
||||
// Async client
|
||||
KeyAsyncClient keyAsyncClient = new KeyClientBuilder()
|
||||
.vaultUrl("https://<vault-name>.vault.azure.net")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildAsyncClient();
|
||||
|
||||
// Cryptography client (for encrypt/decrypt/sign/verify)
|
||||
CryptographyClient cryptoClient = new CryptographyClientBuilder()
|
||||
.keyIdentifier("https://<vault-name>.vault.azure.net/keys/<key-name>/<key-version>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
| Type | Description |
|
||||
|------|-------------|
|
||||
| `RSA` | RSA key (2048, 3072, 4096 bits) |
|
||||
| `RSA_HSM` | RSA key in HSM |
|
||||
| `EC` | Elliptic Curve key |
|
||||
| `EC_HSM` | Elliptic Curve key in HSM |
|
||||
| `OCT` | Symmetric key (Managed HSM only) |
|
||||
| `OCT_HSM` | Symmetric key in HSM |
|
||||
|
||||
## Create Keys
|
||||
|
||||
### Create RSA Key
|
||||
|
||||
```java
|
||||
import com.azure.security.keyvault.keys.models.*;
|
||||
|
||||
// Simple RSA key
|
||||
KeyVaultKey rsaKey = keyClient.createRsaKey(new CreateRsaKeyOptions("my-rsa-key")
|
||||
.setKeySize(2048));
|
||||
|
||||
System.out.println("Key name: " + rsaKey.getName());
|
||||
System.out.println("Key ID: " + rsaKey.getId());
|
||||
System.out.println("Key type: " + rsaKey.getKeyType());
|
||||
|
||||
// RSA key with options
|
||||
KeyVaultKey rsaKeyWithOptions = keyClient.createRsaKey(new CreateRsaKeyOptions("my-rsa-key-2")
|
||||
.setKeySize(4096)
|
||||
.setExpiresOn(OffsetDateTime.now().plusYears(1))
|
||||
.setNotBefore(OffsetDateTime.now())
|
||||
.setEnabled(true)
|
||||
.setKeyOperations(KeyOperation.ENCRYPT, KeyOperation.DECRYPT,
|
||||
KeyOperation.WRAP_KEY, KeyOperation.UNWRAP_KEY)
|
||||
.setTags(Map.of("environment", "production")));
|
||||
|
||||
// HSM-backed RSA key
|
||||
KeyVaultKey hsmKey = keyClient.createRsaKey(new CreateRsaKeyOptions("my-hsm-key")
|
||||
.setKeySize(2048)
|
||||
.setHardwareProtected(true));
|
||||
```
|
||||
|
||||
### Create EC Key
|
||||
|
||||
```java
|
||||
// EC key with P-256 curve
|
||||
KeyVaultKey ecKey = keyClient.createEcKey(new CreateEcKeyOptions("my-ec-key")
|
||||
.setCurveName(KeyCurveName.P_256));
|
||||
|
||||
// EC key with other curves
|
||||
KeyVaultKey ecKey384 = keyClient.createEcKey(new CreateEcKeyOptions("my-ec-key-384")
|
||||
.setCurveName(KeyCurveName.P_384));
|
||||
|
||||
KeyVaultKey ecKey521 = keyClient.createEcKey(new CreateEcKeyOptions("my-ec-key-521")
|
||||
.setCurveName(KeyCurveName.P_521));
|
||||
|
||||
// HSM-backed EC key
|
||||
KeyVaultKey ecHsmKey = keyClient.createEcKey(new CreateEcKeyOptions("my-ec-hsm-key")
|
||||
.setCurveName(KeyCurveName.P_256)
|
||||
.setHardwareProtected(true));
|
||||
```
|
||||
|
||||
### Create Symmetric Key (Managed HSM only)
|
||||
|
||||
```java
|
||||
KeyVaultKey octKey = keyClient.createOctKey(new CreateOctKeyOptions("my-symmetric-key")
|
||||
.setKeySize(256)
|
||||
.setHardwareProtected(true));
|
||||
```
|
||||
|
||||
## Get Key
|
||||
|
||||
```java
|
||||
// Get latest version
|
||||
KeyVaultKey key = keyClient.getKey("my-key");
|
||||
|
||||
// Get specific version
|
||||
KeyVaultKey keyVersion = keyClient.getKey("my-key", "<version-id>");
|
||||
|
||||
// Get only key properties (no key material)
|
||||
KeyProperties keyProps = keyClient.getKey("my-key").getProperties();
|
||||
```
|
||||
|
||||
## Update Key Properties
|
||||
|
||||
```java
|
||||
KeyVaultKey key = keyClient.getKey("my-key");
|
||||
|
||||
// Update properties
|
||||
key.getProperties()
|
||||
.setEnabled(false)
|
||||
.setExpiresOn(OffsetDateTime.now().plusMonths(6))
|
||||
.setTags(Map.of("status", "archived"));
|
||||
|
||||
KeyVaultKey updatedKey = keyClient.updateKeyProperties(key.getProperties(),
|
||||
KeyOperation.ENCRYPT, KeyOperation.DECRYPT);
|
||||
```
|
||||
|
||||
## List Keys
|
||||
|
||||
```java
|
||||
import com.azure.core.util.paging.PagedIterable;
|
||||
|
||||
// List all keys
|
||||
for (KeyProperties keyProps : keyClient.listPropertiesOfKeys()) {
|
||||
System.out.println("Key: " + keyProps.getName());
|
||||
System.out.println(" Enabled: " + keyProps.isEnabled());
|
||||
System.out.println(" Created: " + keyProps.getCreatedOn());
|
||||
}
|
||||
|
||||
// List key versions
|
||||
for (KeyProperties version : keyClient.listPropertiesOfKeyVersions("my-key")) {
|
||||
System.out.println("Version: " + version.getVersion());
|
||||
System.out.println("Created: " + version.getCreatedOn());
|
||||
}
|
||||
```
|
||||
|
||||
## Delete Key
|
||||
|
||||
```java
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
|
||||
// Begin delete (soft-delete enabled vaults)
|
||||
SyncPoller<DeletedKey, Void> deletePoller = keyClient.beginDeleteKey("my-key");
|
||||
|
||||
// Wait for deletion
|
||||
DeletedKey deletedKey = deletePoller.poll().getValue();
|
||||
System.out.println("Deleted: " + deletedKey.getDeletedOn());
|
||||
|
||||
deletePoller.waitForCompletion();
|
||||
|
||||
// Purge deleted key (permanent deletion)
|
||||
keyClient.purgeDeletedKey("my-key");
|
||||
|
||||
// Recover deleted key
|
||||
SyncPoller<KeyVaultKey, Void> recoverPoller = keyClient.beginRecoverDeletedKey("my-key");
|
||||
recoverPoller.waitForCompletion();
|
||||
```
|
||||
|
||||
## Cryptographic Operations
|
||||
|
||||
### Encrypt/Decrypt
|
||||
|
||||
```java
|
||||
import com.azure.security.keyvault.keys.cryptography.models.*;
|
||||
|
||||
CryptographyClient cryptoClient = new CryptographyClientBuilder()
|
||||
.keyIdentifier("https://<vault>.vault.azure.net/keys/<key-name>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
|
||||
byte[] plaintext = "Hello, World!".getBytes(StandardCharsets.UTF_8);
|
||||
|
||||
// Encrypt
|
||||
EncryptResult encryptResult = cryptoClient.encrypt(EncryptionAlgorithm.RSA_OAEP, plaintext);
|
||||
byte[] ciphertext = encryptResult.getCipherText();
|
||||
System.out.println("Ciphertext length: " + ciphertext.length);
|
||||
|
||||
// Decrypt
|
||||
DecryptResult decryptResult = cryptoClient.decrypt(EncryptionAlgorithm.RSA_OAEP, ciphertext);
|
||||
String decrypted = new String(decryptResult.getPlainText(), StandardCharsets.UTF_8);
|
||||
System.out.println("Decrypted: " + decrypted);
|
||||
```
|
||||
|
||||
### Sign/Verify
|
||||
|
||||
```java
|
||||
import java.security.MessageDigest;
|
||||
|
||||
// Create digest of data
|
||||
byte[] data = "Data to sign".getBytes(StandardCharsets.UTF_8);
|
||||
MessageDigest md = MessageDigest.getInstance("SHA-256");
|
||||
byte[] digest = md.digest(data);
|
||||
|
||||
// Sign
|
||||
SignResult signResult = cryptoClient.sign(SignatureAlgorithm.RS256, digest);
|
||||
byte[] signature = signResult.getSignature();
|
||||
|
||||
// Verify
|
||||
VerifyResult verifyResult = cryptoClient.verify(SignatureAlgorithm.RS256, digest, signature);
|
||||
System.out.println("Valid signature: " + verifyResult.isValid());
|
||||
```
|
||||
|
||||
### Wrap/Unwrap Key
|
||||
|
||||
```java
|
||||
// Key to wrap (e.g., AES key)
|
||||
byte[] keyToWrap = new byte[32]; // 256-bit key
|
||||
new SecureRandom().nextBytes(keyToWrap);
|
||||
|
||||
// Wrap
|
||||
WrapResult wrapResult = cryptoClient.wrapKey(KeyWrapAlgorithm.RSA_OAEP, keyToWrap);
|
||||
byte[] wrappedKey = wrapResult.getEncryptedKey();
|
||||
|
||||
// Unwrap
|
||||
UnwrapResult unwrapResult = cryptoClient.unwrapKey(KeyWrapAlgorithm.RSA_OAEP, wrappedKey);
|
||||
byte[] unwrappedKey = unwrapResult.getKey();
|
||||
```
|
||||
|
||||
## Backup and Restore
|
||||
|
||||
```java
|
||||
// Backup
|
||||
byte[] backup = keyClient.backupKey("my-key");
|
||||
|
||||
// Save backup to file
|
||||
Files.write(Paths.get("key-backup.blob"), backup);
|
||||
|
||||
// Restore
|
||||
byte[] backupData = Files.readAllBytes(Paths.get("key-backup.blob"));
|
||||
KeyVaultKey restoredKey = keyClient.restoreKeyBackup(backupData);
|
||||
```
|
||||
|
||||
## Key Rotation
|
||||
|
||||
```java
|
||||
// Rotate to new version
|
||||
KeyVaultKey rotatedKey = keyClient.rotateKey("my-key");
|
||||
System.out.println("New version: " + rotatedKey.getProperties().getVersion());
|
||||
|
||||
// Set rotation policy
|
||||
KeyRotationPolicy policy = new KeyRotationPolicy()
|
||||
.setExpiresIn("P90D") // Expire after 90 days
|
||||
.setLifetimeActions(Arrays.asList(
|
||||
new KeyRotationLifetimeAction(KeyRotationPolicyAction.ROTATE)
|
||||
.setTimeBeforeExpiry("P30D"))); // Rotate 30 days before expiry
|
||||
|
||||
keyClient.updateKeyRotationPolicy("my-key", policy);
|
||||
|
||||
// Get rotation policy
|
||||
KeyRotationPolicy currentPolicy = keyClient.getKeyRotationPolicy("my-key");
|
||||
```
|
||||
|
||||
## Import Key
|
||||
|
||||
```java
|
||||
import com.azure.security.keyvault.keys.models.ImportKeyOptions;
|
||||
import com.azure.security.keyvault.keys.models.JsonWebKey;
|
||||
|
||||
// Import existing key material
|
||||
JsonWebKey jsonWebKey = new JsonWebKey()
|
||||
.setKeyType(KeyType.RSA)
|
||||
.setN(modulus)
|
||||
.setE(exponent)
|
||||
.setD(privateExponent)
|
||||
// ... other RSA components
|
||||
;
|
||||
|
||||
ImportKeyOptions importOptions = new ImportKeyOptions("imported-key", jsonWebKey)
|
||||
.setHardwareProtected(false);
|
||||
|
||||
KeyVaultKey importedKey = keyClient.importKey(importOptions);
|
||||
```
|
||||
|
||||
## Encryption Algorithms
|
||||
|
||||
| Algorithm | Key Type | Description |
|
||||
|-----------|----------|-------------|
|
||||
| `RSA1_5` | RSA | RSAES-PKCS1-v1_5 |
|
||||
| `RSA_OAEP` | RSA | RSAES with OAEP (recommended) |
|
||||
| `RSA_OAEP_256` | RSA | RSAES with OAEP using SHA-256 |
|
||||
| `A128GCM` | OCT | AES-GCM 128-bit |
|
||||
| `A256GCM` | OCT | AES-GCM 256-bit |
|
||||
| `A128CBC` | OCT | AES-CBC 128-bit |
|
||||
| `A256CBC` | OCT | AES-CBC 256-bit |
|
||||
|
||||
## Signature Algorithms
|
||||
|
||||
| Algorithm | Key Type | Hash |
|
||||
|-----------|----------|------|
|
||||
| `RS256` | RSA | SHA-256 |
|
||||
| `RS384` | RSA | SHA-384 |
|
||||
| `RS512` | RSA | SHA-512 |
|
||||
| `PS256` | RSA | SHA-256 (PSS) |
|
||||
| `ES256` | EC P-256 | SHA-256 |
|
||||
| `ES384` | EC P-384 | SHA-384 |
|
||||
| `ES512` | EC P-521 | SHA-512 |
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
import com.azure.core.exception.ResourceNotFoundException;
|
||||
|
||||
try {
|
||||
KeyVaultKey key = keyClient.getKey("non-existent-key");
|
||||
} catch (ResourceNotFoundException e) {
|
||||
System.out.println("Key not found: " + e.getMessage());
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("HTTP error " + e.getResponse().getStatusCode());
|
||||
System.out.println("Message: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_KEYVAULT_URL=https://<vault-name>.vault.azure.net
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use HSM Keys for Production** - Set `setHardwareProtected(true)` for sensitive keys
|
||||
2. **Enable Soft Delete** - Protects against accidental deletion
|
||||
3. **Key Rotation** - Set up automatic rotation policies
|
||||
4. **Least Privilege** - Use separate keys for different operations
|
||||
5. **Local Crypto When Possible** - Use `CryptographyClient` with local key material to reduce round-trips
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Key Vault keys Java", "cryptographic keys Java"
|
||||
- "encrypt decrypt Java", "sign verify Java"
|
||||
- "RSA key", "EC key", "HSM key"
|
||||
- "key rotation", "wrap unwrap key"
|
||||
356
skills/official/microsoft/java/entra/keyvault-secrets/SKILL.md
Normal file
356
skills/official/microsoft/java/entra/keyvault-secrets/SKILL.md
Normal file
@@ -0,0 +1,356 @@
|
||||
---
|
||||
name: azure-security-keyvault-secrets-java
|
||||
description: Azure Key Vault Secrets Java SDK for secret management. Use when storing, retrieving, or managing passwords, API keys, connection strings, or other sensitive configuration data.
|
||||
package: com.azure:azure-security-keyvault-secrets
|
||||
---
|
||||
|
||||
# Azure Key Vault Secrets (Java)
|
||||
|
||||
Securely store and manage secrets like passwords, API keys, and connection strings.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-security-keyvault-secrets</artifactId>
|
||||
<version>4.9.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
```java
|
||||
import com.azure.security.keyvault.secrets.SecretClient;
|
||||
import com.azure.security.keyvault.secrets.SecretClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
// Sync client
|
||||
SecretClient secretClient = new SecretClientBuilder()
|
||||
.vaultUrl("https://<vault-name>.vault.azure.net")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
|
||||
// Async client
|
||||
SecretAsyncClient secretAsyncClient = new SecretClientBuilder()
|
||||
.vaultUrl("https://<vault-name>.vault.azure.net")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Create/Set Secret
|
||||
|
||||
```java
|
||||
import com.azure.security.keyvault.secrets.models.KeyVaultSecret;
|
||||
|
||||
// Simple secret
|
||||
KeyVaultSecret secret = secretClient.setSecret("database-password", "P@ssw0rd123!");
|
||||
System.out.println("Secret name: " + secret.getName());
|
||||
System.out.println("Secret ID: " + secret.getId());
|
||||
|
||||
// Secret with options
|
||||
KeyVaultSecret secretWithOptions = secretClient.setSecret(
|
||||
new KeyVaultSecret("api-key", "sk_live_abc123xyz")
|
||||
.setProperties(new SecretProperties()
|
||||
.setContentType("application/json")
|
||||
.setExpiresOn(OffsetDateTime.now().plusYears(1))
|
||||
.setNotBefore(OffsetDateTime.now())
|
||||
.setEnabled(true)
|
||||
.setTags(Map.of(
|
||||
"environment", "production",
|
||||
"service", "payment-api"
|
||||
))
|
||||
)
|
||||
);
|
||||
```
|
||||
|
||||
## Get Secret
|
||||
|
||||
```java
|
||||
// Get latest version
|
||||
KeyVaultSecret secret = secretClient.getSecret("database-password");
|
||||
String value = secret.getValue();
|
||||
System.out.println("Secret value: " + value);
|
||||
|
||||
// Get specific version
|
||||
KeyVaultSecret specificVersion = secretClient.getSecret("database-password", "<version-id>");
|
||||
|
||||
// Get only properties (no value)
|
||||
SecretProperties props = secretClient.getSecret("database-password").getProperties();
|
||||
System.out.println("Enabled: " + props.isEnabled());
|
||||
System.out.println("Created: " + props.getCreatedOn());
|
||||
```
|
||||
|
||||
## Update Secret Properties
|
||||
|
||||
```java
|
||||
// Get secret
|
||||
KeyVaultSecret secret = secretClient.getSecret("api-key");
|
||||
|
||||
// Update properties (cannot update value - create new version instead)
|
||||
secret.getProperties()
|
||||
.setEnabled(false)
|
||||
.setExpiresOn(OffsetDateTime.now().plusMonths(6))
|
||||
.setTags(Map.of("status", "rotating"));
|
||||
|
||||
SecretProperties updated = secretClient.updateSecretProperties(secret.getProperties());
|
||||
System.out.println("Updated: " + updated.getUpdatedOn());
|
||||
```
|
||||
|
||||
## List Secrets
|
||||
|
||||
```java
|
||||
import com.azure.core.util.paging.PagedIterable;
|
||||
import com.azure.security.keyvault.secrets.models.SecretProperties;
|
||||
|
||||
// List all secrets (properties only, no values)
|
||||
for (SecretProperties secretProps : secretClient.listPropertiesOfSecrets()) {
|
||||
System.out.println("Secret: " + secretProps.getName());
|
||||
System.out.println(" Enabled: " + secretProps.isEnabled());
|
||||
System.out.println(" Created: " + secretProps.getCreatedOn());
|
||||
System.out.println(" Content-Type: " + secretProps.getContentType());
|
||||
|
||||
// Get value if needed
|
||||
if (secretProps.isEnabled()) {
|
||||
KeyVaultSecret fullSecret = secretClient.getSecret(secretProps.getName());
|
||||
System.out.println(" Value: " + fullSecret.getValue().substring(0, 5) + "...");
|
||||
}
|
||||
}
|
||||
|
||||
// List versions of a secret
|
||||
for (SecretProperties version : secretClient.listPropertiesOfSecretVersions("database-password")) {
|
||||
System.out.println("Version: " + version.getVersion());
|
||||
System.out.println("Created: " + version.getCreatedOn());
|
||||
System.out.println("Enabled: " + version.isEnabled());
|
||||
}
|
||||
```
|
||||
|
||||
## Delete Secret
|
||||
|
||||
```java
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
import com.azure.security.keyvault.secrets.models.DeletedSecret;
|
||||
|
||||
// Begin delete (returns poller for soft-delete enabled vaults)
|
||||
SyncPoller<DeletedSecret, Void> deletePoller = secretClient.beginDeleteSecret("old-secret");
|
||||
|
||||
// Wait for deletion
|
||||
DeletedSecret deletedSecret = deletePoller.poll().getValue();
|
||||
System.out.println("Deleted on: " + deletedSecret.getDeletedOn());
|
||||
System.out.println("Scheduled purge: " + deletedSecret.getScheduledPurgeDate());
|
||||
|
||||
deletePoller.waitForCompletion();
|
||||
```
|
||||
|
||||
## Recover Deleted Secret
|
||||
|
||||
```java
|
||||
// List deleted secrets
|
||||
for (DeletedSecret deleted : secretClient.listDeletedSecrets()) {
|
||||
System.out.println("Deleted: " + deleted.getName());
|
||||
System.out.println("Deletion date: " + deleted.getDeletedOn());
|
||||
}
|
||||
|
||||
// Recover deleted secret
|
||||
SyncPoller<KeyVaultSecret, Void> recoverPoller = secretClient.beginRecoverDeletedSecret("old-secret");
|
||||
recoverPoller.waitForCompletion();
|
||||
|
||||
KeyVaultSecret recovered = recoverPoller.getFinalResult();
|
||||
System.out.println("Recovered: " + recovered.getName());
|
||||
```
|
||||
|
||||
## Purge Deleted Secret
|
||||
|
||||
```java
|
||||
// Permanently delete (cannot be recovered)
|
||||
secretClient.purgeDeletedSecret("old-secret");
|
||||
|
||||
// Get deleted secret info first
|
||||
DeletedSecret deleted = secretClient.getDeletedSecret("old-secret");
|
||||
System.out.println("Will purge: " + deleted.getName());
|
||||
secretClient.purgeDeletedSecret("old-secret");
|
||||
```
|
||||
|
||||
## Backup and Restore
|
||||
|
||||
```java
|
||||
// Backup secret (all versions)
|
||||
byte[] backup = secretClient.backupSecret("important-secret");
|
||||
|
||||
// Save to file
|
||||
Files.write(Paths.get("secret-backup.blob"), backup);
|
||||
|
||||
// Restore from backup
|
||||
byte[] backupData = Files.readAllBytes(Paths.get("secret-backup.blob"));
|
||||
KeyVaultSecret restored = secretClient.restoreSecretBackup(backupData);
|
||||
System.out.println("Restored: " + restored.getName());
|
||||
```
|
||||
|
||||
## Async Operations
|
||||
|
||||
```java
|
||||
SecretAsyncClient asyncClient = new SecretClientBuilder()
|
||||
.vaultUrl("https://<vault>.vault.azure.net")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildAsyncClient();
|
||||
|
||||
// Set secret async
|
||||
asyncClient.setSecret("async-secret", "async-value")
|
||||
.subscribe(
|
||||
secret -> System.out.println("Created: " + secret.getName()),
|
||||
error -> System.out.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
// Get secret async
|
||||
asyncClient.getSecret("async-secret")
|
||||
.subscribe(secret -> System.out.println("Value: " + secret.getValue()));
|
||||
|
||||
// List secrets async
|
||||
asyncClient.listPropertiesOfSecrets()
|
||||
.doOnNext(props -> System.out.println("Found: " + props.getName()))
|
||||
.subscribe();
|
||||
```
|
||||
|
||||
## Configuration Patterns
|
||||
|
||||
### Load Multiple Secrets
|
||||
|
||||
```java
|
||||
public class ConfigLoader {
|
||||
private final SecretClient client;
|
||||
|
||||
public ConfigLoader(String vaultUrl) {
|
||||
this.client = new SecretClientBuilder()
|
||||
.vaultUrl(vaultUrl)
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
}
|
||||
|
||||
public Map<String, String> loadSecrets(List<String> secretNames) {
|
||||
Map<String, String> secrets = new HashMap<>();
|
||||
for (String name : secretNames) {
|
||||
try {
|
||||
KeyVaultSecret secret = client.getSecret(name);
|
||||
secrets.put(name, secret.getValue());
|
||||
} catch (ResourceNotFoundException e) {
|
||||
System.out.println("Secret not found: " + name);
|
||||
}
|
||||
}
|
||||
return secrets;
|
||||
}
|
||||
}
|
||||
|
||||
// Usage
|
||||
ConfigLoader loader = new ConfigLoader("https://my-vault.vault.azure.net");
|
||||
Map<String, String> config = loader.loadSecrets(
|
||||
Arrays.asList("db-connection-string", "api-key", "jwt-secret")
|
||||
);
|
||||
```
|
||||
|
||||
### Secret Rotation Pattern
|
||||
|
||||
```java
|
||||
public void rotateSecret(String secretName, String newValue) {
|
||||
// Get current secret
|
||||
KeyVaultSecret current = secretClient.getSecret(secretName);
|
||||
|
||||
// Disable old version
|
||||
current.getProperties().setEnabled(false);
|
||||
secretClient.updateSecretProperties(current.getProperties());
|
||||
|
||||
// Create new version with new value
|
||||
KeyVaultSecret newSecret = secretClient.setSecret(secretName, newValue);
|
||||
System.out.println("Rotated to version: " + newSecret.getProperties().getVersion());
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
import com.azure.core.exception.ResourceNotFoundException;
|
||||
|
||||
try {
|
||||
KeyVaultSecret secret = secretClient.getSecret("my-secret");
|
||||
System.out.println("Value: " + secret.getValue());
|
||||
} catch (ResourceNotFoundException e) {
|
||||
System.out.println("Secret not found");
|
||||
} catch (HttpResponseException e) {
|
||||
int status = e.getResponse().getStatusCode();
|
||||
if (status == 403) {
|
||||
System.out.println("Access denied - check permissions");
|
||||
} else if (status == 429) {
|
||||
System.out.println("Rate limited - retry later");
|
||||
} else {
|
||||
System.out.println("HTTP error: " + status);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Secret Properties
|
||||
|
||||
| Property | Description |
|
||||
|----------|-------------|
|
||||
| `name` | Secret name |
|
||||
| `value` | Secret value (string) |
|
||||
| `id` | Full identifier URL |
|
||||
| `contentType` | MIME type hint |
|
||||
| `enabled` | Whether secret can be retrieved |
|
||||
| `notBefore` | Activation time |
|
||||
| `expiresOn` | Expiration time |
|
||||
| `createdOn` | Creation timestamp |
|
||||
| `updatedOn` | Last update timestamp |
|
||||
| `recoveryLevel` | Soft-delete recovery level |
|
||||
| `tags` | User-defined metadata |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_KEYVAULT_URL=https://<vault-name>.vault.azure.net
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Enable Soft Delete** - Protects against accidental deletion
|
||||
2. **Use Tags** - Tag secrets with environment, service, owner
|
||||
3. **Set Expiration** - Use `setExpiresOn()` for credentials that should rotate
|
||||
4. **Content Type** - Set `contentType` to indicate format (e.g., `application/json`)
|
||||
5. **Version Management** - Don't delete old versions immediately during rotation
|
||||
6. **Access Logging** - Enable diagnostic logging on Key Vault
|
||||
7. **Least Privilege** - Use separate vaults for different environments
|
||||
|
||||
## Common Secret Types
|
||||
|
||||
```java
|
||||
// Database connection string
|
||||
secretClient.setSecret(new KeyVaultSecret("db-connection",
|
||||
"Server=myserver.database.windows.net;Database=mydb;...")
|
||||
.setProperties(new SecretProperties()
|
||||
.setContentType("text/plain")
|
||||
.setTags(Map.of("type", "connection-string"))));
|
||||
|
||||
// API key
|
||||
secretClient.setSecret(new KeyVaultSecret("stripe-api-key", "sk_live_...")
|
||||
.setProperties(new SecretProperties()
|
||||
.setContentType("text/plain")
|
||||
.setExpiresOn(OffsetDateTime.now().plusYears(1))));
|
||||
|
||||
// JSON configuration
|
||||
secretClient.setSecret(new KeyVaultSecret("app-config",
|
||||
"{\"endpoint\":\"https://...\",\"key\":\"...\"}")
|
||||
.setProperties(new SecretProperties()
|
||||
.setContentType("application/json")));
|
||||
|
||||
// Certificate password
|
||||
secretClient.setSecret(new KeyVaultSecret("cert-password", "CertP@ss!")
|
||||
.setProperties(new SecretProperties()
|
||||
.setContentType("text/plain")
|
||||
.setTags(Map.of("certificate", "my-cert"))));
|
||||
```
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Key Vault secrets Java", "secret management Java"
|
||||
- "store password", "store API key", "connection string"
|
||||
- "retrieve secret", "rotate secret"
|
||||
- "Azure secrets", "vault secrets"
|
||||
256
skills/official/microsoft/java/foundry/anomalydetector/SKILL.md
Normal file
256
skills/official/microsoft/java/foundry/anomalydetector/SKILL.md
Normal file
@@ -0,0 +1,256 @@
|
||||
---
|
||||
name: azure-ai-anomalydetector-java
|
||||
description: Build anomaly detection applications with Azure AI Anomaly Detector SDK for Java. Use when implementing univariate/multivariate anomaly detection, time-series analysis, or AI-powered monitoring.
|
||||
package: com.azure:azure-ai-anomalydetector
|
||||
---
|
||||
|
||||
# Azure AI Anomaly Detector SDK for Java
|
||||
|
||||
Build anomaly detection applications using the Azure AI Anomaly Detector SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-anomalydetector</artifactId>
|
||||
<version>3.0.0-beta.6</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### Sync and Async Clients
|
||||
|
||||
```java
|
||||
import com.azure.ai.anomalydetector.AnomalyDetectorClientBuilder;
|
||||
import com.azure.ai.anomalydetector.MultivariateClient;
|
||||
import com.azure.ai.anomalydetector.UnivariateClient;
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
String endpoint = System.getenv("AZURE_ANOMALY_DETECTOR_ENDPOINT");
|
||||
String key = System.getenv("AZURE_ANOMALY_DETECTOR_API_KEY");
|
||||
|
||||
// Multivariate client for multiple correlated signals
|
||||
MultivariateClient multivariateClient = new AnomalyDetectorClientBuilder()
|
||||
.credential(new AzureKeyCredential(key))
|
||||
.endpoint(endpoint)
|
||||
.buildMultivariateClient();
|
||||
|
||||
// Univariate client for single variable analysis
|
||||
UnivariateClient univariateClient = new AnomalyDetectorClientBuilder()
|
||||
.credential(new AzureKeyCredential(key))
|
||||
.endpoint(endpoint)
|
||||
.buildUnivariateClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
MultivariateClient client = new AnomalyDetectorClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(endpoint)
|
||||
.buildMultivariateClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### Univariate Anomaly Detection
|
||||
- **Batch Detection**: Analyze entire time series at once
|
||||
- **Streaming Detection**: Real-time detection on latest data point
|
||||
- **Change Point Detection**: Detect trend changes in time series
|
||||
|
||||
### Multivariate Anomaly Detection
|
||||
- Detect anomalies across 300+ correlated signals
|
||||
- Uses Graph Attention Network for inter-correlations
|
||||
- Three-step process: Train → Inference → Results
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Univariate Batch Detection
|
||||
|
||||
```java
|
||||
import com.azure.ai.anomalydetector.models.*;
|
||||
import java.time.OffsetDateTime;
|
||||
import java.util.List;
|
||||
|
||||
List<TimeSeriesPoint> series = List.of(
|
||||
new TimeSeriesPoint(OffsetDateTime.parse("2023-01-01T00:00:00Z"), 1.0),
|
||||
new TimeSeriesPoint(OffsetDateTime.parse("2023-01-02T00:00:00Z"), 2.5),
|
||||
// ... more data points (minimum 12 points required)
|
||||
);
|
||||
|
||||
UnivariateDetectionOptions options = new UnivariateDetectionOptions(series)
|
||||
.setGranularity(TimeGranularity.DAILY)
|
||||
.setSensitivity(95);
|
||||
|
||||
UnivariateEntireDetectionResult result = univariateClient.detectUnivariateEntireSeries(options);
|
||||
|
||||
// Check for anomalies
|
||||
for (int i = 0; i < result.getIsAnomaly().size(); i++) {
|
||||
if (result.getIsAnomaly().get(i)) {
|
||||
System.out.printf("Anomaly detected at index %d with value %.2f%n",
|
||||
i, series.get(i).getValue());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Univariate Last Point Detection (Streaming)
|
||||
|
||||
```java
|
||||
UnivariateLastDetectionResult lastResult = univariateClient.detectUnivariateLastPoint(options);
|
||||
|
||||
if (lastResult.isAnomaly()) {
|
||||
System.out.println("Latest point is an anomaly!");
|
||||
System.out.printf("Expected: %.2f, Upper: %.2f, Lower: %.2f%n",
|
||||
lastResult.getExpectedValue(),
|
||||
lastResult.getUpperMargin(),
|
||||
lastResult.getLowerMargin());
|
||||
}
|
||||
```
|
||||
|
||||
### Change Point Detection
|
||||
|
||||
```java
|
||||
UnivariateChangePointDetectionOptions changeOptions =
|
||||
new UnivariateChangePointDetectionOptions(series, TimeGranularity.DAILY);
|
||||
|
||||
UnivariateChangePointDetectionResult changeResult =
|
||||
univariateClient.detectUnivariateChangePoint(changeOptions);
|
||||
|
||||
for (int i = 0; i < changeResult.getIsChangePoint().size(); i++) {
|
||||
if (changeResult.getIsChangePoint().get(i)) {
|
||||
System.out.printf("Change point at index %d with confidence %.2f%n",
|
||||
i, changeResult.getConfidenceScores().get(i));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Multivariate Model Training
|
||||
|
||||
```java
|
||||
import com.azure.ai.anomalydetector.models.*;
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
|
||||
// Prepare training request with blob storage data
|
||||
ModelInfo modelInfo = new ModelInfo()
|
||||
.setDataSource("https://storage.blob.core.windows.net/container/data.zip?sasToken")
|
||||
.setStartTime(OffsetDateTime.parse("2023-01-01T00:00:00Z"))
|
||||
.setEndTime(OffsetDateTime.parse("2023-06-01T00:00:00Z"))
|
||||
.setSlidingWindow(200)
|
||||
.setDisplayName("MyMultivariateModel");
|
||||
|
||||
// Train model (long-running operation)
|
||||
AnomalyDetectionModel trainedModel = multivariateClient.trainMultivariateModel(modelInfo);
|
||||
|
||||
String modelId = trainedModel.getModelId();
|
||||
System.out.println("Model ID: " + modelId);
|
||||
|
||||
// Check training status
|
||||
AnomalyDetectionModel model = multivariateClient.getMultivariateModel(modelId);
|
||||
System.out.println("Status: " + model.getModelInfo().getStatus());
|
||||
```
|
||||
|
||||
### Multivariate Batch Inference
|
||||
|
||||
```java
|
||||
MultivariateBatchDetectionOptions detectionOptions = new MultivariateBatchDetectionOptions()
|
||||
.setDataSource("https://storage.blob.core.windows.net/container/inference-data.zip?sasToken")
|
||||
.setStartTime(OffsetDateTime.parse("2023-07-01T00:00:00Z"))
|
||||
.setEndTime(OffsetDateTime.parse("2023-07-31T00:00:00Z"))
|
||||
.setTopContributorCount(10);
|
||||
|
||||
MultivariateDetectionResult detectionResult =
|
||||
multivariateClient.detectMultivariateBatchAnomaly(modelId, detectionOptions);
|
||||
|
||||
String resultId = detectionResult.getResultId();
|
||||
|
||||
// Poll for results
|
||||
MultivariateDetectionResult result = multivariateClient.getBatchDetectionResult(resultId);
|
||||
for (AnomalyState state : result.getResults()) {
|
||||
if (state.getValue().isAnomaly()) {
|
||||
System.out.printf("Anomaly at %s, severity: %.2f%n",
|
||||
state.getTimestamp(),
|
||||
state.getValue().getSeverity());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Multivariate Last Point Detection
|
||||
|
||||
```java
|
||||
MultivariateLastDetectionOptions lastOptions = new MultivariateLastDetectionOptions()
|
||||
.setVariables(List.of(
|
||||
new VariableValues("variable1", List.of("timestamp1"), List.of(1.0f)),
|
||||
new VariableValues("variable2", List.of("timestamp1"), List.of(2.5f))
|
||||
))
|
||||
.setTopContributorCount(5);
|
||||
|
||||
MultivariateLastDetectionResult lastResult =
|
||||
multivariateClient.detectMultivariateLastAnomaly(modelId, lastOptions);
|
||||
|
||||
if (lastResult.getValue().isAnomaly()) {
|
||||
System.out.println("Anomaly detected!");
|
||||
// Check contributing variables
|
||||
for (AnomalyContributor contributor : lastResult.getValue().getInterpretation()) {
|
||||
System.out.printf("Variable: %s, Contribution: %.2f%n",
|
||||
contributor.getVariable(),
|
||||
contributor.getContributionScore());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Model Management
|
||||
|
||||
```java
|
||||
// List all models
|
||||
PagedIterable<AnomalyDetectionModel> models = multivariateClient.listMultivariateModels();
|
||||
for (AnomalyDetectionModel m : models) {
|
||||
System.out.printf("Model: %s, Status: %s%n",
|
||||
m.getModelId(),
|
||||
m.getModelInfo().getStatus());
|
||||
}
|
||||
|
||||
// Delete a model
|
||||
multivariateClient.deleteMultivariateModel(modelId);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
univariateClient.detectUnivariateEntireSeries(options);
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status code: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_ANOMALY_DETECTOR_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
AZURE_ANOMALY_DETECTOR_API_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Minimum Data Points**: Univariate requires at least 12 points; more data improves accuracy
|
||||
2. **Granularity Alignment**: Match `TimeGranularity` to your actual data frequency
|
||||
3. **Sensitivity Tuning**: Higher values (0-99) detect more anomalies
|
||||
4. **Multivariate Training**: Use 200-1000 sliding window based on pattern complexity
|
||||
5. **Error Handling**: Always handle `HttpResponseException` for API errors
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "anomaly detection Java"
|
||||
- "detect anomalies time series"
|
||||
- "multivariate anomaly Java"
|
||||
- "univariate anomaly detection"
|
||||
- "streaming anomaly detection"
|
||||
- "change point detection"
|
||||
- "Azure AI Anomaly Detector"
|
||||
282
skills/official/microsoft/java/foundry/contentsafety/SKILL.md
Normal file
282
skills/official/microsoft/java/foundry/contentsafety/SKILL.md
Normal file
@@ -0,0 +1,282 @@
|
||||
---
|
||||
name: azure-ai-contentsafety-java
|
||||
description: Build content moderation applications with Azure AI Content Safety SDK for Java. Use when implementing text/image analysis, blocklist management, or harm detection for hate, violence, sexual content, and self-harm.
|
||||
package: com.azure:azure-ai-contentsafety
|
||||
---
|
||||
|
||||
# Azure AI Content Safety SDK for Java
|
||||
|
||||
Build content moderation applications using the Azure AI Content Safety SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-contentsafety</artifactId>
|
||||
<version>1.1.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With API Key
|
||||
|
||||
```java
|
||||
import com.azure.ai.contentsafety.ContentSafetyClient;
|
||||
import com.azure.ai.contentsafety.ContentSafetyClientBuilder;
|
||||
import com.azure.ai.contentsafety.BlocklistClient;
|
||||
import com.azure.ai.contentsafety.BlocklistClientBuilder;
|
||||
import com.azure.core.credential.KeyCredential;
|
||||
|
||||
String endpoint = System.getenv("CONTENT_SAFETY_ENDPOINT");
|
||||
String key = System.getenv("CONTENT_SAFETY_KEY");
|
||||
|
||||
ContentSafetyClient contentSafetyClient = new ContentSafetyClientBuilder()
|
||||
.credential(new KeyCredential(key))
|
||||
.endpoint(endpoint)
|
||||
.buildClient();
|
||||
|
||||
BlocklistClient blocklistClient = new BlocklistClientBuilder()
|
||||
.credential(new KeyCredential(key))
|
||||
.endpoint(endpoint)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
ContentSafetyClient client = new ContentSafetyClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(endpoint)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### Harm Categories
|
||||
| Category | Description |
|
||||
|----------|-------------|
|
||||
| Hate | Discriminatory language based on identity groups |
|
||||
| Sexual | Sexual content, relationships, acts |
|
||||
| Violence | Physical harm, weapons, injury |
|
||||
| Self-harm | Self-injury, suicide-related content |
|
||||
|
||||
### Severity Levels
|
||||
- Text: 0-7 scale (default outputs 0, 2, 4, 6)
|
||||
- Image: 0, 2, 4, 6 (trimmed scale)
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Analyze Text
|
||||
|
||||
```java
|
||||
import com.azure.ai.contentsafety.models.*;
|
||||
|
||||
AnalyzeTextResult result = contentSafetyClient.analyzeText(
|
||||
new AnalyzeTextOptions("This is text to analyze"));
|
||||
|
||||
for (TextCategoriesAnalysis category : result.getCategoriesAnalysis()) {
|
||||
System.out.printf("Category: %s, Severity: %d%n",
|
||||
category.getCategory(),
|
||||
category.getSeverity());
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze Text with Options
|
||||
|
||||
```java
|
||||
AnalyzeTextOptions options = new AnalyzeTextOptions("Text to analyze")
|
||||
.setCategories(Arrays.asList(
|
||||
TextCategory.HATE,
|
||||
TextCategory.VIOLENCE))
|
||||
.setOutputType(AnalyzeTextOutputType.EIGHT_SEVERITY_LEVELS);
|
||||
|
||||
AnalyzeTextResult result = contentSafetyClient.analyzeText(options);
|
||||
```
|
||||
|
||||
### Analyze Text with Blocklist
|
||||
|
||||
```java
|
||||
AnalyzeTextOptions options = new AnalyzeTextOptions("I h*te you and want to k*ll you")
|
||||
.setBlocklistNames(Arrays.asList("my-blocklist"))
|
||||
.setHaltOnBlocklistHit(true);
|
||||
|
||||
AnalyzeTextResult result = contentSafetyClient.analyzeText(options);
|
||||
|
||||
if (result.getBlocklistsMatch() != null) {
|
||||
for (TextBlocklistMatch match : result.getBlocklistsMatch()) {
|
||||
System.out.printf("Blocklist: %s, Item: %s, Text: %s%n",
|
||||
match.getBlocklistName(),
|
||||
match.getBlocklistItemId(),
|
||||
match.getBlocklistItemText());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze Image
|
||||
|
||||
```java
|
||||
import com.azure.ai.contentsafety.models.*;
|
||||
import com.azure.core.util.BinaryData;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Paths;
|
||||
|
||||
// From file
|
||||
byte[] imageBytes = Files.readAllBytes(Paths.get("image.png"));
|
||||
ContentSafetyImageData imageData = new ContentSafetyImageData()
|
||||
.setContent(BinaryData.fromBytes(imageBytes));
|
||||
|
||||
AnalyzeImageResult result = contentSafetyClient.analyzeImage(
|
||||
new AnalyzeImageOptions(imageData));
|
||||
|
||||
for (ImageCategoriesAnalysis category : result.getCategoriesAnalysis()) {
|
||||
System.out.printf("Category: %s, Severity: %d%n",
|
||||
category.getCategory(),
|
||||
category.getSeverity());
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze Image from URL
|
||||
|
||||
```java
|
||||
ContentSafetyImageData imageData = new ContentSafetyImageData()
|
||||
.setBlobUrl("https://example.com/image.jpg");
|
||||
|
||||
AnalyzeImageResult result = contentSafetyClient.analyzeImage(
|
||||
new AnalyzeImageOptions(imageData));
|
||||
```
|
||||
|
||||
## Blocklist Management
|
||||
|
||||
### Create or Update Blocklist
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.RequestOptions;
|
||||
import com.azure.core.http.rest.Response;
|
||||
import com.azure.core.util.BinaryData;
|
||||
import java.util.Map;
|
||||
|
||||
Map<String, String> description = Map.of("description", "Custom blocklist");
|
||||
BinaryData resource = BinaryData.fromObject(description);
|
||||
|
||||
Response<BinaryData> response = blocklistClient.createOrUpdateTextBlocklistWithResponse(
|
||||
"my-blocklist", resource, new RequestOptions());
|
||||
|
||||
if (response.getStatusCode() == 201) {
|
||||
System.out.println("Blocklist created");
|
||||
} else if (response.getStatusCode() == 200) {
|
||||
System.out.println("Blocklist updated");
|
||||
}
|
||||
```
|
||||
|
||||
### Add Block Items
|
||||
|
||||
```java
|
||||
import com.azure.ai.contentsafety.models.*;
|
||||
import java.util.Arrays;
|
||||
|
||||
List<TextBlocklistItem> items = Arrays.asList(
|
||||
new TextBlocklistItem("badword1").setDescription("Offensive term"),
|
||||
new TextBlocklistItem("badword2").setDescription("Another term")
|
||||
);
|
||||
|
||||
AddOrUpdateTextBlocklistItemsResult result = blocklistClient.addOrUpdateBlocklistItems(
|
||||
"my-blocklist",
|
||||
new AddOrUpdateTextBlocklistItemsOptions(items));
|
||||
|
||||
for (TextBlocklistItem item : result.getBlocklistItems()) {
|
||||
System.out.printf("Added: %s (ID: %s)%n",
|
||||
item.getText(),
|
||||
item.getBlocklistItemId());
|
||||
}
|
||||
```
|
||||
|
||||
### List Blocklists
|
||||
|
||||
```java
|
||||
PagedIterable<TextBlocklist> blocklists = blocklistClient.listTextBlocklists();
|
||||
|
||||
for (TextBlocklist blocklist : blocklists) {
|
||||
System.out.printf("Blocklist: %s, Description: %s%n",
|
||||
blocklist.getName(),
|
||||
blocklist.getDescription());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Blocklist
|
||||
|
||||
```java
|
||||
TextBlocklist blocklist = blocklistClient.getTextBlocklist("my-blocklist");
|
||||
System.out.println("Name: " + blocklist.getName());
|
||||
```
|
||||
|
||||
### List Block Items
|
||||
|
||||
```java
|
||||
PagedIterable<TextBlocklistItem> items =
|
||||
blocklistClient.listTextBlocklistItems("my-blocklist");
|
||||
|
||||
for (TextBlocklistItem item : items) {
|
||||
System.out.printf("ID: %s, Text: %s%n",
|
||||
item.getBlocklistItemId(),
|
||||
item.getText());
|
||||
}
|
||||
```
|
||||
|
||||
### Remove Block Items
|
||||
|
||||
```java
|
||||
List<String> itemIds = Arrays.asList("item-id-1", "item-id-2");
|
||||
|
||||
blocklistClient.removeBlocklistItems(
|
||||
"my-blocklist",
|
||||
new RemoveTextBlocklistItemsOptions(itemIds));
|
||||
```
|
||||
|
||||
### Delete Blocklist
|
||||
|
||||
```java
|
||||
blocklistClient.deleteTextBlocklist("my-blocklist");
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
contentSafetyClient.analyzeText(new AnalyzeTextOptions("test"));
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
// Common codes: InvalidRequestBody, ResourceNotFound, TooManyRequests
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
CONTENT_SAFETY_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
CONTENT_SAFETY_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Blocklist Delay**: Changes take ~5 minutes to take effect
|
||||
2. **Category Selection**: Only request needed categories to reduce latency
|
||||
3. **Severity Thresholds**: Typically block severity >= 4 for strict moderation
|
||||
4. **Batch Processing**: Process multiple items in parallel for throughput
|
||||
5. **Caching**: Cache blocklist results where appropriate
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "content safety Java"
|
||||
- "content moderation Azure"
|
||||
- "analyze text safety"
|
||||
- "image moderation Java"
|
||||
- "blocklist management"
|
||||
- "hate speech detection"
|
||||
- "harmful content filter"
|
||||
341
skills/official/microsoft/java/foundry/formrecognizer/SKILL.md
Normal file
341
skills/official/microsoft/java/foundry/formrecognizer/SKILL.md
Normal file
@@ -0,0 +1,341 @@
|
||||
---
|
||||
name: azure-ai-formrecognizer-java
|
||||
description: Build document analysis applications with Azure Document Intelligence (Form Recognizer) SDK for Java. Use when extracting text, tables, key-value pairs from documents, receipts, invoices, or building custom document models.
|
||||
package: com.azure:azure-ai-formrecognizer
|
||||
---
|
||||
|
||||
# Azure Document Intelligence (Form Recognizer) SDK for Java
|
||||
|
||||
Build document analysis applications using the Azure AI Document Intelligence SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-formrecognizer</artifactId>
|
||||
<version>4.2.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### DocumentAnalysisClient
|
||||
|
||||
```java
|
||||
import com.azure.ai.formrecognizer.documentanalysis.DocumentAnalysisClient;
|
||||
import com.azure.ai.formrecognizer.documentanalysis.DocumentAnalysisClientBuilder;
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
DocumentAnalysisClient client = new DocumentAnalysisClientBuilder()
|
||||
.credential(new AzureKeyCredential("{key}"))
|
||||
.endpoint("{endpoint}")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### DocumentModelAdministrationClient
|
||||
|
||||
```java
|
||||
import com.azure.ai.formrecognizer.documentanalysis.administration.DocumentModelAdministrationClient;
|
||||
import com.azure.ai.formrecognizer.documentanalysis.administration.DocumentModelAdministrationClientBuilder;
|
||||
|
||||
DocumentModelAdministrationClient adminClient = new DocumentModelAdministrationClientBuilder()
|
||||
.credential(new AzureKeyCredential("{key}"))
|
||||
.endpoint("{endpoint}")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
DocumentAnalysisClient client = new DocumentAnalysisClientBuilder()
|
||||
.endpoint("{endpoint}")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Prebuilt Models
|
||||
|
||||
| Model ID | Purpose |
|
||||
|----------|---------|
|
||||
| `prebuilt-layout` | Extract text, tables, selection marks |
|
||||
| `prebuilt-document` | General document with key-value pairs |
|
||||
| `prebuilt-receipt` | Receipt data extraction |
|
||||
| `prebuilt-invoice` | Invoice field extraction |
|
||||
| `prebuilt-businessCard` | Business card parsing |
|
||||
| `prebuilt-idDocument` | ID document (passport, license) |
|
||||
| `prebuilt-tax.us.w2` | US W2 tax forms |
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Extract Layout
|
||||
|
||||
```java
|
||||
import com.azure.ai.formrecognizer.documentanalysis.models.*;
|
||||
import com.azure.core.util.BinaryData;
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
import java.io.File;
|
||||
|
||||
File document = new File("document.pdf");
|
||||
BinaryData documentData = BinaryData.fromFile(document.toPath());
|
||||
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocument("prebuilt-layout", documentData);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
// Process pages
|
||||
for (DocumentPage page : result.getPages()) {
|
||||
System.out.printf("Page %d: %.2f x %.2f %s%n",
|
||||
page.getPageNumber(),
|
||||
page.getWidth(),
|
||||
page.getHeight(),
|
||||
page.getUnit());
|
||||
|
||||
// Lines
|
||||
for (DocumentLine line : page.getLines()) {
|
||||
System.out.println("Line: " + line.getContent());
|
||||
}
|
||||
|
||||
// Selection marks (checkboxes)
|
||||
for (DocumentSelectionMark mark : page.getSelectionMarks()) {
|
||||
System.out.printf("Checkbox: %s (confidence: %.2f)%n",
|
||||
mark.getSelectionMarkState(),
|
||||
mark.getConfidence());
|
||||
}
|
||||
}
|
||||
|
||||
// Tables
|
||||
for (DocumentTable table : result.getTables()) {
|
||||
System.out.printf("Table: %d rows x %d columns%n",
|
||||
table.getRowCount(),
|
||||
table.getColumnCount());
|
||||
|
||||
for (DocumentTableCell cell : table.getCells()) {
|
||||
System.out.printf("Cell[%d,%d]: %s%n",
|
||||
cell.getRowIndex(),
|
||||
cell.getColumnIndex(),
|
||||
cell.getContent());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze from URL
|
||||
|
||||
```java
|
||||
String documentUrl = "https://example.com/invoice.pdf";
|
||||
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocumentFromUrl("prebuilt-invoice", documentUrl);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
```
|
||||
|
||||
### Analyze Receipt
|
||||
|
||||
```java
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocumentFromUrl("prebuilt-receipt", receiptUrl);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
for (AnalyzedDocument doc : result.getDocuments()) {
|
||||
Map<String, DocumentField> fields = doc.getFields();
|
||||
|
||||
DocumentField merchantName = fields.get("MerchantName");
|
||||
if (merchantName != null && merchantName.getType() == DocumentFieldType.STRING) {
|
||||
System.out.printf("Merchant: %s (confidence: %.2f)%n",
|
||||
merchantName.getValueAsString(),
|
||||
merchantName.getConfidence());
|
||||
}
|
||||
|
||||
DocumentField transactionDate = fields.get("TransactionDate");
|
||||
if (transactionDate != null && transactionDate.getType() == DocumentFieldType.DATE) {
|
||||
System.out.printf("Date: %s%n", transactionDate.getValueAsDate());
|
||||
}
|
||||
|
||||
DocumentField items = fields.get("Items");
|
||||
if (items != null && items.getType() == DocumentFieldType.LIST) {
|
||||
for (DocumentField item : items.getValueAsList()) {
|
||||
Map<String, DocumentField> itemFields = item.getValueAsMap();
|
||||
System.out.printf("Item: %s, Price: %.2f%n",
|
||||
itemFields.get("Name").getValueAsString(),
|
||||
itemFields.get("Price").getValueAsDouble());
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### General Document Analysis
|
||||
|
||||
```java
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocumentFromUrl("prebuilt-document", documentUrl);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
// Key-value pairs
|
||||
for (DocumentKeyValuePair kvp : result.getKeyValuePairs()) {
|
||||
System.out.printf("Key: %s => Value: %s%n",
|
||||
kvp.getKey().getContent(),
|
||||
kvp.getValue() != null ? kvp.getValue().getContent() : "null");
|
||||
}
|
||||
```
|
||||
|
||||
## Custom Models
|
||||
|
||||
### Build Custom Model
|
||||
|
||||
```java
|
||||
import com.azure.ai.formrecognizer.documentanalysis.administration.models.*;
|
||||
|
||||
String blobContainerUrl = "{SAS_URL_of_training_data}";
|
||||
String prefix = "training-docs/";
|
||||
|
||||
SyncPoller<OperationResult, DocumentModelDetails> poller = adminClient.beginBuildDocumentModel(
|
||||
blobContainerUrl,
|
||||
DocumentModelBuildMode.TEMPLATE,
|
||||
prefix,
|
||||
new BuildDocumentModelOptions()
|
||||
.setModelId("my-custom-model")
|
||||
.setDescription("Custom invoice model"),
|
||||
Context.NONE);
|
||||
|
||||
DocumentModelDetails model = poller.getFinalResult();
|
||||
|
||||
System.out.println("Model ID: " + model.getModelId());
|
||||
System.out.println("Created: " + model.getCreatedOn());
|
||||
|
||||
model.getDocumentTypes().forEach((docType, details) -> {
|
||||
System.out.println("Document type: " + docType);
|
||||
details.getFieldSchema().forEach((field, schema) -> {
|
||||
System.out.printf(" Field: %s (%s)%n", field, schema.getType());
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
### Analyze with Custom Model
|
||||
|
||||
```java
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocumentFromUrl("my-custom-model", documentUrl);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
for (AnalyzedDocument doc : result.getDocuments()) {
|
||||
System.out.printf("Document type: %s (confidence: %.2f)%n",
|
||||
doc.getDocType(),
|
||||
doc.getConfidence());
|
||||
|
||||
doc.getFields().forEach((name, field) -> {
|
||||
System.out.printf("Field '%s': %s (confidence: %.2f)%n",
|
||||
name,
|
||||
field.getContent(),
|
||||
field.getConfidence());
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
### Compose Models
|
||||
|
||||
```java
|
||||
List<String> modelIds = Arrays.asList("model-1", "model-2", "model-3");
|
||||
|
||||
SyncPoller<OperationResult, DocumentModelDetails> poller =
|
||||
adminClient.beginComposeDocumentModel(
|
||||
modelIds,
|
||||
new ComposeDocumentModelOptions()
|
||||
.setModelId("composed-model")
|
||||
.setDescription("Composed from multiple models"));
|
||||
|
||||
DocumentModelDetails composedModel = poller.getFinalResult();
|
||||
```
|
||||
|
||||
### Manage Models
|
||||
|
||||
```java
|
||||
// List models
|
||||
PagedIterable<DocumentModelSummary> models = adminClient.listDocumentModels();
|
||||
for (DocumentModelSummary summary : models) {
|
||||
System.out.printf("Model: %s, Created: %s%n",
|
||||
summary.getModelId(),
|
||||
summary.getCreatedOn());
|
||||
}
|
||||
|
||||
// Get model details
|
||||
DocumentModelDetails model = adminClient.getDocumentModel("model-id");
|
||||
|
||||
// Delete model
|
||||
adminClient.deleteDocumentModel("model-id");
|
||||
|
||||
// Check resource limits
|
||||
ResourceDetails resources = adminClient.getResourceDetails();
|
||||
System.out.printf("Models: %d / %d%n",
|
||||
resources.getCustomDocumentModelCount(),
|
||||
resources.getCustomDocumentModelLimit());
|
||||
```
|
||||
|
||||
## Document Classification
|
||||
|
||||
### Build Classifier
|
||||
|
||||
```java
|
||||
Map<String, ClassifierDocumentTypeDetails> docTypes = new HashMap<>();
|
||||
docTypes.put("invoice", new ClassifierDocumentTypeDetails()
|
||||
.setAzureBlobSource(new AzureBlobContentSource(containerUrl).setPrefix("invoices/")));
|
||||
docTypes.put("receipt", new ClassifierDocumentTypeDetails()
|
||||
.setAzureBlobSource(new AzureBlobContentSource(containerUrl).setPrefix("receipts/")));
|
||||
|
||||
SyncPoller<OperationResult, DocumentClassifierDetails> poller =
|
||||
adminClient.beginBuildDocumentClassifier(docTypes,
|
||||
new BuildDocumentClassifierOptions().setClassifierId("my-classifier"));
|
||||
|
||||
DocumentClassifierDetails classifier = poller.getFinalResult();
|
||||
```
|
||||
|
||||
### Classify Document
|
||||
|
||||
```java
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginClassifyDocumentFromUrl("my-classifier", documentUrl, Context.NONE);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
for (AnalyzedDocument doc : result.getDocuments()) {
|
||||
System.out.printf("Classified as: %s (confidence: %.2f)%n",
|
||||
doc.getDocType(),
|
||||
doc.getConfidence());
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.beginAnalyzeDocumentFromUrl("prebuilt-receipt", "invalid-url");
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
FORM_RECOGNIZER_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
FORM_RECOGNIZER_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "document intelligence Java"
|
||||
- "form recognizer SDK"
|
||||
- "extract text from PDF"
|
||||
- "OCR document Java"
|
||||
- "analyze invoice receipt"
|
||||
- "custom document model"
|
||||
- "document classification"
|
||||
152
skills/official/microsoft/java/foundry/projects/SKILL.md
Normal file
152
skills/official/microsoft/java/foundry/projects/SKILL.md
Normal file
@@ -0,0 +1,152 @@
|
||||
---
|
||||
name: azure-ai-projects-java
|
||||
description: |
|
||||
Azure AI Projects SDK for Java. High-level SDK for Azure AI Foundry project management including connections, datasets, indexes, and evaluations.
|
||||
Triggers: "AIProjectClient java", "azure ai projects java", "Foundry project java", "ConnectionsClient", "DatasetsClient", "IndexesClient".
|
||||
package: com.azure:azure-ai-projects
|
||||
---
|
||||
|
||||
# Azure AI Projects SDK for Java
|
||||
|
||||
High-level SDK for Azure AI Foundry project management with access to connections, datasets, indexes, and evaluations.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-projects</artifactId>
|
||||
<version>1.0.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```java
|
||||
import com.azure.ai.projects.AIProjectClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
AIProjectClientBuilder builder = new AIProjectClientBuilder()
|
||||
.endpoint(System.getenv("PROJECT_ENDPOINT"))
|
||||
.credential(new DefaultAzureCredentialBuilder().build());
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
The SDK provides multiple sub-clients for different operations:
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `ConnectionsClient` | Enumerate connected Azure resources |
|
||||
| `DatasetsClient` | Upload documents and manage datasets |
|
||||
| `DeploymentsClient` | Enumerate AI model deployments |
|
||||
| `IndexesClient` | Create and manage search indexes |
|
||||
| `EvaluationsClient` | Run AI model evaluations |
|
||||
| `EvaluatorsClient` | Manage evaluator configurations |
|
||||
| `SchedulesClient` | Manage scheduled operations |
|
||||
|
||||
```java
|
||||
// Build sub-clients from builder
|
||||
ConnectionsClient connectionsClient = builder.buildConnectionsClient();
|
||||
DatasetsClient datasetsClient = builder.buildDatasetsClient();
|
||||
DeploymentsClient deploymentsClient = builder.buildDeploymentsClient();
|
||||
IndexesClient indexesClient = builder.buildIndexesClient();
|
||||
EvaluationsClient evaluationsClient = builder.buildEvaluationsClient();
|
||||
```
|
||||
|
||||
## Core Operations
|
||||
|
||||
### List Connections
|
||||
|
||||
```java
|
||||
import com.azure.ai.projects.models.Connection;
|
||||
import com.azure.core.http.rest.PagedIterable;
|
||||
|
||||
PagedIterable<Connection> connections = connectionsClient.listConnections();
|
||||
for (Connection connection : connections) {
|
||||
System.out.println("Name: " + connection.getName());
|
||||
System.out.println("Type: " + connection.getType());
|
||||
System.out.println("Credential Type: " + connection.getCredentials().getType());
|
||||
}
|
||||
```
|
||||
|
||||
### List Indexes
|
||||
|
||||
```java
|
||||
indexesClient.listLatest().forEach(index -> {
|
||||
System.out.println("Index name: " + index.getName());
|
||||
System.out.println("Version: " + index.getVersion());
|
||||
System.out.println("Description: " + index.getDescription());
|
||||
});
|
||||
```
|
||||
|
||||
### Create or Update Index
|
||||
|
||||
```java
|
||||
import com.azure.ai.projects.models.AzureAISearchIndex;
|
||||
import com.azure.ai.projects.models.Index;
|
||||
|
||||
String indexName = "my-index";
|
||||
String indexVersion = "1.0";
|
||||
String searchConnectionName = System.getenv("AI_SEARCH_CONNECTION_NAME");
|
||||
String searchIndexName = System.getenv("AI_SEARCH_INDEX_NAME");
|
||||
|
||||
Index index = indexesClient.createOrUpdate(
|
||||
indexName,
|
||||
indexVersion,
|
||||
new AzureAISearchIndex()
|
||||
.setConnectionName(searchConnectionName)
|
||||
.setIndexName(searchIndexName)
|
||||
);
|
||||
|
||||
System.out.println("Created index: " + index.getName());
|
||||
```
|
||||
|
||||
### Access OpenAI Evaluations
|
||||
|
||||
The SDK exposes OpenAI's official SDK for evaluations:
|
||||
|
||||
```java
|
||||
import com.openai.services.EvalService;
|
||||
|
||||
EvalService evalService = evaluationsClient.getOpenAIClient();
|
||||
// Use OpenAI evaluation APIs directly
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for production authentication
|
||||
2. **Reuse client builder** to create multiple sub-clients efficiently
|
||||
3. **Handle pagination** when listing resources with `PagedIterable`
|
||||
4. **Use environment variables** for connection names and configuration
|
||||
5. **Check connection types** before accessing credentials
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
import com.azure.core.exception.ResourceNotFoundException;
|
||||
|
||||
try {
|
||||
Index index = indexesClient.get(indexName, version);
|
||||
} catch (ResourceNotFoundException e) {
|
||||
System.err.println("Index not found: " + indexName);
|
||||
} catch (HttpResponseException e) {
|
||||
System.err.println("Error: " + e.getResponse().getStatusCode());
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Product Docs | https://learn.microsoft.com/azure/ai-studio/ |
|
||||
| API Reference | https://learn.microsoft.com/rest/api/aifoundry/aiprojects/ |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-projects |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-projects/src/samples |
|
||||
@@ -0,0 +1,289 @@
|
||||
---
|
||||
name: azure-ai-vision-imageanalysis-java
|
||||
description: Build image analysis applications with Azure AI Vision SDK for Java. Use when implementing image captioning, OCR text extraction, object detection, tagging, or smart cropping.
|
||||
package: com.azure:azure-ai-vision-imageanalysis
|
||||
---
|
||||
|
||||
# Azure AI Vision Image Analysis SDK for Java
|
||||
|
||||
Build image analysis applications using the Azure AI Vision Image Analysis SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-vision-imageanalysis</artifactId>
|
||||
<version>1.1.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With API Key
|
||||
|
||||
```java
|
||||
import com.azure.ai.vision.imageanalysis.ImageAnalysisClient;
|
||||
import com.azure.ai.vision.imageanalysis.ImageAnalysisClientBuilder;
|
||||
import com.azure.core.credential.KeyCredential;
|
||||
|
||||
String endpoint = System.getenv("VISION_ENDPOINT");
|
||||
String key = System.getenv("VISION_KEY");
|
||||
|
||||
ImageAnalysisClient client = new ImageAnalysisClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(new KeyCredential(key))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.ai.vision.imageanalysis.ImageAnalysisAsyncClient;
|
||||
|
||||
ImageAnalysisAsyncClient asyncClient = new ImageAnalysisClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(new KeyCredential(key))
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
ImageAnalysisClient client = new ImageAnalysisClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Visual Features
|
||||
|
||||
| Feature | Description |
|
||||
|---------|-------------|
|
||||
| `CAPTION` | Generate human-readable image description |
|
||||
| `DENSE_CAPTIONS` | Captions for up to 10 regions |
|
||||
| `READ` | OCR - Extract text from images |
|
||||
| `TAGS` | Content tags for objects, scenes, actions |
|
||||
| `OBJECTS` | Detect objects with bounding boxes |
|
||||
| `SMART_CROPS` | Smart thumbnail regions |
|
||||
| `PEOPLE` | Detect people with locations |
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Generate Caption
|
||||
|
||||
```java
|
||||
import com.azure.ai.vision.imageanalysis.models.*;
|
||||
import com.azure.core.util.BinaryData;
|
||||
import java.io.File;
|
||||
import java.util.Arrays;
|
||||
|
||||
// From file
|
||||
BinaryData imageData = BinaryData.fromFile(new File("image.jpg").toPath());
|
||||
|
||||
ImageAnalysisResult result = client.analyze(
|
||||
imageData,
|
||||
Arrays.asList(VisualFeatures.CAPTION),
|
||||
new ImageAnalysisOptions().setGenderNeutralCaption(true));
|
||||
|
||||
System.out.printf("Caption: \"%s\" (confidence: %.4f)%n",
|
||||
result.getCaption().getText(),
|
||||
result.getCaption().getConfidence());
|
||||
```
|
||||
|
||||
### Generate Caption from URL
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
"https://example.com/image.jpg",
|
||||
Arrays.asList(VisualFeatures.CAPTION),
|
||||
new ImageAnalysisOptions().setGenderNeutralCaption(true));
|
||||
|
||||
System.out.printf("Caption: \"%s\"%n", result.getCaption().getText());
|
||||
```
|
||||
|
||||
### Extract Text (OCR)
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyze(
|
||||
BinaryData.fromFile(new File("document.jpg").toPath()),
|
||||
Arrays.asList(VisualFeatures.READ),
|
||||
null);
|
||||
|
||||
for (DetectedTextBlock block : result.getRead().getBlocks()) {
|
||||
for (DetectedTextLine line : block.getLines()) {
|
||||
System.out.printf("Line: '%s'%n", line.getText());
|
||||
System.out.printf(" Bounding polygon: %s%n", line.getBoundingPolygon());
|
||||
|
||||
for (DetectedTextWord word : line.getWords()) {
|
||||
System.out.printf(" Word: '%s' (confidence: %.4f)%n",
|
||||
word.getText(),
|
||||
word.getConfidence());
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Detect Objects
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.OBJECTS),
|
||||
null);
|
||||
|
||||
for (DetectedObject obj : result.getObjects()) {
|
||||
System.out.printf("Object: %s (confidence: %.4f)%n",
|
||||
obj.getTags().get(0).getName(),
|
||||
obj.getTags().get(0).getConfidence());
|
||||
|
||||
ImageBoundingBox box = obj.getBoundingBox();
|
||||
System.out.printf(" Location: x=%d, y=%d, w=%d, h=%d%n",
|
||||
box.getX(), box.getY(), box.getWidth(), box.getHeight());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Tags
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.TAGS),
|
||||
null);
|
||||
|
||||
for (DetectedTag tag : result.getTags()) {
|
||||
System.out.printf("Tag: %s (confidence: %.4f)%n",
|
||||
tag.getName(),
|
||||
tag.getConfidence());
|
||||
}
|
||||
```
|
||||
|
||||
### Detect People
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.PEOPLE),
|
||||
null);
|
||||
|
||||
for (DetectedPerson person : result.getPeople()) {
|
||||
ImageBoundingBox box = person.getBoundingBox();
|
||||
System.out.printf("Person at x=%d, y=%d (confidence: %.4f)%n",
|
||||
box.getX(), box.getY(), person.getConfidence());
|
||||
}
|
||||
```
|
||||
|
||||
### Smart Cropping
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.SMART_CROPS),
|
||||
new ImageAnalysisOptions().setSmartCropsAspectRatios(Arrays.asList(1.0, 1.5)));
|
||||
|
||||
for (CropRegion crop : result.getSmartCrops()) {
|
||||
System.out.printf("Crop region: aspect=%.2f, x=%d, y=%d, w=%d, h=%d%n",
|
||||
crop.getAspectRatio(),
|
||||
crop.getBoundingBox().getX(),
|
||||
crop.getBoundingBox().getY(),
|
||||
crop.getBoundingBox().getWidth(),
|
||||
crop.getBoundingBox().getHeight());
|
||||
}
|
||||
```
|
||||
|
||||
### Dense Captions
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.DENSE_CAPTIONS),
|
||||
new ImageAnalysisOptions().setGenderNeutralCaption(true));
|
||||
|
||||
for (DenseCaption caption : result.getDenseCaptions()) {
|
||||
System.out.printf("Caption: \"%s\" (confidence: %.4f)%n",
|
||||
caption.getText(),
|
||||
caption.getConfidence());
|
||||
System.out.printf(" Region: x=%d, y=%d, w=%d, h=%d%n",
|
||||
caption.getBoundingBox().getX(),
|
||||
caption.getBoundingBox().getY(),
|
||||
caption.getBoundingBox().getWidth(),
|
||||
caption.getBoundingBox().getHeight());
|
||||
}
|
||||
```
|
||||
|
||||
### Multiple Features
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(
|
||||
VisualFeatures.CAPTION,
|
||||
VisualFeatures.TAGS,
|
||||
VisualFeatures.OBJECTS,
|
||||
VisualFeatures.READ),
|
||||
new ImageAnalysisOptions()
|
||||
.setGenderNeutralCaption(true)
|
||||
.setLanguage("en"));
|
||||
|
||||
// Access all results
|
||||
System.out.println("Caption: " + result.getCaption().getText());
|
||||
System.out.println("Tags: " + result.getTags().size());
|
||||
System.out.println("Objects: " + result.getObjects().size());
|
||||
System.out.println("Text blocks: " + result.getRead().getBlocks().size());
|
||||
```
|
||||
|
||||
### Async Analysis
|
||||
|
||||
```java
|
||||
asyncClient.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.CAPTION),
|
||||
null)
|
||||
.subscribe(
|
||||
result -> System.out.println("Caption: " + result.getCaption().getText()),
|
||||
error -> System.err.println("Error: " + error.getMessage()),
|
||||
() -> System.out.println("Complete")
|
||||
);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.analyzeFromUrl(imageUrl, Arrays.asList(VisualFeatures.CAPTION), null);
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
VISION_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
VISION_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Image Requirements
|
||||
|
||||
- Formats: JPEG, PNG, GIF, BMP, WEBP, ICO, TIFF, MPO
|
||||
- Size: < 20 MB
|
||||
- Dimensions: 50x50 to 16000x16000 pixels
|
||||
|
||||
## Regional Availability
|
||||
|
||||
Caption and Dense Captions require GPU-supported regions. Check [supported regions](https://learn.microsoft.com/azure/ai-services/computer-vision/concept-describe-images-40) before deployment.
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "image analysis Java"
|
||||
- "Azure Vision SDK"
|
||||
- "image captioning"
|
||||
- "OCR image text extraction"
|
||||
- "object detection image"
|
||||
- "smart crop thumbnail"
|
||||
- "detect people image"
|
||||
225
skills/official/microsoft/java/foundry/voicelive/SKILL.md
Normal file
225
skills/official/microsoft/java/foundry/voicelive/SKILL.md
Normal file
@@ -0,0 +1,225 @@
|
||||
---
|
||||
name: azure-ai-voicelive-java
|
||||
description: |
|
||||
Azure AI VoiceLive SDK for Java. Real-time bidirectional voice conversations with AI assistants using WebSocket.
|
||||
Triggers: "VoiceLiveClient java", "voice assistant java", "real-time voice java", "audio streaming java", "voice activity detection java".
|
||||
package: com.azure:azure-ai-voicelive
|
||||
---
|
||||
|
||||
# Azure AI VoiceLive SDK for Java
|
||||
|
||||
Real-time, bidirectional voice conversations with AI assistants using WebSocket technology.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-voicelive</artifactId>
|
||||
<version>1.0.0-beta.2</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_VOICELIVE_ENDPOINT=https://<resource>.openai.azure.com/
|
||||
AZURE_VOICELIVE_API_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```java
|
||||
import com.azure.ai.voicelive.VoiceLiveAsyncClient;
|
||||
import com.azure.ai.voicelive.VoiceLiveClientBuilder;
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
VoiceLiveAsyncClient client = new VoiceLiveClientBuilder()
|
||||
.endpoint(System.getenv("AZURE_VOICELIVE_ENDPOINT"))
|
||||
.credential(new AzureKeyCredential(System.getenv("AZURE_VOICELIVE_API_KEY")))
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### DefaultAzureCredential (Recommended)
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
VoiceLiveAsyncClient client = new VoiceLiveClientBuilder()
|
||||
.endpoint(System.getenv("AZURE_VOICELIVE_ENDPOINT"))
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| `VoiceLiveAsyncClient` | Main entry point for voice sessions |
|
||||
| `VoiceLiveSessionAsyncClient` | Active WebSocket connection for streaming |
|
||||
| `VoiceLiveSessionOptions` | Configuration for session behavior |
|
||||
|
||||
### Audio Requirements
|
||||
|
||||
- **Sample Rate**: 24kHz (24000 Hz)
|
||||
- **Bit Depth**: 16-bit PCM
|
||||
- **Channels**: Mono (1 channel)
|
||||
- **Format**: Signed PCM, little-endian
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Start Session
|
||||
|
||||
```java
|
||||
import reactor.core.publisher.Mono;
|
||||
|
||||
client.startSession("gpt-4o-realtime-preview")
|
||||
.flatMap(session -> {
|
||||
System.out.println("Session started");
|
||||
|
||||
// Subscribe to events
|
||||
session.receiveEvents()
|
||||
.subscribe(
|
||||
event -> System.out.println("Event: " + event.getType()),
|
||||
error -> System.err.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
return Mono.just(session);
|
||||
})
|
||||
.block();
|
||||
```
|
||||
|
||||
### 2. Configure Session Options
|
||||
|
||||
```java
|
||||
import com.azure.ai.voicelive.models.*;
|
||||
import java.util.Arrays;
|
||||
|
||||
ServerVadTurnDetection turnDetection = new ServerVadTurnDetection()
|
||||
.setThreshold(0.5) // Sensitivity (0.0-1.0)
|
||||
.setPrefixPaddingMs(300) // Audio before speech
|
||||
.setSilenceDurationMs(500) // Silence to end turn
|
||||
.setInterruptResponse(true) // Allow interruptions
|
||||
.setAutoTruncate(true)
|
||||
.setCreateResponse(true);
|
||||
|
||||
AudioInputTranscriptionOptions transcription = new AudioInputTranscriptionOptions(
|
||||
AudioInputTranscriptionOptionsModel.WHISPER_1);
|
||||
|
||||
VoiceLiveSessionOptions options = new VoiceLiveSessionOptions()
|
||||
.setInstructions("You are a helpful AI voice assistant.")
|
||||
.setVoice(BinaryData.fromObject(new OpenAIVoice(OpenAIVoiceName.ALLOY)))
|
||||
.setModalities(Arrays.asList(InteractionModality.TEXT, InteractionModality.AUDIO))
|
||||
.setInputAudioFormat(InputAudioFormat.PCM16)
|
||||
.setOutputAudioFormat(OutputAudioFormat.PCM16)
|
||||
.setInputAudioSamplingRate(24000)
|
||||
.setInputAudioNoiseReduction(new AudioNoiseReduction(AudioNoiseReductionType.NEAR_FIELD))
|
||||
.setInputAudioEchoCancellation(new AudioEchoCancellation())
|
||||
.setInputAudioTranscription(transcription)
|
||||
.setTurnDetection(turnDetection);
|
||||
|
||||
// Send configuration
|
||||
ClientEventSessionUpdate updateEvent = new ClientEventSessionUpdate(options);
|
||||
session.sendEvent(updateEvent).subscribe();
|
||||
```
|
||||
|
||||
### 3. Send Audio Input
|
||||
|
||||
```java
|
||||
byte[] audioData = readAudioChunk(); // Your PCM16 audio data
|
||||
session.sendInputAudio(BinaryData.fromBytes(audioData)).subscribe();
|
||||
```
|
||||
|
||||
### 4. Handle Events
|
||||
|
||||
```java
|
||||
session.receiveEvents().subscribe(event -> {
|
||||
ServerEventType eventType = event.getType();
|
||||
|
||||
if (ServerEventType.SESSION_CREATED.equals(eventType)) {
|
||||
System.out.println("Session created");
|
||||
} else if (ServerEventType.INPUT_AUDIO_BUFFER_SPEECH_STARTED.equals(eventType)) {
|
||||
System.out.println("User started speaking");
|
||||
} else if (ServerEventType.INPUT_AUDIO_BUFFER_SPEECH_STOPPED.equals(eventType)) {
|
||||
System.out.println("User stopped speaking");
|
||||
} else if (ServerEventType.RESPONSE_AUDIO_DELTA.equals(eventType)) {
|
||||
if (event instanceof SessionUpdateResponseAudioDelta) {
|
||||
SessionUpdateResponseAudioDelta audioEvent = (SessionUpdateResponseAudioDelta) event;
|
||||
playAudioChunk(audioEvent.getDelta());
|
||||
}
|
||||
} else if (ServerEventType.RESPONSE_DONE.equals(eventType)) {
|
||||
System.out.println("Response complete");
|
||||
} else if (ServerEventType.ERROR.equals(eventType)) {
|
||||
if (event instanceof SessionUpdateError) {
|
||||
SessionUpdateError errorEvent = (SessionUpdateError) event;
|
||||
System.err.println("Error: " + errorEvent.getError().getMessage());
|
||||
}
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Voice Configuration
|
||||
|
||||
### OpenAI Voices
|
||||
|
||||
```java
|
||||
// Available: ALLOY, ASH, BALLAD, CORAL, ECHO, SAGE, SHIMMER, VERSE
|
||||
VoiceLiveSessionOptions options = new VoiceLiveSessionOptions()
|
||||
.setVoice(BinaryData.fromObject(new OpenAIVoice(OpenAIVoiceName.ALLOY)));
|
||||
```
|
||||
|
||||
### Azure Voices
|
||||
|
||||
```java
|
||||
// Azure Standard Voice
|
||||
options.setVoice(BinaryData.fromObject(new AzureStandardVoice("en-US-JennyNeural")));
|
||||
|
||||
// Azure Custom Voice
|
||||
options.setVoice(BinaryData.fromObject(new AzureCustomVoice("myVoice", "endpointId")));
|
||||
|
||||
// Azure Personal Voice
|
||||
options.setVoice(BinaryData.fromObject(
|
||||
new AzurePersonalVoice("speakerProfileId", PersonalVoiceModels.PHOENIX_LATEST_NEURAL)));
|
||||
```
|
||||
|
||||
## Function Calling
|
||||
|
||||
```java
|
||||
VoiceLiveFunctionDefinition weatherFunction = new VoiceLiveFunctionDefinition("get_weather")
|
||||
.setDescription("Get current weather for a location")
|
||||
.setParameters(BinaryData.fromObject(parametersSchema));
|
||||
|
||||
VoiceLiveSessionOptions options = new VoiceLiveSessionOptions()
|
||||
.setTools(Arrays.asList(weatherFunction))
|
||||
.setInstructions("You have access to weather information.");
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use async client** — VoiceLive requires reactive patterns
|
||||
2. **Configure turn detection** for natural conversation flow
|
||||
3. **Enable noise reduction** for better speech recognition
|
||||
4. **Handle interruptions** gracefully with `setInterruptResponse(true)`
|
||||
5. **Use Whisper transcription** for input audio transcription
|
||||
6. **Close sessions** properly when conversation ends
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
session.receiveEvents()
|
||||
.doOnError(error -> System.err.println("Connection error: " + error.getMessage()))
|
||||
.onErrorResume(error -> {
|
||||
// Attempt reconnection or cleanup
|
||||
return Flux.empty();
|
||||
})
|
||||
.subscribe();
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-voicelive |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-voicelive/src/samples |
|
||||
@@ -0,0 +1,470 @@
|
||||
---
|
||||
name: azure-appconfiguration-java
|
||||
description: |
|
||||
Azure App Configuration SDK for Java. Centralized application configuration management with key-value settings, feature flags, and snapshots.
|
||||
Triggers: "ConfigurationClient java", "app configuration java", "feature flag java", "configuration setting java", "azure config java".
|
||||
package: com.azure:azure-data-appconfiguration
|
||||
---
|
||||
|
||||
# Azure App Configuration SDK for Java
|
||||
|
||||
Client library for Azure App Configuration, a managed service for centralizing application configurations.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-data-appconfiguration</artifactId>
|
||||
<version>1.8.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
Or use Azure SDK BOM:
|
||||
|
||||
```xml
|
||||
<dependencyManagement>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-sdk-bom</artifactId>
|
||||
<version>{bom_version}</version>
|
||||
<type>pom</type>
|
||||
<scope>import</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</dependencyManagement>
|
||||
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-data-appconfiguration</artifactId>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Azure App Configuration store
|
||||
- Connection string or Entra ID credentials
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_APPCONFIG_CONNECTION_STRING=Endpoint=https://<store>.azconfig.io;Id=<id>;Secret=<secret>
|
||||
AZURE_APPCONFIG_ENDPOINT=https://<store>.azconfig.io
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With Connection String
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.ConfigurationClient;
|
||||
import com.azure.data.appconfiguration.ConfigurationClientBuilder;
|
||||
|
||||
ConfigurationClient configClient = new ConfigurationClientBuilder()
|
||||
.connectionString(System.getenv("AZURE_APPCONFIG_CONNECTION_STRING"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.ConfigurationAsyncClient;
|
||||
|
||||
ConfigurationAsyncClient asyncClient = new ConfigurationClientBuilder()
|
||||
.connectionString(connectionString)
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### With Entra ID (Recommended)
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
ConfigurationClient configClient = new ConfigurationClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(System.getenv("AZURE_APPCONFIG_ENDPOINT"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| Configuration Setting | Key-value pair with optional label |
|
||||
| Label | Dimension for separating settings (e.g., environments) |
|
||||
| Feature Flag | Special setting for feature management |
|
||||
| Secret Reference | Setting pointing to Key Vault secret |
|
||||
| Snapshot | Point-in-time immutable view of settings |
|
||||
|
||||
## Configuration Setting Operations
|
||||
|
||||
### Create Setting (Add)
|
||||
|
||||
Creates only if setting doesn't exist:
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.ConfigurationSetting;
|
||||
|
||||
ConfigurationSetting setting = configClient.addConfigurationSetting(
|
||||
"app/database/connection",
|
||||
"Production",
|
||||
"Server=prod.db.com;Database=myapp"
|
||||
);
|
||||
```
|
||||
|
||||
### Create or Update Setting (Set)
|
||||
|
||||
Creates or overwrites:
|
||||
|
||||
```java
|
||||
ConfigurationSetting setting = configClient.setConfigurationSetting(
|
||||
"app/cache/enabled",
|
||||
"Production",
|
||||
"true"
|
||||
);
|
||||
```
|
||||
|
||||
### Get Setting
|
||||
|
||||
```java
|
||||
ConfigurationSetting setting = configClient.getConfigurationSetting(
|
||||
"app/database/connection",
|
||||
"Production"
|
||||
);
|
||||
System.out.println("Value: " + setting.getValue());
|
||||
System.out.println("Content-Type: " + setting.getContentType());
|
||||
System.out.println("Last Modified: " + setting.getLastModified());
|
||||
```
|
||||
|
||||
### Conditional Get (If Changed)
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.Response;
|
||||
import com.azure.core.util.Context;
|
||||
|
||||
Response<ConfigurationSetting> response = configClient.getConfigurationSettingWithResponse(
|
||||
setting, // Setting with ETag
|
||||
null, // Accept datetime
|
||||
true, // ifChanged - only fetch if modified
|
||||
Context.NONE
|
||||
);
|
||||
|
||||
if (response.getStatusCode() == 304) {
|
||||
System.out.println("Setting not modified");
|
||||
} else {
|
||||
ConfigurationSetting updated = response.getValue();
|
||||
}
|
||||
```
|
||||
|
||||
### Update Setting
|
||||
|
||||
```java
|
||||
ConfigurationSetting updated = configClient.setConfigurationSetting(
|
||||
"app/cache/enabled",
|
||||
"Production",
|
||||
"false"
|
||||
);
|
||||
```
|
||||
|
||||
### Conditional Update (If Unchanged)
|
||||
|
||||
```java
|
||||
// Only update if ETag matches (no concurrent modifications)
|
||||
Response<ConfigurationSetting> response = configClient.setConfigurationSettingWithResponse(
|
||||
setting, // Setting with current ETag
|
||||
true, // ifUnchanged
|
||||
Context.NONE
|
||||
);
|
||||
```
|
||||
|
||||
### Delete Setting
|
||||
|
||||
```java
|
||||
ConfigurationSetting deleted = configClient.deleteConfigurationSetting(
|
||||
"app/cache/enabled",
|
||||
"Production"
|
||||
);
|
||||
```
|
||||
|
||||
### Conditional Delete
|
||||
|
||||
```java
|
||||
Response<ConfigurationSetting> response = configClient.deleteConfigurationSettingWithResponse(
|
||||
setting, // Setting with ETag
|
||||
true, // ifUnchanged
|
||||
Context.NONE
|
||||
);
|
||||
```
|
||||
|
||||
## List and Filter Settings
|
||||
|
||||
### List by Key Pattern
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.SettingSelector;
|
||||
import com.azure.core.http.rest.PagedIterable;
|
||||
|
||||
SettingSelector selector = new SettingSelector()
|
||||
.setKeyFilter("app/*");
|
||||
|
||||
PagedIterable<ConfigurationSetting> settings = configClient.listConfigurationSettings(selector);
|
||||
for (ConfigurationSetting s : settings) {
|
||||
System.out.println(s.getKey() + " = " + s.getValue());
|
||||
}
|
||||
```
|
||||
|
||||
### List by Label
|
||||
|
||||
```java
|
||||
SettingSelector selector = new SettingSelector()
|
||||
.setKeyFilter("*")
|
||||
.setLabelFilter("Production");
|
||||
|
||||
PagedIterable<ConfigurationSetting> settings = configClient.listConfigurationSettings(selector);
|
||||
```
|
||||
|
||||
### List by Multiple Keys
|
||||
|
||||
```java
|
||||
SettingSelector selector = new SettingSelector()
|
||||
.setKeyFilter("app/database/*,app/cache/*");
|
||||
|
||||
PagedIterable<ConfigurationSetting> settings = configClient.listConfigurationSettings(selector);
|
||||
```
|
||||
|
||||
### List Revisions
|
||||
|
||||
```java
|
||||
SettingSelector selector = new SettingSelector()
|
||||
.setKeyFilter("app/database/connection");
|
||||
|
||||
PagedIterable<ConfigurationSetting> revisions = configClient.listRevisions(selector);
|
||||
for (ConfigurationSetting revision : revisions) {
|
||||
System.out.println("Value: " + revision.getValue() + ", Modified: " + revision.getLastModified());
|
||||
}
|
||||
```
|
||||
|
||||
## Feature Flags
|
||||
|
||||
### Create Feature Flag
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.FeatureFlagConfigurationSetting;
|
||||
import com.azure.data.appconfiguration.models.FeatureFlagFilter;
|
||||
import java.util.Arrays;
|
||||
|
||||
FeatureFlagFilter percentageFilter = new FeatureFlagFilter("Microsoft.Percentage")
|
||||
.addParameter("Value", 50);
|
||||
|
||||
FeatureFlagConfigurationSetting featureFlag = new FeatureFlagConfigurationSetting("beta-feature", true)
|
||||
.setDescription("Beta feature rollout")
|
||||
.setClientFilters(Arrays.asList(percentageFilter));
|
||||
|
||||
FeatureFlagConfigurationSetting created = (FeatureFlagConfigurationSetting)
|
||||
configClient.addConfigurationSetting(featureFlag);
|
||||
```
|
||||
|
||||
### Get Feature Flag
|
||||
|
||||
```java
|
||||
FeatureFlagConfigurationSetting flag = (FeatureFlagConfigurationSetting)
|
||||
configClient.getConfigurationSetting(featureFlag);
|
||||
|
||||
System.out.println("Feature: " + flag.getFeatureId());
|
||||
System.out.println("Enabled: " + flag.isEnabled());
|
||||
System.out.println("Filters: " + flag.getClientFilters());
|
||||
```
|
||||
|
||||
### Update Feature Flag
|
||||
|
||||
```java
|
||||
featureFlag.setEnabled(false);
|
||||
FeatureFlagConfigurationSetting updated = (FeatureFlagConfigurationSetting)
|
||||
configClient.setConfigurationSetting(featureFlag);
|
||||
```
|
||||
|
||||
## Secret References
|
||||
|
||||
### Create Secret Reference
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.SecretReferenceConfigurationSetting;
|
||||
|
||||
SecretReferenceConfigurationSetting secretRef = new SecretReferenceConfigurationSetting(
|
||||
"app/secrets/api-key",
|
||||
"https://myvault.vault.azure.net/secrets/api-key"
|
||||
);
|
||||
|
||||
SecretReferenceConfigurationSetting created = (SecretReferenceConfigurationSetting)
|
||||
configClient.addConfigurationSetting(secretRef);
|
||||
```
|
||||
|
||||
### Get Secret Reference
|
||||
|
||||
```java
|
||||
SecretReferenceConfigurationSetting ref = (SecretReferenceConfigurationSetting)
|
||||
configClient.getConfigurationSetting(secretRef);
|
||||
|
||||
System.out.println("Secret URI: " + ref.getSecretId());
|
||||
```
|
||||
|
||||
## Read-Only Settings
|
||||
|
||||
### Set Read-Only
|
||||
|
||||
```java
|
||||
ConfigurationSetting readOnly = configClient.setReadOnly(
|
||||
"app/critical/setting",
|
||||
"Production",
|
||||
true
|
||||
);
|
||||
```
|
||||
|
||||
### Clear Read-Only
|
||||
|
||||
```java
|
||||
ConfigurationSetting writable = configClient.setReadOnly(
|
||||
"app/critical/setting",
|
||||
"Production",
|
||||
false
|
||||
);
|
||||
```
|
||||
|
||||
## Snapshots
|
||||
|
||||
### Create Snapshot
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.ConfigurationSnapshot;
|
||||
import com.azure.data.appconfiguration.models.ConfigurationSettingsFilter;
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
import com.azure.core.util.polling.PollOperationDetails;
|
||||
|
||||
List<ConfigurationSettingsFilter> filters = new ArrayList<>();
|
||||
filters.add(new ConfigurationSettingsFilter("app/*"));
|
||||
|
||||
SyncPoller<PollOperationDetails, ConfigurationSnapshot> poller = configClient.beginCreateSnapshot(
|
||||
"release-v1.0",
|
||||
new ConfigurationSnapshot(filters),
|
||||
Context.NONE
|
||||
);
|
||||
poller.setPollInterval(Duration.ofSeconds(10));
|
||||
poller.waitForCompletion();
|
||||
|
||||
ConfigurationSnapshot snapshot = poller.getFinalResult();
|
||||
System.out.println("Snapshot: " + snapshot.getName() + ", Status: " + snapshot.getStatus());
|
||||
```
|
||||
|
||||
### Get Snapshot
|
||||
|
||||
```java
|
||||
ConfigurationSnapshot snapshot = configClient.getSnapshot("release-v1.0");
|
||||
System.out.println("Created: " + snapshot.getCreatedAt());
|
||||
System.out.println("Items: " + snapshot.getItemCount());
|
||||
```
|
||||
|
||||
### List Settings in Snapshot
|
||||
|
||||
```java
|
||||
PagedIterable<ConfigurationSetting> settings =
|
||||
configClient.listConfigurationSettingsForSnapshot("release-v1.0");
|
||||
|
||||
for (ConfigurationSetting setting : settings) {
|
||||
System.out.println(setting.getKey() + " = " + setting.getValue());
|
||||
}
|
||||
```
|
||||
|
||||
### Archive Snapshot
|
||||
|
||||
```java
|
||||
ConfigurationSnapshot archived = configClient.archiveSnapshot("release-v1.0");
|
||||
System.out.println("Status: " + archived.getStatus()); // archived
|
||||
```
|
||||
|
||||
### Recover Snapshot
|
||||
|
||||
```java
|
||||
ConfigurationSnapshot recovered = configClient.recoverSnapshot("release-v1.0");
|
||||
System.out.println("Status: " + recovered.getStatus()); // ready
|
||||
```
|
||||
|
||||
### List All Snapshots
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.SnapshotSelector;
|
||||
|
||||
SnapshotSelector selector = new SnapshotSelector().setNameFilter("release-*");
|
||||
PagedIterable<ConfigurationSnapshot> snapshots = configClient.listSnapshots(selector);
|
||||
|
||||
for (ConfigurationSnapshot snap : snapshots) {
|
||||
System.out.println(snap.getName() + " - " + snap.getStatus());
|
||||
}
|
||||
```
|
||||
|
||||
## Labels
|
||||
|
||||
### List Labels
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.SettingLabelSelector;
|
||||
|
||||
configClient.listLabels(new SettingLabelSelector().setNameFilter("*"))
|
||||
.forEach(label -> System.out.println("Label: " + label.getName()));
|
||||
```
|
||||
|
||||
## Async Operations
|
||||
|
||||
```java
|
||||
ConfigurationAsyncClient asyncClient = new ConfigurationClientBuilder()
|
||||
.connectionString(connectionString)
|
||||
.buildAsyncClient();
|
||||
|
||||
// Async list with reactive streams
|
||||
asyncClient.listConfigurationSettings(new SettingSelector().setLabelFilter("Production"))
|
||||
.subscribe(
|
||||
setting -> System.out.println(setting.getKey() + " = " + setting.getValue()),
|
||||
error -> System.err.println("Error: " + error.getMessage()),
|
||||
() -> System.out.println("Completed")
|
||||
);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
configClient.getConfigurationSetting("nonexistent", null);
|
||||
} catch (HttpResponseException e) {
|
||||
if (e.getResponse().getStatusCode() == 404) {
|
||||
System.err.println("Setting not found");
|
||||
} else {
|
||||
System.err.println("Error: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use labels** — Separate configurations by environment (Dev, Staging, Production)
|
||||
2. **Use snapshots** — Create immutable snapshots for releases
|
||||
3. **Feature flags** — Use for gradual rollouts and A/B testing
|
||||
4. **Secret references** — Store sensitive values in Key Vault
|
||||
5. **Conditional requests** — Use ETags for optimistic concurrency
|
||||
6. **Read-only protection** — Lock critical production settings
|
||||
7. **Use Entra ID** — Preferred over connection strings
|
||||
8. **Async client** — Use for high-throughput scenarios
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-data-appconfiguration |
|
||||
| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/appconfiguration/azure-data-appconfiguration |
|
||||
| API Documentation | https://aka.ms/java-docs |
|
||||
| Product Docs | https://learn.microsoft.com/azure/azure-app-configuration |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/appconfiguration/azure-data-appconfiguration/src/samples |
|
||||
| Troubleshooting | https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/appconfiguration/azure-data-appconfiguration/TROUBLESHOOTING.md |
|
||||
305
skills/official/microsoft/java/messaging/eventgrid/SKILL.md
Normal file
305
skills/official/microsoft/java/messaging/eventgrid/SKILL.md
Normal file
@@ -0,0 +1,305 @@
|
||||
---
|
||||
name: azure-eventgrid-java
|
||||
description: Build event-driven applications with Azure Event Grid SDK for Java. Use when publishing events, implementing pub/sub patterns, or integrating with Azure services via events.
|
||||
package: com.azure:azure-messaging-eventgrid
|
||||
---
|
||||
|
||||
# Azure Event Grid SDK for Java
|
||||
|
||||
Build event-driven applications using the Azure Event Grid SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-messaging-eventgrid</artifactId>
|
||||
<version>4.27.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### EventGridPublisherClient
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.EventGridPublisherClient;
|
||||
import com.azure.messaging.eventgrid.EventGridPublisherClientBuilder;
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
// With API Key
|
||||
EventGridPublisherClient<EventGridEvent> client = new EventGridPublisherClientBuilder()
|
||||
.endpoint("<topic-endpoint>")
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.buildEventGridEventPublisherClient();
|
||||
|
||||
// For CloudEvents
|
||||
EventGridPublisherClient<CloudEvent> cloudClient = new EventGridPublisherClientBuilder()
|
||||
.endpoint("<topic-endpoint>")
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.buildCloudEventPublisherClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
EventGridPublisherClient<EventGridEvent> client = new EventGridPublisherClientBuilder()
|
||||
.endpoint("<topic-endpoint>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildEventGridEventPublisherClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.EventGridPublisherAsyncClient;
|
||||
|
||||
EventGridPublisherAsyncClient<EventGridEvent> asyncClient = new EventGridPublisherClientBuilder()
|
||||
.endpoint("<topic-endpoint>")
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.buildEventGridEventPublisherAsyncClient();
|
||||
```
|
||||
|
||||
## Event Types
|
||||
|
||||
| Type | Description |
|
||||
|------|-------------|
|
||||
| `EventGridEvent` | Azure Event Grid native schema |
|
||||
| `CloudEvent` | CNCF CloudEvents 1.0 specification |
|
||||
| `BinaryData` | Custom schema events |
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Publish EventGridEvent
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.EventGridEvent;
|
||||
import com.azure.core.util.BinaryData;
|
||||
|
||||
EventGridEvent event = new EventGridEvent(
|
||||
"resource/path", // subject
|
||||
"MyApp.Events.OrderCreated", // eventType
|
||||
BinaryData.fromObject(new OrderData("order-123", 99.99)), // data
|
||||
"1.0" // dataVersion
|
||||
);
|
||||
|
||||
client.sendEvent(event);
|
||||
```
|
||||
|
||||
### Publish Multiple Events
|
||||
|
||||
```java
|
||||
List<EventGridEvent> events = Arrays.asList(
|
||||
new EventGridEvent("orders/1", "Order.Created",
|
||||
BinaryData.fromObject(order1), "1.0"),
|
||||
new EventGridEvent("orders/2", "Order.Created",
|
||||
BinaryData.fromObject(order2), "1.0")
|
||||
);
|
||||
|
||||
client.sendEvents(events);
|
||||
```
|
||||
|
||||
### Publish CloudEvent
|
||||
|
||||
```java
|
||||
import com.azure.core.models.CloudEvent;
|
||||
import com.azure.core.models.CloudEventDataFormat;
|
||||
|
||||
CloudEvent cloudEvent = new CloudEvent(
|
||||
"/myapp/orders", // source
|
||||
"order.created", // type
|
||||
BinaryData.fromObject(orderData), // data
|
||||
CloudEventDataFormat.JSON // dataFormat
|
||||
);
|
||||
cloudEvent.setSubject("orders/12345");
|
||||
cloudEvent.setId(UUID.randomUUID().toString());
|
||||
|
||||
cloudClient.sendEvent(cloudEvent);
|
||||
```
|
||||
|
||||
### Publish CloudEvents Batch
|
||||
|
||||
```java
|
||||
List<CloudEvent> cloudEvents = Arrays.asList(
|
||||
new CloudEvent("/app", "event.type1", BinaryData.fromString("data1"), CloudEventDataFormat.JSON),
|
||||
new CloudEvent("/app", "event.type2", BinaryData.fromString("data2"), CloudEventDataFormat.JSON)
|
||||
);
|
||||
|
||||
cloudClient.sendEvents(cloudEvents);
|
||||
```
|
||||
|
||||
### Async Publishing
|
||||
|
||||
```java
|
||||
asyncClient.sendEvent(event)
|
||||
.subscribe(
|
||||
unused -> System.out.println("Event sent successfully"),
|
||||
error -> System.err.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
// With multiple events
|
||||
asyncClient.sendEvents(events)
|
||||
.doOnSuccess(unused -> System.out.println("All events sent"))
|
||||
.doOnError(error -> System.err.println("Failed: " + error))
|
||||
.block(); // Block if needed
|
||||
```
|
||||
|
||||
### Custom Event Data Class
|
||||
|
||||
```java
|
||||
public class OrderData {
|
||||
private String orderId;
|
||||
private double amount;
|
||||
private String customerId;
|
||||
|
||||
public OrderData(String orderId, double amount) {
|
||||
this.orderId = orderId;
|
||||
this.amount = amount;
|
||||
}
|
||||
|
||||
// Getters and setters
|
||||
}
|
||||
|
||||
// Usage
|
||||
OrderData order = new OrderData("ORD-123", 150.00);
|
||||
EventGridEvent event = new EventGridEvent(
|
||||
"orders/" + order.getOrderId(),
|
||||
"MyApp.Order.Created",
|
||||
BinaryData.fromObject(order),
|
||||
"1.0"
|
||||
);
|
||||
```
|
||||
|
||||
## Receiving Events
|
||||
|
||||
### Parse EventGridEvent
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.EventGridEvent;
|
||||
|
||||
// From JSON string (e.g., webhook payload)
|
||||
String jsonPayload = "[{\"id\": \"...\", ...}]";
|
||||
List<EventGridEvent> events = EventGridEvent.fromString(jsonPayload);
|
||||
|
||||
for (EventGridEvent event : events) {
|
||||
System.out.println("Event Type: " + event.getEventType());
|
||||
System.out.println("Subject: " + event.getSubject());
|
||||
System.out.println("Event Time: " + event.getEventTime());
|
||||
|
||||
// Get data
|
||||
BinaryData data = event.getData();
|
||||
OrderData orderData = data.toObject(OrderData.class);
|
||||
}
|
||||
```
|
||||
|
||||
### Parse CloudEvent
|
||||
|
||||
```java
|
||||
import com.azure.core.models.CloudEvent;
|
||||
|
||||
String cloudEventJson = "[{\"specversion\": \"1.0\", ...}]";
|
||||
List<CloudEvent> cloudEvents = CloudEvent.fromString(cloudEventJson);
|
||||
|
||||
for (CloudEvent event : cloudEvents) {
|
||||
System.out.println("Type: " + event.getType());
|
||||
System.out.println("Source: " + event.getSource());
|
||||
System.out.println("ID: " + event.getId());
|
||||
|
||||
MyEventData data = event.getData().toObject(MyEventData.class);
|
||||
}
|
||||
```
|
||||
|
||||
### Handle System Events
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.systemevents.*;
|
||||
|
||||
for (EventGridEvent event : events) {
|
||||
if (event.getEventType().equals("Microsoft.Storage.BlobCreated")) {
|
||||
StorageBlobCreatedEventData blobData =
|
||||
event.getData().toObject(StorageBlobCreatedEventData.class);
|
||||
System.out.println("Blob URL: " + blobData.getUrl());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Event Grid Namespaces (MQTT/Pull)
|
||||
|
||||
### Receive from Namespace Topic
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.namespaces.EventGridReceiverClient;
|
||||
import com.azure.messaging.eventgrid.namespaces.EventGridReceiverClientBuilder;
|
||||
import com.azure.messaging.eventgrid.namespaces.models.*;
|
||||
|
||||
EventGridReceiverClient receiverClient = new EventGridReceiverClientBuilder()
|
||||
.endpoint("<namespace-endpoint>")
|
||||
.credential(new AzureKeyCredential("<key>"))
|
||||
.topicName("my-topic")
|
||||
.subscriptionName("my-subscription")
|
||||
.buildClient();
|
||||
|
||||
// Receive events
|
||||
ReceiveResult result = receiverClient.receive(10, Duration.ofSeconds(30));
|
||||
|
||||
for (ReceiveDetails detail : result.getValue()) {
|
||||
CloudEvent event = detail.getEvent();
|
||||
System.out.println("Event: " + event.getType());
|
||||
|
||||
// Acknowledge the event
|
||||
receiverClient.acknowledge(Arrays.asList(detail.getBrokerProperties().getLockToken()));
|
||||
}
|
||||
```
|
||||
|
||||
### Reject or Release Events
|
||||
|
||||
```java
|
||||
// Reject (don't retry)
|
||||
receiverClient.reject(Arrays.asList(lockToken));
|
||||
|
||||
// Release (retry later)
|
||||
receiverClient.release(Arrays.asList(lockToken));
|
||||
|
||||
// Release with delay
|
||||
receiverClient.release(Arrays.asList(lockToken),
|
||||
new ReleaseOptions().setDelay(ReleaseDelay.BY_60_SECONDS));
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.sendEvent(event);
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENT_GRID_TOPIC_ENDPOINT=https://<topic-name>.<region>.eventgrid.azure.net/api/events
|
||||
EVENT_GRID_ACCESS_KEY=<your-access-key>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Batch Events**: Send multiple events in one call when possible
|
||||
2. **Idempotency**: Include unique event IDs for deduplication
|
||||
3. **Schema Validation**: Use strongly-typed event data classes
|
||||
4. **Retry Logic**: Built-in, but consider dead-letter for failures
|
||||
5. **Event Size**: Keep events under 1MB (64KB for basic tier)
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Event Grid Java"
|
||||
- "publish events Azure"
|
||||
- "CloudEvent SDK"
|
||||
- "event-driven messaging"
|
||||
- "pub/sub Azure"
|
||||
- "webhook events"
|
||||
356
skills/official/microsoft/java/messaging/eventhubs/SKILL.md
Normal file
356
skills/official/microsoft/java/messaging/eventhubs/SKILL.md
Normal file
@@ -0,0 +1,356 @@
|
||||
---
|
||||
name: azure-eventhub-java
|
||||
description: Build real-time streaming applications with Azure Event Hubs SDK for Java. Use when implementing event streaming, high-throughput data ingestion, or building event-driven architectures.
|
||||
package: com.azure:azure-messaging-eventhubs
|
||||
---
|
||||
|
||||
# Azure Event Hubs SDK for Java
|
||||
|
||||
Build real-time streaming applications using the Azure Event Hubs SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-messaging-eventhubs</artifactId>
|
||||
<version>5.19.0</version>
|
||||
</dependency>
|
||||
|
||||
<!-- For checkpoint store (production) -->
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-messaging-eventhubs-checkpointstore-blob</artifactId>
|
||||
<version>1.20.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### EventHubProducerClient
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventHubProducerClient;
|
||||
import com.azure.messaging.eventhubs.EventHubClientBuilder;
|
||||
|
||||
// With connection string
|
||||
EventHubProducerClient producer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.buildProducerClient();
|
||||
|
||||
// Full connection string with EntityPath
|
||||
EventHubProducerClient producer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string-with-entity-path>")
|
||||
.buildProducerClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
EventHubProducerClient producer = new EventHubClientBuilder()
|
||||
.fullyQualifiedNamespace("<namespace>.servicebus.windows.net")
|
||||
.eventHubName("<event-hub-name>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildProducerClient();
|
||||
```
|
||||
|
||||
### EventHubConsumerClient
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventHubConsumerClient;
|
||||
|
||||
EventHubConsumerClient consumer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.consumerGroup(EventHubClientBuilder.DEFAULT_CONSUMER_GROUP_NAME)
|
||||
.buildConsumerClient();
|
||||
```
|
||||
|
||||
### Async Clients
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventHubProducerAsyncClient;
|
||||
import com.azure.messaging.eventhubs.EventHubConsumerAsyncClient;
|
||||
|
||||
EventHubProducerAsyncClient asyncProducer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.buildAsyncProducerClient();
|
||||
|
||||
EventHubConsumerAsyncClient asyncConsumer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.consumerGroup("$Default")
|
||||
.buildAsyncConsumerClient();
|
||||
```
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Send Single Event
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventData;
|
||||
|
||||
EventData eventData = new EventData("Hello, Event Hubs!");
|
||||
producer.send(Collections.singletonList(eventData));
|
||||
```
|
||||
|
||||
### Send Event Batch
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventDataBatch;
|
||||
import com.azure.messaging.eventhubs.models.CreateBatchOptions;
|
||||
|
||||
// Create batch
|
||||
EventDataBatch batch = producer.createBatch();
|
||||
|
||||
// Add events (returns false if batch is full)
|
||||
for (int i = 0; i < 100; i++) {
|
||||
EventData event = new EventData("Event " + i);
|
||||
if (!batch.tryAdd(event)) {
|
||||
// Batch is full, send and create new batch
|
||||
producer.send(batch);
|
||||
batch = producer.createBatch();
|
||||
batch.tryAdd(event);
|
||||
}
|
||||
}
|
||||
|
||||
// Send remaining events
|
||||
if (batch.getCount() > 0) {
|
||||
producer.send(batch);
|
||||
}
|
||||
```
|
||||
|
||||
### Send to Specific Partition
|
||||
|
||||
```java
|
||||
CreateBatchOptions options = new CreateBatchOptions()
|
||||
.setPartitionId("0");
|
||||
|
||||
EventDataBatch batch = producer.createBatch(options);
|
||||
batch.tryAdd(new EventData("Partition 0 event"));
|
||||
producer.send(batch);
|
||||
```
|
||||
|
||||
### Send with Partition Key
|
||||
|
||||
```java
|
||||
CreateBatchOptions options = new CreateBatchOptions()
|
||||
.setPartitionKey("customer-123");
|
||||
|
||||
EventDataBatch batch = producer.createBatch(options);
|
||||
batch.tryAdd(new EventData("Customer event"));
|
||||
producer.send(batch);
|
||||
```
|
||||
|
||||
### Event with Properties
|
||||
|
||||
```java
|
||||
EventData event = new EventData("Order created");
|
||||
event.getProperties().put("orderId", "ORD-123");
|
||||
event.getProperties().put("customerId", "CUST-456");
|
||||
event.getProperties().put("priority", 1);
|
||||
|
||||
producer.send(Collections.singletonList(event));
|
||||
```
|
||||
|
||||
### Receive Events (Simple)
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.models.EventPosition;
|
||||
import com.azure.messaging.eventhubs.models.PartitionEvent;
|
||||
|
||||
// Receive from specific partition
|
||||
Iterable<PartitionEvent> events = consumer.receiveFromPartition(
|
||||
"0", // partitionId
|
||||
10, // maxEvents
|
||||
EventPosition.earliest(), // startingPosition
|
||||
Duration.ofSeconds(30) // timeout
|
||||
);
|
||||
|
||||
for (PartitionEvent partitionEvent : events) {
|
||||
EventData event = partitionEvent.getData();
|
||||
System.out.println("Body: " + event.getBodyAsString());
|
||||
System.out.println("Sequence: " + event.getSequenceNumber());
|
||||
System.out.println("Offset: " + event.getOffset());
|
||||
}
|
||||
```
|
||||
|
||||
### EventProcessorClient (Production)
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventProcessorClient;
|
||||
import com.azure.messaging.eventhubs.EventProcessorClientBuilder;
|
||||
import com.azure.messaging.eventhubs.checkpointstore.blob.BlobCheckpointStore;
|
||||
import com.azure.storage.blob.BlobContainerAsyncClient;
|
||||
import com.azure.storage.blob.BlobContainerClientBuilder;
|
||||
|
||||
// Create checkpoint store
|
||||
BlobContainerAsyncClient blobClient = new BlobContainerClientBuilder()
|
||||
.connectionString("<storage-connection-string>")
|
||||
.containerName("checkpoints")
|
||||
.buildAsyncClient();
|
||||
|
||||
// Create processor
|
||||
EventProcessorClient processor = new EventProcessorClientBuilder()
|
||||
.connectionString("<eventhub-connection-string>", "<event-hub-name>")
|
||||
.consumerGroup("$Default")
|
||||
.checkpointStore(new BlobCheckpointStore(blobClient))
|
||||
.processEvent(eventContext -> {
|
||||
EventData event = eventContext.getEventData();
|
||||
System.out.println("Processing: " + event.getBodyAsString());
|
||||
|
||||
// Checkpoint after processing
|
||||
eventContext.updateCheckpoint();
|
||||
})
|
||||
.processError(errorContext -> {
|
||||
System.err.println("Error: " + errorContext.getThrowable().getMessage());
|
||||
System.err.println("Partition: " + errorContext.getPartitionContext().getPartitionId());
|
||||
})
|
||||
.buildEventProcessorClient();
|
||||
|
||||
// Start processing
|
||||
processor.start();
|
||||
|
||||
// Keep running...
|
||||
Thread.sleep(Duration.ofMinutes(5).toMillis());
|
||||
|
||||
// Stop gracefully
|
||||
processor.stop();
|
||||
```
|
||||
|
||||
### Batch Processing
|
||||
|
||||
```java
|
||||
EventProcessorClient processor = new EventProcessorClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.consumerGroup("$Default")
|
||||
.checkpointStore(new BlobCheckpointStore(blobClient))
|
||||
.processEventBatch(eventBatchContext -> {
|
||||
List<EventData> events = eventBatchContext.getEvents();
|
||||
System.out.printf("Received %d events%n", events.size());
|
||||
|
||||
for (EventData event : events) {
|
||||
// Process each event
|
||||
System.out.println(event.getBodyAsString());
|
||||
}
|
||||
|
||||
// Checkpoint after batch
|
||||
eventBatchContext.updateCheckpoint();
|
||||
}, 50) // maxBatchSize
|
||||
.processError(errorContext -> {
|
||||
System.err.println("Error: " + errorContext.getThrowable());
|
||||
})
|
||||
.buildEventProcessorClient();
|
||||
```
|
||||
|
||||
### Async Receiving
|
||||
|
||||
```java
|
||||
asyncConsumer.receiveFromPartition("0", EventPosition.latest())
|
||||
.subscribe(
|
||||
partitionEvent -> {
|
||||
EventData event = partitionEvent.getData();
|
||||
System.out.println("Received: " + event.getBodyAsString());
|
||||
},
|
||||
error -> System.err.println("Error: " + error),
|
||||
() -> System.out.println("Complete")
|
||||
);
|
||||
```
|
||||
|
||||
### Get Event Hub Properties
|
||||
|
||||
```java
|
||||
// Get hub info
|
||||
EventHubProperties hubProps = producer.getEventHubProperties();
|
||||
System.out.println("Hub: " + hubProps.getName());
|
||||
System.out.println("Partitions: " + hubProps.getPartitionIds());
|
||||
|
||||
// Get partition info
|
||||
PartitionProperties partitionProps = producer.getPartitionProperties("0");
|
||||
System.out.println("Begin sequence: " + partitionProps.getBeginningSequenceNumber());
|
||||
System.out.println("Last sequence: " + partitionProps.getLastEnqueuedSequenceNumber());
|
||||
System.out.println("Last offset: " + partitionProps.getLastEnqueuedOffset());
|
||||
```
|
||||
|
||||
## Event Positions
|
||||
|
||||
```java
|
||||
// Start from beginning
|
||||
EventPosition.earliest()
|
||||
|
||||
// Start from end (new events only)
|
||||
EventPosition.latest()
|
||||
|
||||
// From specific offset
|
||||
EventPosition.fromOffset(12345L)
|
||||
|
||||
// From specific sequence number
|
||||
EventPosition.fromSequenceNumber(100L)
|
||||
|
||||
// From specific time
|
||||
EventPosition.fromEnqueuedTime(Instant.now().minus(Duration.ofHours(1)))
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.models.ErrorContext;
|
||||
|
||||
.processError(errorContext -> {
|
||||
Throwable error = errorContext.getThrowable();
|
||||
String partitionId = errorContext.getPartitionContext().getPartitionId();
|
||||
|
||||
if (error instanceof AmqpException) {
|
||||
AmqpException amqpError = (AmqpException) error;
|
||||
if (amqpError.isTransient()) {
|
||||
System.out.println("Transient error, will retry");
|
||||
}
|
||||
}
|
||||
|
||||
System.err.printf("Error on partition %s: %s%n", partitionId, error.getMessage());
|
||||
})
|
||||
```
|
||||
|
||||
## Resource Cleanup
|
||||
|
||||
```java
|
||||
// Always close clients
|
||||
try {
|
||||
producer.send(batch);
|
||||
} finally {
|
||||
producer.close();
|
||||
}
|
||||
|
||||
// Or use try-with-resources
|
||||
try (EventHubProducerClient producer = new EventHubClientBuilder()
|
||||
.connectionString(connectionString, eventHubName)
|
||||
.buildProducerClient()) {
|
||||
producer.send(events);
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENT_HUBS_CONNECTION_STRING=Endpoint=sb://<namespace>.servicebus.windows.net/;SharedAccessKeyName=...
|
||||
EVENT_HUBS_NAME=<event-hub-name>
|
||||
STORAGE_CONNECTION_STRING=<for-checkpointing>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use EventProcessorClient**: For production, provides load balancing and checkpointing
|
||||
2. **Batch Events**: Use `EventDataBatch` for efficient sending
|
||||
3. **Partition Keys**: Use for ordering guarantees within a partition
|
||||
4. **Checkpointing**: Checkpoint after processing to avoid reprocessing
|
||||
5. **Error Handling**: Handle transient errors with retries
|
||||
6. **Close Clients**: Always close producer/consumer when done
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Event Hubs Java"
|
||||
- "event streaming Azure"
|
||||
- "real-time data ingestion"
|
||||
- "EventProcessorClient"
|
||||
- "event hub producer consumer"
|
||||
- "partition processing"
|
||||
302
skills/official/microsoft/java/messaging/webpubsub/SKILL.md
Normal file
302
skills/official/microsoft/java/messaging/webpubsub/SKILL.md
Normal file
@@ -0,0 +1,302 @@
|
||||
---
|
||||
name: azure-messaging-webpubsub-java
|
||||
description: Build real-time web applications with Azure Web PubSub SDK for Java. Use when implementing WebSocket-based messaging, live updates, chat applications, or server-to-client push notifications.
|
||||
package: com.azure:azure-messaging-webpubsub
|
||||
---
|
||||
|
||||
# Azure Web PubSub SDK for Java
|
||||
|
||||
Build real-time web applications using the Azure Web PubSub SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-messaging-webpubsub</artifactId>
|
||||
<version>1.5.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With Connection String
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.WebPubSubServiceClient;
|
||||
import com.azure.messaging.webpubsub.WebPubSubServiceClientBuilder;
|
||||
|
||||
WebPubSubServiceClient client = new WebPubSubServiceClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.hub("chat")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With Access Key
|
||||
|
||||
```java
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
WebPubSubServiceClient client = new WebPubSubServiceClientBuilder()
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.endpoint("<endpoint>")
|
||||
.hub("chat")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
WebPubSubServiceClient client = new WebPubSubServiceClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint("<endpoint>")
|
||||
.hub("chat")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.WebPubSubServiceAsyncClient;
|
||||
|
||||
WebPubSubServiceAsyncClient asyncClient = new WebPubSubServiceClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.hub("chat")
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
- **Hub**: Logical isolation unit for connections
|
||||
- **Group**: Subset of connections within a hub
|
||||
- **Connection**: Individual WebSocket client connection
|
||||
- **User**: Entity that can have multiple connections
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Send to All Connections
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.models.WebPubSubContentType;
|
||||
|
||||
// Send text message
|
||||
client.sendToAll("Hello everyone!", WebPubSubContentType.TEXT_PLAIN);
|
||||
|
||||
// Send JSON
|
||||
String jsonMessage = "{\"type\": \"notification\", \"message\": \"New update!\"}";
|
||||
client.sendToAll(jsonMessage, WebPubSubContentType.APPLICATION_JSON);
|
||||
```
|
||||
|
||||
### Send to All with Filter
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.RequestOptions;
|
||||
import com.azure.core.util.BinaryData;
|
||||
|
||||
BinaryData message = BinaryData.fromString("Hello filtered users!");
|
||||
|
||||
// Filter by userId
|
||||
client.sendToAllWithResponse(
|
||||
message,
|
||||
WebPubSubContentType.TEXT_PLAIN,
|
||||
message.getLength(),
|
||||
new RequestOptions().addQueryParam("filter", "userId ne 'user1'"));
|
||||
|
||||
// Filter by groups
|
||||
client.sendToAllWithResponse(
|
||||
message,
|
||||
WebPubSubContentType.TEXT_PLAIN,
|
||||
message.getLength(),
|
||||
new RequestOptions().addQueryParam("filter", "'GroupA' in groups and not('GroupB' in groups)"));
|
||||
```
|
||||
|
||||
### Send to Group
|
||||
|
||||
```java
|
||||
// Send to all connections in a group
|
||||
client.sendToGroup("java-developers", "Hello Java devs!", WebPubSubContentType.TEXT_PLAIN);
|
||||
|
||||
// Send JSON to group
|
||||
String json = "{\"event\": \"update\", \"data\": {\"version\": \"2.0\"}}";
|
||||
client.sendToGroup("subscribers", json, WebPubSubContentType.APPLICATION_JSON);
|
||||
```
|
||||
|
||||
### Send to Specific Connection
|
||||
|
||||
```java
|
||||
// Send to a specific connection by ID
|
||||
client.sendToConnection("connectionId123", "Private message", WebPubSubContentType.TEXT_PLAIN);
|
||||
```
|
||||
|
||||
### Send to User
|
||||
|
||||
```java
|
||||
// Send to all connections for a specific user
|
||||
client.sendToUser("andy", "Hello Andy!", WebPubSubContentType.TEXT_PLAIN);
|
||||
```
|
||||
|
||||
### Manage Groups
|
||||
|
||||
```java
|
||||
// Add connection to group
|
||||
client.addConnectionToGroup("premium-users", "connectionId123");
|
||||
|
||||
// Remove connection from group
|
||||
client.removeConnectionFromGroup("premium-users", "connectionId123");
|
||||
|
||||
// Add user to group (all their connections)
|
||||
client.addUserToGroup("admin-group", "userId456");
|
||||
|
||||
// Remove user from group
|
||||
client.removeUserFromGroup("admin-group", "userId456");
|
||||
|
||||
// Check if user is in group
|
||||
boolean exists = client.userExistsInGroup("admin-group", "userId456");
|
||||
```
|
||||
|
||||
### Manage Connections
|
||||
|
||||
```java
|
||||
// Check if connection exists
|
||||
boolean connected = client.connectionExists("connectionId123");
|
||||
|
||||
// Close a connection
|
||||
client.closeConnection("connectionId123");
|
||||
|
||||
// Close with reason
|
||||
client.closeConnection("connectionId123", "Session expired");
|
||||
|
||||
// Check if user exists (has any connections)
|
||||
boolean userOnline = client.userExists("userId456");
|
||||
|
||||
// Close all connections for a user
|
||||
client.closeUserConnections("userId456");
|
||||
|
||||
// Close all connections in a group
|
||||
client.closeGroupConnections("inactive-group");
|
||||
```
|
||||
|
||||
### Generate Client Access Token
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.models.GetClientAccessTokenOptions;
|
||||
import com.azure.messaging.webpubsub.models.WebPubSubClientAccessToken;
|
||||
|
||||
// Basic token
|
||||
WebPubSubClientAccessToken token = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions());
|
||||
System.out.println("URL: " + token.getUrl());
|
||||
|
||||
// With user ID
|
||||
WebPubSubClientAccessToken userToken = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions().setUserId("user123"));
|
||||
|
||||
// With roles (permissions)
|
||||
WebPubSubClientAccessToken roleToken = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions()
|
||||
.setUserId("user123")
|
||||
.addRole("webpubsub.joinLeaveGroup")
|
||||
.addRole("webpubsub.sendToGroup"));
|
||||
|
||||
// With groups to join on connect
|
||||
WebPubSubClientAccessToken groupToken = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions()
|
||||
.setUserId("user123")
|
||||
.addGroup("announcements")
|
||||
.addGroup("updates"));
|
||||
|
||||
// With custom expiration
|
||||
WebPubSubClientAccessToken expToken = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions()
|
||||
.setUserId("user123")
|
||||
.setExpiresAfter(Duration.ofHours(2)));
|
||||
```
|
||||
|
||||
### Grant/Revoke Permissions
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.models.WebPubSubPermission;
|
||||
|
||||
// Grant permission to send to a group
|
||||
client.grantPermission(
|
||||
WebPubSubPermission.SEND_TO_GROUP,
|
||||
"connectionId123",
|
||||
new RequestOptions().addQueryParam("targetName", "chat-room"));
|
||||
|
||||
// Revoke permission
|
||||
client.revokePermission(
|
||||
WebPubSubPermission.SEND_TO_GROUP,
|
||||
"connectionId123",
|
||||
new RequestOptions().addQueryParam("targetName", "chat-room"));
|
||||
|
||||
// Check permission
|
||||
boolean hasPermission = client.checkPermission(
|
||||
WebPubSubPermission.SEND_TO_GROUP,
|
||||
"connectionId123",
|
||||
new RequestOptions().addQueryParam("targetName", "chat-room"));
|
||||
```
|
||||
|
||||
### Async Operations
|
||||
|
||||
```java
|
||||
asyncClient.sendToAll("Async message!", WebPubSubContentType.TEXT_PLAIN)
|
||||
.subscribe(
|
||||
unused -> System.out.println("Message sent"),
|
||||
error -> System.err.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
asyncClient.sendToGroup("developers", "Group message", WebPubSubContentType.TEXT_PLAIN)
|
||||
.doOnSuccess(v -> System.out.println("Sent to group"))
|
||||
.doOnError(e -> System.err.println("Failed: " + e))
|
||||
.subscribe();
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.sendToConnection("invalid-id", "test", WebPubSubContentType.TEXT_PLAIN);
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
WEB_PUBSUB_CONNECTION_STRING=Endpoint=https://<resource>.webpubsub.azure.com;AccessKey=...
|
||||
WEB_PUBSUB_ENDPOINT=https://<resource>.webpubsub.azure.com
|
||||
WEB_PUBSUB_ACCESS_KEY=<your-access-key>
|
||||
```
|
||||
|
||||
## Client Roles
|
||||
|
||||
| Role | Permission |
|
||||
|------|------------|
|
||||
| `webpubsub.joinLeaveGroup` | Join/leave any group |
|
||||
| `webpubsub.sendToGroup` | Send to any group |
|
||||
| `webpubsub.joinLeaveGroup.<group>` | Join/leave specific group |
|
||||
| `webpubsub.sendToGroup.<group>` | Send to specific group |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Groups**: Organize connections into groups for targeted messaging
|
||||
2. **User IDs**: Associate connections with user IDs for user-level messaging
|
||||
3. **Token Expiration**: Set appropriate token expiration for security
|
||||
4. **Roles**: Grant minimal required permissions via roles
|
||||
5. **Hub Isolation**: Use separate hubs for different application features
|
||||
6. **Connection Management**: Clean up inactive connections
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Web PubSub Java"
|
||||
- "WebSocket messaging Azure"
|
||||
- "real-time push notifications"
|
||||
- "server-sent events"
|
||||
- "chat application backend"
|
||||
- "live updates broadcasting"
|
||||
230
skills/official/microsoft/java/monitoring/ingestion/SKILL.md
Normal file
230
skills/official/microsoft/java/monitoring/ingestion/SKILL.md
Normal file
@@ -0,0 +1,230 @@
|
||||
---
|
||||
name: azure-monitor-ingestion-java
|
||||
description: |
|
||||
Azure Monitor Ingestion SDK for Java. Send custom logs to Azure Monitor via Data Collection Rules (DCR) and Data Collection Endpoints (DCE).
|
||||
Triggers: "LogsIngestionClient java", "azure monitor ingestion java", "custom logs java", "DCR java", "data collection rule java".
|
||||
package: com.azure:azure-monitor-ingestion
|
||||
---
|
||||
|
||||
# Azure Monitor Ingestion SDK for Java
|
||||
|
||||
Client library for sending custom logs to Azure Monitor using the Logs Ingestion API via Data Collection Rules.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-monitor-ingestion</artifactId>
|
||||
<version>1.2.11</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
Or use Azure SDK BOM:
|
||||
|
||||
```xml
|
||||
<dependencyManagement>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-sdk-bom</artifactId>
|
||||
<version>{bom_version}</version>
|
||||
<type>pom</type>
|
||||
<scope>import</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</dependencyManagement>
|
||||
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-monitor-ingestion</artifactId>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Data Collection Endpoint (DCE)
|
||||
- Data Collection Rule (DCR)
|
||||
- Log Analytics workspace
|
||||
- Target table (custom or built-in: CommonSecurityLog, SecurityEvents, Syslog, WindowsEvents)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
DATA_COLLECTION_ENDPOINT=https://<dce-name>.<region>.ingest.monitor.azure.com
|
||||
DATA_COLLECTION_RULE_ID=dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
|
||||
STREAM_NAME=Custom-MyTable_CL
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### Synchronous Client
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredential;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
import com.azure.monitor.ingestion.LogsIngestionClient;
|
||||
import com.azure.monitor.ingestion.LogsIngestionClientBuilder;
|
||||
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder().build();
|
||||
|
||||
LogsIngestionClient client = new LogsIngestionClientBuilder()
|
||||
.endpoint("<data-collection-endpoint>")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Asynchronous Client
|
||||
|
||||
```java
|
||||
import com.azure.monitor.ingestion.LogsIngestionAsyncClient;
|
||||
|
||||
LogsIngestionAsyncClient asyncClient = new LogsIngestionClientBuilder()
|
||||
.endpoint("<data-collection-endpoint>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| Data Collection Endpoint (DCE) | Ingestion endpoint URL for your region |
|
||||
| Data Collection Rule (DCR) | Defines data transformation and routing to tables |
|
||||
| Stream Name | Target stream in the DCR (e.g., `Custom-MyTable_CL`) |
|
||||
| Log Analytics Workspace | Destination for ingested logs |
|
||||
|
||||
## Core Operations
|
||||
|
||||
### Upload Custom Logs
|
||||
|
||||
```java
|
||||
import java.util.List;
|
||||
import java.util.ArrayList;
|
||||
|
||||
List<Object> logs = new ArrayList<>();
|
||||
logs.add(new MyLogEntry("2024-01-15T10:30:00Z", "INFO", "Application started"));
|
||||
logs.add(new MyLogEntry("2024-01-15T10:30:05Z", "DEBUG", "Processing request"));
|
||||
|
||||
client.upload("<data-collection-rule-id>", "<stream-name>", logs);
|
||||
System.out.println("Logs uploaded successfully");
|
||||
```
|
||||
|
||||
### Upload with Concurrency
|
||||
|
||||
For large log collections, enable concurrent uploads:
|
||||
|
||||
```java
|
||||
import com.azure.monitor.ingestion.models.LogsUploadOptions;
|
||||
import com.azure.core.util.Context;
|
||||
|
||||
List<Object> logs = getLargeLogs(); // Large collection
|
||||
|
||||
LogsUploadOptions options = new LogsUploadOptions()
|
||||
.setMaxConcurrency(3);
|
||||
|
||||
client.upload("<data-collection-rule-id>", "<stream-name>", logs, options, Context.NONE);
|
||||
```
|
||||
|
||||
### Upload with Error Handling
|
||||
|
||||
Handle partial upload failures gracefully:
|
||||
|
||||
```java
|
||||
LogsUploadOptions options = new LogsUploadOptions()
|
||||
.setLogsUploadErrorConsumer(uploadError -> {
|
||||
System.err.println("Upload error: " + uploadError.getResponseException().getMessage());
|
||||
System.err.println("Failed logs count: " + uploadError.getFailedLogs().size());
|
||||
|
||||
// Option 1: Log and continue
|
||||
// Option 2: Throw to abort remaining uploads
|
||||
// throw uploadError.getResponseException();
|
||||
});
|
||||
|
||||
client.upload("<data-collection-rule-id>", "<stream-name>", logs, options, Context.NONE);
|
||||
```
|
||||
|
||||
### Async Upload with Reactor
|
||||
|
||||
```java
|
||||
import reactor.core.publisher.Mono;
|
||||
|
||||
List<Object> logs = getLogs();
|
||||
|
||||
asyncClient.upload("<data-collection-rule-id>", "<stream-name>", logs)
|
||||
.doOnSuccess(v -> System.out.println("Upload completed"))
|
||||
.doOnError(e -> System.err.println("Upload failed: " + e.getMessage()))
|
||||
.subscribe();
|
||||
```
|
||||
|
||||
## Log Entry Model Example
|
||||
|
||||
```java
|
||||
public class MyLogEntry {
|
||||
private String timeGenerated;
|
||||
private String level;
|
||||
private String message;
|
||||
|
||||
public MyLogEntry(String timeGenerated, String level, String message) {
|
||||
this.timeGenerated = timeGenerated;
|
||||
this.level = level;
|
||||
this.message = message;
|
||||
}
|
||||
|
||||
// Getters required for JSON serialization
|
||||
public String getTimeGenerated() { return timeGenerated; }
|
||||
public String getLevel() { return level; }
|
||||
public String getMessage() { return message; }
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.upload(ruleId, streamName, logs);
|
||||
} catch (HttpResponseException e) {
|
||||
System.err.println("HTTP Status: " + e.getResponse().getStatusCode());
|
||||
System.err.println("Error: " + e.getMessage());
|
||||
|
||||
if (e.getResponse().getStatusCode() == 403) {
|
||||
System.err.println("Check DCR permissions and managed identity");
|
||||
} else if (e.getResponse().getStatusCode() == 404) {
|
||||
System.err.println("Verify DCE endpoint and DCR ID");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Batch logs** — Upload in batches rather than one at a time
|
||||
2. **Use concurrency** — Set `maxConcurrency` for large uploads
|
||||
3. **Handle partial failures** — Use error consumer to log failed entries
|
||||
4. **Match DCR schema** — Log entry fields must match DCR transformation expectations
|
||||
5. **Include TimeGenerated** — Most tables require a timestamp field
|
||||
6. **Reuse client** — Create once, reuse throughout application
|
||||
7. **Use async for high throughput** — `LogsIngestionAsyncClient` for reactive patterns
|
||||
|
||||
## Querying Uploaded Logs
|
||||
|
||||
Use [azure-monitor-query](../query/SKILL.md) to query ingested logs:
|
||||
|
||||
```java
|
||||
// See azure-monitor-query skill for LogsQueryClient usage
|
||||
String query = "MyTable_CL | where TimeGenerated > ago(1h) | limit 10";
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-monitor-ingestion |
|
||||
| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/monitor/azure-monitor-ingestion |
|
||||
| Product Docs | https://learn.microsoft.com/azure/azure-monitor/logs/logs-ingestion-api-overview |
|
||||
| DCE Overview | https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-endpoint-overview |
|
||||
| DCR Overview | https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-rule-overview |
|
||||
| Troubleshooting | https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-ingestion/TROUBLESHOOTING.md |
|
||||
@@ -0,0 +1,284 @@
|
||||
---
|
||||
name: azure-monitor-opentelemetry-exporter-java
|
||||
description: |
|
||||
Azure Monitor OpenTelemetry Exporter for Java. Export OpenTelemetry traces, metrics, and logs to Azure Monitor/Application Insights.
|
||||
Triggers: "AzureMonitorExporter java", "opentelemetry azure java", "application insights java otel", "azure monitor tracing java".
|
||||
Note: This package is DEPRECATED. Migrate to azure-monitor-opentelemetry-autoconfigure.
|
||||
package: com.azure:azure-monitor-opentelemetry-exporter
|
||||
---
|
||||
|
||||
# Azure Monitor OpenTelemetry Exporter for Java
|
||||
|
||||
> **⚠️ DEPRECATION NOTICE**: This package is deprecated. Migrate to `azure-monitor-opentelemetry-autoconfigure`.
|
||||
>
|
||||
> See [Migration Guide](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-opentelemetry-exporter/MIGRATION.md) for detailed instructions.
|
||||
|
||||
Export OpenTelemetry telemetry data to Azure Monitor / Application Insights.
|
||||
|
||||
## Installation (Deprecated)
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-monitor-opentelemetry-exporter</artifactId>
|
||||
<version>1.0.0-beta.x</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Recommended: Use Autoconfigure Instead
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-monitor-opentelemetry-autoconfigure</artifactId>
|
||||
<version>LATEST</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=xxx;IngestionEndpoint=https://xxx.in.applicationinsights.azure.com/
|
||||
```
|
||||
|
||||
## Basic Setup with Autoconfigure (Recommended)
|
||||
|
||||
### Using Environment Variable
|
||||
|
||||
```java
|
||||
import io.opentelemetry.sdk.autoconfigure.AutoConfiguredOpenTelemetrySdk;
|
||||
import io.opentelemetry.sdk.autoconfigure.AutoConfiguredOpenTelemetrySdkBuilder;
|
||||
import io.opentelemetry.api.OpenTelemetry;
|
||||
import com.azure.monitor.opentelemetry.exporter.AzureMonitorExporter;
|
||||
|
||||
// Connection string from APPLICATIONINSIGHTS_CONNECTION_STRING env var
|
||||
AutoConfiguredOpenTelemetrySdkBuilder sdkBuilder = AutoConfiguredOpenTelemetrySdk.builder();
|
||||
AzureMonitorExporter.customize(sdkBuilder);
|
||||
OpenTelemetry openTelemetry = sdkBuilder.build().getOpenTelemetrySdk();
|
||||
```
|
||||
|
||||
### With Explicit Connection String
|
||||
|
||||
```java
|
||||
AutoConfiguredOpenTelemetrySdkBuilder sdkBuilder = AutoConfiguredOpenTelemetrySdk.builder();
|
||||
AzureMonitorExporter.customize(sdkBuilder, "{connection-string}");
|
||||
OpenTelemetry openTelemetry = sdkBuilder.build().getOpenTelemetrySdk();
|
||||
```
|
||||
|
||||
## Creating Spans
|
||||
|
||||
```java
|
||||
import io.opentelemetry.api.trace.Tracer;
|
||||
import io.opentelemetry.api.trace.Span;
|
||||
import io.opentelemetry.context.Scope;
|
||||
|
||||
// Get tracer
|
||||
Tracer tracer = openTelemetry.getTracer("com.example.myapp");
|
||||
|
||||
// Create span
|
||||
Span span = tracer.spanBuilder("myOperation").startSpan();
|
||||
|
||||
try (Scope scope = span.makeCurrent()) {
|
||||
// Your application logic
|
||||
doWork();
|
||||
} catch (Throwable t) {
|
||||
span.recordException(t);
|
||||
throw t;
|
||||
} finally {
|
||||
span.end();
|
||||
}
|
||||
```
|
||||
|
||||
## Adding Span Attributes
|
||||
|
||||
```java
|
||||
import io.opentelemetry.api.common.AttributeKey;
|
||||
import io.opentelemetry.api.common.Attributes;
|
||||
|
||||
Span span = tracer.spanBuilder("processOrder")
|
||||
.setAttribute("order.id", "12345")
|
||||
.setAttribute("customer.tier", "premium")
|
||||
.startSpan();
|
||||
|
||||
try (Scope scope = span.makeCurrent()) {
|
||||
// Add attributes during execution
|
||||
span.setAttribute("items.count", 3);
|
||||
span.setAttribute("total.amount", 99.99);
|
||||
|
||||
processOrder();
|
||||
} finally {
|
||||
span.end();
|
||||
}
|
||||
```
|
||||
|
||||
## Custom Span Processor
|
||||
|
||||
```java
|
||||
import io.opentelemetry.sdk.trace.SpanProcessor;
|
||||
import io.opentelemetry.sdk.trace.ReadWriteSpan;
|
||||
import io.opentelemetry.sdk.trace.ReadableSpan;
|
||||
import io.opentelemetry.context.Context;
|
||||
|
||||
private static final AttributeKey<String> CUSTOM_ATTR = AttributeKey.stringKey("custom.attribute");
|
||||
|
||||
SpanProcessor customProcessor = new SpanProcessor() {
|
||||
@Override
|
||||
public void onStart(Context context, ReadWriteSpan span) {
|
||||
// Add custom attribute to every span
|
||||
span.setAttribute(CUSTOM_ATTR, "customValue");
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean isStartRequired() {
|
||||
return true;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void onEnd(ReadableSpan span) {
|
||||
// Post-processing if needed
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean isEndRequired() {
|
||||
return false;
|
||||
}
|
||||
};
|
||||
|
||||
// Register processor
|
||||
AutoConfiguredOpenTelemetrySdkBuilder sdkBuilder = AutoConfiguredOpenTelemetrySdk.builder();
|
||||
AzureMonitorExporter.customize(sdkBuilder);
|
||||
|
||||
sdkBuilder.addTracerProviderCustomizer(
|
||||
(sdkTracerProviderBuilder, configProperties) ->
|
||||
sdkTracerProviderBuilder.addSpanProcessor(customProcessor)
|
||||
);
|
||||
|
||||
OpenTelemetry openTelemetry = sdkBuilder.build().getOpenTelemetrySdk();
|
||||
```
|
||||
|
||||
## Nested Spans
|
||||
|
||||
```java
|
||||
public void parentOperation() {
|
||||
Span parentSpan = tracer.spanBuilder("parentOperation").startSpan();
|
||||
try (Scope scope = parentSpan.makeCurrent()) {
|
||||
childOperation();
|
||||
} finally {
|
||||
parentSpan.end();
|
||||
}
|
||||
}
|
||||
|
||||
public void childOperation() {
|
||||
// Automatically links to parent via Context
|
||||
Span childSpan = tracer.spanBuilder("childOperation").startSpan();
|
||||
try (Scope scope = childSpan.makeCurrent()) {
|
||||
// Child work
|
||||
} finally {
|
||||
childSpan.end();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Recording Exceptions
|
||||
|
||||
```java
|
||||
Span span = tracer.spanBuilder("riskyOperation").startSpan();
|
||||
try (Scope scope = span.makeCurrent()) {
|
||||
performRiskyWork();
|
||||
} catch (Exception e) {
|
||||
span.recordException(e);
|
||||
span.setStatus(StatusCode.ERROR, e.getMessage());
|
||||
throw e;
|
||||
} finally {
|
||||
span.end();
|
||||
}
|
||||
```
|
||||
|
||||
## Metrics (via OpenTelemetry)
|
||||
|
||||
```java
|
||||
import io.opentelemetry.api.metrics.Meter;
|
||||
import io.opentelemetry.api.metrics.LongCounter;
|
||||
import io.opentelemetry.api.metrics.LongHistogram;
|
||||
|
||||
Meter meter = openTelemetry.getMeter("com.example.myapp");
|
||||
|
||||
// Counter
|
||||
LongCounter requestCounter = meter.counterBuilder("http.requests")
|
||||
.setDescription("Total HTTP requests")
|
||||
.setUnit("requests")
|
||||
.build();
|
||||
|
||||
requestCounter.add(1, Attributes.of(
|
||||
AttributeKey.stringKey("http.method"), "GET",
|
||||
AttributeKey.longKey("http.status_code"), 200L
|
||||
));
|
||||
|
||||
// Histogram
|
||||
LongHistogram latencyHistogram = meter.histogramBuilder("http.latency")
|
||||
.setDescription("Request latency")
|
||||
.setUnit("ms")
|
||||
.ofLongs()
|
||||
.build();
|
||||
|
||||
latencyHistogram.record(150, Attributes.of(
|
||||
AttributeKey.stringKey("http.route"), "/api/users"
|
||||
));
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| Connection String | Application Insights connection string with instrumentation key |
|
||||
| Tracer | Creates spans for distributed tracing |
|
||||
| Span | Represents a unit of work with timing and attributes |
|
||||
| SpanProcessor | Intercepts span lifecycle for customization |
|
||||
| Exporter | Sends telemetry to Azure Monitor |
|
||||
|
||||
## Migration to Autoconfigure
|
||||
|
||||
The `azure-monitor-opentelemetry-autoconfigure` package provides:
|
||||
- Automatic instrumentation of common libraries
|
||||
- Simplified configuration
|
||||
- Better integration with OpenTelemetry SDK
|
||||
|
||||
### Migration Steps
|
||||
|
||||
1. Replace dependency:
|
||||
```xml
|
||||
<!-- Remove -->
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-monitor-opentelemetry-exporter</artifactId>
|
||||
</dependency>
|
||||
|
||||
<!-- Add -->
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-monitor-opentelemetry-autoconfigure</artifactId>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
2. Update initialization code per [Migration Guide](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-opentelemetry-exporter/MIGRATION.md)
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use autoconfigure** — Migrate to `azure-monitor-opentelemetry-autoconfigure`
|
||||
2. **Set meaningful span names** — Use descriptive operation names
|
||||
3. **Add relevant attributes** — Include contextual data for debugging
|
||||
4. **Handle exceptions** — Always record exceptions on spans
|
||||
5. **Use semantic conventions** — Follow OpenTelemetry semantic conventions
|
||||
6. **End spans in finally** — Ensure spans are always ended
|
||||
7. **Use try-with-resources** — Scope management with try-with-resources pattern
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-monitor-opentelemetry-exporter |
|
||||
| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/monitor/azure-monitor-opentelemetry-exporter |
|
||||
| Migration Guide | https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-opentelemetry-exporter/MIGRATION.md |
|
||||
| Autoconfigure Package | https://central.sonatype.com/artifact/com.azure/azure-monitor-opentelemetry-autoconfigure |
|
||||
| OpenTelemetry Java | https://opentelemetry.io/docs/languages/java/ |
|
||||
| Application Insights | https://learn.microsoft.com/azure/azure-monitor/app/app-insights-overview |
|
||||
417
skills/official/microsoft/java/monitoring/query/SKILL.md
Normal file
417
skills/official/microsoft/java/monitoring/query/SKILL.md
Normal file
@@ -0,0 +1,417 @@
|
||||
---
|
||||
name: azure-monitor-query-java
|
||||
description: |
|
||||
Azure Monitor Query SDK for Java. Execute Kusto queries against Log Analytics workspaces and query metrics from Azure resources.
|
||||
Triggers: "LogsQueryClient java", "MetricsQueryClient java", "kusto query java", "log analytics java", "azure monitor query java".
|
||||
Note: This package is deprecated. Migrate to azure-monitor-query-logs and azure-monitor-query-metrics.
|
||||
package: com.azure:azure-monitor-query
|
||||
---
|
||||
|
||||
# Azure Monitor Query SDK for Java
|
||||
|
||||
> **DEPRECATION NOTICE**: This package is deprecated in favor of:
|
||||
> - `azure-monitor-query-logs` — For Log Analytics queries
|
||||
> - `azure-monitor-query-metrics` — For metrics queries
|
||||
>
|
||||
> See migration guides: [Logs Migration](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-query-logs/migration-guide.md) | [Metrics Migration](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-query-metrics/migration-guide.md)
|
||||
|
||||
Client library for querying Azure Monitor Logs and Metrics.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-monitor-query</artifactId>
|
||||
<version>1.5.9</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
Or use Azure SDK BOM:
|
||||
|
||||
```xml
|
||||
<dependencyManagement>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-sdk-bom</artifactId>
|
||||
<version>{bom_version}</version>
|
||||
<type>pom</type>
|
||||
<scope>import</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</dependencyManagement>
|
||||
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-monitor-query</artifactId>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Log Analytics workspace (for logs queries)
|
||||
- Azure resource (for metrics queries)
|
||||
- TokenCredential with appropriate permissions
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
LOG_ANALYTICS_WORKSPACE_ID=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
|
||||
AZURE_RESOURCE_ID=/subscriptions/{sub}/resourceGroups/{rg}/providers/{provider}/{resource}
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### LogsQueryClient (Sync)
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
import com.azure.monitor.query.LogsQueryClient;
|
||||
import com.azure.monitor.query.LogsQueryClientBuilder;
|
||||
|
||||
LogsQueryClient logsClient = new LogsQueryClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### LogsQueryAsyncClient
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.LogsQueryAsyncClient;
|
||||
|
||||
LogsQueryAsyncClient logsAsyncClient = new LogsQueryClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### MetricsQueryClient (Sync)
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.MetricsQueryClient;
|
||||
import com.azure.monitor.query.MetricsQueryClientBuilder;
|
||||
|
||||
MetricsQueryClient metricsClient = new MetricsQueryClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### MetricsQueryAsyncClient
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.MetricsQueryAsyncClient;
|
||||
|
||||
MetricsQueryAsyncClient metricsAsyncClient = new MetricsQueryClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### Sovereign Cloud Configuration
|
||||
|
||||
```java
|
||||
// Azure China Cloud - Logs
|
||||
LogsQueryClient logsClient = new LogsQueryClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint("https://api.loganalytics.azure.cn/v1")
|
||||
.buildClient();
|
||||
|
||||
// Azure China Cloud - Metrics
|
||||
MetricsQueryClient metricsClient = new MetricsQueryClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint("https://management.chinacloudapi.cn")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| Logs | Log and performance data from Azure resources via Kusto Query Language |
|
||||
| Metrics | Numeric time-series data collected at regular intervals |
|
||||
| Workspace ID | Log Analytics workspace identifier |
|
||||
| Resource ID | Azure resource URI for metrics queries |
|
||||
| QueryTimeInterval | Time range for the query |
|
||||
|
||||
## Logs Query Operations
|
||||
|
||||
### Basic Query
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.models.LogsQueryResult;
|
||||
import com.azure.monitor.query.models.LogsTableRow;
|
||||
import com.azure.monitor.query.models.QueryTimeInterval;
|
||||
import java.time.Duration;
|
||||
|
||||
LogsQueryResult result = logsClient.queryWorkspace(
|
||||
"{workspace-id}",
|
||||
"AzureActivity | summarize count() by ResourceGroup | top 10 by count_",
|
||||
new QueryTimeInterval(Duration.ofDays(7))
|
||||
);
|
||||
|
||||
for (LogsTableRow row : result.getTable().getRows()) {
|
||||
System.out.println(row.getColumnValue("ResourceGroup") + ": " + row.getColumnValue("count_"));
|
||||
}
|
||||
```
|
||||
|
||||
### Query by Resource ID
|
||||
|
||||
```java
|
||||
LogsQueryResult result = logsClient.queryResource(
|
||||
"{resource-id}",
|
||||
"AzureMetrics | where TimeGenerated > ago(1h)",
|
||||
new QueryTimeInterval(Duration.ofDays(1))
|
||||
);
|
||||
|
||||
for (LogsTableRow row : result.getTable().getRows()) {
|
||||
System.out.println(row.getColumnValue("MetricName") + " " + row.getColumnValue("Average"));
|
||||
}
|
||||
```
|
||||
|
||||
### Map Results to Custom Model
|
||||
|
||||
```java
|
||||
// Define model class
|
||||
public class ActivityLog {
|
||||
private String resourceGroup;
|
||||
private String operationName;
|
||||
|
||||
public String getResourceGroup() { return resourceGroup; }
|
||||
public String getOperationName() { return operationName; }
|
||||
}
|
||||
|
||||
// Query with model mapping
|
||||
List<ActivityLog> logs = logsClient.queryWorkspace(
|
||||
"{workspace-id}",
|
||||
"AzureActivity | project ResourceGroup, OperationName | take 100",
|
||||
new QueryTimeInterval(Duration.ofDays(2)),
|
||||
ActivityLog.class
|
||||
);
|
||||
|
||||
for (ActivityLog log : logs) {
|
||||
System.out.println(log.getOperationName() + " - " + log.getResourceGroup());
|
||||
}
|
||||
```
|
||||
|
||||
### Batch Query
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.models.LogsBatchQuery;
|
||||
import com.azure.monitor.query.models.LogsBatchQueryResult;
|
||||
import com.azure.monitor.query.models.LogsBatchQueryResultCollection;
|
||||
import com.azure.core.util.Context;
|
||||
|
||||
LogsBatchQuery batchQuery = new LogsBatchQuery();
|
||||
String q1 = batchQuery.addWorkspaceQuery("{workspace-id}", "AzureActivity | count", new QueryTimeInterval(Duration.ofDays(1)));
|
||||
String q2 = batchQuery.addWorkspaceQuery("{workspace-id}", "Heartbeat | count", new QueryTimeInterval(Duration.ofDays(1)));
|
||||
String q3 = batchQuery.addWorkspaceQuery("{workspace-id}", "Perf | count", new QueryTimeInterval(Duration.ofDays(1)));
|
||||
|
||||
LogsBatchQueryResultCollection results = logsClient
|
||||
.queryBatchWithResponse(batchQuery, Context.NONE)
|
||||
.getValue();
|
||||
|
||||
LogsBatchQueryResult result1 = results.getResult(q1);
|
||||
LogsBatchQueryResult result2 = results.getResult(q2);
|
||||
LogsBatchQueryResult result3 = results.getResult(q3);
|
||||
|
||||
// Check for failures
|
||||
if (result3.getQueryResultStatus() == LogsQueryResultStatus.FAILURE) {
|
||||
System.err.println("Query failed: " + result3.getError().getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
### Query with Options
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.models.LogsQueryOptions;
|
||||
import com.azure.core.http.rest.Response;
|
||||
|
||||
LogsQueryOptions options = new LogsQueryOptions()
|
||||
.setServerTimeout(Duration.ofMinutes(10))
|
||||
.setIncludeStatistics(true)
|
||||
.setIncludeVisualization(true);
|
||||
|
||||
Response<LogsQueryResult> response = logsClient.queryWorkspaceWithResponse(
|
||||
"{workspace-id}",
|
||||
"AzureActivity | summarize count() by bin(TimeGenerated, 1h)",
|
||||
new QueryTimeInterval(Duration.ofDays(7)),
|
||||
options,
|
||||
Context.NONE
|
||||
);
|
||||
|
||||
LogsQueryResult result = response.getValue();
|
||||
|
||||
// Access statistics
|
||||
BinaryData statistics = result.getStatistics();
|
||||
// Access visualization data
|
||||
BinaryData visualization = result.getVisualization();
|
||||
```
|
||||
|
||||
### Query Multiple Workspaces
|
||||
|
||||
```java
|
||||
import java.util.Arrays;
|
||||
|
||||
LogsQueryOptions options = new LogsQueryOptions()
|
||||
.setAdditionalWorkspaces(Arrays.asList("{workspace-id-2}", "{workspace-id-3}"));
|
||||
|
||||
Response<LogsQueryResult> response = logsClient.queryWorkspaceWithResponse(
|
||||
"{workspace-id-1}",
|
||||
"AzureActivity | summarize count() by TenantId",
|
||||
new QueryTimeInterval(Duration.ofDays(1)),
|
||||
options,
|
||||
Context.NONE
|
||||
);
|
||||
```
|
||||
|
||||
## Metrics Query Operations
|
||||
|
||||
### Basic Metrics Query
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.models.MetricsQueryResult;
|
||||
import com.azure.monitor.query.models.MetricResult;
|
||||
import com.azure.monitor.query.models.TimeSeriesElement;
|
||||
import com.azure.monitor.query.models.MetricValue;
|
||||
import java.util.Arrays;
|
||||
|
||||
MetricsQueryResult result = metricsClient.queryResource(
|
||||
"{resource-uri}",
|
||||
Arrays.asList("SuccessfulCalls", "TotalCalls")
|
||||
);
|
||||
|
||||
for (MetricResult metric : result.getMetrics()) {
|
||||
System.out.println("Metric: " + metric.getMetricName());
|
||||
for (TimeSeriesElement ts : metric.getTimeSeries()) {
|
||||
System.out.println(" Dimensions: " + ts.getMetadata());
|
||||
for (MetricValue value : ts.getValues()) {
|
||||
System.out.println(" " + value.getTimeStamp() + ": " + value.getTotal());
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Metrics with Aggregations
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.models.MetricsQueryOptions;
|
||||
import com.azure.monitor.query.models.AggregationType;
|
||||
|
||||
Response<MetricsQueryResult> response = metricsClient.queryResourceWithResponse(
|
||||
"{resource-id}",
|
||||
Arrays.asList("SuccessfulCalls", "TotalCalls"),
|
||||
new MetricsQueryOptions()
|
||||
.setGranularity(Duration.ofHours(1))
|
||||
.setAggregations(Arrays.asList(AggregationType.AVERAGE, AggregationType.COUNT)),
|
||||
Context.NONE
|
||||
);
|
||||
|
||||
MetricsQueryResult result = response.getValue();
|
||||
```
|
||||
|
||||
### Query Multiple Resources (MetricsClient)
|
||||
|
||||
```java
|
||||
import com.azure.monitor.query.MetricsClient;
|
||||
import com.azure.monitor.query.MetricsClientBuilder;
|
||||
import com.azure.monitor.query.models.MetricsQueryResourcesResult;
|
||||
|
||||
MetricsClient metricsClient = new MetricsClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint("{endpoint}")
|
||||
.buildClient();
|
||||
|
||||
MetricsQueryResourcesResult result = metricsClient.queryResources(
|
||||
Arrays.asList("{resourceId1}", "{resourceId2}"),
|
||||
Arrays.asList("{metric1}", "{metric2}"),
|
||||
"{metricNamespace}"
|
||||
);
|
||||
|
||||
for (MetricsQueryResult queryResult : result.getMetricsQueryResults()) {
|
||||
for (MetricResult metric : queryResult.getMetrics()) {
|
||||
System.out.println(metric.getMetricName());
|
||||
metric.getTimeSeries().stream()
|
||||
.flatMap(ts -> ts.getValues().stream())
|
||||
.forEach(mv -> System.out.println(
|
||||
mv.getTimeStamp() + " Count=" + mv.getCount() + " Avg=" + mv.getAverage()));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Response Structure
|
||||
|
||||
### Logs Response Hierarchy
|
||||
|
||||
```
|
||||
LogsQueryResult
|
||||
├── statistics (BinaryData)
|
||||
├── visualization (BinaryData)
|
||||
├── error
|
||||
└── tables (List<LogsTable>)
|
||||
├── name
|
||||
├── columns (List<LogsTableColumn>)
|
||||
│ ├── name
|
||||
│ └── type
|
||||
└── rows (List<LogsTableRow>)
|
||||
├── rowIndex
|
||||
└── rowCells (List<LogsTableCell>)
|
||||
```
|
||||
|
||||
### Metrics Response Hierarchy
|
||||
|
||||
```
|
||||
MetricsQueryResult
|
||||
├── granularity
|
||||
├── timeInterval
|
||||
├── namespace
|
||||
├── resourceRegion
|
||||
└── metrics (List<MetricResult>)
|
||||
├── id, name, type, unit
|
||||
└── timeSeries (List<TimeSeriesElement>)
|
||||
├── metadata (dimensions)
|
||||
└── values (List<MetricValue>)
|
||||
├── timeStamp
|
||||
├── count, average, total
|
||||
├── maximum, minimum
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
import com.azure.monitor.query.models.LogsQueryResultStatus;
|
||||
|
||||
try {
|
||||
LogsQueryResult result = logsClient.queryWorkspace(workspaceId, query, timeInterval);
|
||||
|
||||
// Check partial failure
|
||||
if (result.getStatus() == LogsQueryResultStatus.PARTIAL_FAILURE) {
|
||||
System.err.println("Partial failure: " + result.getError().getMessage());
|
||||
}
|
||||
} catch (HttpResponseException e) {
|
||||
System.err.println("Query failed: " + e.getMessage());
|
||||
System.err.println("Status: " + e.getResponse().getStatusCode());
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use batch queries** — Combine multiple queries into a single request
|
||||
2. **Set appropriate timeouts** — Long queries may need extended server timeout
|
||||
3. **Limit result size** — Use `top` or `take` in Kusto queries
|
||||
4. **Use projections** — Select only needed columns with `project`
|
||||
5. **Check query status** — Handle PARTIAL_FAILURE results gracefully
|
||||
6. **Cache results** — Metrics don't change frequently; cache when appropriate
|
||||
7. **Migrate to new packages** — Plan migration to `azure-monitor-query-logs` and `azure-monitor-query-metrics`
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-monitor-query |
|
||||
| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/monitor/azure-monitor-query |
|
||||
| API Reference | https://learn.microsoft.com/java/api/com.azure.monitor.query |
|
||||
| Kusto Query Language | https://learn.microsoft.com/azure/data-explorer/kusto/query/ |
|
||||
| Log Analytics Limits | https://learn.microsoft.com/azure/azure-monitor/service-limits#la-query-api |
|
||||
| Troubleshooting | https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-query/TROUBLESHOOTING.md |
|
||||
60
skills/official/microsoft/plugins/wiki-architect/SKILL.md
Normal file
60
skills/official/microsoft/plugins/wiki-architect/SKILL.md
Normal file
@@ -0,0 +1,60 @@
|
||||
---
|
||||
name: wiki-architect
|
||||
description: Analyzes code repositories and generates hierarchical documentation structures with onboarding guides. Use when the user wants to create a wiki, generate documentation, map a codebase structure, or understand a project's architecture at a high level.
|
||||
---
|
||||
|
||||
# Wiki Architect
|
||||
|
||||
You are a documentation architect that produces structured wiki catalogues and onboarding guides from codebases.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User asks to "create a wiki", "document this repo", "generate docs"
|
||||
- User wants to understand project structure or architecture
|
||||
- User asks for a table of contents or documentation plan
|
||||
- User asks for an onboarding guide or "zero to hero" path
|
||||
|
||||
## Procedure
|
||||
|
||||
1. **Scan** the repository file tree and README
|
||||
2. **Detect** project type, languages, frameworks, architectural patterns, key technologies
|
||||
3. **Identify** layers: presentation, business logic, data access, infrastructure
|
||||
4. **Generate** a hierarchical JSON catalogue with:
|
||||
- **Onboarding**: Principal-Level Guide, Zero to Hero Guide
|
||||
- **Getting Started**: overview, setup, usage, quick reference
|
||||
- **Deep Dive**: architecture → subsystems → components → methods
|
||||
5. **Cite** real files in every section prompt using `file_path:line_number`
|
||||
|
||||
## Onboarding Guide Architecture
|
||||
|
||||
The catalogue MUST include an Onboarding section (always first, uncollapsed) containing:
|
||||
|
||||
1. **Principal-Level Guide** — For senior/principal ICs. Dense, opinionated. Includes:
|
||||
- The ONE core architectural insight with pseudocode in a different language
|
||||
- System architecture Mermaid diagram, domain model ER diagram
|
||||
- Design tradeoffs, strategic direction, "where to go deep" reading order
|
||||
|
||||
2. **Zero-to-Hero Learning Path** — For newcomers. Progressive depth:
|
||||
- Part I: Language/framework/technology foundations with cross-language comparisons
|
||||
- Part II: This codebase's architecture and domain model
|
||||
- Part III: Dev setup, testing, codebase navigation, contributing
|
||||
- Appendices: 40+ term glossary, key file reference
|
||||
|
||||
## Language Detection
|
||||
|
||||
Detect primary language from file extensions and build files, then select a comparison language:
|
||||
- C#/Java/Go/TypeScript → Python as comparison
|
||||
- Python → JavaScript as comparison
|
||||
- Rust → C++ or Go as comparison
|
||||
|
||||
## Constraints
|
||||
|
||||
- Max nesting depth: 4 levels
|
||||
- Max 8 children per section
|
||||
- Small repos (≤10 files): Getting Started only (skip Deep Dive, still include onboarding)
|
||||
- Every prompt must reference specific files
|
||||
- Derive all titles from actual repository content — never use generic placeholders
|
||||
|
||||
## Output
|
||||
|
||||
JSON code block following the catalogue schema with `items[].children[]` structure, where each node has `title`, `name`, `prompt`, and `children` fields.
|
||||
27
skills/official/microsoft/plugins/wiki-changelog/SKILL.md
Normal file
27
skills/official/microsoft/plugins/wiki-changelog/SKILL.md
Normal file
@@ -0,0 +1,27 @@
|
||||
---
|
||||
name: wiki-changelog
|
||||
description: Analyzes git commit history and generates structured changelogs categorized by change type. Use when the user asks about recent changes, wants a changelog, or needs to understand what changed in the repository.
|
||||
---
|
||||
|
||||
# Wiki Changelog
|
||||
|
||||
Generate structured changelogs from git history.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User asks "what changed recently", "generate a changelog", "summarize commits"
|
||||
- User wants to understand recent development activity
|
||||
|
||||
## Procedure
|
||||
|
||||
1. Examine git log (commits, dates, authors, messages)
|
||||
2. Group by time period: daily (last 7 days), weekly (older)
|
||||
3. Classify each commit: Features (🆕), Fixes (🐛), Refactoring (🔄), Docs (📝), Config (🔧), Dependencies (📦), Breaking (⚠️)
|
||||
4. Generate concise user-facing descriptions using project terminology
|
||||
|
||||
## Constraints
|
||||
|
||||
- Focus on user-facing changes
|
||||
- Merge related commits into coherent descriptions
|
||||
- Use project terminology from README
|
||||
- Highlight breaking changes prominently with migration notes
|
||||
77
skills/official/microsoft/plugins/wiki-onboarding/SKILL.md
Normal file
77
skills/official/microsoft/plugins/wiki-onboarding/SKILL.md
Normal file
@@ -0,0 +1,77 @@
|
||||
---
|
||||
name: wiki-onboarding
|
||||
description: Generates two complementary onboarding guides — a Principal-Level architectural deep-dive and a Zero-to-Hero contributor walkthrough. Use when the user wants onboarding documentation for a codebase.
|
||||
---
|
||||
|
||||
# Wiki Onboarding Guide Generator
|
||||
|
||||
Generate two complementary onboarding documents that together give any engineer — from newcomer to principal — a complete understanding of a codebase.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User asks for onboarding docs or getting-started guides
|
||||
- User runs `/deep-wiki:onboard` command
|
||||
- User wants to help new team members understand a codebase
|
||||
|
||||
## Language Detection
|
||||
|
||||
Scan the repository for build files to determine the primary language for code examples:
|
||||
- `package.json` / `tsconfig.json` → TypeScript/JavaScript
|
||||
- `*.csproj` / `*.sln` → C# / .NET
|
||||
- `Cargo.toml` → Rust
|
||||
- `pyproject.toml` / `setup.py` / `requirements.txt` → Python
|
||||
- `go.mod` → Go
|
||||
- `pom.xml` / `build.gradle` → Java
|
||||
|
||||
## Guide 1: Principal-Level Onboarding
|
||||
|
||||
**Audience**: Senior/staff+ engineers who need the "why" behind decisions.
|
||||
|
||||
### Required Sections
|
||||
|
||||
1. **System Philosophy & Design Principles** — What invariants does the system maintain? What were the key design choices and why?
|
||||
2. **Architecture Overview** — Component map with Mermaid diagram. What owns what, communication patterns.
|
||||
3. **Key Abstractions & Interfaces** — The load-bearing abstractions everything depends on
|
||||
4. **Decision Log** — Major architectural decisions with context, alternatives considered, trade-offs
|
||||
5. **Dependency Rationale** — Why each major dependency was chosen, what it replaced
|
||||
6. **Data Flow & State** — How data moves through the system (traced from actual code, not guessed)
|
||||
7. **Failure Modes & Error Handling** — What breaks, how errors propagate, recovery patterns
|
||||
8. **Performance Characteristics** — Bottlenecks, scaling limits, hot paths
|
||||
9. **Security Model** — Auth, authorization, trust boundaries, data sensitivity
|
||||
10. **Testing Strategy** — What's tested, what isn't, testing philosophy
|
||||
11. **Operational Concerns** — Deployment, monitoring, feature flags, configuration
|
||||
12. **Known Technical Debt** — Honest assessment of shortcuts and their risks
|
||||
|
||||
### Rules
|
||||
- Every claim backed by `(file_path:line_number)` citation
|
||||
- Minimum 3 Mermaid diagrams (architecture, data flow, dependency graph)
|
||||
- All Mermaid diagrams use dark-mode colors (see wiki-vitepress skill)
|
||||
- Focus on WHY decisions were made, not just WHAT exists
|
||||
|
||||
## Guide 2: Zero-to-Hero Contributor Guide
|
||||
|
||||
**Audience**: New contributors who need step-by-step practical guidance.
|
||||
|
||||
### Required Sections
|
||||
|
||||
1. **What This Project Does** — 2-3 sentence elevator pitch
|
||||
2. **Prerequisites** — Tools, versions, accounts needed
|
||||
3. **Environment Setup** — Step-by-step with exact commands, expected output at each step
|
||||
4. **Project Structure** — Annotated directory tree (what lives where and why)
|
||||
5. **Your First Task** — End-to-end walkthrough of adding a simple feature
|
||||
6. **Development Workflow** — Branch strategy, commit conventions, PR process
|
||||
7. **Running Tests** — How to run tests, what to test, how to add a test
|
||||
8. **Debugging Guide** — Common issues and how to diagnose them
|
||||
9. **Key Concepts** — Domain-specific terminology explained with code examples
|
||||
10. **Code Patterns** — "If you want to add X, follow this pattern" templates
|
||||
11. **Common Pitfalls** — Mistakes every new contributor makes and how to avoid them
|
||||
12. **Where to Get Help** — Communication channels, documentation, key contacts
|
||||
13. **Glossary** — Terms used in the codebase that aren't obvious
|
||||
14. **Quick Reference Card** — Cheat sheet of most-used commands and patterns
|
||||
|
||||
### Rules
|
||||
- All code examples in the detected primary language
|
||||
- Every command must be copy-pasteable
|
||||
- Include expected output for verification steps
|
||||
- Use Mermaid for workflow diagrams (dark-mode colors)
|
||||
- Ground all claims in actual code — cite `(file_path:line_number)`
|
||||
65
skills/official/microsoft/plugins/wiki-page-writer/SKILL.md
Normal file
65
skills/official/microsoft/plugins/wiki-page-writer/SKILL.md
Normal file
@@ -0,0 +1,65 @@
|
||||
---
|
||||
name: wiki-page-writer
|
||||
description: Generates rich technical documentation pages with dark-mode Mermaid diagrams, source code citations, and first-principles depth. Use when writing documentation, generating wiki pages, creating technical deep-dives, or documenting specific components or systems.
|
||||
---
|
||||
|
||||
# Wiki Page Writer
|
||||
|
||||
You are a senior documentation engineer that generates comprehensive technical documentation pages with evidence-based depth.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User asks to document a specific component, system, or feature
|
||||
- User wants a technical deep-dive with diagrams
|
||||
- A wiki catalogue section needs its content generated
|
||||
|
||||
## Depth Requirements (NON-NEGOTIABLE)
|
||||
|
||||
1. **TRACE ACTUAL CODE PATHS** — Do not guess from file names. Read the implementation.
|
||||
2. **EVERY CLAIM NEEDS A SOURCE** — File path + function/class name.
|
||||
3. **DISTINGUISH FACT FROM INFERENCE** — If you read the code, say so. If inferring, mark it.
|
||||
4. **FIRST PRINCIPLES** — Explain WHY something exists before WHAT it does.
|
||||
5. **NO HAND-WAVING** — Don't say "this likely handles..." — read the code.
|
||||
|
||||
## Procedure
|
||||
|
||||
1. **Plan**: Determine scope, audience, and documentation budget based on file count
|
||||
2. **Analyze**: Read all relevant files; identify patterns, algorithms, dependencies, data flow
|
||||
3. **Write**: Generate structured Markdown with diagrams and citations
|
||||
4. **Validate**: Verify file paths exist, class names are accurate, Mermaid renders correctly
|
||||
|
||||
## Mandatory Requirements
|
||||
|
||||
### VitePress Frontmatter
|
||||
Every page must have:
|
||||
```
|
||||
---
|
||||
title: "Page Title"
|
||||
description: "One-line description"
|
||||
---
|
||||
```
|
||||
|
||||
### Mermaid Diagrams
|
||||
- **Minimum 2 per page**
|
||||
- Use `autonumber` in all `sequenceDiagram` blocks
|
||||
- Choose appropriate types: `graph`, `sequenceDiagram`, `classDiagram`, `stateDiagram-v2`, `erDiagram`, `flowchart`
|
||||
- **Dark-mode colors (MANDATORY)**: node fills `#2d333b`, borders `#6d5dfc`, text `#e6edf3`
|
||||
- Subgraph backgrounds: `#161b22`, borders `#30363d`, lines `#8b949e`
|
||||
- If using inline `style`, use dark fills with `,color:#e6edf3`
|
||||
- Do NOT use `<br/>` (use `<br>` or line breaks)
|
||||
|
||||
### Citations
|
||||
- Every non-trivial claim needs `(file_path:line_number)`
|
||||
- Minimum 5 different source files cited per page
|
||||
- If evidence is missing: `(Unknown – verify in path/to/check)`
|
||||
|
||||
### Structure
|
||||
- Overview (explain WHY) → Architecture → Components → Data Flow → Implementation → References
|
||||
- Use Markdown tables for APIs, configs, and component summaries
|
||||
- Use comparison tables when introducing technologies
|
||||
- Include pseudocode in a familiar language when explaining complex code paths
|
||||
|
||||
### VitePress Compatibility
|
||||
- Escape bare generics outside code fences: `` `List<T>` `` not bare `List<T>`
|
||||
- No `<br/>` in Mermaid blocks
|
||||
- All hex colors must be 3 or 6 digits
|
||||
34
skills/official/microsoft/plugins/wiki-qa/SKILL.md
Normal file
34
skills/official/microsoft/plugins/wiki-qa/SKILL.md
Normal file
@@ -0,0 +1,34 @@
|
||||
---
|
||||
name: wiki-qa
|
||||
description: Answers questions about a code repository using source file analysis. Use when the user asks a question about how something works, wants to understand a component, or needs help navigating the codebase.
|
||||
---
|
||||
|
||||
# Wiki Q&A
|
||||
|
||||
Answer repository questions grounded entirely in source code evidence.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User asks a question about the codebase
|
||||
- User wants to understand a specific file, function, or component
|
||||
- User asks "how does X work" or "where is Y defined"
|
||||
|
||||
## Procedure
|
||||
|
||||
1. Detect the language of the question; respond in the same language
|
||||
2. Search the codebase for relevant files
|
||||
3. Read those files to gather evidence
|
||||
4. Synthesize an answer with inline citations
|
||||
|
||||
## Response Format
|
||||
|
||||
- Use `##` headings, code blocks with language tags, tables, bullet lists
|
||||
- Cite sources inline: `(src/path/file.ts:42)`
|
||||
- Include a "Key Files" table mapping files to their roles
|
||||
- If information is insufficient, say so and suggest files to examine
|
||||
|
||||
## Rules
|
||||
|
||||
- ONLY use information from actual source files
|
||||
- NEVER invent, guess, or use external knowledge
|
||||
- Think step by step before answering
|
||||
65
skills/official/microsoft/plugins/wiki-researcher/SKILL.md
Normal file
65
skills/official/microsoft/plugins/wiki-researcher/SKILL.md
Normal file
@@ -0,0 +1,65 @@
|
||||
---
|
||||
name: wiki-researcher
|
||||
description: Conducts multi-turn iterative deep research on specific topics within a codebase with zero tolerance for shallow analysis. Use when the user wants an in-depth investigation, needs to understand how something works across multiple files, or asks for comprehensive analysis of a specific system or pattern.
|
||||
---
|
||||
|
||||
# Wiki Researcher
|
||||
|
||||
You are an expert software engineer and systems analyst. Your job is to deeply understand codebases, tracing actual code paths and grounding every claim in evidence.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User asks "how does X work" with expectation of depth
|
||||
- User wants to understand a complex system spanning many files
|
||||
- User asks for architectural analysis or pattern investigation
|
||||
|
||||
## Core Invariants (NON-NEGOTIABLE)
|
||||
|
||||
### Depth Before Breadth
|
||||
- **TRACE ACTUAL CODE PATHS** — not guess from file names or conventions
|
||||
- **READ THE REAL IMPLEMENTATION** — not summarize what you think it probably does
|
||||
- **FOLLOW THE CHAIN** — if A calls B calls C, trace it all the way down
|
||||
- **DISTINGUISH FACT FROM INFERENCE** — "I read this" vs "I'm inferring because..."
|
||||
|
||||
### Zero Tolerance for Shallow Research
|
||||
- **NO Vibes-Based Diagrams** — Every box and arrow corresponds to real code you've read
|
||||
- **NO Assumed Patterns** — Don't say "this follows MVC" unless you've verified where the M, V, and C live
|
||||
- **NO Skipped Layers** — If asked how data flows A to Z, trace every hop
|
||||
- **NO Confident Unknowns** — If you haven't read it, say "I haven't traced this yet"
|
||||
|
||||
### Evidence Standard
|
||||
|
||||
| Claim Type | Required Evidence |
|
||||
|---|---|
|
||||
| "X calls Y" | File path + function name |
|
||||
| "Data flows through Z" | Trace: entry point → transformations → destination |
|
||||
| "This is the main entry point" | Where it's invoked (config, main, route registration) |
|
||||
| "These modules are coupled" | Import/dependency chain |
|
||||
| "This is dead code" | Show no call sites exist |
|
||||
|
||||
## Process: 5 Iterations
|
||||
|
||||
Each iteration takes a different lens and builds on all prior findings:
|
||||
|
||||
1. **Structural/Architectural view** — map the landscape, identify components, entry points
|
||||
2. **Data flow / State management view** — trace data through the system
|
||||
3. **Integration / Dependency view** — external connections, API contracts
|
||||
4. **Pattern / Anti-pattern view** — design patterns, trade-offs, technical debt, risks
|
||||
5. **Synthesis / Recommendations** — combine all findings, provide actionable insights
|
||||
|
||||
### For Every Significant Finding
|
||||
|
||||
1. **State the finding** — one clear sentence
|
||||
2. **Show the evidence** — file paths, code references, call chains
|
||||
3. **Explain the implication** — why does this matter?
|
||||
4. **Rate confidence** — HIGH (read code), MEDIUM (read some, inferred rest), LOW (inferred from structure)
|
||||
5. **Flag open questions** — what would you need to trace next?
|
||||
|
||||
## Rules
|
||||
|
||||
- NEVER repeat findings from prior iterations
|
||||
- ALWAYS cite files: `(file_path:line_number)`
|
||||
- ALWAYS provide substantive analysis — never just "continuing..."
|
||||
- Include Mermaid diagrams (dark-mode colors) when they clarify architecture or flow
|
||||
- Stay focused on the specific topic
|
||||
- Flag what you HAVEN'T explored — boundaries of your knowledge at all times
|
||||
148
skills/official/microsoft/plugins/wiki-vitepress/SKILL.md
Normal file
148
skills/official/microsoft/plugins/wiki-vitepress/SKILL.md
Normal file
@@ -0,0 +1,148 @@
|
||||
---
|
||||
name: wiki-vitepress
|
||||
description: Packages generated wiki Markdown into a VitePress static site with dark theme, dark-mode Mermaid diagrams with click-to-zoom, and production build output. Use when the user wants to create a browsable website from generated wiki pages.
|
||||
---
|
||||
|
||||
# Wiki VitePress Packager
|
||||
|
||||
Transform generated wiki Markdown files into a polished VitePress static site with dark theme and interactive Mermaid diagrams.
|
||||
|
||||
## When to Activate
|
||||
|
||||
- User asks to "build a site" or "package as VitePress"
|
||||
- User runs the `/deep-wiki:build` command
|
||||
- User wants a browsable HTML output from generated wiki pages
|
||||
|
||||
## VitePress Scaffolding
|
||||
|
||||
Generate the following structure in a `wiki-site/` directory:
|
||||
|
||||
```
|
||||
wiki-site/
|
||||
├── .vitepress/
|
||||
│ ├── config.mts
|
||||
│ └── theme/
|
||||
│ ├── index.ts
|
||||
│ └── custom.css
|
||||
├── public/
|
||||
├── [generated .md pages]
|
||||
├── package.json
|
||||
└── index.md
|
||||
```
|
||||
|
||||
## Config Requirements (`config.mts`)
|
||||
|
||||
- Use `withMermaid` wrapper from `vitepress-plugin-mermaid`
|
||||
- Set `appearance: 'dark'` for dark-only theme
|
||||
- Configure `themeConfig.nav` and `themeConfig.sidebar` from the catalogue structure
|
||||
- Mermaid config must set dark theme variables:
|
||||
|
||||
```typescript
|
||||
mermaid: {
|
||||
theme: 'dark',
|
||||
themeVariables: {
|
||||
primaryColor: '#1e3a5f',
|
||||
primaryTextColor: '#e0e0e0',
|
||||
primaryBorderColor: '#4a9eed',
|
||||
lineColor: '#4a9eed',
|
||||
secondaryColor: '#2d4a3e',
|
||||
tertiaryColor: '#2d2d3d',
|
||||
background: '#1a1a2e',
|
||||
mainBkg: '#1e3a5f',
|
||||
nodeBorder: '#4a9eed',
|
||||
clusterBkg: '#16213e',
|
||||
titleColor: '#e0e0e0',
|
||||
edgeLabelBackground: '#1a1a2e'
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Dark-Mode Mermaid: Three-Layer Fix
|
||||
|
||||
### Layer 1: Theme Variables (in config.mts)
|
||||
Set via `mermaid.themeVariables` as shown above.
|
||||
|
||||
### Layer 2: CSS Overrides (`custom.css`)
|
||||
Target Mermaid SVG elements with `!important`:
|
||||
|
||||
```css
|
||||
.mermaid .node rect,
|
||||
.mermaid .node circle,
|
||||
.mermaid .node polygon { fill: #1e3a5f !important; stroke: #4a9eed !important; }
|
||||
.mermaid .edgeLabel { background-color: #1a1a2e !important; color: #e0e0e0 !important; }
|
||||
.mermaid text { fill: #e0e0e0 !important; }
|
||||
.mermaid .label { color: #e0e0e0 !important; }
|
||||
```
|
||||
|
||||
### Layer 3: Inline Style Replacement (`theme/index.ts`)
|
||||
Mermaid inline `style` attributes override everything. Use `onMounted` + polling to replace them:
|
||||
|
||||
```typescript
|
||||
import { onMounted } from 'vue'
|
||||
|
||||
// In setup()
|
||||
onMounted(() => {
|
||||
let attempts = 0
|
||||
const fix = setInterval(() => {
|
||||
document.querySelectorAll('.mermaid svg [style]').forEach(el => {
|
||||
const s = (el as HTMLElement).style
|
||||
if (s.fill && !s.fill.includes('#1e3a5f')) s.fill = '#1e3a5f'
|
||||
if (s.stroke && !s.stroke.includes('#4a9eed')) s.stroke = '#4a9eed'
|
||||
if (s.color) s.color = '#e0e0e0'
|
||||
})
|
||||
if (++attempts >= 20) clearInterval(fix)
|
||||
}, 500)
|
||||
})
|
||||
```
|
||||
|
||||
Use `setup()` with `onMounted`, NOT `enhanceApp()` — DOM doesn't exist during SSR.
|
||||
|
||||
## Click-to-Zoom for Mermaid Diagrams
|
||||
|
||||
Wrap each `.mermaid` container in a clickable wrapper that opens a fullscreen modal:
|
||||
|
||||
```typescript
|
||||
document.querySelectorAll('.mermaid').forEach(el => {
|
||||
el.style.cursor = 'zoom-in'
|
||||
el.addEventListener('click', () => {
|
||||
const modal = document.createElement('div')
|
||||
modal.className = 'mermaid-zoom-modal'
|
||||
modal.innerHTML = el.outerHTML
|
||||
modal.addEventListener('click', () => modal.remove())
|
||||
document.body.appendChild(modal)
|
||||
})
|
||||
})
|
||||
```
|
||||
|
||||
Modal CSS:
|
||||
```css
|
||||
.mermaid-zoom-modal {
|
||||
position: fixed; inset: 0;
|
||||
background: rgba(0,0,0,0.9);
|
||||
display: flex; align-items: center; justify-content: center;
|
||||
z-index: 9999; cursor: zoom-out;
|
||||
}
|
||||
.mermaid-zoom-modal .mermaid { transform: scale(1.5); }
|
||||
```
|
||||
|
||||
## Post-Processing Rules
|
||||
|
||||
Before VitePress build, scan all `.md` files and fix:
|
||||
- Replace `<br/>` with `<br>` (Vue template compiler compatibility)
|
||||
- Wrap bare `<T>` generic parameters in backticks outside code fences
|
||||
- Ensure every page has YAML frontmatter with `title` and `description`
|
||||
|
||||
## Build
|
||||
|
||||
```bash
|
||||
cd wiki-site && npm install && npm run docs:build
|
||||
```
|
||||
|
||||
Output goes to `wiki-site/.vitepress/dist/`.
|
||||
|
||||
## Known Gotchas
|
||||
|
||||
- Mermaid renders async — SVGs don't exist when `onMounted` fires. Must poll.
|
||||
- `isCustomElement` compiler option for bare `<T>` causes worse crashes — do NOT use it
|
||||
- Node text in Mermaid uses inline `style` with highest specificity — CSS alone won't fix it
|
||||
- `enhanceApp()` runs during SSR where `document` doesn't exist — use `setup()` only
|
||||
320
skills/official/microsoft/python/compute/botservice/SKILL.md
Normal file
320
skills/official/microsoft/python/compute/botservice/SKILL.md
Normal file
@@ -0,0 +1,320 @@
|
||||
---
|
||||
name: azure-mgmt-botservice-py
|
||||
description: |
|
||||
Azure Bot Service Management SDK for Python. Use for creating, managing, and configuring Azure Bot Service resources.
|
||||
Triggers: "azure-mgmt-botservice", "AzureBotService", "bot management", "conversational AI", "bot channels".
|
||||
---
|
||||
|
||||
# Azure Bot Service Management SDK for Python
|
||||
|
||||
Manage Azure Bot Service resources including bots, channels, and connections.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-mgmt-botservice
|
||||
pip install azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.mgmt.botservice import AzureBotService
|
||||
import os
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = AzureBotService(
|
||||
credential=credential,
|
||||
subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"]
|
||||
)
|
||||
```
|
||||
|
||||
## Create a Bot
|
||||
|
||||
```python
|
||||
from azure.mgmt.botservice import AzureBotService
|
||||
from azure.mgmt.botservice.models import Bot, BotProperties, Sku
|
||||
from azure.identity import DefaultAzureCredential
|
||||
import os
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = AzureBotService(
|
||||
credential=credential,
|
||||
subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"]
|
||||
)
|
||||
|
||||
resource_group = os.environ["AZURE_RESOURCE_GROUP"]
|
||||
bot_name = "my-chat-bot"
|
||||
|
||||
bot = client.bots.create(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name,
|
||||
parameters=Bot(
|
||||
location="global",
|
||||
sku=Sku(name="F0"), # Free tier
|
||||
kind="azurebot",
|
||||
properties=BotProperties(
|
||||
display_name="My Chat Bot",
|
||||
description="A conversational AI bot",
|
||||
endpoint="https://my-bot-app.azurewebsites.net/api/messages",
|
||||
msa_app_id="<your-app-id>",
|
||||
msa_app_type="MultiTenant"
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Bot created: {bot.name}")
|
||||
```
|
||||
|
||||
## Get Bot Details
|
||||
|
||||
```python
|
||||
bot = client.bots.get(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name
|
||||
)
|
||||
|
||||
print(f"Bot: {bot.properties.display_name}")
|
||||
print(f"Endpoint: {bot.properties.endpoint}")
|
||||
print(f"SKU: {bot.sku.name}")
|
||||
```
|
||||
|
||||
## List Bots in Resource Group
|
||||
|
||||
```python
|
||||
bots = client.bots.list_by_resource_group(resource_group_name=resource_group)
|
||||
|
||||
for bot in bots:
|
||||
print(f"Bot: {bot.name} - {bot.properties.display_name}")
|
||||
```
|
||||
|
||||
## List All Bots in Subscription
|
||||
|
||||
```python
|
||||
all_bots = client.bots.list()
|
||||
|
||||
for bot in all_bots:
|
||||
print(f"Bot: {bot.name} in {bot.id.split('/')[4]}")
|
||||
```
|
||||
|
||||
## Update Bot
|
||||
|
||||
```python
|
||||
bot = client.bots.update(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name,
|
||||
properties=BotProperties(
|
||||
display_name="Updated Bot Name",
|
||||
description="Updated description"
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Delete Bot
|
||||
|
||||
```python
|
||||
client.bots.delete(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name
|
||||
)
|
||||
```
|
||||
|
||||
## Configure Channels
|
||||
|
||||
### Add Teams Channel
|
||||
|
||||
```python
|
||||
from azure.mgmt.botservice.models import (
|
||||
BotChannel,
|
||||
MsTeamsChannel,
|
||||
MsTeamsChannelProperties
|
||||
)
|
||||
|
||||
channel = client.channels.create(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name,
|
||||
channel_name="MsTeamsChannel",
|
||||
parameters=BotChannel(
|
||||
location="global",
|
||||
properties=MsTeamsChannel(
|
||||
properties=MsTeamsChannelProperties(
|
||||
is_enabled=True
|
||||
)
|
||||
)
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
### Add Direct Line Channel
|
||||
|
||||
```python
|
||||
from azure.mgmt.botservice.models import (
|
||||
BotChannel,
|
||||
DirectLineChannel,
|
||||
DirectLineChannelProperties,
|
||||
DirectLineSite
|
||||
)
|
||||
|
||||
channel = client.channels.create(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name,
|
||||
channel_name="DirectLineChannel",
|
||||
parameters=BotChannel(
|
||||
location="global",
|
||||
properties=DirectLineChannel(
|
||||
properties=DirectLineChannelProperties(
|
||||
sites=[
|
||||
DirectLineSite(
|
||||
site_name="Default Site",
|
||||
is_enabled=True,
|
||||
is_v1_enabled=False,
|
||||
is_v3_enabled=True
|
||||
)
|
||||
]
|
||||
)
|
||||
)
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
### Add Web Chat Channel
|
||||
|
||||
```python
|
||||
from azure.mgmt.botservice.models import (
|
||||
BotChannel,
|
||||
WebChatChannel,
|
||||
WebChatChannelProperties,
|
||||
WebChatSite
|
||||
)
|
||||
|
||||
channel = client.channels.create(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name,
|
||||
channel_name="WebChatChannel",
|
||||
parameters=BotChannel(
|
||||
location="global",
|
||||
properties=WebChatChannel(
|
||||
properties=WebChatChannelProperties(
|
||||
sites=[
|
||||
WebChatSite(
|
||||
site_name="Default Site",
|
||||
is_enabled=True
|
||||
)
|
||||
]
|
||||
)
|
||||
)
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Get Channel Details
|
||||
|
||||
```python
|
||||
channel = client.channels.get(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name,
|
||||
channel_name="DirectLineChannel"
|
||||
)
|
||||
```
|
||||
|
||||
## List Channel Keys
|
||||
|
||||
```python
|
||||
keys = client.channels.list_with_keys(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name,
|
||||
channel_name="DirectLineChannel"
|
||||
)
|
||||
|
||||
# Access Direct Line keys
|
||||
if hasattr(keys.properties, 'properties'):
|
||||
for site in keys.properties.properties.sites:
|
||||
print(f"Site: {site.site_name}")
|
||||
print(f"Key: {site.key}")
|
||||
```
|
||||
|
||||
## Bot Connections (OAuth)
|
||||
|
||||
### Create Connection Setting
|
||||
|
||||
```python
|
||||
from azure.mgmt.botservice.models import (
|
||||
ConnectionSetting,
|
||||
ConnectionSettingProperties
|
||||
)
|
||||
|
||||
connection = client.bot_connection.create(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name,
|
||||
connection_name="graph-connection",
|
||||
parameters=ConnectionSetting(
|
||||
location="global",
|
||||
properties=ConnectionSettingProperties(
|
||||
client_id="<oauth-client-id>",
|
||||
client_secret="<oauth-client-secret>",
|
||||
scopes="User.Read",
|
||||
service_provider_id="<service-provider-id>"
|
||||
)
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
### List Connections
|
||||
|
||||
```python
|
||||
connections = client.bot_connection.list_by_bot_service(
|
||||
resource_group_name=resource_group,
|
||||
resource_name=bot_name
|
||||
)
|
||||
|
||||
for conn in connections:
|
||||
print(f"Connection: {conn.name}")
|
||||
```
|
||||
|
||||
## Client Operations
|
||||
|
||||
| Operation | Method |
|
||||
|-----------|--------|
|
||||
| `client.bots` | Bot CRUD operations |
|
||||
| `client.channels` | Channel configuration |
|
||||
| `client.bot_connection` | OAuth connection settings |
|
||||
| `client.direct_line` | Direct Line channel operations |
|
||||
| `client.email` | Email channel operations |
|
||||
| `client.operations` | Available operations |
|
||||
| `client.host_settings` | Host settings operations |
|
||||
|
||||
## SKU Options
|
||||
|
||||
| SKU | Description |
|
||||
|-----|-------------|
|
||||
| `F0` | Free tier (limited messages) |
|
||||
| `S1` | Standard tier (unlimited messages) |
|
||||
|
||||
## Channel Types
|
||||
|
||||
| Channel | Class | Purpose |
|
||||
|---------|-------|---------|
|
||||
| `MsTeamsChannel` | Microsoft Teams | Teams integration |
|
||||
| `DirectLineChannel` | Direct Line | Custom client integration |
|
||||
| `WebChatChannel` | Web Chat | Embeddable web widget |
|
||||
| `SlackChannel` | Slack | Slack workspace integration |
|
||||
| `FacebookChannel` | Facebook | Messenger integration |
|
||||
| `EmailChannel` | Email | Email communication |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for authentication
|
||||
2. **Start with F0 SKU** for development, upgrade to S1 for production
|
||||
3. **Store MSA App ID/Secret securely** — use Key Vault
|
||||
4. **Enable only needed channels** — reduces attack surface
|
||||
5. **Rotate Direct Line keys** periodically
|
||||
6. **Use managed identity** when possible for bot connections
|
||||
7. **Configure proper CORS** for Web Chat channel
|
||||
@@ -0,0 +1,252 @@
|
||||
---
|
||||
name: azure-containerregistry-py
|
||||
description: |
|
||||
Azure Container Registry SDK for Python. Use for managing container images, artifacts, and repositories.
|
||||
Triggers: "azure-containerregistry", "ContainerRegistryClient", "container images", "docker registry", "ACR".
|
||||
package: azure-containerregistry
|
||||
---
|
||||
|
||||
# Azure Container Registry SDK for Python
|
||||
|
||||
Manage container images, artifacts, and repositories in Azure Container Registry.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-containerregistry
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_CONTAINERREGISTRY_ENDPOINT=https://<registry-name>.azurecr.io
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ContainerRegistryClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = ContainerRegistryClient(
|
||||
endpoint=os.environ["AZURE_CONTAINERREGISTRY_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
### Anonymous Access (Public Registry)
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ContainerRegistryClient
|
||||
|
||||
client = ContainerRegistryClient(
|
||||
endpoint="https://mcr.microsoft.com",
|
||||
credential=None,
|
||||
audience="https://mcr.microsoft.com"
|
||||
)
|
||||
```
|
||||
|
||||
## List Repositories
|
||||
|
||||
```python
|
||||
client = ContainerRegistryClient(endpoint, DefaultAzureCredential())
|
||||
|
||||
for repository in client.list_repository_names():
|
||||
print(repository)
|
||||
```
|
||||
|
||||
## Repository Operations
|
||||
|
||||
### Get Repository Properties
|
||||
|
||||
```python
|
||||
properties = client.get_repository_properties("my-image")
|
||||
print(f"Created: {properties.created_on}")
|
||||
print(f"Modified: {properties.last_updated_on}")
|
||||
print(f"Manifests: {properties.manifest_count}")
|
||||
print(f"Tags: {properties.tag_count}")
|
||||
```
|
||||
|
||||
### Update Repository Properties
|
||||
|
||||
```python
|
||||
from azure.containerregistry import RepositoryProperties
|
||||
|
||||
client.update_repository_properties(
|
||||
"my-image",
|
||||
properties=RepositoryProperties(
|
||||
can_delete=False,
|
||||
can_write=False
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
### Delete Repository
|
||||
|
||||
```python
|
||||
client.delete_repository("my-image")
|
||||
```
|
||||
|
||||
## List Tags
|
||||
|
||||
```python
|
||||
for tag in client.list_tag_properties("my-image"):
|
||||
print(f"{tag.name}: {tag.created_on}")
|
||||
```
|
||||
|
||||
### Filter by Order
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ArtifactTagOrder
|
||||
|
||||
# Most recent first
|
||||
for tag in client.list_tag_properties(
|
||||
"my-image",
|
||||
order_by=ArtifactTagOrder.LAST_UPDATED_ON_DESCENDING
|
||||
):
|
||||
print(f"{tag.name}: {tag.last_updated_on}")
|
||||
```
|
||||
|
||||
## Manifest Operations
|
||||
|
||||
### List Manifests
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ArtifactManifestOrder
|
||||
|
||||
for manifest in client.list_manifest_properties(
|
||||
"my-image",
|
||||
order_by=ArtifactManifestOrder.LAST_UPDATED_ON_DESCENDING
|
||||
):
|
||||
print(f"Digest: {manifest.digest}")
|
||||
print(f"Tags: {manifest.tags}")
|
||||
print(f"Size: {manifest.size_in_bytes}")
|
||||
```
|
||||
|
||||
### Get Manifest Properties
|
||||
|
||||
```python
|
||||
manifest = client.get_manifest_properties("my-image", "latest")
|
||||
print(f"Digest: {manifest.digest}")
|
||||
print(f"Architecture: {manifest.architecture}")
|
||||
print(f"OS: {manifest.operating_system}")
|
||||
```
|
||||
|
||||
### Update Manifest Properties
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ArtifactManifestProperties
|
||||
|
||||
client.update_manifest_properties(
|
||||
"my-image",
|
||||
"latest",
|
||||
properties=ArtifactManifestProperties(
|
||||
can_delete=False,
|
||||
can_write=False
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
### Delete Manifest
|
||||
|
||||
```python
|
||||
# Delete by digest
|
||||
client.delete_manifest("my-image", "sha256:abc123...")
|
||||
|
||||
# Delete by tag
|
||||
manifest = client.get_manifest_properties("my-image", "old-tag")
|
||||
client.delete_manifest("my-image", manifest.digest)
|
||||
```
|
||||
|
||||
## Tag Operations
|
||||
|
||||
### Get Tag Properties
|
||||
|
||||
```python
|
||||
tag = client.get_tag_properties("my-image", "latest")
|
||||
print(f"Digest: {tag.digest}")
|
||||
print(f"Created: {tag.created_on}")
|
||||
```
|
||||
|
||||
### Delete Tag
|
||||
|
||||
```python
|
||||
client.delete_tag("my-image", "old-tag")
|
||||
```
|
||||
|
||||
## Upload and Download Artifacts
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ContainerRegistryClient
|
||||
|
||||
client = ContainerRegistryClient(endpoint, DefaultAzureCredential())
|
||||
|
||||
# Download manifest
|
||||
manifest = client.download_manifest("my-image", "latest")
|
||||
print(f"Media type: {manifest.media_type}")
|
||||
print(f"Digest: {manifest.digest}")
|
||||
|
||||
# Download blob
|
||||
blob = client.download_blob("my-image", "sha256:abc123...")
|
||||
with open("layer.tar.gz", "wb") as f:
|
||||
for chunk in blob:
|
||||
f.write(chunk)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.containerregistry.aio import ContainerRegistryClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def list_repos():
|
||||
credential = DefaultAzureCredential()
|
||||
client = ContainerRegistryClient(endpoint, credential)
|
||||
|
||||
async for repo in client.list_repository_names():
|
||||
print(repo)
|
||||
|
||||
await client.close()
|
||||
await credential.close()
|
||||
```
|
||||
|
||||
## Clean Up Old Images
|
||||
|
||||
```python
|
||||
from datetime import datetime, timedelta, timezone
|
||||
|
||||
cutoff = datetime.now(timezone.utc) - timedelta(days=30)
|
||||
|
||||
for manifest in client.list_manifest_properties("my-image"):
|
||||
if manifest.last_updated_on < cutoff and not manifest.tags:
|
||||
print(f"Deleting {manifest.digest}")
|
||||
client.delete_manifest("my-image", manifest.digest)
|
||||
```
|
||||
|
||||
## Client Operations
|
||||
|
||||
| Operation | Description |
|
||||
|-----------|-------------|
|
||||
| `list_repository_names` | List all repositories |
|
||||
| `get_repository_properties` | Get repository metadata |
|
||||
| `delete_repository` | Delete repository and all images |
|
||||
| `list_tag_properties` | List tags in repository |
|
||||
| `get_tag_properties` | Get tag metadata |
|
||||
| `delete_tag` | Delete specific tag |
|
||||
| `list_manifest_properties` | List manifests in repository |
|
||||
| `get_manifest_properties` | Get manifest metadata |
|
||||
| `delete_manifest` | Delete manifest by digest |
|
||||
| `download_manifest` | Download manifest content |
|
||||
| `download_blob` | Download layer blob |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID** for authentication in production
|
||||
2. **Delete by digest** not tag to avoid orphaned images
|
||||
3. **Lock production images** with can_delete=False
|
||||
4. **Clean up untagged manifests** regularly
|
||||
5. **Use async client** for high-throughput operations
|
||||
6. **Order by last_updated** to find recent/old images
|
||||
7. **Check manifest.tags** before deleting to avoid removing tagged images
|
||||
258
skills/official/microsoft/python/compute/fabric/SKILL.md
Normal file
258
skills/official/microsoft/python/compute/fabric/SKILL.md
Normal file
@@ -0,0 +1,258 @@
|
||||
---
|
||||
name: azure-mgmt-fabric-py
|
||||
description: |
|
||||
Azure Fabric Management SDK for Python. Use for managing Microsoft Fabric capacities and resources.
|
||||
Triggers: "azure-mgmt-fabric", "FabricMgmtClient", "Fabric capacity", "Microsoft Fabric", "Power BI capacity".
|
||||
---
|
||||
|
||||
# Azure Fabric Management SDK for Python
|
||||
|
||||
Manage Microsoft Fabric capacities and resources programmatically.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-mgmt-fabric
|
||||
pip install azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.mgmt.fabric import FabricMgmtClient
|
||||
import os
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = FabricMgmtClient(
|
||||
credential=credential,
|
||||
subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"]
|
||||
)
|
||||
```
|
||||
|
||||
## Create Fabric Capacity
|
||||
|
||||
```python
|
||||
from azure.mgmt.fabric import FabricMgmtClient
|
||||
from azure.mgmt.fabric.models import FabricCapacity, FabricCapacityProperties, CapacitySku
|
||||
from azure.identity import DefaultAzureCredential
|
||||
import os
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = FabricMgmtClient(
|
||||
credential=credential,
|
||||
subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"]
|
||||
)
|
||||
|
||||
resource_group = os.environ["AZURE_RESOURCE_GROUP"]
|
||||
capacity_name = "myfabriccapacity"
|
||||
|
||||
capacity = client.fabric_capacities.begin_create_or_update(
|
||||
resource_group_name=resource_group,
|
||||
capacity_name=capacity_name,
|
||||
resource=FabricCapacity(
|
||||
location="eastus",
|
||||
sku=CapacitySku(
|
||||
name="F2", # Fabric SKU
|
||||
tier="Fabric"
|
||||
),
|
||||
properties=FabricCapacityProperties(
|
||||
administration=FabricCapacityAdministration(
|
||||
members=["user@contoso.com"]
|
||||
)
|
||||
)
|
||||
)
|
||||
).result()
|
||||
|
||||
print(f"Capacity created: {capacity.name}")
|
||||
```
|
||||
|
||||
## Get Capacity Details
|
||||
|
||||
```python
|
||||
capacity = client.fabric_capacities.get(
|
||||
resource_group_name=resource_group,
|
||||
capacity_name=capacity_name
|
||||
)
|
||||
|
||||
print(f"Capacity: {capacity.name}")
|
||||
print(f"SKU: {capacity.sku.name}")
|
||||
print(f"State: {capacity.properties.state}")
|
||||
print(f"Location: {capacity.location}")
|
||||
```
|
||||
|
||||
## List Capacities in Resource Group
|
||||
|
||||
```python
|
||||
capacities = client.fabric_capacities.list_by_resource_group(
|
||||
resource_group_name=resource_group
|
||||
)
|
||||
|
||||
for capacity in capacities:
|
||||
print(f"Capacity: {capacity.name} - SKU: {capacity.sku.name}")
|
||||
```
|
||||
|
||||
## List All Capacities in Subscription
|
||||
|
||||
```python
|
||||
all_capacities = client.fabric_capacities.list_by_subscription()
|
||||
|
||||
for capacity in all_capacities:
|
||||
print(f"Capacity: {capacity.name} in {capacity.location}")
|
||||
```
|
||||
|
||||
## Update Capacity
|
||||
|
||||
```python
|
||||
from azure.mgmt.fabric.models import FabricCapacityUpdate, CapacitySku
|
||||
|
||||
updated = client.fabric_capacities.begin_update(
|
||||
resource_group_name=resource_group,
|
||||
capacity_name=capacity_name,
|
||||
properties=FabricCapacityUpdate(
|
||||
sku=CapacitySku(
|
||||
name="F4", # Scale up
|
||||
tier="Fabric"
|
||||
),
|
||||
tags={"environment": "production"}
|
||||
)
|
||||
).result()
|
||||
|
||||
print(f"Updated SKU: {updated.sku.name}")
|
||||
```
|
||||
|
||||
## Suspend Capacity
|
||||
|
||||
Pause capacity to stop billing:
|
||||
|
||||
```python
|
||||
client.fabric_capacities.begin_suspend(
|
||||
resource_group_name=resource_group,
|
||||
capacity_name=capacity_name
|
||||
).result()
|
||||
|
||||
print("Capacity suspended")
|
||||
```
|
||||
|
||||
## Resume Capacity
|
||||
|
||||
Resume a paused capacity:
|
||||
|
||||
```python
|
||||
client.fabric_capacities.begin_resume(
|
||||
resource_group_name=resource_group,
|
||||
capacity_name=capacity_name
|
||||
).result()
|
||||
|
||||
print("Capacity resumed")
|
||||
```
|
||||
|
||||
## Delete Capacity
|
||||
|
||||
```python
|
||||
client.fabric_capacities.begin_delete(
|
||||
resource_group_name=resource_group,
|
||||
capacity_name=capacity_name
|
||||
).result()
|
||||
|
||||
print("Capacity deleted")
|
||||
```
|
||||
|
||||
## Check Name Availability
|
||||
|
||||
```python
|
||||
from azure.mgmt.fabric.models import CheckNameAvailabilityRequest
|
||||
|
||||
result = client.fabric_capacities.check_name_availability(
|
||||
location="eastus",
|
||||
body=CheckNameAvailabilityRequest(
|
||||
name="my-new-capacity",
|
||||
type="Microsoft.Fabric/capacities"
|
||||
)
|
||||
)
|
||||
|
||||
if result.name_available:
|
||||
print("Name is available")
|
||||
else:
|
||||
print(f"Name not available: {result.reason}")
|
||||
```
|
||||
|
||||
## List Available SKUs
|
||||
|
||||
```python
|
||||
skus = client.fabric_capacities.list_skus(
|
||||
resource_group_name=resource_group,
|
||||
capacity_name=capacity_name
|
||||
)
|
||||
|
||||
for sku in skus:
|
||||
print(f"SKU: {sku.name} - Tier: {sku.tier}")
|
||||
```
|
||||
|
||||
## Client Operations
|
||||
|
||||
| Operation | Method |
|
||||
|-----------|--------|
|
||||
| `client.fabric_capacities` | Capacity CRUD operations |
|
||||
| `client.operations` | List available operations |
|
||||
|
||||
## Fabric SKUs
|
||||
|
||||
| SKU | Description | CUs |
|
||||
|-----|-------------|-----|
|
||||
| `F2` | Entry level | 2 Capacity Units |
|
||||
| `F4` | Small | 4 Capacity Units |
|
||||
| `F8` | Medium | 8 Capacity Units |
|
||||
| `F16` | Large | 16 Capacity Units |
|
||||
| `F32` | X-Large | 32 Capacity Units |
|
||||
| `F64` | 2X-Large | 64 Capacity Units |
|
||||
| `F128` | 4X-Large | 128 Capacity Units |
|
||||
| `F256` | 8X-Large | 256 Capacity Units |
|
||||
| `F512` | 16X-Large | 512 Capacity Units |
|
||||
| `F1024` | 32X-Large | 1024 Capacity Units |
|
||||
| `F2048` | 64X-Large | 2048 Capacity Units |
|
||||
|
||||
## Capacity States
|
||||
|
||||
| State | Description |
|
||||
|-------|-------------|
|
||||
| `Active` | Capacity is running |
|
||||
| `Paused` | Capacity is suspended (no billing) |
|
||||
| `Provisioning` | Being created |
|
||||
| `Updating` | Being modified |
|
||||
| `Deleting` | Being removed |
|
||||
| `Failed` | Operation failed |
|
||||
|
||||
## Long-Running Operations
|
||||
|
||||
All mutating operations are long-running (LRO). Use `.result()` to wait:
|
||||
|
||||
```python
|
||||
# Synchronous wait
|
||||
capacity = client.fabric_capacities.begin_create_or_update(...).result()
|
||||
|
||||
# Or poll manually
|
||||
poller = client.fabric_capacities.begin_create_or_update(...)
|
||||
while not poller.done():
|
||||
print(f"Status: {poller.status()}")
|
||||
time.sleep(5)
|
||||
capacity = poller.result()
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for authentication
|
||||
2. **Suspend unused capacities** to reduce costs
|
||||
3. **Start with smaller SKUs** and scale up as needed
|
||||
4. **Use tags** for cost tracking and organization
|
||||
5. **Check name availability** before creating capacities
|
||||
6. **Handle LRO properly** — don't assume immediate completion
|
||||
7. **Set up capacity admins** — specify users who can manage workspaces
|
||||
8. **Monitor capacity usage** via Azure Monitor metrics
|
||||
219
skills/official/microsoft/python/data/blob/SKILL.md
Normal file
219
skills/official/microsoft/python/data/blob/SKILL.md
Normal file
@@ -0,0 +1,219 @@
|
||||
---
|
||||
name: azure-storage-blob-py
|
||||
description: |
|
||||
Azure Blob Storage SDK for Python. Use for uploading, downloading, listing blobs, managing containers, and blob lifecycle.
|
||||
Triggers: "blob storage", "BlobServiceClient", "ContainerClient", "BlobClient", "upload blob", "download blob".
|
||||
package: azure-storage-blob
|
||||
---
|
||||
|
||||
# Azure Blob Storage SDK for Python
|
||||
|
||||
Client library for Azure Blob Storage — object storage for unstructured data.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-storage-blob azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_STORAGE_ACCOUNT_NAME=<your-storage-account>
|
||||
# Or use full URL
|
||||
AZURE_STORAGE_ACCOUNT_URL=https://<account>.blob.core.windows.net
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.storage.blob import BlobServiceClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
account_url = "https://<account>.blob.core.windows.net"
|
||||
|
||||
blob_service_client = BlobServiceClient(account_url, credential=credential)
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
| Client | Purpose | Get From |
|
||||
|--------|---------|----------|
|
||||
| `BlobServiceClient` | Account-level operations | Direct instantiation |
|
||||
| `ContainerClient` | Container operations | `blob_service_client.get_container_client()` |
|
||||
| `BlobClient` | Single blob operations | `container_client.get_blob_client()` |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Create Container
|
||||
|
||||
```python
|
||||
container_client = blob_service_client.get_container_client("mycontainer")
|
||||
container_client.create_container()
|
||||
```
|
||||
|
||||
### Upload Blob
|
||||
|
||||
```python
|
||||
# From file path
|
||||
blob_client = blob_service_client.get_blob_client(
|
||||
container="mycontainer",
|
||||
blob="sample.txt"
|
||||
)
|
||||
|
||||
with open("./local-file.txt", "rb") as data:
|
||||
blob_client.upload_blob(data, overwrite=True)
|
||||
|
||||
# From bytes/string
|
||||
blob_client.upload_blob(b"Hello, World!", overwrite=True)
|
||||
|
||||
# From stream
|
||||
import io
|
||||
stream = io.BytesIO(b"Stream content")
|
||||
blob_client.upload_blob(stream, overwrite=True)
|
||||
```
|
||||
|
||||
### Download Blob
|
||||
|
||||
```python
|
||||
blob_client = blob_service_client.get_blob_client(
|
||||
container="mycontainer",
|
||||
blob="sample.txt"
|
||||
)
|
||||
|
||||
# To file
|
||||
with open("./downloaded.txt", "wb") as file:
|
||||
download_stream = blob_client.download_blob()
|
||||
file.write(download_stream.readall())
|
||||
|
||||
# To memory
|
||||
download_stream = blob_client.download_blob()
|
||||
content = download_stream.readall() # bytes
|
||||
|
||||
# Read into existing buffer
|
||||
stream = io.BytesIO()
|
||||
num_bytes = blob_client.download_blob().readinto(stream)
|
||||
```
|
||||
|
||||
### List Blobs
|
||||
|
||||
```python
|
||||
container_client = blob_service_client.get_container_client("mycontainer")
|
||||
|
||||
# List all blobs
|
||||
for blob in container_client.list_blobs():
|
||||
print(f"{blob.name} - {blob.size} bytes")
|
||||
|
||||
# List with prefix (folder-like)
|
||||
for blob in container_client.list_blobs(name_starts_with="logs/"):
|
||||
print(blob.name)
|
||||
|
||||
# Walk blob hierarchy (virtual directories)
|
||||
for item in container_client.walk_blobs(delimiter="/"):
|
||||
if item.get("prefix"):
|
||||
print(f"Directory: {item['prefix']}")
|
||||
else:
|
||||
print(f"Blob: {item.name}")
|
||||
```
|
||||
|
||||
### Delete Blob
|
||||
|
||||
```python
|
||||
blob_client.delete_blob()
|
||||
|
||||
# Delete with snapshots
|
||||
blob_client.delete_blob(delete_snapshots="include")
|
||||
```
|
||||
|
||||
## Performance Tuning
|
||||
|
||||
```python
|
||||
# Configure chunk sizes for large uploads/downloads
|
||||
blob_client = BlobClient(
|
||||
account_url=account_url,
|
||||
container_name="mycontainer",
|
||||
blob_name="large-file.zip",
|
||||
credential=credential,
|
||||
max_block_size=4 * 1024 * 1024, # 4 MiB blocks
|
||||
max_single_put_size=64 * 1024 * 1024 # 64 MiB single upload limit
|
||||
)
|
||||
|
||||
# Parallel upload
|
||||
blob_client.upload_blob(data, max_concurrency=4)
|
||||
|
||||
# Parallel download
|
||||
download_stream = blob_client.download_blob(max_concurrency=4)
|
||||
```
|
||||
|
||||
## SAS Tokens
|
||||
|
||||
```python
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from azure.storage.blob import generate_blob_sas, BlobSasPermissions
|
||||
|
||||
sas_token = generate_blob_sas(
|
||||
account_name="<account>",
|
||||
container_name="mycontainer",
|
||||
blob_name="sample.txt",
|
||||
account_key="<account-key>", # Or use user delegation key
|
||||
permission=BlobSasPermissions(read=True),
|
||||
expiry=datetime.now(timezone.utc) + timedelta(hours=1)
|
||||
)
|
||||
|
||||
# Use SAS token
|
||||
blob_url = f"https://<account>.blob.core.windows.net/mycontainer/sample.txt?{sas_token}"
|
||||
```
|
||||
|
||||
## Blob Properties and Metadata
|
||||
|
||||
```python
|
||||
# Get properties
|
||||
properties = blob_client.get_blob_properties()
|
||||
print(f"Size: {properties.size}")
|
||||
print(f"Content-Type: {properties.content_settings.content_type}")
|
||||
print(f"Last modified: {properties.last_modified}")
|
||||
|
||||
# Set metadata
|
||||
blob_client.set_blob_metadata(metadata={"category": "logs", "year": "2024"})
|
||||
|
||||
# Set content type
|
||||
from azure.storage.blob import ContentSettings
|
||||
blob_client.set_http_headers(
|
||||
content_settings=ContentSettings(content_type="application/json")
|
||||
)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
from azure.storage.blob.aio import BlobServiceClient
|
||||
|
||||
async def upload_async():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with BlobServiceClient(account_url, credential=credential) as client:
|
||||
blob_client = client.get_blob_client("mycontainer", "sample.txt")
|
||||
|
||||
with open("./file.txt", "rb") as data:
|
||||
await blob_client.upload_blob(data, overwrite=True)
|
||||
|
||||
# Download async
|
||||
async def download_async():
|
||||
async with BlobServiceClient(account_url, credential=credential) as client:
|
||||
blob_client = client.get_blob_client("mycontainer", "sample.txt")
|
||||
|
||||
stream = await blob_client.download_blob()
|
||||
data = await stream.readall()
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** instead of connection strings
|
||||
2. **Use context managers** for async clients
|
||||
3. **Set `overwrite=True`** explicitly when re-uploading
|
||||
4. **Use `max_concurrency`** for large file transfers
|
||||
5. **Prefer `readinto()`** over `readall()` for memory efficiency
|
||||
6. **Use `walk_blobs()`** for hierarchical listing
|
||||
7. **Set appropriate content types** for web-served blobs
|
||||
239
skills/official/microsoft/python/data/cosmos-db/SKILL.md
Normal file
239
skills/official/microsoft/python/data/cosmos-db/SKILL.md
Normal file
@@ -0,0 +1,239 @@
|
||||
---
|
||||
name: azure-cosmos-db-py
|
||||
description: Build Azure Cosmos DB NoSQL services with Python/FastAPI following production-grade patterns. Use when implementing database client setup with dual auth (DefaultAzureCredential + emulator), service layer classes with CRUD operations, partition key strategies, parameterized queries, or TDD patterns for Cosmos. Triggers on phrases like "Cosmos DB", "NoSQL database", "document store", "add persistence", "database service layer", or "Python Cosmos SDK".
|
||||
package: azure-cosmos
|
||||
---
|
||||
|
||||
# Cosmos DB Service Implementation
|
||||
|
||||
Build production-grade Azure Cosmos DB NoSQL services following clean code, security best practices, and TDD principles.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-cosmos azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
COSMOS_ENDPOINT=https://<account>.documents.azure.com:443/
|
||||
COSMOS_DATABASE_NAME=<database-name>
|
||||
COSMOS_CONTAINER_ID=<container-id>
|
||||
# For emulator only (not production)
|
||||
COSMOS_KEY=<emulator-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
**DefaultAzureCredential (preferred)**:
|
||||
```python
|
||||
from azure.cosmos import CosmosClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = CosmosClient(
|
||||
url=os.environ["COSMOS_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
**Emulator (local development)**:
|
||||
```python
|
||||
from azure.cosmos import CosmosClient
|
||||
|
||||
client = CosmosClient(
|
||||
url="https://localhost:8081",
|
||||
credential=os.environ["COSMOS_KEY"],
|
||||
connection_verify=False
|
||||
)
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ FastAPI Router │
|
||||
│ - Auth dependencies (get_current_user, get_current_user_required)
|
||||
│ - HTTP error responses (HTTPException) │
|
||||
└──────────────────────────────┬──────────────────────────────────┘
|
||||
│
|
||||
┌──────────────────────────────▼──────────────────────────────────┐
|
||||
│ Service Layer │
|
||||
│ - Business logic and validation │
|
||||
│ - Document ↔ Model conversion │
|
||||
│ - Graceful degradation when Cosmos unavailable │
|
||||
└──────────────────────────────┬──────────────────────────────────┘
|
||||
│
|
||||
┌──────────────────────────────▼──────────────────────────────────┐
|
||||
│ Cosmos DB Client Module │
|
||||
│ - Singleton container initialization │
|
||||
│ - Dual auth: DefaultAzureCredential (Azure) / Key (emulator) │
|
||||
│ - Async wrapper via run_in_threadpool │
|
||||
└─────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Client Module Setup
|
||||
|
||||
Create a singleton Cosmos client with dual authentication:
|
||||
|
||||
```python
|
||||
# db/cosmos.py
|
||||
from azure.cosmos import CosmosClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from starlette.concurrency import run_in_threadpool
|
||||
|
||||
_cosmos_container = None
|
||||
|
||||
def _is_emulator_endpoint(endpoint: str) -> bool:
|
||||
return "localhost" in endpoint or "127.0.0.1" in endpoint
|
||||
|
||||
async def get_container():
|
||||
global _cosmos_container
|
||||
if _cosmos_container is None:
|
||||
if _is_emulator_endpoint(settings.cosmos_endpoint):
|
||||
client = CosmosClient(
|
||||
url=settings.cosmos_endpoint,
|
||||
credential=settings.cosmos_key,
|
||||
connection_verify=False
|
||||
)
|
||||
else:
|
||||
client = CosmosClient(
|
||||
url=settings.cosmos_endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
db = client.get_database_client(settings.cosmos_database_name)
|
||||
_cosmos_container = db.get_container_client(settings.cosmos_container_id)
|
||||
return _cosmos_container
|
||||
```
|
||||
|
||||
**Full implementation**: See [references/client-setup.md](references/client-setup.md)
|
||||
|
||||
### 2. Pydantic Model Hierarchy
|
||||
|
||||
Use five-tier model pattern for clean separation:
|
||||
|
||||
```python
|
||||
class ProjectBase(BaseModel): # Shared fields
|
||||
name: str = Field(..., min_length=1, max_length=200)
|
||||
|
||||
class ProjectCreate(ProjectBase): # Creation request
|
||||
workspace_id: str = Field(..., alias="workspaceId")
|
||||
|
||||
class ProjectUpdate(BaseModel): # Partial updates (all optional)
|
||||
name: Optional[str] = Field(None, min_length=1)
|
||||
|
||||
class Project(ProjectBase): # API response
|
||||
id: str
|
||||
created_at: datetime = Field(..., alias="createdAt")
|
||||
|
||||
class ProjectInDB(Project): # Internal with docType
|
||||
doc_type: str = "project"
|
||||
```
|
||||
|
||||
### 3. Service Layer Pattern
|
||||
|
||||
```python
|
||||
class ProjectService:
|
||||
def _use_cosmos(self) -> bool:
|
||||
return get_container() is not None
|
||||
|
||||
async def get_by_id(self, project_id: str, workspace_id: str) -> Project | None:
|
||||
if not self._use_cosmos():
|
||||
return None
|
||||
doc = await get_document(project_id, partition_key=workspace_id)
|
||||
if doc is None:
|
||||
return None
|
||||
return self._doc_to_model(doc)
|
||||
```
|
||||
|
||||
**Full patterns**: See [references/service-layer.md](references/service-layer.md)
|
||||
|
||||
## Core Principles
|
||||
|
||||
### Security Requirements
|
||||
|
||||
1. **RBAC Authentication**: Use `DefaultAzureCredential` in Azure — never store keys in code
|
||||
2. **Emulator-Only Keys**: Hardcode the well-known emulator key only for local development
|
||||
3. **Parameterized Queries**: Always use `@parameter` syntax — never string concatenation
|
||||
4. **Partition Key Validation**: Validate partition key access matches user authorization
|
||||
|
||||
### Clean Code Conventions
|
||||
|
||||
1. **Single Responsibility**: Client module handles connection; services handle business logic
|
||||
2. **Graceful Degradation**: Services return `None`/`[]` when Cosmos unavailable
|
||||
3. **Consistent Naming**: `_doc_to_model()`, `_model_to_doc()`, `_use_cosmos()`
|
||||
4. **Type Hints**: Full typing on all public methods
|
||||
5. **CamelCase Aliases**: Use `Field(alias="camelCase")` for JSON serialization
|
||||
|
||||
### TDD Requirements
|
||||
|
||||
Write tests BEFORE implementation using these patterns:
|
||||
|
||||
```python
|
||||
@pytest.fixture
|
||||
def mock_cosmos_container(mocker):
|
||||
container = mocker.MagicMock()
|
||||
mocker.patch("app.db.cosmos.get_container", return_value=container)
|
||||
return container
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_project_by_id_returns_project(mock_cosmos_container):
|
||||
# Arrange
|
||||
mock_cosmos_container.read_item.return_value = {"id": "123", "name": "Test"}
|
||||
|
||||
# Act
|
||||
result = await project_service.get_by_id("123", "workspace-1")
|
||||
|
||||
# Assert
|
||||
assert result.id == "123"
|
||||
assert result.name == "Test"
|
||||
```
|
||||
|
||||
**Full testing guide**: See [references/testing.md](references/testing.md)
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | When to Read |
|
||||
|------|--------------|
|
||||
| [references/client-setup.md](references/client-setup.md) | Setting up Cosmos client with dual auth, SSL config, singleton pattern |
|
||||
| [references/service-layer.md](references/service-layer.md) | Implementing full service class with CRUD, conversions, graceful degradation |
|
||||
| [references/testing.md](references/testing.md) | Writing pytest tests, mocking Cosmos, integration test setup |
|
||||
| [references/partitioning.md](references/partitioning.md) | Choosing partition keys, cross-partition queries, move operations |
|
||||
| [references/error-handling.md](references/error-handling.md) | Handling CosmosResourceNotFoundError, logging, HTTP error mapping |
|
||||
|
||||
## Template Files
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| [assets/cosmos_client_template.py](assets/cosmos_client_template.py) | Ready-to-use client module |
|
||||
| [assets/service_template.py](assets/service_template.py) | Service class skeleton |
|
||||
| [assets/conftest_template.py](assets/conftest_template.py) | pytest fixtures for Cosmos mocking |
|
||||
|
||||
## Quality Attributes (NFRs)
|
||||
|
||||
### Reliability
|
||||
- Graceful degradation when Cosmos unavailable
|
||||
- Retry logic with exponential backoff for transient failures
|
||||
- Connection pooling via singleton pattern
|
||||
|
||||
### Security
|
||||
- Zero secrets in code (RBAC via DefaultAzureCredential)
|
||||
- Parameterized queries prevent injection
|
||||
- Partition key isolation enforces data boundaries
|
||||
|
||||
### Maintainability
|
||||
- Five-tier model pattern enables schema evolution
|
||||
- Service layer decouples business logic from storage
|
||||
- Consistent patterns across all entity services
|
||||
|
||||
### Testability
|
||||
- Dependency injection via `get_container()`
|
||||
- Easy mocking with module-level globals
|
||||
- Clear separation enables unit testing without Cosmos
|
||||
|
||||
### Performance
|
||||
- Partition key queries avoid cross-partition scans
|
||||
- Async wrapping prevents blocking FastAPI event loop
|
||||
- Minimal document conversion overhead
|
||||
280
skills/official/microsoft/python/data/cosmos/SKILL.md
Normal file
280
skills/official/microsoft/python/data/cosmos/SKILL.md
Normal file
@@ -0,0 +1,280 @@
|
||||
---
|
||||
name: azure-cosmos-py
|
||||
description: |
|
||||
Azure Cosmos DB SDK for Python (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
|
||||
Triggers: "cosmos db", "CosmosClient", "container", "document", "NoSQL", "partition key".
|
||||
package: azure-cosmos
|
||||
---
|
||||
|
||||
# Azure Cosmos DB SDK for Python
|
||||
|
||||
Client library for Azure Cosmos DB NoSQL API — globally distributed, multi-model database.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-cosmos azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
COSMOS_ENDPOINT=https://<account>.documents.azure.com:443/
|
||||
COSMOS_DATABASE=mydb
|
||||
COSMOS_CONTAINER=mycontainer
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.cosmos import CosmosClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
endpoint = "https://<account>.documents.azure.com:443/"
|
||||
|
||||
client = CosmosClient(url=endpoint, credential=credential)
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
| Client | Purpose | Get From |
|
||||
|--------|---------|----------|
|
||||
| `CosmosClient` | Account-level operations | Direct instantiation |
|
||||
| `DatabaseProxy` | Database operations | `client.get_database_client()` |
|
||||
| `ContainerProxy` | Container/item operations | `database.get_container_client()` |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Setup Database and Container
|
||||
|
||||
```python
|
||||
# Get or create database
|
||||
database = client.create_database_if_not_exists(id="mydb")
|
||||
|
||||
# Get or create container with partition key
|
||||
container = database.create_container_if_not_exists(
|
||||
id="mycontainer",
|
||||
partition_key=PartitionKey(path="/category")
|
||||
)
|
||||
|
||||
# Get existing
|
||||
database = client.get_database_client("mydb")
|
||||
container = database.get_container_client("mycontainer")
|
||||
```
|
||||
|
||||
### Create Item
|
||||
|
||||
```python
|
||||
item = {
|
||||
"id": "item-001", # Required: unique within partition
|
||||
"category": "electronics", # Partition key value
|
||||
"name": "Laptop",
|
||||
"price": 999.99,
|
||||
"tags": ["computer", "portable"]
|
||||
}
|
||||
|
||||
created = container.create_item(body=item)
|
||||
print(f"Created: {created['id']}")
|
||||
```
|
||||
|
||||
### Read Item
|
||||
|
||||
```python
|
||||
# Read requires id AND partition key
|
||||
item = container.read_item(
|
||||
item="item-001",
|
||||
partition_key="electronics"
|
||||
)
|
||||
print(f"Name: {item['name']}")
|
||||
```
|
||||
|
||||
### Update Item (Replace)
|
||||
|
||||
```python
|
||||
item = container.read_item(item="item-001", partition_key="electronics")
|
||||
item["price"] = 899.99
|
||||
item["on_sale"] = True
|
||||
|
||||
updated = container.replace_item(item=item["id"], body=item)
|
||||
```
|
||||
|
||||
### Upsert Item
|
||||
|
||||
```python
|
||||
# Create if not exists, replace if exists
|
||||
item = {
|
||||
"id": "item-002",
|
||||
"category": "electronics",
|
||||
"name": "Tablet",
|
||||
"price": 499.99
|
||||
}
|
||||
|
||||
result = container.upsert_item(body=item)
|
||||
```
|
||||
|
||||
### Delete Item
|
||||
|
||||
```python
|
||||
container.delete_item(
|
||||
item="item-001",
|
||||
partition_key="electronics"
|
||||
)
|
||||
```
|
||||
|
||||
## Queries
|
||||
|
||||
### Basic Query
|
||||
|
||||
```python
|
||||
# Query within a partition (efficient)
|
||||
query = "SELECT * FROM c WHERE c.price < @max_price"
|
||||
items = container.query_items(
|
||||
query=query,
|
||||
parameters=[{"name": "@max_price", "value": 500}],
|
||||
partition_key="electronics"
|
||||
)
|
||||
|
||||
for item in items:
|
||||
print(f"{item['name']}: ${item['price']}")
|
||||
```
|
||||
|
||||
### Cross-Partition Query
|
||||
|
||||
```python
|
||||
# Cross-partition (more expensive, use sparingly)
|
||||
query = "SELECT * FROM c WHERE c.price < @max_price"
|
||||
items = container.query_items(
|
||||
query=query,
|
||||
parameters=[{"name": "@max_price", "value": 500}],
|
||||
enable_cross_partition_query=True
|
||||
)
|
||||
|
||||
for item in items:
|
||||
print(item)
|
||||
```
|
||||
|
||||
### Query with Projection
|
||||
|
||||
```python
|
||||
query = "SELECT c.id, c.name, c.price FROM c WHERE c.category = @category"
|
||||
items = container.query_items(
|
||||
query=query,
|
||||
parameters=[{"name": "@category", "value": "electronics"}],
|
||||
partition_key="electronics"
|
||||
)
|
||||
```
|
||||
|
||||
### Read All Items
|
||||
|
||||
```python
|
||||
# Read all in a partition
|
||||
items = container.read_all_items() # Cross-partition
|
||||
# Or with partition key
|
||||
items = container.query_items(
|
||||
query="SELECT * FROM c",
|
||||
partition_key="electronics"
|
||||
)
|
||||
```
|
||||
|
||||
## Partition Keys
|
||||
|
||||
**Critical**: Always include partition key for efficient operations.
|
||||
|
||||
```python
|
||||
from azure.cosmos import PartitionKey
|
||||
|
||||
# Single partition key
|
||||
container = database.create_container_if_not_exists(
|
||||
id="orders",
|
||||
partition_key=PartitionKey(path="/customer_id")
|
||||
)
|
||||
|
||||
# Hierarchical partition key (preview)
|
||||
container = database.create_container_if_not_exists(
|
||||
id="events",
|
||||
partition_key=PartitionKey(path=["/tenant_id", "/user_id"])
|
||||
)
|
||||
```
|
||||
|
||||
## Throughput
|
||||
|
||||
```python
|
||||
# Create container with provisioned throughput
|
||||
container = database.create_container_if_not_exists(
|
||||
id="mycontainer",
|
||||
partition_key=PartitionKey(path="/pk"),
|
||||
offer_throughput=400 # RU/s
|
||||
)
|
||||
|
||||
# Read current throughput
|
||||
offer = container.read_offer()
|
||||
print(f"Throughput: {offer.offer_throughput} RU/s")
|
||||
|
||||
# Update throughput
|
||||
container.replace_throughput(throughput=1000)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.cosmos.aio import CosmosClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def cosmos_operations():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with CosmosClient(endpoint, credential=credential) as client:
|
||||
database = client.get_database_client("mydb")
|
||||
container = database.get_container_client("mycontainer")
|
||||
|
||||
# Create
|
||||
await container.create_item(body={"id": "1", "pk": "test"})
|
||||
|
||||
# Read
|
||||
item = await container.read_item(item="1", partition_key="test")
|
||||
|
||||
# Query
|
||||
async for item in container.query_items(
|
||||
query="SELECT * FROM c",
|
||||
partition_key="test"
|
||||
):
|
||||
print(item)
|
||||
|
||||
import asyncio
|
||||
asyncio.run(cosmos_operations())
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
from azure.cosmos.exceptions import CosmosHttpResponseError
|
||||
|
||||
try:
|
||||
item = container.read_item(item="nonexistent", partition_key="pk")
|
||||
except CosmosHttpResponseError as e:
|
||||
if e.status_code == 404:
|
||||
print("Item not found")
|
||||
elif e.status_code == 429:
|
||||
print(f"Rate limited. Retry after: {e.headers.get('x-ms-retry-after-ms')}ms")
|
||||
else:
|
||||
raise
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always specify partition key** for point reads and queries
|
||||
2. **Use parameterized queries** to prevent injection and improve caching
|
||||
3. **Avoid cross-partition queries** when possible
|
||||
4. **Use `upsert_item`** for idempotent writes
|
||||
5. **Use async client** for high-throughput scenarios
|
||||
6. **Design partition key** for even data distribution
|
||||
7. **Use `read_item`** instead of query for single document retrieval
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | Contents |
|
||||
|------|----------|
|
||||
| [references/partitioning.md](references/partitioning.md) | Partition key strategies, hierarchical keys, hot partition detection and mitigation |
|
||||
| [references/query-patterns.md](references/query-patterns.md) | Query optimization, aggregations, pagination, transactions, change feed |
|
||||
| [scripts/setup_cosmos_container.py](scripts/setup_cosmos_container.py) | CLI tool for creating containers with partitioning, throughput, and indexing |
|
||||
211
skills/official/microsoft/python/data/datalake/SKILL.md
Normal file
211
skills/official/microsoft/python/data/datalake/SKILL.md
Normal file
@@ -0,0 +1,211 @@
|
||||
---
|
||||
name: azure-storage-file-datalake-py
|
||||
description: |
|
||||
Azure Data Lake Storage Gen2 SDK for Python. Use for hierarchical file systems, big data analytics, and file/directory operations.
|
||||
Triggers: "data lake", "DataLakeServiceClient", "FileSystemClient", "ADLS Gen2", "hierarchical namespace".
|
||||
package: azure-storage-file-datalake
|
||||
---
|
||||
|
||||
# Azure Data Lake Storage Gen2 SDK for Python
|
||||
|
||||
Hierarchical file system for big data analytics workloads.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-storage-file-datalake azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_STORAGE_ACCOUNT_URL=https://<account>.dfs.core.windows.net
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.storage.filedatalake import DataLakeServiceClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
account_url = "https://<account>.dfs.core.windows.net"
|
||||
|
||||
service_client = DataLakeServiceClient(account_url=account_url, credential=credential)
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `DataLakeServiceClient` | Account-level operations |
|
||||
| `FileSystemClient` | Container (file system) operations |
|
||||
| `DataLakeDirectoryClient` | Directory operations |
|
||||
| `DataLakeFileClient` | File operations |
|
||||
|
||||
## File System Operations
|
||||
|
||||
```python
|
||||
# Create file system (container)
|
||||
file_system_client = service_client.create_file_system("myfilesystem")
|
||||
|
||||
# Get existing
|
||||
file_system_client = service_client.get_file_system_client("myfilesystem")
|
||||
|
||||
# Delete
|
||||
service_client.delete_file_system("myfilesystem")
|
||||
|
||||
# List file systems
|
||||
for fs in service_client.list_file_systems():
|
||||
print(fs.name)
|
||||
```
|
||||
|
||||
## Directory Operations
|
||||
|
||||
```python
|
||||
file_system_client = service_client.get_file_system_client("myfilesystem")
|
||||
|
||||
# Create directory
|
||||
directory_client = file_system_client.create_directory("mydir")
|
||||
|
||||
# Create nested directories
|
||||
directory_client = file_system_client.create_directory("path/to/nested/dir")
|
||||
|
||||
# Get directory client
|
||||
directory_client = file_system_client.get_directory_client("mydir")
|
||||
|
||||
# Delete directory
|
||||
directory_client.delete_directory()
|
||||
|
||||
# Rename/move directory
|
||||
directory_client.rename_directory(new_name="myfilesystem/newname")
|
||||
```
|
||||
|
||||
## File Operations
|
||||
|
||||
### Upload File
|
||||
|
||||
```python
|
||||
# Get file client
|
||||
file_client = file_system_client.get_file_client("path/to/file.txt")
|
||||
|
||||
# Upload from local file
|
||||
with open("local-file.txt", "rb") as data:
|
||||
file_client.upload_data(data, overwrite=True)
|
||||
|
||||
# Upload bytes
|
||||
file_client.upload_data(b"Hello, Data Lake!", overwrite=True)
|
||||
|
||||
# Append data (for large files)
|
||||
file_client.append_data(data=b"chunk1", offset=0, length=6)
|
||||
file_client.append_data(data=b"chunk2", offset=6, length=6)
|
||||
file_client.flush_data(12) # Commit the data
|
||||
```
|
||||
|
||||
### Download File
|
||||
|
||||
```python
|
||||
file_client = file_system_client.get_file_client("path/to/file.txt")
|
||||
|
||||
# Download all content
|
||||
download = file_client.download_file()
|
||||
content = download.readall()
|
||||
|
||||
# Download to file
|
||||
with open("downloaded.txt", "wb") as f:
|
||||
download = file_client.download_file()
|
||||
download.readinto(f)
|
||||
|
||||
# Download range
|
||||
download = file_client.download_file(offset=0, length=100)
|
||||
```
|
||||
|
||||
### Delete File
|
||||
|
||||
```python
|
||||
file_client.delete_file()
|
||||
```
|
||||
|
||||
## List Contents
|
||||
|
||||
```python
|
||||
# List paths (files and directories)
|
||||
for path in file_system_client.get_paths():
|
||||
print(f"{'DIR' if path.is_directory else 'FILE'}: {path.name}")
|
||||
|
||||
# List paths in directory
|
||||
for path in file_system_client.get_paths(path="mydir"):
|
||||
print(path.name)
|
||||
|
||||
# Recursive listing
|
||||
for path in file_system_client.get_paths(path="mydir", recursive=True):
|
||||
print(path.name)
|
||||
```
|
||||
|
||||
## File/Directory Properties
|
||||
|
||||
```python
|
||||
# Get properties
|
||||
properties = file_client.get_file_properties()
|
||||
print(f"Size: {properties.size}")
|
||||
print(f"Last modified: {properties.last_modified}")
|
||||
|
||||
# Set metadata
|
||||
file_client.set_metadata(metadata={"processed": "true"})
|
||||
```
|
||||
|
||||
## Access Control (ACL)
|
||||
|
||||
```python
|
||||
# Get ACL
|
||||
acl = directory_client.get_access_control()
|
||||
print(f"Owner: {acl['owner']}")
|
||||
print(f"Permissions: {acl['permissions']}")
|
||||
|
||||
# Set ACL
|
||||
directory_client.set_access_control(
|
||||
owner="user-id",
|
||||
permissions="rwxr-x---"
|
||||
)
|
||||
|
||||
# Update ACL entries
|
||||
from azure.storage.filedatalake import AccessControlChangeResult
|
||||
directory_client.update_access_control_recursive(
|
||||
acl="user:user-id:rwx"
|
||||
)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.storage.filedatalake.aio import DataLakeServiceClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def datalake_operations():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with DataLakeServiceClient(
|
||||
account_url="https://<account>.dfs.core.windows.net",
|
||||
credential=credential
|
||||
) as service_client:
|
||||
file_system_client = service_client.get_file_system_client("myfilesystem")
|
||||
file_client = file_system_client.get_file_client("test.txt")
|
||||
|
||||
await file_client.upload_data(b"async content", overwrite=True)
|
||||
|
||||
download = await file_client.download_file()
|
||||
content = await download.readall()
|
||||
|
||||
import asyncio
|
||||
asyncio.run(datalake_operations())
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use hierarchical namespace** for file system semantics
|
||||
2. **Use `append_data` + `flush_data`** for large file uploads
|
||||
3. **Set ACLs at directory level** and inherit to children
|
||||
4. **Use async client** for high-throughput scenarios
|
||||
5. **Use `get_paths` with `recursive=True`** for full directory listing
|
||||
6. **Set metadata** for custom file attributes
|
||||
7. **Consider Blob API** for simple object storage use cases
|
||||
238
skills/official/microsoft/python/data/fileshare/SKILL.md
Normal file
238
skills/official/microsoft/python/data/fileshare/SKILL.md
Normal file
@@ -0,0 +1,238 @@
|
||||
---
|
||||
name: azure-storage-file-share-py
|
||||
description: |
|
||||
Azure Storage File Share SDK for Python. Use for SMB file shares, directories, and file operations in the cloud.
|
||||
Triggers: "azure-storage-file-share", "ShareServiceClient", "ShareClient", "file share", "SMB".
|
||||
---
|
||||
|
||||
# Azure Storage File Share SDK for Python
|
||||
|
||||
Manage SMB file shares for cloud-native and lift-and-shift scenarios.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-storage-file-share
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...;AccountKey=...
|
||||
# Or
|
||||
AZURE_STORAGE_ACCOUNT_URL=https://<account>.file.core.windows.net
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Connection String
|
||||
|
||||
```python
|
||||
from azure.storage.fileshare import ShareServiceClient
|
||||
|
||||
service = ShareServiceClient.from_connection_string(
|
||||
os.environ["AZURE_STORAGE_CONNECTION_STRING"]
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID
|
||||
|
||||
```python
|
||||
from azure.storage.fileshare import ShareServiceClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
service = ShareServiceClient(
|
||||
account_url=os.environ["AZURE_STORAGE_ACCOUNT_URL"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Share Operations
|
||||
|
||||
### Create Share
|
||||
|
||||
```python
|
||||
share = service.create_share("my-share")
|
||||
```
|
||||
|
||||
### List Shares
|
||||
|
||||
```python
|
||||
for share in service.list_shares():
|
||||
print(f"{share.name}: {share.quota} GB")
|
||||
```
|
||||
|
||||
### Get Share Client
|
||||
|
||||
```python
|
||||
share_client = service.get_share_client("my-share")
|
||||
```
|
||||
|
||||
### Delete Share
|
||||
|
||||
```python
|
||||
service.delete_share("my-share")
|
||||
```
|
||||
|
||||
## Directory Operations
|
||||
|
||||
### Create Directory
|
||||
|
||||
```python
|
||||
share_client = service.get_share_client("my-share")
|
||||
share_client.create_directory("my-directory")
|
||||
|
||||
# Nested directory
|
||||
share_client.create_directory("my-directory/sub-directory")
|
||||
```
|
||||
|
||||
### List Directories and Files
|
||||
|
||||
```python
|
||||
directory_client = share_client.get_directory_client("my-directory")
|
||||
|
||||
for item in directory_client.list_directories_and_files():
|
||||
if item["is_directory"]:
|
||||
print(f"[DIR] {item['name']}")
|
||||
else:
|
||||
print(f"[FILE] {item['name']} ({item['size']} bytes)")
|
||||
```
|
||||
|
||||
### Delete Directory
|
||||
|
||||
```python
|
||||
share_client.delete_directory("my-directory")
|
||||
```
|
||||
|
||||
## File Operations
|
||||
|
||||
### Upload File
|
||||
|
||||
```python
|
||||
file_client = share_client.get_file_client("my-directory/file.txt")
|
||||
|
||||
# From string
|
||||
file_client.upload_file("Hello, World!")
|
||||
|
||||
# From file
|
||||
with open("local-file.txt", "rb") as f:
|
||||
file_client.upload_file(f)
|
||||
|
||||
# From bytes
|
||||
file_client.upload_file(b"Binary content")
|
||||
```
|
||||
|
||||
### Download File
|
||||
|
||||
```python
|
||||
file_client = share_client.get_file_client("my-directory/file.txt")
|
||||
|
||||
# To bytes
|
||||
data = file_client.download_file().readall()
|
||||
|
||||
# To file
|
||||
with open("downloaded.txt", "wb") as f:
|
||||
data = file_client.download_file()
|
||||
data.readinto(f)
|
||||
|
||||
# Stream chunks
|
||||
download = file_client.download_file()
|
||||
for chunk in download.chunks():
|
||||
process(chunk)
|
||||
```
|
||||
|
||||
### Get File Properties
|
||||
|
||||
```python
|
||||
properties = file_client.get_file_properties()
|
||||
print(f"Size: {properties.size}")
|
||||
print(f"Content type: {properties.content_settings.content_type}")
|
||||
print(f"Last modified: {properties.last_modified}")
|
||||
```
|
||||
|
||||
### Delete File
|
||||
|
||||
```python
|
||||
file_client.delete_file()
|
||||
```
|
||||
|
||||
### Copy File
|
||||
|
||||
```python
|
||||
source_url = "https://account.file.core.windows.net/share/source.txt"
|
||||
dest_client = share_client.get_file_client("destination.txt")
|
||||
dest_client.start_copy_from_url(source_url)
|
||||
```
|
||||
|
||||
## Range Operations
|
||||
|
||||
### Upload Range
|
||||
|
||||
```python
|
||||
# Upload to specific range
|
||||
file_client.upload_range(data=b"content", offset=0, length=7)
|
||||
```
|
||||
|
||||
### Download Range
|
||||
|
||||
```python
|
||||
# Download specific range
|
||||
download = file_client.download_file(offset=0, length=100)
|
||||
data = download.readall()
|
||||
```
|
||||
|
||||
## Snapshot Operations
|
||||
|
||||
### Create Snapshot
|
||||
|
||||
```python
|
||||
snapshot = share_client.create_snapshot()
|
||||
print(f"Snapshot: {snapshot['snapshot']}")
|
||||
```
|
||||
|
||||
### Access Snapshot
|
||||
|
||||
```python
|
||||
snapshot_client = service.get_share_client(
|
||||
"my-share",
|
||||
snapshot=snapshot["snapshot"]
|
||||
)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.storage.fileshare.aio import ShareServiceClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def upload_file():
|
||||
credential = DefaultAzureCredential()
|
||||
service = ShareServiceClient(account_url, credential=credential)
|
||||
|
||||
share = service.get_share_client("my-share")
|
||||
file_client = share.get_file_client("test.txt")
|
||||
|
||||
await file_client.upload_file("Hello!")
|
||||
|
||||
await service.close()
|
||||
await credential.close()
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `ShareServiceClient` | Account-level operations |
|
||||
| `ShareClient` | Share operations |
|
||||
| `ShareDirectoryClient` | Directory operations |
|
||||
| `ShareFileClient` | File operations |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use connection string** for simplest setup
|
||||
2. **Use Entra ID** for production with RBAC
|
||||
3. **Stream large files** using chunks() to avoid memory issues
|
||||
4. **Create snapshots** before major changes
|
||||
5. **Set quotas** to prevent unexpected storage costs
|
||||
6. **Use ranges** for partial file updates
|
||||
7. **Close async clients** explicitly
|
||||
213
skills/official/microsoft/python/data/queue/SKILL.md
Normal file
213
skills/official/microsoft/python/data/queue/SKILL.md
Normal file
@@ -0,0 +1,213 @@
|
||||
---
|
||||
name: azure-storage-queue-py
|
||||
description: |
|
||||
Azure Queue Storage SDK for Python. Use for reliable message queuing, task distribution, and asynchronous processing.
|
||||
Triggers: "queue storage", "QueueServiceClient", "QueueClient", "message queue", "dequeue".
|
||||
package: azure-storage-queue
|
||||
---
|
||||
|
||||
# Azure Queue Storage SDK for Python
|
||||
|
||||
Simple, cost-effective message queuing for asynchronous communication.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-storage-queue azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_STORAGE_ACCOUNT_URL=https://<account>.queue.core.windows.net
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.storage.queue import QueueServiceClient, QueueClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
account_url = "https://<account>.queue.core.windows.net"
|
||||
|
||||
# Service client
|
||||
service_client = QueueServiceClient(account_url=account_url, credential=credential)
|
||||
|
||||
# Queue client
|
||||
queue_client = QueueClient(account_url=account_url, queue_name="myqueue", credential=credential)
|
||||
```
|
||||
|
||||
## Queue Operations
|
||||
|
||||
```python
|
||||
# Create queue
|
||||
service_client.create_queue("myqueue")
|
||||
|
||||
# Get queue client
|
||||
queue_client = service_client.get_queue_client("myqueue")
|
||||
|
||||
# Delete queue
|
||||
service_client.delete_queue("myqueue")
|
||||
|
||||
# List queues
|
||||
for queue in service_client.list_queues():
|
||||
print(queue.name)
|
||||
```
|
||||
|
||||
## Send Messages
|
||||
|
||||
```python
|
||||
# Send message (string)
|
||||
queue_client.send_message("Hello, Queue!")
|
||||
|
||||
# Send with options
|
||||
queue_client.send_message(
|
||||
content="Delayed message",
|
||||
visibility_timeout=60, # Hidden for 60 seconds
|
||||
time_to_live=3600 # Expires in 1 hour
|
||||
)
|
||||
|
||||
# Send JSON
|
||||
import json
|
||||
data = {"task": "process", "id": 123}
|
||||
queue_client.send_message(json.dumps(data))
|
||||
```
|
||||
|
||||
## Receive Messages
|
||||
|
||||
```python
|
||||
# Receive messages (makes them invisible temporarily)
|
||||
messages = queue_client.receive_messages(
|
||||
messages_per_page=10,
|
||||
visibility_timeout=30 # 30 seconds to process
|
||||
)
|
||||
|
||||
for message in messages:
|
||||
print(f"ID: {message.id}")
|
||||
print(f"Content: {message.content}")
|
||||
print(f"Dequeue count: {message.dequeue_count}")
|
||||
|
||||
# Process message...
|
||||
|
||||
# Delete after processing
|
||||
queue_client.delete_message(message)
|
||||
```
|
||||
|
||||
## Peek Messages
|
||||
|
||||
```python
|
||||
# Peek without hiding (doesn't affect visibility)
|
||||
messages = queue_client.peek_messages(max_messages=5)
|
||||
|
||||
for message in messages:
|
||||
print(message.content)
|
||||
```
|
||||
|
||||
## Update Message
|
||||
|
||||
```python
|
||||
# Extend visibility or update content
|
||||
messages = queue_client.receive_messages()
|
||||
for message in messages:
|
||||
# Extend timeout (need more time)
|
||||
queue_client.update_message(
|
||||
message,
|
||||
visibility_timeout=60
|
||||
)
|
||||
|
||||
# Update content and timeout
|
||||
queue_client.update_message(
|
||||
message,
|
||||
content="Updated content",
|
||||
visibility_timeout=60
|
||||
)
|
||||
```
|
||||
|
||||
## Delete Message
|
||||
|
||||
```python
|
||||
# Delete after successful processing
|
||||
messages = queue_client.receive_messages()
|
||||
for message in messages:
|
||||
try:
|
||||
# Process...
|
||||
queue_client.delete_message(message)
|
||||
except Exception:
|
||||
# Message becomes visible again after timeout
|
||||
pass
|
||||
```
|
||||
|
||||
## Clear Queue
|
||||
|
||||
```python
|
||||
# Delete all messages
|
||||
queue_client.clear_messages()
|
||||
```
|
||||
|
||||
## Queue Properties
|
||||
|
||||
```python
|
||||
# Get queue properties
|
||||
properties = queue_client.get_queue_properties()
|
||||
print(f"Approximate message count: {properties.approximate_message_count}")
|
||||
|
||||
# Set/get metadata
|
||||
queue_client.set_queue_metadata(metadata={"environment": "production"})
|
||||
properties = queue_client.get_queue_properties()
|
||||
print(properties.metadata)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.storage.queue.aio import QueueServiceClient, QueueClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def queue_operations():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with QueueClient(
|
||||
account_url="https://<account>.queue.core.windows.net",
|
||||
queue_name="myqueue",
|
||||
credential=credential
|
||||
) as client:
|
||||
# Send
|
||||
await client.send_message("Async message")
|
||||
|
||||
# Receive
|
||||
async for message in client.receive_messages():
|
||||
print(message.content)
|
||||
await client.delete_message(message)
|
||||
|
||||
import asyncio
|
||||
asyncio.run(queue_operations())
|
||||
```
|
||||
|
||||
## Base64 Encoding
|
||||
|
||||
```python
|
||||
from azure.storage.queue import QueueClient, BinaryBase64EncodePolicy, BinaryBase64DecodePolicy
|
||||
|
||||
# For binary data
|
||||
queue_client = QueueClient(
|
||||
account_url=account_url,
|
||||
queue_name="myqueue",
|
||||
credential=credential,
|
||||
message_encode_policy=BinaryBase64EncodePolicy(),
|
||||
message_decode_policy=BinaryBase64DecodePolicy()
|
||||
)
|
||||
|
||||
# Send bytes
|
||||
queue_client.send_message(b"Binary content")
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Delete messages after processing** to prevent reprocessing
|
||||
2. **Set appropriate visibility timeout** based on processing time
|
||||
3. **Handle `dequeue_count`** for poison message detection
|
||||
4. **Use async client** for high-throughput scenarios
|
||||
5. **Use `peek_messages`** for monitoring without affecting queue
|
||||
6. **Set `time_to_live`** to prevent stale messages
|
||||
7. **Consider Service Bus** for advanced features (sessions, topics)
|
||||
243
skills/official/microsoft/python/data/tables/SKILL.md
Normal file
243
skills/official/microsoft/python/data/tables/SKILL.md
Normal file
@@ -0,0 +1,243 @@
|
||||
---
|
||||
name: azure-data-tables-py
|
||||
description: |
|
||||
Azure Tables SDK for Python (Storage and Cosmos DB). Use for NoSQL key-value storage, entity CRUD, and batch operations.
|
||||
Triggers: "table storage", "TableServiceClient", "TableClient", "entities", "PartitionKey", "RowKey".
|
||||
package: azure-data-tables
|
||||
---
|
||||
|
||||
# Azure Tables SDK for Python
|
||||
|
||||
NoSQL key-value store for structured data (Azure Storage Tables or Cosmos DB Table API).
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-data-tables azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Azure Storage Tables
|
||||
AZURE_STORAGE_ACCOUNT_URL=https://<account>.table.core.windows.net
|
||||
|
||||
# Cosmos DB Table API
|
||||
COSMOS_TABLE_ENDPOINT=https://<account>.table.cosmos.azure.com
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.data.tables import TableServiceClient, TableClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
endpoint = "https://<account>.table.core.windows.net"
|
||||
|
||||
# Service client (manage tables)
|
||||
service_client = TableServiceClient(endpoint=endpoint, credential=credential)
|
||||
|
||||
# Table client (work with entities)
|
||||
table_client = TableClient(endpoint=endpoint, table_name="mytable", credential=credential)
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `TableServiceClient` | Create/delete tables, list tables |
|
||||
| `TableClient` | Entity CRUD, queries |
|
||||
|
||||
## Table Operations
|
||||
|
||||
```python
|
||||
# Create table
|
||||
service_client.create_table("mytable")
|
||||
|
||||
# Create if not exists
|
||||
service_client.create_table_if_not_exists("mytable")
|
||||
|
||||
# Delete table
|
||||
service_client.delete_table("mytable")
|
||||
|
||||
# List tables
|
||||
for table in service_client.list_tables():
|
||||
print(table.name)
|
||||
|
||||
# Get table client
|
||||
table_client = service_client.get_table_client("mytable")
|
||||
```
|
||||
|
||||
## Entity Operations
|
||||
|
||||
**Important**: Every entity requires `PartitionKey` and `RowKey` (together form unique ID).
|
||||
|
||||
### Create Entity
|
||||
|
||||
```python
|
||||
entity = {
|
||||
"PartitionKey": "sales",
|
||||
"RowKey": "order-001",
|
||||
"product": "Widget",
|
||||
"quantity": 5,
|
||||
"price": 9.99,
|
||||
"shipped": False
|
||||
}
|
||||
|
||||
# Create (fails if exists)
|
||||
table_client.create_entity(entity=entity)
|
||||
|
||||
# Upsert (create or replace)
|
||||
table_client.upsert_entity(entity=entity)
|
||||
```
|
||||
|
||||
### Get Entity
|
||||
|
||||
```python
|
||||
# Get by key (fastest)
|
||||
entity = table_client.get_entity(
|
||||
partition_key="sales",
|
||||
row_key="order-001"
|
||||
)
|
||||
print(f"Product: {entity['product']}")
|
||||
```
|
||||
|
||||
### Update Entity
|
||||
|
||||
```python
|
||||
# Replace entire entity
|
||||
entity["quantity"] = 10
|
||||
table_client.update_entity(entity=entity, mode="replace")
|
||||
|
||||
# Merge (update specific fields only)
|
||||
update = {
|
||||
"PartitionKey": "sales",
|
||||
"RowKey": "order-001",
|
||||
"shipped": True
|
||||
}
|
||||
table_client.update_entity(entity=update, mode="merge")
|
||||
```
|
||||
|
||||
### Delete Entity
|
||||
|
||||
```python
|
||||
table_client.delete_entity(
|
||||
partition_key="sales",
|
||||
row_key="order-001"
|
||||
)
|
||||
```
|
||||
|
||||
## Query Entities
|
||||
|
||||
### Query Within Partition
|
||||
|
||||
```python
|
||||
# Query by partition (efficient)
|
||||
entities = table_client.query_entities(
|
||||
query_filter="PartitionKey eq 'sales'"
|
||||
)
|
||||
for entity in entities:
|
||||
print(entity)
|
||||
```
|
||||
|
||||
### Query with Filters
|
||||
|
||||
```python
|
||||
# Filter by properties
|
||||
entities = table_client.query_entities(
|
||||
query_filter="PartitionKey eq 'sales' and quantity gt 3"
|
||||
)
|
||||
|
||||
# With parameters (safer)
|
||||
entities = table_client.query_entities(
|
||||
query_filter="PartitionKey eq @pk and price lt @max_price",
|
||||
parameters={"pk": "sales", "max_price": 50.0}
|
||||
)
|
||||
```
|
||||
|
||||
### Select Specific Properties
|
||||
|
||||
```python
|
||||
entities = table_client.query_entities(
|
||||
query_filter="PartitionKey eq 'sales'",
|
||||
select=["RowKey", "product", "price"]
|
||||
)
|
||||
```
|
||||
|
||||
### List All Entities
|
||||
|
||||
```python
|
||||
# List all (cross-partition - use sparingly)
|
||||
for entity in table_client.list_entities():
|
||||
print(entity)
|
||||
```
|
||||
|
||||
## Batch Operations
|
||||
|
||||
```python
|
||||
from azure.data.tables import TableTransactionError
|
||||
|
||||
# Batch operations (same partition only!)
|
||||
operations = [
|
||||
("create", {"PartitionKey": "batch", "RowKey": "1", "data": "first"}),
|
||||
("create", {"PartitionKey": "batch", "RowKey": "2", "data": "second"}),
|
||||
("upsert", {"PartitionKey": "batch", "RowKey": "3", "data": "third"}),
|
||||
]
|
||||
|
||||
try:
|
||||
table_client.submit_transaction(operations)
|
||||
except TableTransactionError as e:
|
||||
print(f"Transaction failed: {e}")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.data.tables.aio import TableServiceClient, TableClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def table_operations():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with TableClient(
|
||||
endpoint="https://<account>.table.core.windows.net",
|
||||
table_name="mytable",
|
||||
credential=credential
|
||||
) as client:
|
||||
# Create
|
||||
await client.create_entity(entity={
|
||||
"PartitionKey": "async",
|
||||
"RowKey": "1",
|
||||
"data": "test"
|
||||
})
|
||||
|
||||
# Query
|
||||
async for entity in client.query_entities("PartitionKey eq 'async'"):
|
||||
print(entity)
|
||||
|
||||
import asyncio
|
||||
asyncio.run(table_operations())
|
||||
```
|
||||
|
||||
## Data Types
|
||||
|
||||
| Python Type | Table Storage Type |
|
||||
|-------------|-------------------|
|
||||
| `str` | String |
|
||||
| `int` | Int64 |
|
||||
| `float` | Double |
|
||||
| `bool` | Boolean |
|
||||
| `datetime` | DateTime |
|
||||
| `bytes` | Binary |
|
||||
| `UUID` | Guid |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Design partition keys** for query patterns and even distribution
|
||||
2. **Query within partitions** whenever possible (cross-partition is expensive)
|
||||
3. **Use batch operations** for multiple entities in same partition
|
||||
4. **Use `upsert_entity`** for idempotent writes
|
||||
5. **Use parameterized queries** to prevent injection
|
||||
6. **Keep entities small** — max 1MB per entity
|
||||
7. **Use async client** for high-throughput scenarios
|
||||
192
skills/official/microsoft/python/entra/azure-identity/SKILL.md
Normal file
192
skills/official/microsoft/python/entra/azure-identity/SKILL.md
Normal file
@@ -0,0 +1,192 @@
|
||||
---
|
||||
name: azure-identity-py
|
||||
description: |
|
||||
Azure Identity SDK for Python authentication. Use for DefaultAzureCredential, managed identity, service principals, and token caching.
|
||||
Triggers: "azure-identity", "DefaultAzureCredential", "authentication", "managed identity", "service principal", "credential".
|
||||
package: azure-identity
|
||||
---
|
||||
|
||||
# Azure Identity SDK for Python
|
||||
|
||||
Authentication library for Azure SDK clients using Microsoft Entra ID (formerly Azure AD).
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Service Principal (for production/CI)
|
||||
AZURE_TENANT_ID=<your-tenant-id>
|
||||
AZURE_CLIENT_ID=<your-client-id>
|
||||
AZURE_CLIENT_SECRET=<your-client-secret>
|
||||
|
||||
# User-assigned Managed Identity (optional)
|
||||
AZURE_CLIENT_ID=<managed-identity-client-id>
|
||||
```
|
||||
|
||||
## DefaultAzureCredential
|
||||
|
||||
The recommended credential for most scenarios. Tries multiple authentication methods in order:
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.storage.blob import BlobServiceClient
|
||||
|
||||
# Works in local dev AND production without code changes
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
client = BlobServiceClient(
|
||||
account_url="https://<account>.blob.core.windows.net",
|
||||
credential=credential
|
||||
)
|
||||
```
|
||||
|
||||
### Credential Chain Order
|
||||
|
||||
| Order | Credential | Environment |
|
||||
|-------|-----------|-------------|
|
||||
| 1 | EnvironmentCredential | CI/CD, containers |
|
||||
| 2 | WorkloadIdentityCredential | Kubernetes |
|
||||
| 3 | ManagedIdentityCredential | Azure VMs, App Service, Functions |
|
||||
| 4 | SharedTokenCacheCredential | Windows only |
|
||||
| 5 | VisualStudioCodeCredential | VS Code with Azure extension |
|
||||
| 6 | AzureCliCredential | `az login` |
|
||||
| 7 | AzurePowerShellCredential | `Connect-AzAccount` |
|
||||
| 8 | AzureDeveloperCliCredential | `azd auth login` |
|
||||
|
||||
### Customizing DefaultAzureCredential
|
||||
|
||||
```python
|
||||
# Exclude credentials you don't need
|
||||
credential = DefaultAzureCredential(
|
||||
exclude_environment_credential=True,
|
||||
exclude_shared_token_cache_credential=True,
|
||||
managed_identity_client_id="<user-assigned-mi-client-id>" # For user-assigned MI
|
||||
)
|
||||
|
||||
# Enable interactive browser (disabled by default)
|
||||
credential = DefaultAzureCredential(
|
||||
exclude_interactive_browser_credential=False
|
||||
)
|
||||
```
|
||||
|
||||
## Specific Credential Types
|
||||
|
||||
### ManagedIdentityCredential
|
||||
|
||||
For Azure-hosted resources (VMs, App Service, Functions, AKS):
|
||||
|
||||
```python
|
||||
from azure.identity import ManagedIdentityCredential
|
||||
|
||||
# System-assigned managed identity
|
||||
credential = ManagedIdentityCredential()
|
||||
|
||||
# User-assigned managed identity
|
||||
credential = ManagedIdentityCredential(
|
||||
client_id="<user-assigned-mi-client-id>"
|
||||
)
|
||||
```
|
||||
|
||||
### ClientSecretCredential
|
||||
|
||||
For service principal with secret:
|
||||
|
||||
```python
|
||||
from azure.identity import ClientSecretCredential
|
||||
|
||||
credential = ClientSecretCredential(
|
||||
tenant_id=os.environ["AZURE_TENANT_ID"],
|
||||
client_id=os.environ["AZURE_CLIENT_ID"],
|
||||
client_secret=os.environ["AZURE_CLIENT_SECRET"]
|
||||
)
|
||||
```
|
||||
|
||||
### AzureCliCredential
|
||||
|
||||
Uses the account from `az login`:
|
||||
|
||||
```python
|
||||
from azure.identity import AzureCliCredential
|
||||
|
||||
credential = AzureCliCredential()
|
||||
```
|
||||
|
||||
### ChainedTokenCredential
|
||||
|
||||
Custom credential chain:
|
||||
|
||||
```python
|
||||
from azure.identity import (
|
||||
ChainedTokenCredential,
|
||||
ManagedIdentityCredential,
|
||||
AzureCliCredential
|
||||
)
|
||||
|
||||
# Try managed identity first, fall back to CLI
|
||||
credential = ChainedTokenCredential(
|
||||
ManagedIdentityCredential(client_id="<user-assigned-mi-client-id>"),
|
||||
AzureCliCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Credential Types Table
|
||||
|
||||
| Credential | Use Case | Auth Method |
|
||||
|------------|----------|-------------|
|
||||
| `DefaultAzureCredential` | Most scenarios | Auto-detect |
|
||||
| `ManagedIdentityCredential` | Azure-hosted apps | Managed Identity |
|
||||
| `ClientSecretCredential` | Service principal | Client secret |
|
||||
| `ClientCertificateCredential` | Service principal | Certificate |
|
||||
| `AzureCliCredential` | Local development | Azure CLI |
|
||||
| `AzureDeveloperCliCredential` | Local development | Azure Developer CLI |
|
||||
| `InteractiveBrowserCredential` | User sign-in | Browser OAuth |
|
||||
| `DeviceCodeCredential` | Headless/SSH | Device code flow |
|
||||
|
||||
## Getting Tokens Directly
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
# Get token for a specific scope
|
||||
token = credential.get_token("https://management.azure.com/.default")
|
||||
print(f"Token expires: {token.expires_on}")
|
||||
|
||||
# For Azure Database for PostgreSQL
|
||||
token = credential.get_token("https://ossrdbms-aad.database.windows.net/.default")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
from azure.storage.blob.aio import BlobServiceClient
|
||||
|
||||
async def main():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with BlobServiceClient(
|
||||
account_url="https://<account>.blob.core.windows.net",
|
||||
credential=credential
|
||||
) as client:
|
||||
# ... async operations
|
||||
pass
|
||||
|
||||
await credential.close()
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for code that runs locally and in Azure
|
||||
2. **Never hardcode credentials** — use environment variables or managed identity
|
||||
3. **Prefer managed identity** in production Azure deployments
|
||||
4. **Use ChainedTokenCredential** when you need a custom credential order
|
||||
5. **Close async credentials** explicitly or use context managers
|
||||
6. **Set AZURE_CLIENT_ID** for user-assigned managed identities
|
||||
7. **Exclude unused credentials** to speed up authentication
|
||||
247
skills/official/microsoft/python/entra/keyvault/SKILL.md
Normal file
247
skills/official/microsoft/python/entra/keyvault/SKILL.md
Normal file
@@ -0,0 +1,247 @@
|
||||
---
|
||||
name: azure-keyvault-py
|
||||
description: |
|
||||
Azure Key Vault SDK for Python. Use for secrets, keys, and certificates management with secure storage.
|
||||
Triggers: "key vault", "SecretClient", "KeyClient", "CertificateClient", "secrets", "encryption keys".
|
||||
package: azure-keyvault-secrets, azure-keyvault-keys, azure-keyvault-certificates
|
||||
---
|
||||
|
||||
# Azure Key Vault SDK for Python
|
||||
|
||||
Secure storage and management for secrets, cryptographic keys, and certificates.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Secrets
|
||||
pip install azure-keyvault-secrets azure-identity
|
||||
|
||||
# Keys (cryptographic operations)
|
||||
pip install azure-keyvault-keys azure-identity
|
||||
|
||||
# Certificates
|
||||
pip install azure-keyvault-certificates azure-identity
|
||||
|
||||
# All
|
||||
pip install azure-keyvault-secrets azure-keyvault-keys azure-keyvault-certificates azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_KEYVAULT_URL=https://<vault-name>.vault.azure.net/
|
||||
```
|
||||
|
||||
## Secrets
|
||||
|
||||
### SecretClient Setup
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.keyvault.secrets import SecretClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
vault_url = "https://<vault-name>.vault.azure.net/"
|
||||
|
||||
client = SecretClient(vault_url=vault_url, credential=credential)
|
||||
```
|
||||
|
||||
### Secret Operations
|
||||
|
||||
```python
|
||||
# Set secret
|
||||
secret = client.set_secret("database-password", "super-secret-value")
|
||||
print(f"Created: {secret.name}, version: {secret.properties.version}")
|
||||
|
||||
# Get secret
|
||||
secret = client.get_secret("database-password")
|
||||
print(f"Value: {secret.value}")
|
||||
|
||||
# Get specific version
|
||||
secret = client.get_secret("database-password", version="abc123")
|
||||
|
||||
# List secrets (names only, not values)
|
||||
for secret_properties in client.list_properties_of_secrets():
|
||||
print(f"Secret: {secret_properties.name}")
|
||||
|
||||
# List versions
|
||||
for version in client.list_properties_of_secret_versions("database-password"):
|
||||
print(f"Version: {version.version}, Created: {version.created_on}")
|
||||
|
||||
# Delete secret (soft delete)
|
||||
poller = client.begin_delete_secret("database-password")
|
||||
deleted_secret = poller.result()
|
||||
|
||||
# Purge (permanent delete, if soft-delete enabled)
|
||||
client.purge_deleted_secret("database-password")
|
||||
|
||||
# Recover deleted secret
|
||||
client.begin_recover_deleted_secret("database-password").result()
|
||||
```
|
||||
|
||||
## Keys
|
||||
|
||||
### KeyClient Setup
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.keyvault.keys import KeyClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
vault_url = "https://<vault-name>.vault.azure.net/"
|
||||
|
||||
client = KeyClient(vault_url=vault_url, credential=credential)
|
||||
```
|
||||
|
||||
### Key Operations
|
||||
|
||||
```python
|
||||
from azure.keyvault.keys import KeyType
|
||||
|
||||
# Create RSA key
|
||||
rsa_key = client.create_rsa_key("rsa-key", size=2048)
|
||||
|
||||
# Create EC key
|
||||
ec_key = client.create_ec_key("ec-key", curve="P-256")
|
||||
|
||||
# Get key
|
||||
key = client.get_key("rsa-key")
|
||||
print(f"Key type: {key.key_type}")
|
||||
|
||||
# List keys
|
||||
for key_properties in client.list_properties_of_keys():
|
||||
print(f"Key: {key_properties.name}")
|
||||
|
||||
# Delete key
|
||||
poller = client.begin_delete_key("rsa-key")
|
||||
deleted_key = poller.result()
|
||||
```
|
||||
|
||||
### Cryptographic Operations
|
||||
|
||||
```python
|
||||
from azure.keyvault.keys.crypto import CryptographyClient, EncryptionAlgorithm
|
||||
|
||||
# Get crypto client for a specific key
|
||||
crypto_client = CryptographyClient(key, credential=credential)
|
||||
# Or from key ID
|
||||
crypto_client = CryptographyClient(
|
||||
"https://<vault>.vault.azure.net/keys/<key-name>/<version>",
|
||||
credential=credential
|
||||
)
|
||||
|
||||
# Encrypt
|
||||
plaintext = b"Hello, Key Vault!"
|
||||
result = crypto_client.encrypt(EncryptionAlgorithm.rsa_oaep, plaintext)
|
||||
ciphertext = result.ciphertext
|
||||
|
||||
# Decrypt
|
||||
result = crypto_client.decrypt(EncryptionAlgorithm.rsa_oaep, ciphertext)
|
||||
decrypted = result.plaintext
|
||||
|
||||
# Sign
|
||||
from azure.keyvault.keys.crypto import SignatureAlgorithm
|
||||
import hashlib
|
||||
|
||||
digest = hashlib.sha256(b"data to sign").digest()
|
||||
result = crypto_client.sign(SignatureAlgorithm.rs256, digest)
|
||||
signature = result.signature
|
||||
|
||||
# Verify
|
||||
result = crypto_client.verify(SignatureAlgorithm.rs256, digest, signature)
|
||||
print(f"Valid: {result.is_valid}")
|
||||
```
|
||||
|
||||
## Certificates
|
||||
|
||||
### CertificateClient Setup
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.keyvault.certificates import CertificateClient, CertificatePolicy
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
vault_url = "https://<vault-name>.vault.azure.net/"
|
||||
|
||||
client = CertificateClient(vault_url=vault_url, credential=credential)
|
||||
```
|
||||
|
||||
### Certificate Operations
|
||||
|
||||
```python
|
||||
# Create self-signed certificate
|
||||
policy = CertificatePolicy.get_default()
|
||||
poller = client.begin_create_certificate("my-cert", policy=policy)
|
||||
certificate = poller.result()
|
||||
|
||||
# Get certificate
|
||||
certificate = client.get_certificate("my-cert")
|
||||
print(f"Thumbprint: {certificate.properties.x509_thumbprint.hex()}")
|
||||
|
||||
# Get certificate with private key (as secret)
|
||||
from azure.keyvault.secrets import SecretClient
|
||||
secret_client = SecretClient(vault_url=vault_url, credential=credential)
|
||||
cert_secret = secret_client.get_secret("my-cert")
|
||||
# cert_secret.value contains PEM or PKCS12
|
||||
|
||||
# List certificates
|
||||
for cert in client.list_properties_of_certificates():
|
||||
print(f"Certificate: {cert.name}")
|
||||
|
||||
# Delete certificate
|
||||
poller = client.begin_delete_certificate("my-cert")
|
||||
deleted = poller.result()
|
||||
```
|
||||
|
||||
## Client Types Table
|
||||
|
||||
| Client | Package | Purpose |
|
||||
|--------|---------|---------|
|
||||
| `SecretClient` | `azure-keyvault-secrets` | Store/retrieve secrets |
|
||||
| `KeyClient` | `azure-keyvault-keys` | Manage cryptographic keys |
|
||||
| `CryptographyClient` | `azure-keyvault-keys` | Encrypt/decrypt/sign/verify |
|
||||
| `CertificateClient` | `azure-keyvault-certificates` | Manage certificates |
|
||||
|
||||
## Async Clients
|
||||
|
||||
```python
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
from azure.keyvault.secrets.aio import SecretClient
|
||||
|
||||
async def get_secret():
|
||||
credential = DefaultAzureCredential()
|
||||
client = SecretClient(vault_url=vault_url, credential=credential)
|
||||
|
||||
async with client:
|
||||
secret = await client.get_secret("my-secret")
|
||||
print(secret.value)
|
||||
|
||||
import asyncio
|
||||
asyncio.run(get_secret())
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
from azure.core.exceptions import ResourceNotFoundError, HttpResponseError
|
||||
|
||||
try:
|
||||
secret = client.get_secret("nonexistent")
|
||||
except ResourceNotFoundError:
|
||||
print("Secret not found")
|
||||
except HttpResponseError as e:
|
||||
if e.status_code == 403:
|
||||
print("Access denied - check RBAC permissions")
|
||||
raise
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for authentication
|
||||
2. **Use managed identity** in Azure-hosted applications
|
||||
3. **Enable soft-delete** for recovery (enabled by default)
|
||||
4. **Use RBAC** over access policies for fine-grained control
|
||||
5. **Rotate secrets** regularly using versioning
|
||||
6. **Use Key Vault references** in App Service/Functions config
|
||||
7. **Cache secrets** appropriately to reduce API calls
|
||||
8. **Use async clients** for high-throughput scenarios
|
||||
@@ -0,0 +1,333 @@
|
||||
---
|
||||
name: agent-framework-azure-ai-py
|
||||
description: Build Azure AI Foundry agents using the Microsoft Agent Framework Python SDK (agent-framework-azure-ai). Use when creating persistent agents with AzureAIAgentsProvider, using hosted tools (code interpreter, file search, web search), integrating MCP servers, managing conversation threads, or implementing streaming responses. Covers function tools, structured outputs, and multi-tool agents.
|
||||
package: agent-framework-azure-ai
|
||||
---
|
||||
|
||||
# Agent Framework Azure Hosted Agents
|
||||
|
||||
Build persistent agents on Azure AI Foundry using the Microsoft Agent Framework Python SDK.
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
User Query → AzureAIAgentsProvider → Azure AI Agent Service (Persistent)
|
||||
↓
|
||||
Agent.run() / Agent.run_stream()
|
||||
↓
|
||||
Tools: Functions | Hosted (Code/Search/Web) | MCP
|
||||
↓
|
||||
AgentThread (conversation persistence)
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Full framework (recommended)
|
||||
pip install agent-framework --pre
|
||||
|
||||
# Or Azure-specific package only
|
||||
pip install agent-framework-azure-ai --pre
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
export AZURE_AI_PROJECT_ENDPOINT="https://<project>.services.ai.azure.com/api/projects/<project-id>"
|
||||
export AZURE_AI_MODEL_DEPLOYMENT_NAME="gpt-4o-mini"
|
||||
export BING_CONNECTION_ID="your-bing-connection-id" # For web search
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity.aio import AzureCliCredential, DefaultAzureCredential
|
||||
|
||||
# Development
|
||||
credential = AzureCliCredential()
|
||||
|
||||
# Production
|
||||
credential = DefaultAzureCredential()
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Basic Agent
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="MyAgent",
|
||||
instructions="You are a helpful assistant.",
|
||||
)
|
||||
|
||||
result = await agent.run("Hello!")
|
||||
print(result.text)
|
||||
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
### Agent with Function Tools
|
||||
|
||||
```python
|
||||
from typing import Annotated
|
||||
from pydantic import Field
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
def get_weather(
|
||||
location: Annotated[str, Field(description="City name to get weather for")],
|
||||
) -> str:
|
||||
"""Get the current weather for a location."""
|
||||
return f"Weather in {location}: 72°F, sunny"
|
||||
|
||||
def get_current_time() -> str:
|
||||
"""Get the current UTC time."""
|
||||
from datetime import datetime, timezone
|
||||
return datetime.now(timezone.utc).strftime("%Y-%m-%d %H:%M:%S UTC")
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="WeatherAgent",
|
||||
instructions="You help with weather and time queries.",
|
||||
tools=[get_weather, get_current_time], # Pass functions directly
|
||||
)
|
||||
|
||||
result = await agent.run("What's the weather in Seattle?")
|
||||
print(result.text)
|
||||
```
|
||||
|
||||
### Agent with Hosted Tools
|
||||
|
||||
```python
|
||||
from agent_framework import (
|
||||
HostedCodeInterpreterTool,
|
||||
HostedFileSearchTool,
|
||||
HostedWebSearchTool,
|
||||
)
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="MultiToolAgent",
|
||||
instructions="You can execute code, search files, and search the web.",
|
||||
tools=[
|
||||
HostedCodeInterpreterTool(),
|
||||
HostedWebSearchTool(name="Bing"),
|
||||
],
|
||||
)
|
||||
|
||||
result = await agent.run("Calculate the factorial of 20 in Python")
|
||||
print(result.text)
|
||||
```
|
||||
|
||||
### Streaming Responses
|
||||
|
||||
```python
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="StreamingAgent",
|
||||
instructions="You are a helpful assistant.",
|
||||
)
|
||||
|
||||
print("Agent: ", end="", flush=True)
|
||||
async for chunk in agent.run_stream("Tell me a short story"):
|
||||
if chunk.text:
|
||||
print(chunk.text, end="", flush=True)
|
||||
print()
|
||||
```
|
||||
|
||||
### Conversation Threads
|
||||
|
||||
```python
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="ChatAgent",
|
||||
instructions="You are a helpful assistant.",
|
||||
tools=[get_weather],
|
||||
)
|
||||
|
||||
# Create thread for conversation persistence
|
||||
thread = agent.get_new_thread()
|
||||
|
||||
# First turn
|
||||
result1 = await agent.run("What's the weather in Seattle?", thread=thread)
|
||||
print(f"Agent: {result1.text}")
|
||||
|
||||
# Second turn - context is maintained
|
||||
result2 = await agent.run("What about Portland?", thread=thread)
|
||||
print(f"Agent: {result2.text}")
|
||||
|
||||
# Save thread ID for later resumption
|
||||
print(f"Conversation ID: {thread.conversation_id}")
|
||||
```
|
||||
|
||||
### Structured Outputs
|
||||
|
||||
```python
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
class WeatherResponse(BaseModel):
|
||||
model_config = ConfigDict(extra="forbid")
|
||||
|
||||
location: str
|
||||
temperature: float
|
||||
unit: str
|
||||
conditions: str
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="StructuredAgent",
|
||||
instructions="Provide weather information in structured format.",
|
||||
response_format=WeatherResponse,
|
||||
)
|
||||
|
||||
result = await agent.run("Weather in Seattle?")
|
||||
weather = WeatherResponse.model_validate_json(result.text)
|
||||
print(f"{weather.location}: {weather.temperature}°{weather.unit}")
|
||||
```
|
||||
|
||||
## Provider Methods
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `create_agent()` | Create new agent on Azure AI service |
|
||||
| `get_agent(agent_id)` | Retrieve existing agent by ID |
|
||||
| `as_agent(sdk_agent)` | Wrap SDK Agent object (no HTTP call) |
|
||||
|
||||
## Hosted Tools Quick Reference
|
||||
|
||||
| Tool | Import | Purpose |
|
||||
|------|--------|---------|
|
||||
| `HostedCodeInterpreterTool` | `from agent_framework import HostedCodeInterpreterTool` | Execute Python code |
|
||||
| `HostedFileSearchTool` | `from agent_framework import HostedFileSearchTool` | Search vector stores |
|
||||
| `HostedWebSearchTool` | `from agent_framework import HostedWebSearchTool` | Bing web search |
|
||||
| `HostedMCPTool` | `from agent_framework import HostedMCPTool` | Service-managed MCP |
|
||||
| `MCPStreamableHTTPTool` | `from agent_framework import MCPStreamableHTTPTool` | Client-managed MCP |
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from typing import Annotated
|
||||
from pydantic import BaseModel, Field
|
||||
from agent_framework import (
|
||||
HostedCodeInterpreterTool,
|
||||
HostedWebSearchTool,
|
||||
MCPStreamableHTTPTool,
|
||||
)
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
|
||||
def get_weather(
|
||||
location: Annotated[str, Field(description="City name")],
|
||||
) -> str:
|
||||
"""Get weather for a location."""
|
||||
return f"Weather in {location}: 72°F, sunny"
|
||||
|
||||
|
||||
class AnalysisResult(BaseModel):
|
||||
summary: str
|
||||
key_findings: list[str]
|
||||
confidence: float
|
||||
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
MCPStreamableHTTPTool(
|
||||
name="Docs MCP",
|
||||
url="https://learn.microsoft.com/api/mcp",
|
||||
) as mcp_tool,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="ResearchAssistant",
|
||||
instructions="You are a research assistant with multiple capabilities.",
|
||||
tools=[
|
||||
get_weather,
|
||||
HostedCodeInterpreterTool(),
|
||||
HostedWebSearchTool(name="Bing"),
|
||||
mcp_tool,
|
||||
],
|
||||
)
|
||||
|
||||
thread = agent.get_new_thread()
|
||||
|
||||
# Non-streaming
|
||||
result = await agent.run(
|
||||
"Search for Python best practices and summarize",
|
||||
thread=thread,
|
||||
)
|
||||
print(f"Response: {result.text}")
|
||||
|
||||
# Streaming
|
||||
print("\nStreaming: ", end="")
|
||||
async for chunk in agent.run_stream("Continue with examples", thread=thread):
|
||||
if chunk.text:
|
||||
print(chunk.text, end="", flush=True)
|
||||
print()
|
||||
|
||||
# Structured output
|
||||
result = await agent.run(
|
||||
"Analyze findings",
|
||||
thread=thread,
|
||||
response_format=AnalysisResult,
|
||||
)
|
||||
analysis = AnalysisResult.model_validate_json(result.text)
|
||||
print(f"\nConfidence: {analysis.confidence}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
## Conventions
|
||||
|
||||
- Always use async context managers: `async with provider:`
|
||||
- Pass functions directly to `tools=` parameter (auto-converted to AIFunction)
|
||||
- Use `Annotated[type, Field(description=...)]` for function parameters
|
||||
- Use `get_new_thread()` for multi-turn conversations
|
||||
- Prefer `HostedMCPTool` for service-managed MCP, `MCPStreamableHTTPTool` for client-managed
|
||||
|
||||
## Reference Files
|
||||
|
||||
- [references/tools.md](references/tools.md): Detailed hosted tool patterns
|
||||
- [references/mcp.md](references/mcp.md): MCP integration (hosted + local)
|
||||
- [references/threads.md](references/threads.md): Thread and conversation management
|
||||
- [references/advanced.md](references/advanced.md): OpenAPI, citations, structured outputs
|
||||
325
skills/official/microsoft/python/foundry/agents-v2/SKILL.md
Normal file
325
skills/official/microsoft/python/foundry/agents-v2/SKILL.md
Normal file
@@ -0,0 +1,325 @@
|
||||
---
|
||||
name: agents-v2-py
|
||||
description: |
|
||||
Build container-based Foundry Agents using Azure AI Projects SDK with ImageBasedHostedAgentDefinition.
|
||||
Use when creating hosted agents that run custom code in Azure AI Foundry with your own container images.
|
||||
Triggers: "ImageBasedHostedAgentDefinition", "hosted agent", "container agent", "Foundry Agent",
|
||||
"create_version", "ProtocolVersionRecord", "AgentProtocol.RESPONSES", "custom agent image".
|
||||
package: azure-ai-projects
|
||||
---
|
||||
|
||||
# Azure AI Hosted Agents (Python)
|
||||
|
||||
Build container-based hosted agents using `ImageBasedHostedAgentDefinition` from the Azure AI Projects SDK.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-projects>=2.0.0b3 azure-identity
|
||||
```
|
||||
|
||||
**Minimum SDK Version:** `2.0.0b3` or later required for hosted agent support.
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_AI_PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Before creating hosted agents:
|
||||
|
||||
1. **Container Image** - Build and push to Azure Container Registry (ACR)
|
||||
2. **ACR Pull Permissions** - Grant your project's managed identity `AcrPull` role on the ACR
|
||||
3. **Capability Host** - Account-level capability host with `enablePublicHostingEnvironment=true`
|
||||
4. **SDK Version** - Ensure `azure-ai-projects>=2.0.0b3`
|
||||
|
||||
## Authentication
|
||||
|
||||
Always use `DefaultAzureCredential`:
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.ai.projects import AIProjectClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=credential
|
||||
)
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Imports
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.ai.projects import AIProjectClient
|
||||
from azure.ai.projects.models import (
|
||||
ImageBasedHostedAgentDefinition,
|
||||
ProtocolVersionRecord,
|
||||
AgentProtocol,
|
||||
)
|
||||
```
|
||||
|
||||
### 2. Create Hosted Agent
|
||||
|
||||
```python
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
agent = client.agents.create_version(
|
||||
agent_name="my-hosted-agent",
|
||||
definition=ImageBasedHostedAgentDefinition(
|
||||
container_protocol_versions=[
|
||||
ProtocolVersionRecord(protocol=AgentProtocol.RESPONSES, version="v1")
|
||||
],
|
||||
cpu="1",
|
||||
memory="2Gi",
|
||||
image="myregistry.azurecr.io/my-agent:latest",
|
||||
tools=[{"type": "code_interpreter"}],
|
||||
environment_variables={
|
||||
"AZURE_AI_PROJECT_ENDPOINT": os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
"MODEL_NAME": "gpt-4o-mini"
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Created agent: {agent.name} (version: {agent.version})")
|
||||
```
|
||||
|
||||
### 3. List Agent Versions
|
||||
|
||||
```python
|
||||
versions = client.agents.list_versions(agent_name="my-hosted-agent")
|
||||
for version in versions:
|
||||
print(f"Version: {version.version}, State: {version.state}")
|
||||
```
|
||||
|
||||
### 4. Delete Agent Version
|
||||
|
||||
```python
|
||||
client.agents.delete_version(
|
||||
agent_name="my-hosted-agent",
|
||||
version=agent.version
|
||||
)
|
||||
```
|
||||
|
||||
## ImageBasedHostedAgentDefinition Parameters
|
||||
|
||||
| Parameter | Type | Required | Description |
|
||||
|-----------|------|----------|-------------|
|
||||
| `container_protocol_versions` | `list[ProtocolVersionRecord]` | Yes | Protocol versions the agent supports |
|
||||
| `image` | `str` | Yes | Full container image path (registry/image:tag) |
|
||||
| `cpu` | `str` | No | CPU allocation (e.g., "1", "2") |
|
||||
| `memory` | `str` | No | Memory allocation (e.g., "2Gi", "4Gi") |
|
||||
| `tools` | `list[dict]` | No | Tools available to the agent |
|
||||
| `environment_variables` | `dict[str, str]` | No | Environment variables for the container |
|
||||
|
||||
## Protocol Versions
|
||||
|
||||
The `container_protocol_versions` parameter specifies which protocols your agent supports:
|
||||
|
||||
```python
|
||||
from azure.ai.projects.models import ProtocolVersionRecord, AgentProtocol
|
||||
|
||||
# RESPONSES protocol - standard agent responses
|
||||
container_protocol_versions=[
|
||||
ProtocolVersionRecord(protocol=AgentProtocol.RESPONSES, version="v1")
|
||||
]
|
||||
```
|
||||
|
||||
**Available Protocols:**
|
||||
| Protocol | Description |
|
||||
|----------|-------------|
|
||||
| `AgentProtocol.RESPONSES` | Standard response protocol for agent interactions |
|
||||
|
||||
## Resource Allocation
|
||||
|
||||
Specify CPU and memory for your container:
|
||||
|
||||
```python
|
||||
definition=ImageBasedHostedAgentDefinition(
|
||||
container_protocol_versions=[...],
|
||||
image="myregistry.azurecr.io/my-agent:latest",
|
||||
cpu="2", # 2 CPU cores
|
||||
memory="4Gi" # 4 GiB memory
|
||||
)
|
||||
```
|
||||
|
||||
**Resource Limits:**
|
||||
| Resource | Min | Max | Default |
|
||||
|----------|-----|-----|---------|
|
||||
| CPU | 0.5 | 4 | 1 |
|
||||
| Memory | 1Gi | 8Gi | 2Gi |
|
||||
|
||||
## Tools Configuration
|
||||
|
||||
Add tools to your hosted agent:
|
||||
|
||||
### Code Interpreter
|
||||
|
||||
```python
|
||||
tools=[{"type": "code_interpreter"}]
|
||||
```
|
||||
|
||||
### MCP Tools
|
||||
|
||||
```python
|
||||
tools=[
|
||||
{"type": "code_interpreter"},
|
||||
{
|
||||
"type": "mcp",
|
||||
"server_label": "my-mcp-server",
|
||||
"server_url": "https://my-mcp-server.example.com"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Multiple Tools
|
||||
|
||||
```python
|
||||
tools=[
|
||||
{"type": "code_interpreter"},
|
||||
{"type": "file_search"},
|
||||
{
|
||||
"type": "mcp",
|
||||
"server_label": "custom-tool",
|
||||
"server_url": "https://custom-tool.example.com"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Pass configuration to your container:
|
||||
|
||||
```python
|
||||
environment_variables={
|
||||
"AZURE_AI_PROJECT_ENDPOINT": os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
"MODEL_NAME": "gpt-4o-mini",
|
||||
"LOG_LEVEL": "INFO",
|
||||
"CUSTOM_CONFIG": "value"
|
||||
}
|
||||
```
|
||||
|
||||
**Best Practice:** Never hardcode secrets. Use environment variables or Azure Key Vault.
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.ai.projects import AIProjectClient
|
||||
from azure.ai.projects.models import (
|
||||
ImageBasedHostedAgentDefinition,
|
||||
ProtocolVersionRecord,
|
||||
AgentProtocol,
|
||||
)
|
||||
|
||||
def create_hosted_agent():
|
||||
"""Create a hosted agent with custom container image."""
|
||||
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
agent = client.agents.create_version(
|
||||
agent_name="data-processor-agent",
|
||||
definition=ImageBasedHostedAgentDefinition(
|
||||
container_protocol_versions=[
|
||||
ProtocolVersionRecord(
|
||||
protocol=AgentProtocol.RESPONSES,
|
||||
version="v1"
|
||||
)
|
||||
],
|
||||
image="myregistry.azurecr.io/data-processor:v1.0",
|
||||
cpu="2",
|
||||
memory="4Gi",
|
||||
tools=[
|
||||
{"type": "code_interpreter"},
|
||||
{"type": "file_search"}
|
||||
],
|
||||
environment_variables={
|
||||
"AZURE_AI_PROJECT_ENDPOINT": os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
"MODEL_NAME": "gpt-4o-mini",
|
||||
"MAX_RETRIES": "3"
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Created hosted agent: {agent.name}")
|
||||
print(f"Version: {agent.version}")
|
||||
print(f"State: {agent.state}")
|
||||
|
||||
return agent
|
||||
|
||||
if __name__ == "__main__":
|
||||
create_hosted_agent()
|
||||
```
|
||||
|
||||
## Async Pattern
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
from azure.ai.projects.aio import AIProjectClient
|
||||
from azure.ai.projects.models import (
|
||||
ImageBasedHostedAgentDefinition,
|
||||
ProtocolVersionRecord,
|
||||
AgentProtocol,
|
||||
)
|
||||
|
||||
async def create_hosted_agent_async():
|
||||
"""Create a hosted agent asynchronously."""
|
||||
|
||||
async with DefaultAzureCredential() as credential:
|
||||
async with AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=credential
|
||||
) as client:
|
||||
agent = await client.agents.create_version(
|
||||
agent_name="async-agent",
|
||||
definition=ImageBasedHostedAgentDefinition(
|
||||
container_protocol_versions=[
|
||||
ProtocolVersionRecord(
|
||||
protocol=AgentProtocol.RESPONSES,
|
||||
version="v1"
|
||||
)
|
||||
],
|
||||
image="myregistry.azurecr.io/async-agent:latest",
|
||||
cpu="1",
|
||||
memory="2Gi"
|
||||
)
|
||||
)
|
||||
return agent
|
||||
```
|
||||
|
||||
## Common Errors
|
||||
|
||||
| Error | Cause | Solution |
|
||||
|-------|-------|----------|
|
||||
| `ImagePullBackOff` | ACR pull permission denied | Grant `AcrPull` role to project's managed identity |
|
||||
| `InvalidContainerImage` | Image not found | Verify image path and tag exist in ACR |
|
||||
| `CapabilityHostNotFound` | No capability host configured | Create account-level capability host |
|
||||
| `ProtocolVersionNotSupported` | Invalid protocol version | Use `AgentProtocol.RESPONSES` with version `"v1"` |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Version Your Images** - Use specific tags, not `latest` in production
|
||||
2. **Minimal Resources** - Start with minimum CPU/memory, scale up as needed
|
||||
3. **Environment Variables** - Use for all configuration, never hardcode
|
||||
4. **Error Handling** - Wrap agent creation in try/except blocks
|
||||
5. **Cleanup** - Delete unused agent versions to free resources
|
||||
|
||||
## Reference Links
|
||||
|
||||
- [Azure AI Projects SDK](https://pypi.org/project/azure-ai-projects/)
|
||||
- [Hosted Agents Documentation](https://learn.microsoft.com/azure/ai-services/agents/how-to/hosted-agents)
|
||||
- [Azure Container Registry](https://learn.microsoft.com/azure/container-registry/)
|
||||
214
skills/official/microsoft/python/foundry/contentsafety/SKILL.md
Normal file
214
skills/official/microsoft/python/foundry/contentsafety/SKILL.md
Normal file
@@ -0,0 +1,214 @@
|
||||
---
|
||||
name: azure-ai-contentsafety-py
|
||||
description: |
|
||||
Azure AI Content Safety SDK for Python. Use for detecting harmful content in text and images with multi-severity classification.
|
||||
Triggers: "azure-ai-contentsafety", "ContentSafetyClient", "content moderation", "harmful content", "text analysis", "image analysis".
|
||||
package: azure-ai-contentsafety
|
||||
---
|
||||
|
||||
# Azure AI Content Safety SDK for Python
|
||||
|
||||
Detect harmful user-generated and AI-generated content in applications.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-contentsafety
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
CONTENT_SAFETY_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
CONTENT_SAFETY_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import ContentSafetyClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
import os
|
||||
|
||||
client = ContentSafetyClient(
|
||||
endpoint=os.environ["CONTENT_SAFETY_ENDPOINT"],
|
||||
credential=AzureKeyCredential(os.environ["CONTENT_SAFETY_KEY"])
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import ContentSafetyClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = ContentSafetyClient(
|
||||
endpoint=os.environ["CONTENT_SAFETY_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Analyze Text
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import ContentSafetyClient
|
||||
from azure.ai.contentsafety.models import AnalyzeTextOptions, TextCategory
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
client = ContentSafetyClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
request = AnalyzeTextOptions(text="Your text content to analyze")
|
||||
response = client.analyze_text(request)
|
||||
|
||||
# Check each category
|
||||
for category in [TextCategory.HATE, TextCategory.SELF_HARM,
|
||||
TextCategory.SEXUAL, TextCategory.VIOLENCE]:
|
||||
result = next((r for r in response.categories_analysis
|
||||
if r.category == category), None)
|
||||
if result:
|
||||
print(f"{category}: severity {result.severity}")
|
||||
```
|
||||
|
||||
## Analyze Image
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import ContentSafetyClient
|
||||
from azure.ai.contentsafety.models import AnalyzeImageOptions, ImageData
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
import base64
|
||||
|
||||
client = ContentSafetyClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
# From file
|
||||
with open("image.jpg", "rb") as f:
|
||||
image_data = base64.b64encode(f.read()).decode("utf-8")
|
||||
|
||||
request = AnalyzeImageOptions(
|
||||
image=ImageData(content=image_data)
|
||||
)
|
||||
|
||||
response = client.analyze_image(request)
|
||||
|
||||
for result in response.categories_analysis:
|
||||
print(f"{result.category}: severity {result.severity}")
|
||||
```
|
||||
|
||||
### Image from URL
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety.models import AnalyzeImageOptions, ImageData
|
||||
|
||||
request = AnalyzeImageOptions(
|
||||
image=ImageData(blob_url="https://example.com/image.jpg")
|
||||
)
|
||||
|
||||
response = client.analyze_image(request)
|
||||
```
|
||||
|
||||
## Text Blocklist Management
|
||||
|
||||
### Create Blocklist
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import BlocklistClient
|
||||
from azure.ai.contentsafety.models import TextBlocklist
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
blocklist_client = BlocklistClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
blocklist = TextBlocklist(
|
||||
blocklist_name="my-blocklist",
|
||||
description="Custom terms to block"
|
||||
)
|
||||
|
||||
result = blocklist_client.create_or_update_text_blocklist(
|
||||
blocklist_name="my-blocklist",
|
||||
options=blocklist
|
||||
)
|
||||
```
|
||||
|
||||
### Add Block Items
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety.models import AddOrUpdateTextBlocklistItemsOptions, TextBlocklistItem
|
||||
|
||||
items = AddOrUpdateTextBlocklistItemsOptions(
|
||||
blocklist_items=[
|
||||
TextBlocklistItem(text="blocked-term-1"),
|
||||
TextBlocklistItem(text="blocked-term-2")
|
||||
]
|
||||
)
|
||||
|
||||
result = blocklist_client.add_or_update_blocklist_items(
|
||||
blocklist_name="my-blocklist",
|
||||
options=items
|
||||
)
|
||||
```
|
||||
|
||||
### Analyze with Blocklist
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety.models import AnalyzeTextOptions
|
||||
|
||||
request = AnalyzeTextOptions(
|
||||
text="Text containing blocked-term-1",
|
||||
blocklist_names=["my-blocklist"],
|
||||
halt_on_blocklist_hit=True
|
||||
)
|
||||
|
||||
response = client.analyze_text(request)
|
||||
|
||||
if response.blocklists_match:
|
||||
for match in response.blocklists_match:
|
||||
print(f"Blocked: {match.blocklist_item_text}")
|
||||
```
|
||||
|
||||
## Severity Levels
|
||||
|
||||
Text analysis returns 4 severity levels (0, 2, 4, 6) by default. For 8 levels (0-7):
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety.models import AnalyzeTextOptions, AnalyzeTextOutputType
|
||||
|
||||
request = AnalyzeTextOptions(
|
||||
text="Your text",
|
||||
output_type=AnalyzeTextOutputType.EIGHT_SEVERITY_LEVELS
|
||||
)
|
||||
```
|
||||
|
||||
## Harm Categories
|
||||
|
||||
| Category | Description |
|
||||
|----------|-------------|
|
||||
| `Hate` | Attacks based on identity (race, religion, gender, etc.) |
|
||||
| `Sexual` | Sexual content, relationships, anatomy |
|
||||
| `Violence` | Physical harm, weapons, injury |
|
||||
| `SelfHarm` | Self-injury, suicide, eating disorders |
|
||||
|
||||
## Severity Scale
|
||||
|
||||
| Level | Text Range | Image Range | Meaning |
|
||||
|-------|------------|-------------|---------|
|
||||
| 0 | Safe | Safe | No harmful content |
|
||||
| 2 | Low | Low | Mild references |
|
||||
| 4 | Medium | Medium | Moderate content |
|
||||
| 6 | High | High | Severe content |
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `ContentSafetyClient` | Analyze text and images |
|
||||
| `BlocklistClient` | Manage custom blocklists |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use blocklists** for domain-specific terms
|
||||
2. **Set severity thresholds** appropriate for your use case
|
||||
3. **Handle multiple categories** — content can be harmful in multiple ways
|
||||
4. **Use halt_on_blocklist_hit** for immediate rejection
|
||||
5. **Log analysis results** for audit and improvement
|
||||
6. **Consider 8-severity mode** for finer-grained control
|
||||
7. **Pre-moderate AI outputs** before showing to users
|
||||
@@ -0,0 +1,273 @@
|
||||
---
|
||||
name: azure-ai-contentunderstanding-py
|
||||
description: |
|
||||
Azure AI Content Understanding SDK for Python. Use for multimodal content extraction from documents, images, audio, and video.
|
||||
Triggers: "azure-ai-contentunderstanding", "ContentUnderstandingClient", "multimodal analysis", "document extraction", "video analysis", "audio transcription".
|
||||
package: azure-ai-contentunderstanding
|
||||
---
|
||||
|
||||
# Azure AI Content Understanding SDK for Python
|
||||
|
||||
Multimodal AI service that extracts semantic content from documents, video, audio, and image files for RAG and automated workflows.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-contentunderstanding
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
CONTENTUNDERSTANDING_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.contentunderstanding import ContentUnderstandingClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
endpoint = os.environ["CONTENTUNDERSTANDING_ENDPOINT"]
|
||||
credential = DefaultAzureCredential()
|
||||
client = ContentUnderstandingClient(endpoint=endpoint, credential=credential)
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
Content Understanding operations are asynchronous long-running operations:
|
||||
|
||||
1. **Begin Analysis** — Start the analysis operation with `begin_analyze()` (returns a poller)
|
||||
2. **Poll for Results** — Poll until analysis completes (SDK handles this with `.result()`)
|
||||
3. **Process Results** — Extract structured results from `AnalyzeResult.contents`
|
||||
|
||||
## Prebuilt Analyzers
|
||||
|
||||
| Analyzer | Content Type | Purpose |
|
||||
|----------|--------------|---------|
|
||||
| `prebuilt-documentSearch` | Documents | Extract markdown for RAG applications |
|
||||
| `prebuilt-imageSearch` | Images | Extract content from images |
|
||||
| `prebuilt-audioSearch` | Audio | Transcribe audio with timing |
|
||||
| `prebuilt-videoSearch` | Video | Extract frames, transcripts, summaries |
|
||||
| `prebuilt-invoice` | Documents | Extract invoice fields |
|
||||
|
||||
## Analyze Document
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.contentunderstanding import ContentUnderstandingClient
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
endpoint = os.environ["CONTENTUNDERSTANDING_ENDPOINT"]
|
||||
client = ContentUnderstandingClient(
|
||||
endpoint=endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
# Analyze document from URL
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="prebuilt-documentSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/document.pdf")]
|
||||
)
|
||||
|
||||
result = poller.result()
|
||||
|
||||
# Access markdown content (contents is a list)
|
||||
content = result.contents[0]
|
||||
print(content.markdown)
|
||||
```
|
||||
|
||||
## Access Document Content Details
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import MediaContentKind, DocumentContent
|
||||
|
||||
content = result.contents[0]
|
||||
if content.kind == MediaContentKind.DOCUMENT:
|
||||
document_content: DocumentContent = content # type: ignore
|
||||
print(document_content.start_page_number)
|
||||
```
|
||||
|
||||
## Analyze Image
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="prebuilt-imageSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/image.jpg")]
|
||||
)
|
||||
result = poller.result()
|
||||
content = result.contents[0]
|
||||
print(content.markdown)
|
||||
```
|
||||
|
||||
## Analyze Video
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="prebuilt-videoSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/video.mp4")]
|
||||
)
|
||||
|
||||
result = poller.result()
|
||||
|
||||
# Access video content (AudioVisualContent)
|
||||
content = result.contents[0]
|
||||
|
||||
# Get transcript phrases with timing
|
||||
for phrase in content.transcript_phrases:
|
||||
print(f"[{phrase.start_time} - {phrase.end_time}]: {phrase.text}")
|
||||
|
||||
# Get key frames (for video)
|
||||
for frame in content.key_frames:
|
||||
print(f"Frame at {frame.time}: {frame.description}")
|
||||
```
|
||||
|
||||
## Analyze Audio
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="prebuilt-audioSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/audio.mp3")]
|
||||
)
|
||||
|
||||
result = poller.result()
|
||||
|
||||
# Access audio transcript
|
||||
content = result.contents[0]
|
||||
for phrase in content.transcript_phrases:
|
||||
print(f"[{phrase.start_time}] {phrase.text}")
|
||||
```
|
||||
|
||||
## Custom Analyzers
|
||||
|
||||
Create custom analyzers with field schemas for specialized extraction:
|
||||
|
||||
```python
|
||||
# Create custom analyzer
|
||||
analyzer = client.create_analyzer(
|
||||
analyzer_id="my-invoice-analyzer",
|
||||
analyzer={
|
||||
"description": "Custom invoice analyzer",
|
||||
"base_analyzer_id": "prebuilt-documentSearch",
|
||||
"field_schema": {
|
||||
"fields": {
|
||||
"vendor_name": {"type": "string"},
|
||||
"invoice_total": {"type": "number"},
|
||||
"line_items": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"description": {"type": "string"},
|
||||
"amount": {"type": "number"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
# Use custom analyzer
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="my-invoice-analyzer",
|
||||
inputs=[AnalyzeInput(url="https://example.com/invoice.pdf")]
|
||||
)
|
||||
|
||||
result = poller.result()
|
||||
|
||||
# Access extracted fields
|
||||
print(result.fields["vendor_name"])
|
||||
print(result.fields["invoice_total"])
|
||||
```
|
||||
|
||||
## Analyzer Management
|
||||
|
||||
```python
|
||||
# List all analyzers
|
||||
analyzers = client.list_analyzers()
|
||||
for analyzer in analyzers:
|
||||
print(f"{analyzer.analyzer_id}: {analyzer.description}")
|
||||
|
||||
# Get specific analyzer
|
||||
analyzer = client.get_analyzer("prebuilt-documentSearch")
|
||||
|
||||
# Delete custom analyzer
|
||||
client.delete_analyzer("my-custom-analyzer")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import os
|
||||
from azure.ai.contentunderstanding.aio import ContentUnderstandingClient
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def analyze_document():
|
||||
endpoint = os.environ["CONTENTUNDERSTANDING_ENDPOINT"]
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with ContentUnderstandingClient(
|
||||
endpoint=endpoint,
|
||||
credential=credential
|
||||
) as client:
|
||||
poller = await client.begin_analyze(
|
||||
analyzer_id="prebuilt-documentSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/doc.pdf")]
|
||||
)
|
||||
result = await poller.result()
|
||||
content = result.contents[0]
|
||||
return content.markdown
|
||||
|
||||
asyncio.run(analyze_document())
|
||||
```
|
||||
|
||||
## Content Types
|
||||
|
||||
| Class | For | Provides |
|
||||
|-------|-----|----------|
|
||||
| `DocumentContent` | PDF, images, Office docs | Pages, tables, figures, paragraphs |
|
||||
| `AudioVisualContent` | Audio, video files | Transcript phrases, timing, key frames |
|
||||
|
||||
Both derive from `MediaContent` which provides basic info and markdown representation.
|
||||
|
||||
## Model Imports
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import (
|
||||
AnalyzeInput,
|
||||
AnalyzeResult,
|
||||
MediaContentKind,
|
||||
DocumentContent,
|
||||
AudioVisualContent,
|
||||
)
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `ContentUnderstandingClient` | Sync client for all operations |
|
||||
| `ContentUnderstandingClient` (aio) | Async client for all operations |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `begin_analyze` with `AnalyzeInput`** — this is the correct method signature
|
||||
2. **Access results via `result.contents[0]`** — results are returned as a list
|
||||
3. **Use prebuilt analyzers** for common scenarios (document/image/audio/video search)
|
||||
4. **Create custom analyzers** only for domain-specific field extraction
|
||||
5. **Use async client** for high-throughput scenarios with `azure.identity.aio` credentials
|
||||
6. **Handle long-running operations** — video/audio analysis can take minutes
|
||||
7. **Use URL sources** when possible to avoid upload overhead
|
||||
271
skills/official/microsoft/python/foundry/ml/SKILL.md
Normal file
271
skills/official/microsoft/python/foundry/ml/SKILL.md
Normal file
@@ -0,0 +1,271 @@
|
||||
---
|
||||
name: azure-ai-ml-py
|
||||
description: |
|
||||
Azure Machine Learning SDK v2 for Python. Use for ML workspaces, jobs, models, datasets, compute, and pipelines.
|
||||
Triggers: "azure-ai-ml", "MLClient", "workspace", "model registry", "training jobs", "datasets".
|
||||
package: azure-ai-ml
|
||||
---
|
||||
|
||||
# Azure Machine Learning SDK v2 for Python
|
||||
|
||||
Client library for managing Azure ML resources: workspaces, jobs, models, data, and compute.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-ml
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_ML_WORKSPACE_NAME=<your-workspace-name>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.ai.ml import MLClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
ml_client = MLClient(
|
||||
credential=DefaultAzureCredential(),
|
||||
subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"],
|
||||
resource_group_name=os.environ["AZURE_RESOURCE_GROUP"],
|
||||
workspace_name=os.environ["AZURE_ML_WORKSPACE_NAME"]
|
||||
)
|
||||
```
|
||||
|
||||
### From Config File
|
||||
|
||||
```python
|
||||
from azure.ai.ml import MLClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
# Uses config.json in current directory or parent
|
||||
ml_client = MLClient.from_config(
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Workspace Management
|
||||
|
||||
### Create Workspace
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import Workspace
|
||||
|
||||
ws = Workspace(
|
||||
name="my-workspace",
|
||||
location="eastus",
|
||||
display_name="My Workspace",
|
||||
description="ML workspace for experiments",
|
||||
tags={"purpose": "demo"}
|
||||
)
|
||||
|
||||
ml_client.workspaces.begin_create(ws).result()
|
||||
```
|
||||
|
||||
### List Workspaces
|
||||
|
||||
```python
|
||||
for ws in ml_client.workspaces.list():
|
||||
print(f"{ws.name}: {ws.location}")
|
||||
```
|
||||
|
||||
## Data Assets
|
||||
|
||||
### Register Data
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import Data
|
||||
from azure.ai.ml.constants import AssetTypes
|
||||
|
||||
# Register a file
|
||||
my_data = Data(
|
||||
name="my-dataset",
|
||||
version="1",
|
||||
path="azureml://datastores/workspaceblobstore/paths/data/train.csv",
|
||||
type=AssetTypes.URI_FILE,
|
||||
description="Training data"
|
||||
)
|
||||
|
||||
ml_client.data.create_or_update(my_data)
|
||||
```
|
||||
|
||||
### Register Folder
|
||||
|
||||
```python
|
||||
my_data = Data(
|
||||
name="my-folder-dataset",
|
||||
version="1",
|
||||
path="azureml://datastores/workspaceblobstore/paths/data/",
|
||||
type=AssetTypes.URI_FOLDER
|
||||
)
|
||||
|
||||
ml_client.data.create_or_update(my_data)
|
||||
```
|
||||
|
||||
## Model Registry
|
||||
|
||||
### Register Model
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import Model
|
||||
from azure.ai.ml.constants import AssetTypes
|
||||
|
||||
model = Model(
|
||||
name="my-model",
|
||||
version="1",
|
||||
path="./model/",
|
||||
type=AssetTypes.CUSTOM_MODEL,
|
||||
description="My trained model"
|
||||
)
|
||||
|
||||
ml_client.models.create_or_update(model)
|
||||
```
|
||||
|
||||
### List Models
|
||||
|
||||
```python
|
||||
for model in ml_client.models.list(name="my-model"):
|
||||
print(f"{model.name} v{model.version}")
|
||||
```
|
||||
|
||||
## Compute
|
||||
|
||||
### Create Compute Cluster
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import AmlCompute
|
||||
|
||||
cluster = AmlCompute(
|
||||
name="cpu-cluster",
|
||||
type="amlcompute",
|
||||
size="Standard_DS3_v2",
|
||||
min_instances=0,
|
||||
max_instances=4,
|
||||
idle_time_before_scale_down=120
|
||||
)
|
||||
|
||||
ml_client.compute.begin_create_or_update(cluster).result()
|
||||
```
|
||||
|
||||
### List Compute
|
||||
|
||||
```python
|
||||
for compute in ml_client.compute.list():
|
||||
print(f"{compute.name}: {compute.type}")
|
||||
```
|
||||
|
||||
## Jobs
|
||||
|
||||
### Command Job
|
||||
|
||||
```python
|
||||
from azure.ai.ml import command, Input
|
||||
|
||||
job = command(
|
||||
code="./src",
|
||||
command="python train.py --data ${{inputs.data}} --lr ${{inputs.learning_rate}}",
|
||||
inputs={
|
||||
"data": Input(type="uri_folder", path="azureml:my-dataset:1"),
|
||||
"learning_rate": 0.01
|
||||
},
|
||||
environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest",
|
||||
compute="cpu-cluster",
|
||||
display_name="training-job"
|
||||
)
|
||||
|
||||
returned_job = ml_client.jobs.create_or_update(job)
|
||||
print(f"Job URL: {returned_job.studio_url}")
|
||||
```
|
||||
|
||||
### Monitor Job
|
||||
|
||||
```python
|
||||
ml_client.jobs.stream(returned_job.name)
|
||||
```
|
||||
|
||||
## Pipelines
|
||||
|
||||
```python
|
||||
from azure.ai.ml import dsl, Input, Output
|
||||
from azure.ai.ml.entities import Pipeline
|
||||
|
||||
@dsl.pipeline(
|
||||
compute="cpu-cluster",
|
||||
description="Training pipeline"
|
||||
)
|
||||
def training_pipeline(data_input):
|
||||
prep_step = prep_component(data=data_input)
|
||||
train_step = train_component(
|
||||
data=prep_step.outputs.output_data,
|
||||
learning_rate=0.01
|
||||
)
|
||||
return {"model": train_step.outputs.model}
|
||||
|
||||
pipeline = training_pipeline(
|
||||
data_input=Input(type="uri_folder", path="azureml:my-dataset:1")
|
||||
)
|
||||
|
||||
pipeline_job = ml_client.jobs.create_or_update(pipeline)
|
||||
```
|
||||
|
||||
## Environments
|
||||
|
||||
### Create Custom Environment
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import Environment
|
||||
|
||||
env = Environment(
|
||||
name="my-env",
|
||||
version="1",
|
||||
image="mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
|
||||
conda_file="./environment.yml"
|
||||
)
|
||||
|
||||
ml_client.environments.create_or_update(env)
|
||||
```
|
||||
|
||||
## Datastores
|
||||
|
||||
### List Datastores
|
||||
|
||||
```python
|
||||
for ds in ml_client.datastores.list():
|
||||
print(f"{ds.name}: {ds.type}")
|
||||
```
|
||||
|
||||
### Get Default Datastore
|
||||
|
||||
```python
|
||||
default_ds = ml_client.datastores.get_default()
|
||||
print(f"Default: {default_ds.name}")
|
||||
```
|
||||
|
||||
## MLClient Operations
|
||||
|
||||
| Property | Operations |
|
||||
|----------|------------|
|
||||
| `workspaces` | create, get, list, delete |
|
||||
| `jobs` | create_or_update, get, list, stream, cancel |
|
||||
| `models` | create_or_update, get, list, archive |
|
||||
| `data` | create_or_update, get, list |
|
||||
| `compute` | begin_create_or_update, get, list, delete |
|
||||
| `environments` | create_or_update, get, list |
|
||||
| `datastores` | create_or_update, get, list, get_default |
|
||||
| `components` | create_or_update, get, list |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use versioning** for data, models, and environments
|
||||
2. **Configure idle scale-down** to reduce compute costs
|
||||
3. **Use environments** for reproducible training
|
||||
4. **Stream job logs** to monitor progress
|
||||
5. **Register models** after successful training jobs
|
||||
6. **Use pipelines** for multi-step workflows
|
||||
7. **Tag resources** for organization and cost tracking
|
||||
295
skills/official/microsoft/python/foundry/projects/SKILL.md
Normal file
295
skills/official/microsoft/python/foundry/projects/SKILL.md
Normal file
@@ -0,0 +1,295 @@
|
||||
---
|
||||
name: azure-ai-projects-py
|
||||
description: Build AI applications using the Azure AI Projects Python SDK (azure-ai-projects). Use when working with Foundry project clients, creating versioned agents with PromptAgentDefinition, running evaluations, managing connections/deployments/datasets/indexes, or using OpenAI-compatible clients. This is the high-level Foundry SDK - for low-level agent operations, use azure-ai-agents-python skill.
|
||||
package: azure-ai-projects
|
||||
---
|
||||
|
||||
# Azure AI Projects Python SDK (Foundry SDK)
|
||||
|
||||
Build AI applications on Microsoft Foundry using the `azure-ai-projects` SDK.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-projects azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_AI_PROJECT_ENDPOINT="https://<resource>.services.ai.azure.com/api/projects/<project>"
|
||||
AZURE_AI_MODEL_DEPLOYMENT_NAME="gpt-4o-mini"
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.ai.projects import AIProjectClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=credential,
|
||||
)
|
||||
```
|
||||
|
||||
## Client Operations Overview
|
||||
|
||||
| Operation | Access | Purpose |
|
||||
|-----------|--------|---------|
|
||||
| `client.agents` | `.agents.*` | Agent CRUD, versions, threads, runs |
|
||||
| `client.connections` | `.connections.*` | List/get project connections |
|
||||
| `client.deployments` | `.deployments.*` | List model deployments |
|
||||
| `client.datasets` | `.datasets.*` | Dataset management |
|
||||
| `client.indexes` | `.indexes.*` | Index management |
|
||||
| `client.evaluations` | `.evaluations.*` | Run evaluations |
|
||||
| `client.red_teams` | `.red_teams.*` | Red team operations |
|
||||
|
||||
## Two Client Approaches
|
||||
|
||||
### 1. AIProjectClient (Native Foundry)
|
||||
|
||||
```python
|
||||
from azure.ai.projects import AIProjectClient
|
||||
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=DefaultAzureCredential(),
|
||||
)
|
||||
|
||||
# Use Foundry-native operations
|
||||
agent = client.agents.create_agent(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
name="my-agent",
|
||||
instructions="You are helpful.",
|
||||
)
|
||||
```
|
||||
|
||||
### 2. OpenAI-Compatible Client
|
||||
|
||||
```python
|
||||
# Get OpenAI-compatible client from project
|
||||
openai_client = client.get_openai_client()
|
||||
|
||||
# Use standard OpenAI API
|
||||
response = openai_client.chat.completions.create(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
messages=[{"role": "user", "content": "Hello!"}],
|
||||
)
|
||||
```
|
||||
|
||||
## Agent Operations
|
||||
|
||||
### Create Agent (Basic)
|
||||
|
||||
```python
|
||||
agent = client.agents.create_agent(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
name="my-agent",
|
||||
instructions="You are a helpful assistant.",
|
||||
)
|
||||
```
|
||||
|
||||
### Create Agent with Tools
|
||||
|
||||
```python
|
||||
from azure.ai.agents import CodeInterpreterTool, FileSearchTool
|
||||
|
||||
agent = client.agents.create_agent(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
name="tool-agent",
|
||||
instructions="You can execute code and search files.",
|
||||
tools=[CodeInterpreterTool(), FileSearchTool()],
|
||||
)
|
||||
```
|
||||
|
||||
### Versioned Agents with PromptAgentDefinition
|
||||
|
||||
```python
|
||||
from azure.ai.projects.models import PromptAgentDefinition
|
||||
|
||||
# Create a versioned agent
|
||||
agent_version = client.agents.create_version(
|
||||
agent_name="customer-support-agent",
|
||||
definition=PromptAgentDefinition(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
instructions="You are a customer support specialist.",
|
||||
tools=[], # Add tools as needed
|
||||
),
|
||||
version_label="v1.0",
|
||||
)
|
||||
```
|
||||
|
||||
See [references/agents.md](references/agents.md) for detailed agent patterns.
|
||||
|
||||
## Tools Overview
|
||||
|
||||
| Tool | Class | Use Case |
|
||||
|------|-------|----------|
|
||||
| Code Interpreter | `CodeInterpreterTool` | Execute Python, generate files |
|
||||
| File Search | `FileSearchTool` | RAG over uploaded documents |
|
||||
| Bing Grounding | `BingGroundingTool` | Web search (requires connection) |
|
||||
| Azure AI Search | `AzureAISearchTool` | Search your indexes |
|
||||
| Function Calling | `FunctionTool` | Call your Python functions |
|
||||
| OpenAPI | `OpenApiTool` | Call REST APIs |
|
||||
| MCP | `McpTool` | Model Context Protocol servers |
|
||||
| Memory Search | `MemorySearchTool` | Search agent memory stores |
|
||||
| SharePoint | `SharepointGroundingTool` | Search SharePoint content |
|
||||
|
||||
See [references/tools.md](references/tools.md) for all tool patterns.
|
||||
|
||||
## Thread and Message Flow
|
||||
|
||||
```python
|
||||
# 1. Create thread
|
||||
thread = client.agents.threads.create()
|
||||
|
||||
# 2. Add message
|
||||
client.agents.messages.create(
|
||||
thread_id=thread.id,
|
||||
role="user",
|
||||
content="What's the weather like?",
|
||||
)
|
||||
|
||||
# 3. Create and process run
|
||||
run = client.agents.runs.create_and_process(
|
||||
thread_id=thread.id,
|
||||
agent_id=agent.id,
|
||||
)
|
||||
|
||||
# 4. Get response
|
||||
if run.status == "completed":
|
||||
messages = client.agents.messages.list(thread_id=thread.id)
|
||||
for msg in messages:
|
||||
if msg.role == "assistant":
|
||||
print(msg.content[0].text.value)
|
||||
```
|
||||
|
||||
## Connections
|
||||
|
||||
```python
|
||||
# List all connections
|
||||
connections = client.connections.list()
|
||||
for conn in connections:
|
||||
print(f"{conn.name}: {conn.connection_type}")
|
||||
|
||||
# Get specific connection
|
||||
connection = client.connections.get(connection_name="my-search-connection")
|
||||
```
|
||||
|
||||
See [references/connections.md](references/connections.md) for connection patterns.
|
||||
|
||||
## Deployments
|
||||
|
||||
```python
|
||||
# List available model deployments
|
||||
deployments = client.deployments.list()
|
||||
for deployment in deployments:
|
||||
print(f"{deployment.name}: {deployment.model}")
|
||||
```
|
||||
|
||||
See [references/deployments.md](references/deployments.md) for deployment patterns.
|
||||
|
||||
## Datasets and Indexes
|
||||
|
||||
```python
|
||||
# List datasets
|
||||
datasets = client.datasets.list()
|
||||
|
||||
# List indexes
|
||||
indexes = client.indexes.list()
|
||||
```
|
||||
|
||||
See [references/datasets-indexes.md](references/datasets-indexes.md) for data operations.
|
||||
|
||||
## Evaluation
|
||||
|
||||
```python
|
||||
# Using OpenAI client for evals
|
||||
openai_client = client.get_openai_client()
|
||||
|
||||
# Create evaluation with built-in evaluators
|
||||
eval_run = openai_client.evals.runs.create(
|
||||
eval_id="my-eval",
|
||||
name="quality-check",
|
||||
data_source={
|
||||
"type": "custom",
|
||||
"item_references": [{"item_id": "test-1"}],
|
||||
},
|
||||
testing_criteria=[
|
||||
{"type": "fluency"},
|
||||
{"type": "task_adherence"},
|
||||
],
|
||||
)
|
||||
```
|
||||
|
||||
See [references/evaluation.md](references/evaluation.md) for evaluation patterns.
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.projects.aio import AIProjectClient
|
||||
|
||||
async with AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=DefaultAzureCredential(),
|
||||
) as client:
|
||||
agent = await client.agents.create_agent(...)
|
||||
# ... async operations
|
||||
```
|
||||
|
||||
See [references/async-patterns.md](references/async-patterns.md) for async patterns.
|
||||
|
||||
## Memory Stores
|
||||
|
||||
```python
|
||||
# Create memory store for agent
|
||||
memory_store = client.agents.create_memory_store(
|
||||
name="conversation-memory",
|
||||
)
|
||||
|
||||
# Attach to agent for persistent memory
|
||||
agent = client.agents.create_agent(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
name="memory-agent",
|
||||
tools=[MemorySearchTool()],
|
||||
tool_resources={"memory": {"store_ids": [memory_store.id]}},
|
||||
)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use context managers** for async client: `async with AIProjectClient(...) as client:`
|
||||
2. **Clean up agents** when done: `client.agents.delete_agent(agent.id)`
|
||||
3. **Use `create_and_process`** for simple runs, **streaming** for real-time UX
|
||||
4. **Use versioned agents** for production deployments
|
||||
5. **Prefer connections** for external service integration (AI Search, Bing, etc.)
|
||||
|
||||
## SDK Comparison
|
||||
|
||||
| Feature | `azure-ai-projects` | `azure-ai-agents` |
|
||||
|---------|---------------------|-------------------|
|
||||
| Level | High-level (Foundry) | Low-level (Agents) |
|
||||
| Client | `AIProjectClient` | `AgentsClient` |
|
||||
| Versioning | `create_version()` | Not available |
|
||||
| Connections | Yes | No |
|
||||
| Deployments | Yes | No |
|
||||
| Datasets/Indexes | Yes | No |
|
||||
| Evaluation | Via OpenAI client | No |
|
||||
| When to use | Full Foundry integration | Standalone agent apps |
|
||||
|
||||
## Reference Files
|
||||
|
||||
- [references/agents.md](references/agents.md): Agent operations with PromptAgentDefinition
|
||||
- [references/tools.md](references/tools.md): All agent tools with examples
|
||||
- [references/evaluation.md](references/evaluation.md): Evaluation operations overview
|
||||
- [references/built-in-evaluators.md](references/built-in-evaluators.md): Complete built-in evaluator reference
|
||||
- [references/custom-evaluators.md](references/custom-evaluators.md): Code and prompt-based evaluator patterns
|
||||
- [references/connections.md](references/connections.md): Connection operations
|
||||
- [references/deployments.md](references/deployments.md): Deployment enumeration
|
||||
- [references/datasets-indexes.md](references/datasets-indexes.md): Dataset and index operations
|
||||
- [references/async-patterns.md](references/async-patterns.md): Async client usage
|
||||
- [references/api-reference.md](references/api-reference.md): Complete API reference for all 373 SDK exports (v2.0.0b4)
|
||||
- [scripts/run_batch_evaluation.py](scripts/run_batch_evaluation.py): CLI tool for batch evaluations
|
||||
@@ -0,0 +1,528 @@
|
||||
---
|
||||
name: azure-search-documents-py
|
||||
description: |
|
||||
Azure AI Search SDK for Python. Use for vector search, hybrid search, semantic ranking, indexing, and skillsets.
|
||||
Triggers: "azure-search-documents", "SearchClient", "SearchIndexClient", "vector search", "hybrid search", "semantic search".
|
||||
package: azure-search-documents
|
||||
---
|
||||
|
||||
# Azure AI Search SDK for Python
|
||||
|
||||
Full-text, vector, and hybrid search with AI enrichment capabilities.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-search-documents
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SEARCH_ENDPOINT=https://<service-name>.search.windows.net
|
||||
AZURE_SEARCH_API_KEY=<your-api-key>
|
||||
AZURE_SEARCH_INDEX_NAME=<your-index-name>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
from azure.search.documents import SearchClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
client = SearchClient(
|
||||
endpoint=os.environ["AZURE_SEARCH_ENDPOINT"],
|
||||
index_name=os.environ["AZURE_SEARCH_INDEX_NAME"],
|
||||
credential=AzureKeyCredential(os.environ["AZURE_SEARCH_API_KEY"])
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.search.documents import SearchClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = SearchClient(
|
||||
endpoint=os.environ["AZURE_SEARCH_ENDPOINT"],
|
||||
index_name=os.environ["AZURE_SEARCH_INDEX_NAME"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `SearchClient` | Search and document operations |
|
||||
| `SearchIndexClient` | Index management, synonym maps |
|
||||
| `SearchIndexerClient` | Indexers, data sources, skillsets |
|
||||
|
||||
## Create Index with Vector Field
|
||||
|
||||
```python
|
||||
from azure.search.documents.indexes import SearchIndexClient
|
||||
from azure.search.documents.indexes.models import (
|
||||
SearchIndex,
|
||||
SearchField,
|
||||
SearchFieldDataType,
|
||||
VectorSearch,
|
||||
HnswAlgorithmConfiguration,
|
||||
VectorSearchProfile,
|
||||
SearchableField,
|
||||
SimpleField
|
||||
)
|
||||
|
||||
index_client = SearchIndexClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
fields = [
|
||||
SimpleField(name="id", type=SearchFieldDataType.String, key=True),
|
||||
SearchableField(name="title", type=SearchFieldDataType.String),
|
||||
SearchableField(name="content", type=SearchFieldDataType.String),
|
||||
SearchField(
|
||||
name="content_vector",
|
||||
type=SearchFieldDataType.Collection(SearchFieldDataType.Single),
|
||||
searchable=True,
|
||||
vector_search_dimensions=1536,
|
||||
vector_search_profile_name="my-vector-profile"
|
||||
)
|
||||
]
|
||||
|
||||
vector_search = VectorSearch(
|
||||
algorithms=[
|
||||
HnswAlgorithmConfiguration(name="my-hnsw")
|
||||
],
|
||||
profiles=[
|
||||
VectorSearchProfile(
|
||||
name="my-vector-profile",
|
||||
algorithm_configuration_name="my-hnsw"
|
||||
)
|
||||
]
|
||||
)
|
||||
|
||||
index = SearchIndex(
|
||||
name="my-index",
|
||||
fields=fields,
|
||||
vector_search=vector_search
|
||||
)
|
||||
|
||||
index_client.create_or_update_index(index)
|
||||
```
|
||||
|
||||
## Upload Documents
|
||||
|
||||
```python
|
||||
from azure.search.documents import SearchClient
|
||||
|
||||
client = SearchClient(endpoint, "my-index", AzureKeyCredential(key))
|
||||
|
||||
documents = [
|
||||
{
|
||||
"id": "1",
|
||||
"title": "Azure AI Search",
|
||||
"content": "Full-text and vector search service",
|
||||
"content_vector": [0.1, 0.2, ...] # 1536 dimensions
|
||||
}
|
||||
]
|
||||
|
||||
result = client.upload_documents(documents)
|
||||
print(f"Uploaded {len(result)} documents")
|
||||
```
|
||||
|
||||
## Keyword Search
|
||||
|
||||
```python
|
||||
results = client.search(
|
||||
search_text="azure search",
|
||||
select=["id", "title", "content"],
|
||||
top=10
|
||||
)
|
||||
|
||||
for result in results:
|
||||
print(f"{result['title']}: {result['@search.score']}")
|
||||
```
|
||||
|
||||
## Vector Search
|
||||
|
||||
```python
|
||||
from azure.search.documents.models import VectorizedQuery
|
||||
|
||||
# Your query embedding (1536 dimensions)
|
||||
query_vector = get_embedding("semantic search capabilities")
|
||||
|
||||
vector_query = VectorizedQuery(
|
||||
vector=query_vector,
|
||||
k_nearest_neighbors=10,
|
||||
fields="content_vector"
|
||||
)
|
||||
|
||||
results = client.search(
|
||||
vector_queries=[vector_query],
|
||||
select=["id", "title", "content"]
|
||||
)
|
||||
|
||||
for result in results:
|
||||
print(f"{result['title']}: {result['@search.score']}")
|
||||
```
|
||||
|
||||
## Hybrid Search (Vector + Keyword)
|
||||
|
||||
```python
|
||||
from azure.search.documents.models import VectorizedQuery
|
||||
|
||||
vector_query = VectorizedQuery(
|
||||
vector=query_vector,
|
||||
k_nearest_neighbors=10,
|
||||
fields="content_vector"
|
||||
)
|
||||
|
||||
results = client.search(
|
||||
search_text="azure search",
|
||||
vector_queries=[vector_query],
|
||||
select=["id", "title", "content"],
|
||||
top=10
|
||||
)
|
||||
```
|
||||
|
||||
## Semantic Ranking
|
||||
|
||||
```python
|
||||
from azure.search.documents.models import QueryType
|
||||
|
||||
results = client.search(
|
||||
search_text="what is azure search",
|
||||
query_type=QueryType.SEMANTIC,
|
||||
semantic_configuration_name="my-semantic-config",
|
||||
select=["id", "title", "content"],
|
||||
top=10
|
||||
)
|
||||
|
||||
for result in results:
|
||||
print(f"{result['title']}")
|
||||
if result.get("@search.captions"):
|
||||
print(f" Caption: {result['@search.captions'][0].text}")
|
||||
```
|
||||
|
||||
## Filters
|
||||
|
||||
```python
|
||||
results = client.search(
|
||||
search_text="*",
|
||||
filter="category eq 'Technology' and rating gt 4",
|
||||
order_by=["rating desc"],
|
||||
select=["id", "title", "category", "rating"]
|
||||
)
|
||||
```
|
||||
|
||||
## Facets
|
||||
|
||||
```python
|
||||
results = client.search(
|
||||
search_text="*",
|
||||
facets=["category,count:10", "rating"],
|
||||
top=0 # Only get facets, no documents
|
||||
)
|
||||
|
||||
for facet_name, facet_values in results.get_facets().items():
|
||||
print(f"{facet_name}:")
|
||||
for facet in facet_values:
|
||||
print(f" {facet['value']}: {facet['count']}")
|
||||
```
|
||||
|
||||
## Autocomplete & Suggest
|
||||
|
||||
```python
|
||||
# Autocomplete
|
||||
results = client.autocomplete(
|
||||
search_text="sea",
|
||||
suggester_name="my-suggester",
|
||||
mode="twoTerms"
|
||||
)
|
||||
|
||||
# Suggest
|
||||
results = client.suggest(
|
||||
search_text="sea",
|
||||
suggester_name="my-suggester",
|
||||
select=["title"]
|
||||
)
|
||||
```
|
||||
|
||||
## Indexer with Skillset
|
||||
|
||||
```python
|
||||
from azure.search.documents.indexes import SearchIndexerClient
|
||||
from azure.search.documents.indexes.models import (
|
||||
SearchIndexer,
|
||||
SearchIndexerDataSourceConnection,
|
||||
SearchIndexerSkillset,
|
||||
EntityRecognitionSkill,
|
||||
InputFieldMappingEntry,
|
||||
OutputFieldMappingEntry
|
||||
)
|
||||
|
||||
indexer_client = SearchIndexerClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
# Create data source
|
||||
data_source = SearchIndexerDataSourceConnection(
|
||||
name="my-datasource",
|
||||
type="azureblob",
|
||||
connection_string=connection_string,
|
||||
container={"name": "documents"}
|
||||
)
|
||||
indexer_client.create_or_update_data_source_connection(data_source)
|
||||
|
||||
# Create skillset
|
||||
skillset = SearchIndexerSkillset(
|
||||
name="my-skillset",
|
||||
skills=[
|
||||
EntityRecognitionSkill(
|
||||
inputs=[InputFieldMappingEntry(name="text", source="/document/content")],
|
||||
outputs=[OutputFieldMappingEntry(name="organizations", target_name="organizations")]
|
||||
)
|
||||
]
|
||||
)
|
||||
indexer_client.create_or_update_skillset(skillset)
|
||||
|
||||
# Create indexer
|
||||
indexer = SearchIndexer(
|
||||
name="my-indexer",
|
||||
data_source_name="my-datasource",
|
||||
target_index_name="my-index",
|
||||
skillset_name="my-skillset"
|
||||
)
|
||||
indexer_client.create_or_update_indexer(indexer)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use hybrid search** for best relevance combining vector and keyword
|
||||
2. **Enable semantic ranking** for natural language queries
|
||||
3. **Index in batches** of 100-1000 documents for efficiency
|
||||
4. **Use filters** to narrow results before ranking
|
||||
5. **Configure vector dimensions** to match your embedding model
|
||||
6. **Use HNSW algorithm** for large-scale vector search
|
||||
7. **Create suggesters** at index creation time (cannot add later)
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | Contents |
|
||||
|------|----------|
|
||||
| [references/vector-search.md](references/vector-search.md) | HNSW configuration, integrated vectorization, multi-vector queries |
|
||||
| [references/semantic-ranking.md](references/semantic-ranking.md) | Semantic configuration, captions, answers, hybrid patterns |
|
||||
| [scripts/setup_vector_index.py](scripts/setup_vector_index.py) | CLI script to create vector-enabled search index |
|
||||
|
||||
|
||||
---
|
||||
|
||||
## Additional Azure AI Search Patterns
|
||||
|
||||
# Azure AI Search Python SDK
|
||||
|
||||
Write clean, idiomatic Python code for Azure AI Search using `azure-search-documents`.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-search-documents azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SEARCH_ENDPOINT=https://<search-service>.search.windows.net
|
||||
AZURE_SEARCH_INDEX_NAME=<index-name>
|
||||
# For API key auth (not recommended for production)
|
||||
AZURE_SEARCH_API_KEY=<api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
**DefaultAzureCredential (preferred)**:
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.search.documents import SearchClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = SearchClient(endpoint, index_name, credential)
|
||||
```
|
||||
|
||||
**API Key**:
|
||||
```python
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
from azure.search.documents import SearchClient
|
||||
|
||||
client = SearchClient(endpoint, index_name, AzureKeyCredential(api_key))
|
||||
```
|
||||
|
||||
## Client Selection
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `SearchClient` | Query indexes, upload/update/delete documents |
|
||||
| `SearchIndexClient` | Create/manage indexes, knowledge sources, knowledge bases |
|
||||
| `SearchIndexerClient` | Manage indexers, skillsets, data sources |
|
||||
| `KnowledgeBaseRetrievalClient` | Agentic retrieval with LLM-powered Q&A |
|
||||
|
||||
## Index Creation Pattern
|
||||
|
||||
```python
|
||||
from azure.search.documents.indexes import SearchIndexClient
|
||||
from azure.search.documents.indexes.models import (
|
||||
SearchIndex, SearchField, VectorSearch, VectorSearchProfile,
|
||||
HnswAlgorithmConfiguration, AzureOpenAIVectorizer,
|
||||
AzureOpenAIVectorizerParameters, SemanticSearch,
|
||||
SemanticConfiguration, SemanticPrioritizedFields, SemanticField
|
||||
)
|
||||
|
||||
index = SearchIndex(
|
||||
name=index_name,
|
||||
fields=[
|
||||
SearchField(name="id", type="Edm.String", key=True),
|
||||
SearchField(name="content", type="Edm.String", searchable=True),
|
||||
SearchField(name="embedding", type="Collection(Edm.Single)",
|
||||
vector_search_dimensions=3072,
|
||||
vector_search_profile_name="vector-profile"),
|
||||
],
|
||||
vector_search=VectorSearch(
|
||||
profiles=[VectorSearchProfile(
|
||||
name="vector-profile",
|
||||
algorithm_configuration_name="hnsw-algo",
|
||||
vectorizer_name="openai-vectorizer"
|
||||
)],
|
||||
algorithms=[HnswAlgorithmConfiguration(name="hnsw-algo")],
|
||||
vectorizers=[AzureOpenAIVectorizer(
|
||||
vectorizer_name="openai-vectorizer",
|
||||
parameters=AzureOpenAIVectorizerParameters(
|
||||
resource_url=aoai_endpoint,
|
||||
deployment_name=embedding_deployment,
|
||||
model_name=embedding_model
|
||||
)
|
||||
)]
|
||||
),
|
||||
semantic_search=SemanticSearch(
|
||||
default_configuration_name="semantic-config",
|
||||
configurations=[SemanticConfiguration(
|
||||
name="semantic-config",
|
||||
prioritized_fields=SemanticPrioritizedFields(
|
||||
content_fields=[SemanticField(field_name="content")]
|
||||
)
|
||||
)]
|
||||
)
|
||||
)
|
||||
|
||||
index_client = SearchIndexClient(endpoint, credential)
|
||||
index_client.create_or_update_index(index)
|
||||
```
|
||||
|
||||
## Document Operations
|
||||
|
||||
```python
|
||||
from azure.search.documents import SearchIndexingBufferedSender
|
||||
|
||||
# Batch upload with automatic batching
|
||||
with SearchIndexingBufferedSender(endpoint, index_name, credential) as sender:
|
||||
sender.upload_documents(documents)
|
||||
|
||||
# Direct operations via SearchClient
|
||||
search_client = SearchClient(endpoint, index_name, credential)
|
||||
search_client.upload_documents(documents) # Add new
|
||||
search_client.merge_documents(documents) # Update existing
|
||||
search_client.merge_or_upload_documents(documents) # Upsert
|
||||
search_client.delete_documents(documents) # Remove
|
||||
```
|
||||
|
||||
## Search Patterns
|
||||
|
||||
```python
|
||||
# Basic search
|
||||
results = search_client.search(search_text="query")
|
||||
|
||||
# Vector search
|
||||
from azure.search.documents.models import VectorizedQuery
|
||||
|
||||
results = search_client.search(
|
||||
search_text=None,
|
||||
vector_queries=[VectorizedQuery(
|
||||
vector=embedding,
|
||||
k_nearest_neighbors=5,
|
||||
fields="embedding"
|
||||
)]
|
||||
)
|
||||
|
||||
# Hybrid search (vector + keyword)
|
||||
results = search_client.search(
|
||||
search_text="query",
|
||||
vector_queries=[VectorizedQuery(vector=embedding, k_nearest_neighbors=5, fields="embedding")],
|
||||
query_type="semantic",
|
||||
semantic_configuration_name="semantic-config"
|
||||
)
|
||||
|
||||
# With filters
|
||||
results = search_client.search(
|
||||
search_text="query",
|
||||
filter="category eq 'technology'",
|
||||
select=["id", "title", "content"],
|
||||
top=10
|
||||
)
|
||||
```
|
||||
|
||||
## Agentic Retrieval (Knowledge Bases)
|
||||
|
||||
For LLM-powered Q&A with answer synthesis, see [references/agentic-retrieval.md](references/agentic-retrieval.md).
|
||||
|
||||
Key concepts:
|
||||
- **Knowledge Source**: Points to a search index
|
||||
- **Knowledge Base**: Wraps knowledge sources + LLM for query planning and synthesis
|
||||
- **Output modes**: `EXTRACTIVE_DATA` (raw chunks) or `ANSWER_SYNTHESIS` (LLM-generated answers)
|
||||
|
||||
## Async Pattern
|
||||
|
||||
```python
|
||||
from azure.search.documents.aio import SearchClient
|
||||
|
||||
async with SearchClient(endpoint, index_name, credential) as client:
|
||||
results = await client.search(search_text="query")
|
||||
async for result in results:
|
||||
print(result["title"])
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use environment variables** for endpoints, keys, and deployment names
|
||||
2. **Prefer `DefaultAzureCredential`** over API keys for production
|
||||
3. **Use `SearchIndexingBufferedSender`** for batch uploads (handles batching/retries)
|
||||
4. **Always define semantic configuration** for agentic retrieval indexes
|
||||
5. **Use `create_or_update_index`** for idempotent index creation
|
||||
6. **Close clients** with context managers or explicit `close()`
|
||||
|
||||
## Field Types Reference
|
||||
|
||||
| EDM Type | Python | Notes |
|
||||
|----------|--------|-------|
|
||||
| `Edm.String` | str | Searchable text |
|
||||
| `Edm.Int32` | int | Integer |
|
||||
| `Edm.Int64` | int | Long integer |
|
||||
| `Edm.Double` | float | Floating point |
|
||||
| `Edm.Boolean` | bool | True/False |
|
||||
| `Edm.DateTimeOffset` | datetime | ISO 8601 |
|
||||
| `Collection(Edm.Single)` | List[float] | Vector embeddings |
|
||||
| `Collection(Edm.String)` | List[str] | String arrays |
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
from azure.core.exceptions import (
|
||||
HttpResponseError,
|
||||
ResourceNotFoundError,
|
||||
ResourceExistsError
|
||||
)
|
||||
|
||||
try:
|
||||
result = search_client.get_document(key="123")
|
||||
except ResourceNotFoundError:
|
||||
print("Document not found")
|
||||
except HttpResponseError as e:
|
||||
print(f"Search error: {e.message}")
|
||||
```
|
||||
@@ -0,0 +1,372 @@
|
||||
---
|
||||
name: azure-speech-to-text-rest-py
|
||||
description: |
|
||||
Azure Speech to Text REST API for short audio (Python). Use for simple speech recognition of audio files up to 60 seconds without the Speech SDK.
|
||||
Triggers: "speech to text REST", "short audio transcription", "speech recognition REST API", "STT REST", "recognize speech REST".
|
||||
DO NOT USE FOR: Long audio (>60 seconds), real-time streaming, batch transcription, custom speech models, speech translation. Use Speech SDK or Batch Transcription API instead.
|
||||
---
|
||||
|
||||
# Azure Speech to Text REST API for Short Audio
|
||||
|
||||
Simple REST API for speech-to-text transcription of short audio files (up to 60 seconds). No SDK required - just HTTP requests.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
1. **Azure subscription** - [Create one free](https://azure.microsoft.com/free/)
|
||||
2. **Speech resource** - Create in [Azure Portal](https://portal.azure.com/#create/Microsoft.CognitiveServicesSpeechServices)
|
||||
3. **Get credentials** - After deployment, go to resource > Keys and Endpoint
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Required
|
||||
AZURE_SPEECH_KEY=<your-speech-resource-key>
|
||||
AZURE_SPEECH_REGION=<region> # e.g., eastus, westus2, westeurope
|
||||
|
||||
# Alternative: Use endpoint directly
|
||||
AZURE_SPEECH_ENDPOINT=https://<region>.stt.speech.microsoft.com
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install requests
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
```python
|
||||
import os
|
||||
import requests
|
||||
|
||||
def transcribe_audio(audio_file_path: str, language: str = "en-US") -> dict:
|
||||
"""Transcribe short audio file (max 60 seconds) using REST API."""
|
||||
region = os.environ["AZURE_SPEECH_REGION"]
|
||||
api_key = os.environ["AZURE_SPEECH_KEY"]
|
||||
|
||||
url = f"https://{region}.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1"
|
||||
|
||||
headers = {
|
||||
"Ocp-Apim-Subscription-Key": api_key,
|
||||
"Content-Type": "audio/wav; codecs=audio/pcm; samplerate=16000",
|
||||
"Accept": "application/json"
|
||||
}
|
||||
|
||||
params = {
|
||||
"language": language,
|
||||
"format": "detailed" # or "simple"
|
||||
}
|
||||
|
||||
with open(audio_file_path, "rb") as audio_file:
|
||||
response = requests.post(url, headers=headers, params=params, data=audio_file)
|
||||
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
# Usage
|
||||
result = transcribe_audio("audio.wav", "en-US")
|
||||
print(result["DisplayText"])
|
||||
```
|
||||
|
||||
## Audio Requirements
|
||||
|
||||
| Format | Codec | Sample Rate | Notes |
|
||||
|--------|-------|-------------|-------|
|
||||
| WAV | PCM | 16 kHz, mono | **Recommended** |
|
||||
| OGG | OPUS | 16 kHz, mono | Smaller file size |
|
||||
|
||||
**Limitations:**
|
||||
- Maximum 60 seconds of audio
|
||||
- For pronunciation assessment: maximum 30 seconds
|
||||
- No partial/interim results (final only)
|
||||
|
||||
## Content-Type Headers
|
||||
|
||||
```python
|
||||
# WAV PCM 16kHz
|
||||
"Content-Type": "audio/wav; codecs=audio/pcm; samplerate=16000"
|
||||
|
||||
# OGG OPUS
|
||||
"Content-Type": "audio/ogg; codecs=opus"
|
||||
```
|
||||
|
||||
## Response Formats
|
||||
|
||||
### Simple Format (default)
|
||||
|
||||
```python
|
||||
params = {"language": "en-US", "format": "simple"}
|
||||
```
|
||||
|
||||
```json
|
||||
{
|
||||
"RecognitionStatus": "Success",
|
||||
"DisplayText": "Remind me to buy 5 pencils.",
|
||||
"Offset": "1236645672289",
|
||||
"Duration": "1236645672289"
|
||||
}
|
||||
```
|
||||
|
||||
### Detailed Format
|
||||
|
||||
```python
|
||||
params = {"language": "en-US", "format": "detailed"}
|
||||
```
|
||||
|
||||
```json
|
||||
{
|
||||
"RecognitionStatus": "Success",
|
||||
"Offset": "1236645672289",
|
||||
"Duration": "1236645672289",
|
||||
"NBest": [
|
||||
{
|
||||
"Confidence": 0.9052885,
|
||||
"Display": "What's the weather like?",
|
||||
"ITN": "what's the weather like",
|
||||
"Lexical": "what's the weather like",
|
||||
"MaskedITN": "what's the weather like"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Chunked Transfer (Recommended)
|
||||
|
||||
For lower latency, stream audio in chunks:
|
||||
|
||||
```python
|
||||
import os
|
||||
import requests
|
||||
|
||||
def transcribe_chunked(audio_file_path: str, language: str = "en-US") -> dict:
|
||||
"""Stream audio in chunks for lower latency."""
|
||||
region = os.environ["AZURE_SPEECH_REGION"]
|
||||
api_key = os.environ["AZURE_SPEECH_KEY"]
|
||||
|
||||
url = f"https://{region}.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1"
|
||||
|
||||
headers = {
|
||||
"Ocp-Apim-Subscription-Key": api_key,
|
||||
"Content-Type": "audio/wav; codecs=audio/pcm; samplerate=16000",
|
||||
"Accept": "application/json",
|
||||
"Transfer-Encoding": "chunked",
|
||||
"Expect": "100-continue"
|
||||
}
|
||||
|
||||
params = {"language": language, "format": "detailed"}
|
||||
|
||||
def generate_chunks(file_path: str, chunk_size: int = 1024):
|
||||
with open(file_path, "rb") as f:
|
||||
while chunk := f.read(chunk_size):
|
||||
yield chunk
|
||||
|
||||
response = requests.post(
|
||||
url,
|
||||
headers=headers,
|
||||
params=params,
|
||||
data=generate_chunks(audio_file_path)
|
||||
)
|
||||
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
```
|
||||
|
||||
## Authentication Options
|
||||
|
||||
### Option 1: Subscription Key (Simple)
|
||||
|
||||
```python
|
||||
headers = {
|
||||
"Ocp-Apim-Subscription-Key": os.environ["AZURE_SPEECH_KEY"]
|
||||
}
|
||||
```
|
||||
|
||||
### Option 2: Bearer Token
|
||||
|
||||
```python
|
||||
import requests
|
||||
import os
|
||||
|
||||
def get_access_token() -> str:
|
||||
"""Get access token from the token endpoint."""
|
||||
region = os.environ["AZURE_SPEECH_REGION"]
|
||||
api_key = os.environ["AZURE_SPEECH_KEY"]
|
||||
|
||||
token_url = f"https://{region}.api.cognitive.microsoft.com/sts/v1.0/issueToken"
|
||||
|
||||
response = requests.post(
|
||||
token_url,
|
||||
headers={
|
||||
"Ocp-Apim-Subscription-Key": api_key,
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
"Content-Length": "0"
|
||||
}
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.text
|
||||
|
||||
# Use token in requests (valid for 10 minutes)
|
||||
token = get_access_token()
|
||||
headers = {
|
||||
"Authorization": f"Bearer {token}",
|
||||
"Content-Type": "audio/wav; codecs=audio/pcm; samplerate=16000",
|
||||
"Accept": "application/json"
|
||||
}
|
||||
```
|
||||
|
||||
## Query Parameters
|
||||
|
||||
| Parameter | Required | Values | Description |
|
||||
|-----------|----------|--------|-------------|
|
||||
| `language` | **Yes** | `en-US`, `de-DE`, etc. | Language of speech |
|
||||
| `format` | No | `simple`, `detailed` | Result format (default: simple) |
|
||||
| `profanity` | No | `masked`, `removed`, `raw` | Profanity handling (default: masked) |
|
||||
|
||||
## Recognition Status Values
|
||||
|
||||
| Status | Description |
|
||||
|--------|-------------|
|
||||
| `Success` | Recognition succeeded |
|
||||
| `NoMatch` | Speech detected but no words matched |
|
||||
| `InitialSilenceTimeout` | Only silence detected |
|
||||
| `BabbleTimeout` | Only noise detected |
|
||||
| `Error` | Internal service error |
|
||||
|
||||
## Profanity Handling
|
||||
|
||||
```python
|
||||
# Mask profanity with asterisks (default)
|
||||
params = {"language": "en-US", "profanity": "masked"}
|
||||
|
||||
# Remove profanity entirely
|
||||
params = {"language": "en-US", "profanity": "removed"}
|
||||
|
||||
# Include profanity as-is
|
||||
params = {"language": "en-US", "profanity": "raw"}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
import requests
|
||||
|
||||
def transcribe_with_error_handling(audio_path: str, language: str = "en-US") -> dict | None:
|
||||
"""Transcribe with proper error handling."""
|
||||
region = os.environ["AZURE_SPEECH_REGION"]
|
||||
api_key = os.environ["AZURE_SPEECH_KEY"]
|
||||
|
||||
url = f"https://{region}.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1"
|
||||
|
||||
try:
|
||||
with open(audio_path, "rb") as audio_file:
|
||||
response = requests.post(
|
||||
url,
|
||||
headers={
|
||||
"Ocp-Apim-Subscription-Key": api_key,
|
||||
"Content-Type": "audio/wav; codecs=audio/pcm; samplerate=16000",
|
||||
"Accept": "application/json"
|
||||
},
|
||||
params={"language": language, "format": "detailed"},
|
||||
data=audio_file
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
result = response.json()
|
||||
if result.get("RecognitionStatus") == "Success":
|
||||
return result
|
||||
else:
|
||||
print(f"Recognition failed: {result.get('RecognitionStatus')}")
|
||||
return None
|
||||
elif response.status_code == 400:
|
||||
print(f"Bad request: Check language code or audio format")
|
||||
elif response.status_code == 401:
|
||||
print(f"Unauthorized: Check API key or token")
|
||||
elif response.status_code == 403:
|
||||
print(f"Forbidden: Missing authorization header")
|
||||
else:
|
||||
print(f"Error {response.status_code}: {response.text}")
|
||||
|
||||
return None
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
print(f"Request failed: {e}")
|
||||
return None
|
||||
```
|
||||
|
||||
## Async Version
|
||||
|
||||
```python
|
||||
import os
|
||||
import aiohttp
|
||||
import asyncio
|
||||
|
||||
async def transcribe_async(audio_file_path: str, language: str = "en-US") -> dict:
|
||||
"""Async version using aiohttp."""
|
||||
region = os.environ["AZURE_SPEECH_REGION"]
|
||||
api_key = os.environ["AZURE_SPEECH_KEY"]
|
||||
|
||||
url = f"https://{region}.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1"
|
||||
|
||||
headers = {
|
||||
"Ocp-Apim-Subscription-Key": api_key,
|
||||
"Content-Type": "audio/wav; codecs=audio/pcm; samplerate=16000",
|
||||
"Accept": "application/json"
|
||||
}
|
||||
|
||||
params = {"language": language, "format": "detailed"}
|
||||
|
||||
async with aiohttp.ClientSession() as session:
|
||||
with open(audio_file_path, "rb") as f:
|
||||
audio_data = f.read()
|
||||
|
||||
async with session.post(url, headers=headers, params=params, data=audio_data) as response:
|
||||
response.raise_for_status()
|
||||
return await response.json()
|
||||
|
||||
# Usage
|
||||
result = asyncio.run(transcribe_async("audio.wav", "en-US"))
|
||||
print(result["DisplayText"])
|
||||
```
|
||||
|
||||
## Supported Languages
|
||||
|
||||
Common language codes (see [full list](https://learn.microsoft.com/azure/ai-services/speech-service/language-support)):
|
||||
|
||||
| Code | Language |
|
||||
|------|----------|
|
||||
| `en-US` | English (US) |
|
||||
| `en-GB` | English (UK) |
|
||||
| `de-DE` | German |
|
||||
| `fr-FR` | French |
|
||||
| `es-ES` | Spanish (Spain) |
|
||||
| `es-MX` | Spanish (Mexico) |
|
||||
| `zh-CN` | Chinese (Mandarin) |
|
||||
| `ja-JP` | Japanese |
|
||||
| `ko-KR` | Korean |
|
||||
| `pt-BR` | Portuguese (Brazil) |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use WAV PCM 16kHz mono** for best compatibility
|
||||
2. **Enable chunked transfer** for lower latency
|
||||
3. **Cache access tokens** for 9 minutes (valid for 10)
|
||||
4. **Specify the correct language** for accurate recognition
|
||||
5. **Use detailed format** when you need confidence scores
|
||||
6. **Handle all RecognitionStatus values** in production code
|
||||
|
||||
## When NOT to Use This API
|
||||
|
||||
Use the Speech SDK or Batch Transcription API instead when you need:
|
||||
|
||||
- Audio longer than 60 seconds
|
||||
- Real-time streaming transcription
|
||||
- Partial/interim results
|
||||
- Speech translation
|
||||
- Custom speech models
|
||||
- Batch transcription of many files
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | Contents |
|
||||
|------|----------|
|
||||
| [references/pronunciation-assessment.md](references/pronunciation-assessment.md) | Pronunciation assessment parameters and scoring |
|
||||
227
skills/official/microsoft/python/foundry/textanalytics/SKILL.md
Normal file
227
skills/official/microsoft/python/foundry/textanalytics/SKILL.md
Normal file
@@ -0,0 +1,227 @@
|
||||
---
|
||||
name: azure-ai-textanalytics-py
|
||||
description: |
|
||||
Azure AI Text Analytics SDK for sentiment analysis, entity recognition, key phrases, language detection, PII, and healthcare NLP. Use for natural language processing on text.
|
||||
Triggers: "text analytics", "sentiment analysis", "entity recognition", "key phrase", "PII detection", "TextAnalyticsClient".
|
||||
package: azure-ai-textanalytics
|
||||
---
|
||||
|
||||
# Azure AI Text Analytics SDK for Python
|
||||
|
||||
Client library for Azure AI Language service NLP capabilities including sentiment, entities, key phrases, and more.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-textanalytics
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_LANGUAGE_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
AZURE_LANGUAGE_KEY=<your-api-key> # If using API key
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
from azure.ai.textanalytics import TextAnalyticsClient
|
||||
|
||||
endpoint = os.environ["AZURE_LANGUAGE_ENDPOINT"]
|
||||
key = os.environ["AZURE_LANGUAGE_KEY"]
|
||||
|
||||
client = TextAnalyticsClient(endpoint, AzureKeyCredential(key))
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.ai.textanalytics import TextAnalyticsClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = TextAnalyticsClient(
|
||||
endpoint=os.environ["AZURE_LANGUAGE_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Sentiment Analysis
|
||||
|
||||
```python
|
||||
documents = [
|
||||
"I had a wonderful trip to Seattle last week!",
|
||||
"The food was terrible and the service was slow."
|
||||
]
|
||||
|
||||
result = client.analyze_sentiment(documents, show_opinion_mining=True)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
print(f"Sentiment: {doc.sentiment}")
|
||||
print(f"Scores: pos={doc.confidence_scores.positive:.2f}, "
|
||||
f"neg={doc.confidence_scores.negative:.2f}, "
|
||||
f"neu={doc.confidence_scores.neutral:.2f}")
|
||||
|
||||
# Opinion mining (aspect-based sentiment)
|
||||
for sentence in doc.sentences:
|
||||
for opinion in sentence.mined_opinions:
|
||||
target = opinion.target
|
||||
print(f" Target: '{target.text}' - {target.sentiment}")
|
||||
for assessment in opinion.assessments:
|
||||
print(f" Assessment: '{assessment.text}' - {assessment.sentiment}")
|
||||
```
|
||||
|
||||
## Entity Recognition
|
||||
|
||||
```python
|
||||
documents = ["Microsoft was founded by Bill Gates and Paul Allen in Albuquerque."]
|
||||
|
||||
result = client.recognize_entities(documents)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
for entity in doc.entities:
|
||||
print(f"Entity: {entity.text}")
|
||||
print(f" Category: {entity.category}")
|
||||
print(f" Subcategory: {entity.subcategory}")
|
||||
print(f" Confidence: {entity.confidence_score:.2f}")
|
||||
```
|
||||
|
||||
## PII Detection
|
||||
|
||||
```python
|
||||
documents = ["My SSN is 123-45-6789 and my email is john@example.com"]
|
||||
|
||||
result = client.recognize_pii_entities(documents)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
print(f"Redacted: {doc.redacted_text}")
|
||||
for entity in doc.entities:
|
||||
print(f"PII: {entity.text} ({entity.category})")
|
||||
```
|
||||
|
||||
## Key Phrase Extraction
|
||||
|
||||
```python
|
||||
documents = ["Azure AI provides powerful machine learning capabilities for developers."]
|
||||
|
||||
result = client.extract_key_phrases(documents)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
print(f"Key phrases: {doc.key_phrases}")
|
||||
```
|
||||
|
||||
## Language Detection
|
||||
|
||||
```python
|
||||
documents = ["Ce document est en francais.", "This is written in English."]
|
||||
|
||||
result = client.detect_language(documents)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
print(f"Language: {doc.primary_language.name} ({doc.primary_language.iso6391_name})")
|
||||
print(f"Confidence: {doc.primary_language.confidence_score:.2f}")
|
||||
```
|
||||
|
||||
## Healthcare Text Analytics
|
||||
|
||||
```python
|
||||
documents = ["Patient has diabetes and was prescribed metformin 500mg twice daily."]
|
||||
|
||||
poller = client.begin_analyze_healthcare_entities(documents)
|
||||
result = poller.result()
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
for entity in doc.entities:
|
||||
print(f"Entity: {entity.text}")
|
||||
print(f" Category: {entity.category}")
|
||||
print(f" Normalized: {entity.normalized_text}")
|
||||
|
||||
# Entity links (UMLS, etc.)
|
||||
for link in entity.data_sources:
|
||||
print(f" Link: {link.name} - {link.entity_id}")
|
||||
```
|
||||
|
||||
## Multiple Analysis (Batch)
|
||||
|
||||
```python
|
||||
from azure.ai.textanalytics import (
|
||||
RecognizeEntitiesAction,
|
||||
ExtractKeyPhrasesAction,
|
||||
AnalyzeSentimentAction
|
||||
)
|
||||
|
||||
documents = ["Microsoft announced new Azure AI features at Build conference."]
|
||||
|
||||
poller = client.begin_analyze_actions(
|
||||
documents,
|
||||
actions=[
|
||||
RecognizeEntitiesAction(),
|
||||
ExtractKeyPhrasesAction(),
|
||||
AnalyzeSentimentAction()
|
||||
]
|
||||
)
|
||||
|
||||
results = poller.result()
|
||||
for doc_results in results:
|
||||
for result in doc_results:
|
||||
if result.kind == "EntityRecognition":
|
||||
print(f"Entities: {[e.text for e in result.entities]}")
|
||||
elif result.kind == "KeyPhraseExtraction":
|
||||
print(f"Key phrases: {result.key_phrases}")
|
||||
elif result.kind == "SentimentAnalysis":
|
||||
print(f"Sentiment: {result.sentiment}")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.textanalytics.aio import TextAnalyticsClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def analyze():
|
||||
async with TextAnalyticsClient(
|
||||
endpoint=endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
) as client:
|
||||
result = await client.analyze_sentiment(documents)
|
||||
# Process results...
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `TextAnalyticsClient` | All text analytics operations |
|
||||
| `TextAnalyticsClient` (aio) | Async version |
|
||||
|
||||
## Available Operations
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `analyze_sentiment` | Sentiment analysis with opinion mining |
|
||||
| `recognize_entities` | Named entity recognition |
|
||||
| `recognize_pii_entities` | PII detection and redaction |
|
||||
| `recognize_linked_entities` | Entity linking to Wikipedia |
|
||||
| `extract_key_phrases` | Key phrase extraction |
|
||||
| `detect_language` | Language detection |
|
||||
| `begin_analyze_healthcare_entities` | Healthcare NLP (long-running) |
|
||||
| `begin_analyze_actions` | Multiple analyses in batch |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use batch operations** for multiple documents (up to 10 per request)
|
||||
2. **Enable opinion mining** for detailed aspect-based sentiment
|
||||
3. **Use async client** for high-throughput scenarios
|
||||
4. **Handle document errors** — results list may contain errors for some docs
|
||||
5. **Specify language** when known to improve accuracy
|
||||
6. **Use context manager** or close client explicitly
|
||||
@@ -0,0 +1,69 @@
|
||||
---
|
||||
name: azure-ai-transcription-py
|
||||
description: |
|
||||
Azure AI Transcription SDK for Python. Use for real-time and batch speech-to-text transcription with timestamps and diarization.
|
||||
Triggers: "transcription", "speech to text", "Azure AI Transcription", "TranscriptionClient".
|
||||
package: azure-ai-transcription
|
||||
---
|
||||
|
||||
# Azure AI Transcription SDK for Python
|
||||
|
||||
Client library for Azure AI Transcription (speech-to-text) with real-time and batch transcription.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-transcription
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
TRANSCRIPTION_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
TRANSCRIPTION_KEY=<your-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
Use subscription key authentication (DefaultAzureCredential is not supported for this client):
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.transcription import TranscriptionClient
|
||||
|
||||
client = TranscriptionClient(
|
||||
endpoint=os.environ["TRANSCRIPTION_ENDPOINT"],
|
||||
credential=os.environ["TRANSCRIPTION_KEY"]
|
||||
)
|
||||
```
|
||||
|
||||
## Transcription (Batch)
|
||||
|
||||
```python
|
||||
job = client.begin_transcription(
|
||||
name="meeting-transcription",
|
||||
locale="en-US",
|
||||
content_urls=["https://<storage>/audio.wav"],
|
||||
diarization_enabled=True
|
||||
)
|
||||
result = job.result()
|
||||
print(result.status)
|
||||
```
|
||||
|
||||
## Transcription (Real-time)
|
||||
|
||||
```python
|
||||
stream = client.begin_stream_transcription(locale="en-US")
|
||||
stream.send_audio_file("audio.wav")
|
||||
for event in stream:
|
||||
print(event.text)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Enable diarization** when multiple speakers are present
|
||||
2. **Use batch transcription** for long files stored in blob storage
|
||||
3. **Capture timestamps** for subtitle generation
|
||||
4. **Specify language** to improve recognition accuracy
|
||||
5. **Handle streaming backpressure** for real-time transcription
|
||||
6. **Close transcription sessions** when complete
|
||||
@@ -0,0 +1,249 @@
|
||||
---
|
||||
name: azure-ai-translation-document-py
|
||||
description: |
|
||||
Azure AI Document Translation SDK for batch translation of documents with format preservation. Use for translating Word, PDF, Excel, PowerPoint, and other document formats at scale.
|
||||
Triggers: "document translation", "batch translation", "translate documents", "DocumentTranslationClient".
|
||||
package: azure-ai-translation-document
|
||||
---
|
||||
|
||||
# Azure AI Document Translation SDK for Python
|
||||
|
||||
Client library for Azure AI Translator document translation service for batch document translation with format preservation.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-translation-document
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_DOCUMENT_TRANSLATION_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
AZURE_DOCUMENT_TRANSLATION_KEY=<your-api-key> # If using API key
|
||||
|
||||
# Storage for source and target documents
|
||||
AZURE_SOURCE_CONTAINER_URL=https://<storage>.blob.core.windows.net/<container>?<sas>
|
||||
AZURE_TARGET_CONTAINER_URL=https://<storage>.blob.core.windows.net/<container>?<sas>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.translation.document import DocumentTranslationClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
endpoint = os.environ["AZURE_DOCUMENT_TRANSLATION_ENDPOINT"]
|
||||
key = os.environ["AZURE_DOCUMENT_TRANSLATION_KEY"]
|
||||
|
||||
client = DocumentTranslationClient(endpoint, AzureKeyCredential(key))
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document import DocumentTranslationClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = DocumentTranslationClient(
|
||||
endpoint=os.environ["AZURE_DOCUMENT_TRANSLATION_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Basic Document Translation
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document import DocumentTranslationInput, TranslationTarget
|
||||
|
||||
source_url = os.environ["AZURE_SOURCE_CONTAINER_URL"]
|
||||
target_url = os.environ["AZURE_TARGET_CONTAINER_URL"]
|
||||
|
||||
# Start translation job
|
||||
poller = client.begin_translation(
|
||||
inputs=[
|
||||
DocumentTranslationInput(
|
||||
source_url=source_url,
|
||||
targets=[
|
||||
TranslationTarget(
|
||||
target_url=target_url,
|
||||
language="es" # Translate to Spanish
|
||||
)
|
||||
]
|
||||
)
|
||||
]
|
||||
)
|
||||
|
||||
# Wait for completion
|
||||
result = poller.result()
|
||||
|
||||
print(f"Status: {poller.status()}")
|
||||
print(f"Documents translated: {poller.details.documents_succeeded_count}")
|
||||
print(f"Documents failed: {poller.details.documents_failed_count}")
|
||||
```
|
||||
|
||||
## Multiple Target Languages
|
||||
|
||||
```python
|
||||
poller = client.begin_translation(
|
||||
inputs=[
|
||||
DocumentTranslationInput(
|
||||
source_url=source_url,
|
||||
targets=[
|
||||
TranslationTarget(target_url=target_url_es, language="es"),
|
||||
TranslationTarget(target_url=target_url_fr, language="fr"),
|
||||
TranslationTarget(target_url=target_url_de, language="de")
|
||||
]
|
||||
)
|
||||
]
|
||||
)
|
||||
```
|
||||
|
||||
## Translate Single Document
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document import SingleDocumentTranslationClient
|
||||
|
||||
single_client = SingleDocumentTranslationClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
with open("document.docx", "rb") as f:
|
||||
document_content = f.read()
|
||||
|
||||
result = single_client.translate(
|
||||
body=document_content,
|
||||
target_language="es",
|
||||
content_type="application/vnd.openxmlformats-officedocument.wordprocessingml.document"
|
||||
)
|
||||
|
||||
# Save translated document
|
||||
with open("document_es.docx", "wb") as f:
|
||||
f.write(result)
|
||||
```
|
||||
|
||||
## Check Translation Status
|
||||
|
||||
```python
|
||||
# Get all translation operations
|
||||
operations = client.list_translation_statuses()
|
||||
|
||||
for op in operations:
|
||||
print(f"Operation ID: {op.id}")
|
||||
print(f"Status: {op.status}")
|
||||
print(f"Created: {op.created_on}")
|
||||
print(f"Total documents: {op.documents_total_count}")
|
||||
print(f"Succeeded: {op.documents_succeeded_count}")
|
||||
print(f"Failed: {op.documents_failed_count}")
|
||||
```
|
||||
|
||||
## List Document Statuses
|
||||
|
||||
```python
|
||||
# Get status of individual documents in a job
|
||||
operation_id = poller.id
|
||||
document_statuses = client.list_document_statuses(operation_id)
|
||||
|
||||
for doc in document_statuses:
|
||||
print(f"Document: {doc.source_document_url}")
|
||||
print(f" Status: {doc.status}")
|
||||
print(f" Translated to: {doc.translated_to}")
|
||||
if doc.error:
|
||||
print(f" Error: {doc.error.message}")
|
||||
```
|
||||
|
||||
## Cancel Translation
|
||||
|
||||
```python
|
||||
# Cancel a running translation
|
||||
client.cancel_translation(operation_id)
|
||||
```
|
||||
|
||||
## Using Glossary
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document import TranslationGlossary
|
||||
|
||||
poller = client.begin_translation(
|
||||
inputs=[
|
||||
DocumentTranslationInput(
|
||||
source_url=source_url,
|
||||
targets=[
|
||||
TranslationTarget(
|
||||
target_url=target_url,
|
||||
language="es",
|
||||
glossaries=[
|
||||
TranslationGlossary(
|
||||
glossary_url="https://<storage>.blob.core.windows.net/glossary/terms.csv?<sas>",
|
||||
file_format="csv"
|
||||
)
|
||||
]
|
||||
)
|
||||
]
|
||||
)
|
||||
]
|
||||
)
|
||||
```
|
||||
|
||||
## Supported Document Formats
|
||||
|
||||
```python
|
||||
# Get supported formats
|
||||
formats = client.get_supported_document_formats()
|
||||
|
||||
for fmt in formats:
|
||||
print(f"Format: {fmt.format}")
|
||||
print(f" Extensions: {fmt.file_extensions}")
|
||||
print(f" Content types: {fmt.content_types}")
|
||||
```
|
||||
|
||||
## Supported Languages
|
||||
|
||||
```python
|
||||
# Get supported languages
|
||||
languages = client.get_supported_languages()
|
||||
|
||||
for lang in languages:
|
||||
print(f"Language: {lang.name} ({lang.code})")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document.aio import DocumentTranslationClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def translate_documents():
|
||||
async with DocumentTranslationClient(
|
||||
endpoint=endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
) as client:
|
||||
poller = await client.begin_translation(inputs=[...])
|
||||
result = await poller.result()
|
||||
```
|
||||
|
||||
## Supported Formats
|
||||
|
||||
| Category | Formats |
|
||||
|----------|---------|
|
||||
| Documents | DOCX, PDF, PPTX, XLSX, HTML, TXT, RTF |
|
||||
| Structured | CSV, TSV, JSON, XML |
|
||||
| Localization | XLIFF, XLF, MHTML |
|
||||
|
||||
## Storage Requirements
|
||||
|
||||
- Source and target containers must be Azure Blob Storage
|
||||
- Use SAS tokens with appropriate permissions:
|
||||
- Source: Read, List
|
||||
- Target: Write, List
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use SAS tokens** with minimal required permissions
|
||||
2. **Monitor long-running operations** with `poller.status()`
|
||||
3. **Handle document-level errors** by iterating document statuses
|
||||
4. **Use glossaries** for domain-specific terminology
|
||||
5. **Separate target containers** for each language
|
||||
6. **Use async client** for multiple concurrent jobs
|
||||
7. **Check supported formats** before submitting documents
|
||||
@@ -0,0 +1,274 @@
|
||||
---
|
||||
name: azure-ai-translation-text-py
|
||||
description: |
|
||||
Azure AI Text Translation SDK for real-time text translation, transliteration, language detection, and dictionary lookup. Use for translating text content in applications.
|
||||
Triggers: "text translation", "translator", "translate text", "transliterate", "TextTranslationClient".
|
||||
package: azure-ai-translation-text
|
||||
---
|
||||
|
||||
# Azure AI Text Translation SDK for Python
|
||||
|
||||
Client library for Azure AI Translator text translation service for real-time text translation, transliteration, and language operations.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-translation-text
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_TRANSLATOR_KEY=<your-api-key>
|
||||
AZURE_TRANSLATOR_REGION=<your-region> # e.g., eastus, westus2
|
||||
# Or use custom endpoint
|
||||
AZURE_TRANSLATOR_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key with Region
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.translation.text import TextTranslationClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
key = os.environ["AZURE_TRANSLATOR_KEY"]
|
||||
region = os.environ["AZURE_TRANSLATOR_REGION"]
|
||||
|
||||
# Create credential with region
|
||||
credential = AzureKeyCredential(key)
|
||||
client = TextTranslationClient(credential=credential, region=region)
|
||||
```
|
||||
|
||||
### API Key with Custom Endpoint
|
||||
|
||||
```python
|
||||
endpoint = os.environ["AZURE_TRANSLATOR_ENDPOINT"]
|
||||
|
||||
client = TextTranslationClient(
|
||||
credential=AzureKeyCredential(key),
|
||||
endpoint=endpoint
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.ai.translation.text import TextTranslationClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = TextTranslationClient(
|
||||
credential=DefaultAzureCredential(),
|
||||
endpoint=os.environ["AZURE_TRANSLATOR_ENDPOINT"]
|
||||
)
|
||||
```
|
||||
|
||||
## Basic Translation
|
||||
|
||||
```python
|
||||
# Translate to a single language
|
||||
result = client.translate(
|
||||
body=["Hello, how are you?", "Welcome to Azure!"],
|
||||
to=["es"] # Spanish
|
||||
)
|
||||
|
||||
for item in result:
|
||||
for translation in item.translations:
|
||||
print(f"Translated: {translation.text}")
|
||||
print(f"Target language: {translation.to}")
|
||||
```
|
||||
|
||||
## Translate to Multiple Languages
|
||||
|
||||
```python
|
||||
result = client.translate(
|
||||
body=["Hello, world!"],
|
||||
to=["es", "fr", "de", "ja"] # Spanish, French, German, Japanese
|
||||
)
|
||||
|
||||
for item in result:
|
||||
print(f"Source: {item.detected_language.language if item.detected_language else 'unknown'}")
|
||||
for translation in item.translations:
|
||||
print(f" {translation.to}: {translation.text}")
|
||||
```
|
||||
|
||||
## Specify Source Language
|
||||
|
||||
```python
|
||||
result = client.translate(
|
||||
body=["Bonjour le monde"],
|
||||
from_parameter="fr", # Source is French
|
||||
to=["en", "es"]
|
||||
)
|
||||
```
|
||||
|
||||
## Language Detection
|
||||
|
||||
```python
|
||||
result = client.translate(
|
||||
body=["Hola, como estas?"],
|
||||
to=["en"]
|
||||
)
|
||||
|
||||
for item in result:
|
||||
if item.detected_language:
|
||||
print(f"Detected language: {item.detected_language.language}")
|
||||
print(f"Confidence: {item.detected_language.score:.2f}")
|
||||
```
|
||||
|
||||
## Transliteration
|
||||
|
||||
Convert text from one script to another:
|
||||
|
||||
```python
|
||||
result = client.transliterate(
|
||||
body=["konnichiwa"],
|
||||
language="ja",
|
||||
from_script="Latn", # From Latin script
|
||||
to_script="Jpan" # To Japanese script
|
||||
)
|
||||
|
||||
for item in result:
|
||||
print(f"Transliterated: {item.text}")
|
||||
print(f"Script: {item.script}")
|
||||
```
|
||||
|
||||
## Dictionary Lookup
|
||||
|
||||
Find alternate translations and definitions:
|
||||
|
||||
```python
|
||||
result = client.lookup_dictionary_entries(
|
||||
body=["fly"],
|
||||
from_parameter="en",
|
||||
to="es"
|
||||
)
|
||||
|
||||
for item in result:
|
||||
print(f"Source: {item.normalized_source} ({item.display_source})")
|
||||
for translation in item.translations:
|
||||
print(f" Translation: {translation.normalized_target}")
|
||||
print(f" Part of speech: {translation.pos_tag}")
|
||||
print(f" Confidence: {translation.confidence:.2f}")
|
||||
```
|
||||
|
||||
## Dictionary Examples
|
||||
|
||||
Get usage examples for translations:
|
||||
|
||||
```python
|
||||
from azure.ai.translation.text.models import DictionaryExampleTextItem
|
||||
|
||||
result = client.lookup_dictionary_examples(
|
||||
body=[DictionaryExampleTextItem(text="fly", translation="volar")],
|
||||
from_parameter="en",
|
||||
to="es"
|
||||
)
|
||||
|
||||
for item in result:
|
||||
for example in item.examples:
|
||||
print(f"Source: {example.source_prefix}{example.source_term}{example.source_suffix}")
|
||||
print(f"Target: {example.target_prefix}{example.target_term}{example.target_suffix}")
|
||||
```
|
||||
|
||||
## Get Supported Languages
|
||||
|
||||
```python
|
||||
# Get all supported languages
|
||||
languages = client.get_supported_languages()
|
||||
|
||||
# Translation languages
|
||||
print("Translation languages:")
|
||||
for code, lang in languages.translation.items():
|
||||
print(f" {code}: {lang.name} ({lang.native_name})")
|
||||
|
||||
# Transliteration languages
|
||||
print("\nTransliteration languages:")
|
||||
for code, lang in languages.transliteration.items():
|
||||
print(f" {code}: {lang.name}")
|
||||
for script in lang.scripts:
|
||||
print(f" {script.code} -> {[t.code for t in script.to_scripts]}")
|
||||
|
||||
# Dictionary languages
|
||||
print("\nDictionary languages:")
|
||||
for code, lang in languages.dictionary.items():
|
||||
print(f" {code}: {lang.name}")
|
||||
```
|
||||
|
||||
## Break Sentence
|
||||
|
||||
Identify sentence boundaries:
|
||||
|
||||
```python
|
||||
result = client.find_sentence_boundaries(
|
||||
body=["Hello! How are you? I hope you are well."],
|
||||
language="en"
|
||||
)
|
||||
|
||||
for item in result:
|
||||
print(f"Sentence lengths: {item.sent_len}")
|
||||
```
|
||||
|
||||
## Translation Options
|
||||
|
||||
```python
|
||||
result = client.translate(
|
||||
body=["Hello, world!"],
|
||||
to=["de"],
|
||||
text_type="html", # "plain" or "html"
|
||||
profanity_action="Marked", # "NoAction", "Deleted", "Marked"
|
||||
profanity_marker="Asterisk", # "Asterisk", "Tag"
|
||||
include_alignment=True, # Include word alignment
|
||||
include_sentence_length=True # Include sentence boundaries
|
||||
)
|
||||
|
||||
for item in result:
|
||||
translation = item.translations[0]
|
||||
print(f"Translated: {translation.text}")
|
||||
if translation.alignment:
|
||||
print(f"Alignment: {translation.alignment.proj}")
|
||||
if translation.sent_len:
|
||||
print(f"Sentence lengths: {translation.sent_len.src_sent_len}")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.translation.text.aio import TextTranslationClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
async def translate_text():
|
||||
async with TextTranslationClient(
|
||||
credential=AzureKeyCredential(key),
|
||||
region=region
|
||||
) as client:
|
||||
result = await client.translate(
|
||||
body=["Hello, world!"],
|
||||
to=["es"]
|
||||
)
|
||||
print(result[0].translations[0].text)
|
||||
```
|
||||
|
||||
## Client Methods
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `translate` | Translate text to one or more languages |
|
||||
| `transliterate` | Convert text between scripts |
|
||||
| `detect` | Detect language of text |
|
||||
| `find_sentence_boundaries` | Identify sentence boundaries |
|
||||
| `lookup_dictionary_entries` | Dictionary lookup for translations |
|
||||
| `lookup_dictionary_examples` | Get usage examples |
|
||||
| `get_supported_languages` | List supported languages |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Batch translations** — Send multiple texts in one request (up to 100)
|
||||
2. **Specify source language** when known to improve accuracy
|
||||
3. **Use async client** for high-throughput scenarios
|
||||
4. **Cache language list** — Supported languages don't change frequently
|
||||
5. **Handle profanity** appropriately for your application
|
||||
6. **Use html text_type** when translating HTML content
|
||||
7. **Include alignment** for applications needing word mapping
|
||||
@@ -0,0 +1,260 @@
|
||||
---
|
||||
name: azure-ai-vision-imageanalysis-py
|
||||
description: |
|
||||
Azure AI Vision Image Analysis SDK for captions, tags, objects, OCR, people detection, and smart cropping. Use for computer vision and image understanding tasks.
|
||||
Triggers: "image analysis", "computer vision", "OCR", "object detection", "ImageAnalysisClient", "image caption".
|
||||
package: azure-ai-vision-imageanalysis
|
||||
---
|
||||
|
||||
# Azure AI Vision Image Analysis SDK for Python
|
||||
|
||||
Client library for Azure AI Vision 4.0 image analysis including captions, tags, objects, OCR, and more.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-vision-imageanalysis
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
VISION_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
VISION_KEY=<your-api-key> # If using API key
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.vision.imageanalysis import ImageAnalysisClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
endpoint = os.environ["VISION_ENDPOINT"]
|
||||
key = os.environ["VISION_KEY"]
|
||||
|
||||
client = ImageAnalysisClient(
|
||||
endpoint=endpoint,
|
||||
credential=AzureKeyCredential(key)
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.ai.vision.imageanalysis import ImageAnalysisClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = ImageAnalysisClient(
|
||||
endpoint=os.environ["VISION_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Analyze Image from URL
|
||||
|
||||
```python
|
||||
from azure.ai.vision.imageanalysis.models import VisualFeatures
|
||||
|
||||
image_url = "https://example.com/image.jpg"
|
||||
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[
|
||||
VisualFeatures.CAPTION,
|
||||
VisualFeatures.TAGS,
|
||||
VisualFeatures.OBJECTS,
|
||||
VisualFeatures.READ,
|
||||
VisualFeatures.PEOPLE,
|
||||
VisualFeatures.SMART_CROPS,
|
||||
VisualFeatures.DENSE_CAPTIONS
|
||||
],
|
||||
gender_neutral_caption=True,
|
||||
language="en"
|
||||
)
|
||||
```
|
||||
|
||||
## Analyze Image from File
|
||||
|
||||
```python
|
||||
with open("image.jpg", "rb") as f:
|
||||
image_data = f.read()
|
||||
|
||||
result = client.analyze(
|
||||
image_data=image_data,
|
||||
visual_features=[VisualFeatures.CAPTION, VisualFeatures.TAGS]
|
||||
)
|
||||
```
|
||||
|
||||
## Image Caption
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.CAPTION],
|
||||
gender_neutral_caption=True
|
||||
)
|
||||
|
||||
if result.caption:
|
||||
print(f"Caption: {result.caption.text}")
|
||||
print(f"Confidence: {result.caption.confidence:.2f}")
|
||||
```
|
||||
|
||||
## Dense Captions (Multiple Regions)
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.DENSE_CAPTIONS]
|
||||
)
|
||||
|
||||
if result.dense_captions:
|
||||
for caption in result.dense_captions.list:
|
||||
print(f"Caption: {caption.text}")
|
||||
print(f" Confidence: {caption.confidence:.2f}")
|
||||
print(f" Bounding box: {caption.bounding_box}")
|
||||
```
|
||||
|
||||
## Tags
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.TAGS]
|
||||
)
|
||||
|
||||
if result.tags:
|
||||
for tag in result.tags.list:
|
||||
print(f"Tag: {tag.name} (confidence: {tag.confidence:.2f})")
|
||||
```
|
||||
|
||||
## Object Detection
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.OBJECTS]
|
||||
)
|
||||
|
||||
if result.objects:
|
||||
for obj in result.objects.list:
|
||||
print(f"Object: {obj.tags[0].name}")
|
||||
print(f" Confidence: {obj.tags[0].confidence:.2f}")
|
||||
box = obj.bounding_box
|
||||
print(f" Bounding box: x={box.x}, y={box.y}, w={box.width}, h={box.height}")
|
||||
```
|
||||
|
||||
## OCR (Text Extraction)
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.READ]
|
||||
)
|
||||
|
||||
if result.read:
|
||||
for block in result.read.blocks:
|
||||
for line in block.lines:
|
||||
print(f"Line: {line.text}")
|
||||
print(f" Bounding polygon: {line.bounding_polygon}")
|
||||
|
||||
# Word-level details
|
||||
for word in line.words:
|
||||
print(f" Word: {word.text} (confidence: {word.confidence:.2f})")
|
||||
```
|
||||
|
||||
## People Detection
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.PEOPLE]
|
||||
)
|
||||
|
||||
if result.people:
|
||||
for person in result.people.list:
|
||||
print(f"Person detected:")
|
||||
print(f" Confidence: {person.confidence:.2f}")
|
||||
box = person.bounding_box
|
||||
print(f" Bounding box: x={box.x}, y={box.y}, w={box.width}, h={box.height}")
|
||||
```
|
||||
|
||||
## Smart Cropping
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.SMART_CROPS],
|
||||
smart_crops_aspect_ratios=[0.9, 1.33, 1.78] # Portrait, 4:3, 16:9
|
||||
)
|
||||
|
||||
if result.smart_crops:
|
||||
for crop in result.smart_crops.list:
|
||||
print(f"Aspect ratio: {crop.aspect_ratio}")
|
||||
box = crop.bounding_box
|
||||
print(f" Crop region: x={box.x}, y={box.y}, w={box.width}, h={box.height}")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.vision.imageanalysis.aio import ImageAnalysisClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def analyze_image():
|
||||
async with ImageAnalysisClient(
|
||||
endpoint=endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
) as client:
|
||||
result = await client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.CAPTION]
|
||||
)
|
||||
print(result.caption.text)
|
||||
```
|
||||
|
||||
## Visual Features
|
||||
|
||||
| Feature | Description |
|
||||
|---------|-------------|
|
||||
| `CAPTION` | Single sentence describing the image |
|
||||
| `DENSE_CAPTIONS` | Captions for multiple regions |
|
||||
| `TAGS` | Content tags (objects, scenes, actions) |
|
||||
| `OBJECTS` | Object detection with bounding boxes |
|
||||
| `READ` | OCR text extraction |
|
||||
| `PEOPLE` | People detection with bounding boxes |
|
||||
| `SMART_CROPS` | Suggested crop regions for thumbnails |
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
from azure.core.exceptions import HttpResponseError
|
||||
|
||||
try:
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.CAPTION]
|
||||
)
|
||||
except HttpResponseError as e:
|
||||
print(f"Status code: {e.status_code}")
|
||||
print(f"Reason: {e.reason}")
|
||||
print(f"Message: {e.error.message}")
|
||||
```
|
||||
|
||||
## Image Requirements
|
||||
|
||||
- Formats: JPEG, PNG, GIF, BMP, WEBP, ICO, TIFF, MPO
|
||||
- Max size: 20 MB
|
||||
- Dimensions: 50x50 to 16000x16000 pixels
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Select only needed features** to optimize latency and cost
|
||||
2. **Use async client** for high-throughput scenarios
|
||||
3. **Handle HttpResponseError** for invalid images or auth issues
|
||||
4. **Enable gender_neutral_caption** for inclusive descriptions
|
||||
5. **Specify language** for localized captions
|
||||
6. **Use smart_crops_aspect_ratios** matching your thumbnail requirements
|
||||
7. **Cache results** when analyzing the same image multiple times
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user