Compare commits
26 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7c6abdfb72 | ||
|
|
768290ebd1 | ||
|
|
5ac9d8b9b7 | ||
|
|
3186c43cd9 | ||
|
|
7c8481bcb4 | ||
|
|
6670ca074f | ||
|
|
9dc93c1cb9 | ||
|
|
44e51f0ea9 | ||
|
|
b0a8a59124 | ||
|
|
45bb3e5617 | ||
|
|
58489dfbaf | ||
|
|
35556e0306 | ||
|
|
e7ae616385 | ||
|
|
e06454dafd | ||
|
|
17bce709de | ||
|
|
167d7c97c7 | ||
|
|
817b7fe635 | ||
|
|
5f1f624b7f | ||
|
|
cc2946b6d5 | ||
|
|
c44f0f6505 | ||
|
|
dd60bb2940 | ||
|
|
b4e952d2a8 | ||
|
|
d6fd03cea7 | ||
|
|
ef994f7e5d | ||
|
|
183c792fef | ||
|
|
56720c9e1b |
24
.github/MAINTENANCE.md
vendored
24
.github/MAINTENANCE.md
vendored
@@ -1,4 +1,4 @@
|
||||
# 🛠️ Repository Maintenance Guide (V4)
|
||||
# 🛠️ Repository Maintenance Guide (V5)
|
||||
|
||||
> **"If it's not documented, it's broken."**
|
||||
|
||||
@@ -145,6 +145,24 @@ Locations to check:
|
||||
- **Antigravity Badge**: Must point to `https://github.com/sickn33/antigravity-awesome-skills`, NOT `anthropics/antigravity`.
|
||||
- **License**: Ensure the link points to `LICENSE` file.
|
||||
|
||||
### F. Workflows Consistency (NEW in V5)
|
||||
|
||||
If you touch any Workflows-related artifact, keep all workflow surfaces in sync:
|
||||
|
||||
1. `docs/WORKFLOWS.md` (human-readable playbooks)
|
||||
2. `data/workflows.json` (machine-readable schema)
|
||||
3. `skills/antigravity-workflows/SKILL.md` (orchestration entrypoint)
|
||||
|
||||
Rules:
|
||||
|
||||
- Every workflow id referenced in docs must exist in `data/workflows.json`.
|
||||
- If you add/remove a workflow step category, update prompt examples accordingly.
|
||||
- If a workflow references optional skills not yet merged (example: `go-playwright`), mark them explicitly as **optional** in docs.
|
||||
- If workflow onboarding text is changed, update the docs trinity:
|
||||
- `README.md`
|
||||
- `docs/GETTING_STARTED.md`
|
||||
- `docs/FAQ.md`
|
||||
|
||||
---
|
||||
|
||||
## 3. 🛡️ Governance & Quality Bar
|
||||
@@ -198,6 +216,10 @@ When cutting a new version (e.g., V4):
|
||||
You cannot republish the same version; always bump `package.json` before publishing.
|
||||
- **Option B (CI):** On GitHub, create a **Release** (tag e.g. `v4.6.1`). The workflow [Publish to npm](.github/workflows/publish-npm.yml) runs on **Release published** and runs `npm publish` if the repo secret `NPM_TOKEN` is set (npm → Access Tokens → Granular token with Publish, then add as repo secret `NPM_TOKEN`).
|
||||
|
||||
6. **Close linked issue(s)**:
|
||||
- If the release completes an issue scope (feature/fix), close it with `gh issue close <id> --comment "..."`
|
||||
- Include release tag reference in the closing note when applicable.
|
||||
|
||||
### 📋 Changelog Entry Template
|
||||
|
||||
Each new release section in `CHANGELOG.md` should follow [Keep a Changelog](https://keepachangelog.com/) and this structure:
|
||||
|
||||
209
CATALOG.md
209
CATALOG.md
@@ -1,10 +1,10 @@
|
||||
# Skill Catalog
|
||||
|
||||
Generated at: 2026-02-08T10:02:49.626Z
|
||||
Generated at: 2026-02-08T00:00:00.000Z
|
||||
|
||||
Total skills: 713
|
||||
Total skills: 856
|
||||
|
||||
## architecture (63)
|
||||
## architecture (64)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -68,6 +68,7 @@ Total skills: 713
|
||||
| `tool-design` | Build tools that agents can use effectively, including architectural reduction patterns | | agents, effectively, including, architectural, reduction |
|
||||
| `unreal-engine-cpp-pro` | Expert guide for Unreal Engine 5.x C++ development, covering UObject hygiene, performance patterns, and best practices. | unreal, engine, cpp | unreal, engine, cpp, pro, development, covering, uobject, hygiene, performance |
|
||||
| `wcag-audit-patterns` | Conduct WCAG 2.2 accessibility audits with automated testing, manual verification, and remediation guidance. Use when auditing websites for accessibility, fi... | wcag, audit | wcag, audit, conduct, accessibility, audits, automated, testing, manual, verification, remediation, guidance, auditing |
|
||||
| `wiki-architect` | Analyzes code repositories and generates hierarchical documentation structures with onboarding guides. Use when the user wants to create a wiki, generate doc... | wiki | wiki, architect, analyzes, code, repositories, generates, hierarchical, documentation, structures, onboarding, guides, user |
|
||||
| `workflow-orchestration-patterns` | Design durable workflows with Temporal for distributed systems. Covers workflow vs activity separation, saga patterns, state management, and determinism cons... | | orchestration, durable, temporal, distributed, covers, vs, activity, separation, saga, state, determinism, constraints |
|
||||
| `workflow-patterns` | Use this skill when implementing tasks according to Conductor's TDD workflow, handling phase checkpoints, managing git commits for tasks, or understanding th... | | skill, implementing, tasks, according, conductor, tdd, handling, phase, checkpoints, managing, git, commits |
|
||||
| `zapier-make-patterns` | No-code automation democratizes workflow building. Zapier and Make (formerly Integromat) let non-developers automate business processes without writing code.... | zapier, make | zapier, make, no, code, automation, democratizes, building, formerly, integromat, let, non, developers |
|
||||
@@ -115,12 +116,15 @@ Total skills: 713
|
||||
| `team-composition-analysis` | This skill should be used when the user asks to "plan team structure", "determine hiring needs", "design org chart", "calculate compensation", "plan equity a... | team, composition | team, composition, analysis, skill, should, used, user, asks, plan, structure, determine, hiring |
|
||||
| `whatsapp-automation` | Automate WhatsApp Business tasks via Rube MCP (Composio): send messages, manage templates, upload media, and handle contacts. Always search tools first for c... | whatsapp | whatsapp, automation, automate, business, tasks, via, rube, mcp, composio, send, messages, upload |
|
||||
|
||||
## data-ai (99)
|
||||
## data-ai (159)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
| `agent-framework-azure-ai-py` | Build Azure AI Foundry agents using the Microsoft Agent Framework Python SDK (agent-framework-azure-ai). Use when creating persistent agents with AzureAIAgen... | agent, framework, azure, ai, py | agent, framework, azure, ai, py, foundry, agents, microsoft, python, sdk, creating, persistent |
|
||||
| `agent-memory-mcp` | A hybrid memory system that provides persistent, searchable knowledge management for AI agents (Architecture, Patterns, Decisions). | agent, memory, mcp | agent, memory, mcp, hybrid, provides, persistent, searchable, knowledge, ai, agents, architecture, decisions |
|
||||
| `agent-tool-builder` | Tools are how AI agents interact with the world. A well-designed tool is the difference between an agent that works and one that hallucinates, fails silently... | agent, builder | agent, builder, how, ai, agents, interact, world, well, designed, difference, between, works |
|
||||
| `agents-v2-py` | Build container-based Foundry Agents using Azure AI Projects SDK with ImageBasedHostedAgentDefinition.
|
||||
Use when creating hosted agents that run custom code i... | agents, v2, py | agents, v2, py, container, foundry, azure, ai, sdk, imagebasedhostedagentdefinition, creating, hosted, run |
|
||||
| `ai-agents-architect` | Expert in designing and building autonomous AI agents. Masters tool use, memory systems, planning strategies, and multi-agent orchestration. Use when: build ... | ai, agents | ai, agents, architect, designing, building, autonomous, masters, memory, planning, multi, agent, orchestration |
|
||||
| `ai-engineer` | Build production-ready LLM applications, advanced RAG systems, and intelligent agents. Implements vector search, multimodal AI, agent orchestration, and ente... | ai | ai, engineer, llm, applications, rag, intelligent, agents, implements, vector, search, multimodal, agent |
|
||||
| `ai-wrapper-product` | Expert in building products that wrap AI APIs (OpenAI, Anthropic, etc.) into focused tools people will pay for. Not just 'ChatGPT but different' - products t... | ai, wrapper, product | ai, wrapper, product, building, products, wrap, apis, openai, anthropic, etc, people, pay |
|
||||
@@ -130,6 +134,77 @@ Total skills: 713
|
||||
| `audio-transcriber` | Transform audio recordings into professional Markdown documentation with intelligent summaries using LLM integration | audio, transcription, whisper, meeting-minutes, speech-to-text | audio, transcription, whisper, meeting-minutes, speech-to-text, transcriber, transform, recordings, professional, markdown, documentation, intelligent |
|
||||
| `autonomous-agent-patterns` | Design patterns for building autonomous coding agents. Covers tool integration, permission systems, browser automation, and human-in-the-loop workflows. Use ... | autonomous, agent | autonomous, agent, building, coding, agents, covers, integration, permission, browser, automation, human, loop |
|
||||
| `autonomous-agents` | Autonomous agents are AI systems that can independently decompose goals, plan actions, execute tools, and self-correct without constant human guidance. The c... | autonomous, agents | autonomous, agents, ai, independently, decompose, goals, plan, actions, execute, self, correct, without |
|
||||
| `azure-ai-agents-persistent-dotnet` | Azure AI Agents Persistent SDK for .NET. Low-level SDK for creating and managing AI agents with threads, messages, runs, and tools. Use for agent CRUD, conve... | azure, ai, agents, persistent, dotnet | azure, ai, agents, persistent, dotnet, sdk, net, low, level, creating, managing, threads |
|
||||
| `azure-ai-agents-persistent-java` | Azure AI Agents Persistent SDK for Java. Low-level SDK for creating and managing AI agents with threads, messages, runs, and tools.
|
||||
Triggers: "PersistentAgen... | azure, ai, agents, persistent, java | azure, ai, agents, persistent, java, sdk, low, level, creating, managing, threads, messages |
|
||||
| `azure-ai-contentsafety-java` | Build content moderation applications with Azure AI Content Safety SDK for Java. Use when implementing text/image analysis, blocklist management, or harm det... | azure, ai, contentsafety, java | azure, ai, contentsafety, java, content, moderation, applications, safety, sdk, implementing, text, image |
|
||||
| `azure-ai-contentsafety-py` | Azure AI Content Safety SDK for Python. Use for detecting harmful content in text and images with multi-severity classification.
|
||||
Triggers: "azure-ai-contents... | azure, ai, contentsafety, py | azure, ai, contentsafety, py, content, safety, sdk, python, detecting, harmful, text, images |
|
||||
| `azure-ai-contentsafety-ts` | Analyze text and images for harmful content using Azure AI Content Safety (@azure-rest/ai-content-safety). Use when moderating user-generated content, detect... | azure, ai, contentsafety, ts | azure, ai, contentsafety, ts, analyze, text, images, harmful, content, safety, rest, moderating |
|
||||
| `azure-ai-contentunderstanding-py` | Azure AI Content Understanding SDK for Python. Use for multimodal content extraction from documents, images, audio, and video.
|
||||
Triggers: "azure-ai-contentund... | azure, ai, contentunderstanding, py | azure, ai, contentunderstanding, py, content, understanding, sdk, python, multimodal, extraction, documents, images |
|
||||
| `azure-ai-document-intelligence-dotnet` | Azure AI Document Intelligence SDK for .NET. Extract text, tables, and structured data from documents using prebuilt and custom models. Use for invoice proce... | azure, ai, document, intelligence, dotnet | azure, ai, document, intelligence, dotnet, sdk, net, extract, text, tables, structured, data |
|
||||
| `azure-ai-document-intelligence-ts` | Extract text, tables, and structured data from documents using Azure Document Intelligence (@azure-rest/ai-document-intelligence). Use when processing invoic... | azure, ai, document, intelligence, ts | azure, ai, document, intelligence, ts, extract, text, tables, structured, data, documents, rest |
|
||||
| `azure-ai-formrecognizer-java` | Build document analysis applications with Azure Document Intelligence (Form Recognizer) SDK for Java. Use when extracting text, tables, key-value pairs from ... | azure, ai, formrecognizer, java | azure, ai, formrecognizer, java, document, analysis, applications, intelligence, form, recognizer, sdk, extracting |
|
||||
| `azure-ai-ml-py` | Azure Machine Learning SDK v2 for Python. Use for ML workspaces, jobs, models, datasets, compute, and pipelines.
|
||||
Triggers: "azure-ai-ml", "MLClient", "worksp... | azure, ai, ml, py | azure, ai, ml, py, machine, learning, sdk, v2, python, workspaces, jobs, models |
|
||||
| `azure-ai-openai-dotnet` | Azure OpenAI SDK for .NET. Client library for Azure OpenAI and OpenAI services. Use for chat completions, embeddings, image generation, audio transcription, ... | azure, ai, openai, dotnet | azure, ai, openai, dotnet, sdk, net, client, library, chat, completions, embeddings, image |
|
||||
| `azure-ai-projects-dotnet` | Azure AI Projects SDK for .NET. High-level client for Azure AI Foundry projects including agents, connections, datasets, deployments, evaluations, and indexe... | azure, ai, dotnet | azure, ai, dotnet, sdk, net, high, level, client, foundry, including, agents, connections |
|
||||
| `azure-ai-projects-java` | Azure AI Projects SDK for Java. High-level SDK for Azure AI Foundry project management including connections, datasets, indexes, and evaluations.
|
||||
Triggers: "... | azure, ai, java | azure, ai, java, sdk, high, level, foundry, including, connections, datasets, indexes, evaluations |
|
||||
| `azure-ai-projects-py` | Build AI applications using the Azure AI Projects Python SDK (azure-ai-projects). Use when working with Foundry project clients, creating versioned agents wi... | azure, ai, py | azure, ai, py, applications, python, sdk, working, foundry, clients, creating, versioned, agents |
|
||||
| `azure-ai-projects-ts` | Build AI applications using Azure AI Projects SDK for JavaScript (@azure/ai-projects). Use when working with Foundry project clients, agents, connections, de... | azure, ai, ts | azure, ai, ts, applications, sdk, javascript, working, foundry, clients, agents, connections, deployments |
|
||||
| `azure-ai-textanalytics-py` | Azure AI Text Analytics SDK for sentiment analysis, entity recognition, key phrases, language detection, PII, and healthcare NLP. Use for natural language pr... | azure, ai, textanalytics, py | azure, ai, textanalytics, py, text, analytics, sdk, sentiment, analysis, entity, recognition, key |
|
||||
| `azure-ai-transcription-py` | Azure AI Transcription SDK for Python. Use for real-time and batch speech-to-text transcription with timestamps and diarization.
|
||||
Triggers: "transcription", "... | azure, ai, transcription, py | azure, ai, transcription, py, sdk, python, real, time, batch, speech, text, timestamps |
|
||||
| `azure-ai-translation-document-py` | Azure AI Document Translation SDK for batch translation of documents with format preservation. Use for translating Word, PDF, Excel, PowerPoint, and other do... | azure, ai, translation, document, py | azure, ai, translation, document, py, sdk, batch, documents, format, preservation, translating, word |
|
||||
| `azure-ai-translation-text-py` | Azure AI Text Translation SDK for real-time text translation, transliteration, language detection, and dictionary lookup. Use for translating text content in... | azure, ai, translation, text, py | azure, ai, translation, text, py, sdk, real, time, transliteration, language, detection, dictionary |
|
||||
| `azure-ai-translation-ts` | Build translation applications using Azure Translation SDKs for JavaScript (@azure-rest/ai-translation-text, @azure-rest/ai-translation-document). Use when i... | azure, ai, translation, ts | azure, ai, translation, ts, applications, sdks, javascript, rest, text, document, implementing, transliteration |
|
||||
| `azure-ai-vision-imageanalysis-java` | Build image analysis applications with Azure AI Vision SDK for Java. Use when implementing image captioning, OCR text extraction, object detection, tagging, ... | azure, ai, vision, imageanalysis, java | azure, ai, vision, imageanalysis, java, image, analysis, applications, sdk, implementing, captioning, ocr |
|
||||
| `azure-ai-vision-imageanalysis-py` | Azure AI Vision Image Analysis SDK for captions, tags, objects, OCR, people detection, and smart cropping. Use for computer vision and image understanding ta... | azure, ai, vision, imageanalysis, py | azure, ai, vision, imageanalysis, py, image, analysis, sdk, captions, tags, objects, ocr |
|
||||
| `azure-ai-voicelive-dotnet` | Azure AI Voice Live SDK for .NET. Build real-time voice AI applications with bidirectional WebSocket communication. Use for voice assistants, conversational ... | azure, ai, voicelive, dotnet | azure, ai, voicelive, dotnet, voice, live, sdk, net, real, time, applications, bidirectional |
|
||||
| `azure-ai-voicelive-java` | Azure AI VoiceLive SDK for Java. Real-time bidirectional voice conversations with AI assistants using WebSocket.
|
||||
Triggers: "VoiceLiveClient java", "voice ass... | azure, ai, voicelive, java | azure, ai, voicelive, java, sdk, real, time, bidirectional, voice, conversations, assistants, websocket |
|
||||
| `azure-ai-voicelive-py` | Build real-time voice AI applications using Azure AI Voice Live SDK (azure-ai-voicelive). Use this skill when creating Python applications that need real-tim... | azure, ai, voicelive, py | azure, ai, voicelive, py, real, time, voice, applications, live, sdk, skill, creating |
|
||||
| `azure-ai-voicelive-ts` | Azure AI Voice Live SDK for JavaScript/TypeScript. Build real-time voice AI applications with bidirectional WebSocket communication. Use for voice assistants... | azure, ai, voicelive, ts | azure, ai, voicelive, ts, voice, live, sdk, javascript, typescript, real, time, applications |
|
||||
| `azure-communication-callautomation-java` | Build call automation workflows with Azure Communication Services Call Automation Java SDK. Use when implementing IVR systems, call routing, call recording, ... | azure, communication, callautomation, java | azure, communication, callautomation, java, call, automation, sdk, implementing, ivr, routing, recording, dtmf |
|
||||
| `azure-cosmos-java` | Azure Cosmos DB SDK for Java. NoSQL database operations with global distribution, multi-model support, and reactive patterns.
|
||||
Triggers: "CosmosClient java", ... | azure, cosmos, java | azure, cosmos, java, db, sdk, nosql, database, operations, global, distribution, multi, model |
|
||||
| `azure-cosmos-py` | Azure Cosmos DB SDK for Python (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
|
||||
Triggers: "cosmos db", "CosmosClient",... | azure, cosmos, py | azure, cosmos, py, db, sdk, python, nosql, api, document, crud, queries, containers |
|
||||
| `azure-cosmos-rust` | Azure Cosmos DB SDK for Rust (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
|
||||
Triggers: "cosmos db rust", "CosmosClien... | azure, cosmos, rust | azure, cosmos, rust, db, sdk, nosql, api, document, crud, queries, containers, globally |
|
||||
| `azure-cosmos-ts` | Azure Cosmos DB JavaScript/TypeScript SDK (@azure/cosmos) for data plane operations. Use for CRUD operations on documents, queries, bulk operations, and cont... | azure, cosmos, ts | azure, cosmos, ts, db, javascript, typescript, sdk, data, plane, operations, crud, documents |
|
||||
| `azure-data-tables-java` | Build table storage applications with Azure Tables SDK for Java. Use when working with Azure Table Storage or Cosmos DB Table API for NoSQL key-value data, s... | azure, data, tables, java | azure, data, tables, java, table, storage, applications, sdk, working, cosmos, db, api |
|
||||
| `azure-data-tables-py` | Azure Tables SDK for Python (Storage and Cosmos DB). Use for NoSQL key-value storage, entity CRUD, and batch operations.
|
||||
Triggers: "table storage", "TableSer... | azure, data, tables, py | azure, data, tables, py, sdk, python, storage, cosmos, db, nosql, key, value |
|
||||
| `azure-eventhub-dotnet` | Azure Event Hubs SDK for .NET. Use for high-throughput event streaming: sending events (EventHubProducerClient, EventHubBufferedProducerClient), receiving ev... | azure, eventhub, dotnet | azure, eventhub, dotnet, event, hubs, sdk, net, high, throughput, streaming, sending, events |
|
||||
| `azure-eventhub-java` | Build real-time streaming applications with Azure Event Hubs SDK for Java. Use when implementing event streaming, high-throughput data ingestion, or building... | azure, eventhub, java | azure, eventhub, java, real, time, streaming, applications, event, hubs, sdk, implementing, high |
|
||||
| `azure-eventhub-rust` | Azure Event Hubs SDK for Rust. Use for sending and receiving events, streaming data ingestion.
|
||||
Triggers: "event hubs rust", "ProducerClient rust", "ConsumerC... | azure, eventhub, rust | azure, eventhub, rust, event, hubs, sdk, sending, receiving, events, streaming, data, ingestion |
|
||||
| `azure-eventhub-ts` | Build event streaming applications using Azure Event Hubs SDK for JavaScript (@azure/event-hubs). Use when implementing high-throughput event ingestion, real... | azure, eventhub, ts | azure, eventhub, ts, event, streaming, applications, hubs, sdk, javascript, implementing, high, throughput |
|
||||
| `azure-maps-search-dotnet` | Azure Maps SDK for .NET. Location-based services including geocoding, routing, rendering, geolocation, and weather. Use for address search, directions, map t... | azure, maps, search, dotnet | azure, maps, search, dotnet, sdk, net, location, including, geocoding, routing, rendering, geolocation |
|
||||
| `azure-monitor-ingestion-java` | Azure Monitor Ingestion SDK for Java. Send custom logs to Azure Monitor via Data Collection Rules (DCR) and Data Collection Endpoints (DCE).
|
||||
Triggers: "LogsI... | azure, monitor, ingestion, java | azure, monitor, ingestion, java, sdk, send, custom, logs, via, data, collection, rules |
|
||||
| `azure-monitor-ingestion-py` | Azure Monitor Ingestion SDK for Python. Use for sending custom logs to Log Analytics workspace via Logs Ingestion API.
|
||||
Triggers: "azure-monitor-ingestion", "... | azure, monitor, ingestion, py | azure, monitor, ingestion, py, sdk, python, sending, custom, logs, log, analytics, workspace |
|
||||
| `azure-monitor-query-java` | Azure Monitor Query SDK for Java. Execute Kusto queries against Log Analytics workspaces and query metrics from Azure resources.
|
||||
Triggers: "LogsQueryClient j... | azure, monitor, query, java | azure, monitor, query, java, sdk, execute, kusto, queries, against, log, analytics, workspaces |
|
||||
| `azure-monitor-query-py` | Azure Monitor Query SDK for Python. Use for querying Log Analytics workspaces and Azure Monitor metrics.
|
||||
Triggers: "azure-monitor-query", "LogsQueryClient", ... | azure, monitor, query, py | azure, monitor, query, py, sdk, python, querying, log, analytics, workspaces, metrics, triggers |
|
||||
| `azure-postgres-ts` | Connect to Azure Database for PostgreSQL Flexible Server from Node.js/TypeScript using the pg (node-postgres) package. Use for PostgreSQL queries, connection... | azure, postgres, ts | azure, postgres, ts, connect, database, postgresql, flexible, server, node, js, typescript, pg |
|
||||
| `azure-resource-manager-cosmosdb-dotnet` | Azure Resource Manager SDK for Cosmos DB in .NET. Use for MANAGEMENT PLANE operations: creating/managing Cosmos DB accounts, databases, containers, throughpu... | azure, resource, manager, cosmosdb, dotnet | azure, resource, manager, cosmosdb, dotnet, sdk, cosmos, db, net, plane, operations, creating |
|
||||
| `azure-resource-manager-mysql-dotnet` | Azure MySQL Flexible Server SDK for .NET. Database management for MySQL Flexible Server deployments. Use for creating servers, databases, firewall rules, con... | azure, resource, manager, mysql, dotnet | azure, resource, manager, mysql, dotnet, flexible, server, sdk, net, database, deployments, creating |
|
||||
| `azure-resource-manager-postgresql-dotnet` | Azure PostgreSQL Flexible Server SDK for .NET. Database management for PostgreSQL Flexible Server deployments. Use for creating servers, databases, firewall ... | azure, resource, manager, postgresql, dotnet | azure, resource, manager, postgresql, dotnet, flexible, server, sdk, net, database, deployments, creating |
|
||||
| `azure-resource-manager-redis-dotnet` | Azure Resource Manager SDK for Redis in .NET. Use for MANAGEMENT PLANE operations: creating/managing Azure Cache for Redis instances, firewall rules, access ... | azure, resource, manager, redis, dotnet | azure, resource, manager, redis, dotnet, sdk, net, plane, operations, creating, managing, cache |
|
||||
| `azure-resource-manager-sql-dotnet` | Azure Resource Manager SDK for Azure SQL in .NET. Use for MANAGEMENT PLANE operations: creating/managing SQL servers, databases, elastic pools, firewall rule... | azure, resource, manager, sql, dotnet | azure, resource, manager, sql, dotnet, sdk, net, plane, operations, creating, managing, servers |
|
||||
| `azure-search-documents-dotnet` | Azure AI Search SDK for .NET (Azure.Search.Documents). Use for building search applications with full-text, vector, semantic, and hybrid search. Covers Searc... | azure, search, documents, dotnet | azure, search, documents, dotnet, ai, sdk, net, building, applications, full, text, vector |
|
||||
| `azure-search-documents-py` | Azure AI Search SDK for Python. Use for vector search, hybrid search, semantic ranking, indexing, and skillsets.
|
||||
Triggers: "azure-search-documents", "SearchC... | azure, search, documents, py | azure, search, documents, py, ai, sdk, python, vector, hybrid, semantic, ranking, indexing |
|
||||
| `azure-search-documents-ts` | Build search applications using Azure AI Search SDK for JavaScript (@azure/search-documents). Use when creating/managing indexes, implementing vector/hybrid ... | azure, search, documents, ts | azure, search, documents, ts, applications, ai, sdk, javascript, creating, managing, indexes, implementing |
|
||||
| `azure-storage-blob-java` | Build blob storage applications with Azure Storage Blob SDK for Java. Use when uploading, downloading, or managing files in Azure Blob Storage, working with ... | azure, storage, blob, java | azure, storage, blob, java, applications, sdk, uploading, downloading, managing, files, working, containers |
|
||||
| `azure-storage-file-datalake-py` | Azure Data Lake Storage Gen2 SDK for Python. Use for hierarchical file systems, big data analytics, and file/directory operations.
|
||||
Triggers: "data lake", "Da... | azure, storage, file, datalake, py | azure, storage, file, datalake, py, data, lake, gen2, sdk, python, hierarchical, big |
|
||||
| `beautiful-prose` | Hard-edged writing style contract for timeless, forceful English prose without AI tics | beautiful, prose | beautiful, prose, hard, edged, writing, style, contract, timeless, forceful, english, without, ai |
|
||||
| `behavioral-modes` | AI operational modes (brainstorm, implement, debug, review, teach, ship, orchestrate). Use to adapt behavior based on task type. | behavioral, modes | behavioral, modes, ai, operational, brainstorm, debug, review, teach, ship, orchestrate, adapt, behavior |
|
||||
| `blockrun` | Use when user needs capabilities Claude lacks (image generation, real-time X/Twitter data) or explicitly requests external models ("blockrun", "use grok", "u... | blockrun | blockrun, user, capabilities, claude, lacks, image, generation, real, time, twitter, data, explicitly |
|
||||
@@ -163,10 +238,13 @@ Total skills: 713
|
||||
| `fal-workflow` | Generate workflow JSON files for chaining AI models | fal | fal, generate, json, files, chaining, ai, models |
|
||||
| `fp-ts-react` | Practical patterns for using fp-ts with React - hooks, state, forms, data fetching. Use when building React apps with functional programming patterns. Works ... | fp, ts, react | fp, ts, react, practical, hooks, state, forms, data, fetching, building, apps, functional |
|
||||
| `frontend-dev-guidelines` | Opinionated frontend development standards for modern React + TypeScript applications. Covers Suspense-first data fetching, lazy loading, feature-based archi... | frontend, dev, guidelines | frontend, dev, guidelines, opinionated, development, standards, react, typescript, applications, covers, suspense, first |
|
||||
| `frontend-ui-dark-ts` | Build dark-themed React applications using Tailwind CSS with custom theming, glassmorphism effects, and Framer Motion animations. Use when creating dashboard... | frontend, ui, dark, ts | frontend, ui, dark, ts, themed, react, applications, tailwind, css, custom, theming, glassmorphism |
|
||||
| `geo-fundamentals` | Generative Engine Optimization for AI search engines (ChatGPT, Claude, Perplexity). | geo, fundamentals | geo, fundamentals, generative, engine, optimization, ai, search, engines, chatgpt, claude, perplexity |
|
||||
| `google-analytics-automation` | Automate Google Analytics tasks via Rube MCP (Composio): run reports, list accounts/properties, funnels, pivots, key events. Always search tools first for cu... | google, analytics | google, analytics, automation, automate, tasks, via, rube, mcp, composio, run, reports, list |
|
||||
| `googlesheets-automation` | Automate Google Sheets operations (read, write, format, filter, manage spreadsheets) via Rube MCP (Composio). Read/write data, manage tabs, apply formatting,... | googlesheets | googlesheets, automation, automate, google, sheets, operations, read, write, format, filter, spreadsheets, via |
|
||||
| `graphql` | GraphQL gives clients exactly the data they need - no more, no less. One endpoint, typed schema, introspection. But the flexibility that makes it powerful al... | graphql | graphql, gives, clients, exactly, data, no, less, one, endpoint, typed, schema, introspection |
|
||||
| `hosted-agents-v2-py` | Build hosted agents using Azure AI Projects SDK with ImageBasedHostedAgentDefinition.
|
||||
Use when creating container-based agents that run custom code in Azure ... | hosted, agents, v2, py | hosted, agents, v2, py, azure, ai, sdk, imagebasedhostedagentdefinition, creating, container, run, custom |
|
||||
| `hybrid-search-implementation` | Combine vector and keyword search for improved retrieval. Use when implementing RAG systems, building search engines, or when neither approach alone provides... | hybrid, search | hybrid, search, combine, vector, keyword, improved, retrieval, implementing, rag, building, engines, neither |
|
||||
| `ios-developer` | Develop native iOS applications with Swift/SwiftUI. Masters iOS 18, SwiftUI, UIKit integration, Core Data, networking, and App Store optimization. Use PROACT... | ios | ios, developer, develop, native, applications, swift, swiftui, masters, 18, uikit, integration, core |
|
||||
| `langchain-architecture` | Design LLM applications using the LangChain framework with agents, memory, and tool integration patterns. Use when building LangChain applications, implement... | langchain, architecture | langchain, architecture, llm, applications, framework, agents, memory, integration, building, implementing, ai, creating |
|
||||
@@ -182,12 +260,14 @@ Total skills: 713
|
||||
| `nextjs-best-practices` | Next.js App Router principles. Server Components, data fetching, routing patterns. | nextjs, best, practices | nextjs, best, practices, next, js, app, router, principles, server, components, data, fetching |
|
||||
| `nodejs-backend-patterns` | Build production-ready Node.js backend services with Express/Fastify, implementing middleware patterns, error handling, authentication, database integration,... | nodejs, backend | nodejs, backend, node, js, express, fastify, implementing, middleware, error, handling, authentication, database |
|
||||
| `php-pro` | Write idiomatic PHP code with generators, iterators, SPL data structures, and modern OOP features. Use PROACTIVELY for high-performance PHP applications. | php | php, pro, write, idiomatic, code, generators, iterators, spl, data, structures, oop, features |
|
||||
| `podcast-generation` | Generate AI-powered podcast-style audio narratives using Azure OpenAI's GPT Realtime Mini model via WebSocket. Use when building text-to-speech features, aud... | podcast, generation | podcast, generation, generate, ai, powered, style, audio, narratives, azure, openai, gpt, realtime |
|
||||
| `postgres-best-practices` | Postgres performance optimization and best practices from Supabase. Use this skill when writing, reviewing, or optimizing Postgres queries, schema designs, o... | postgres, best, practices | postgres, best, practices, supabase, performance, optimization, skill, writing, reviewing, optimizing, queries, schema |
|
||||
| `postgresql` | Design a PostgreSQL-specific schema. Covers best-practices, data types, indexing, constraints, performance patterns, and advanced features | postgresql | postgresql, specific, schema, covers, data, types, indexing, constraints, performance, features |
|
||||
| `prisma-expert` | Prisma ORM expert for schema design, migrations, query optimization, relations modeling, and database operations. Use PROACTIVELY for Prisma schema issues, m... | prisma | prisma, orm, schema, migrations, query, optimization, relations, modeling, database, operations, proactively, issues |
|
||||
| `programmatic-seo` | Design and evaluate programmatic SEO strategies for creating SEO-driven pages at scale using templates and structured data. Use when the user mentions progra... | programmatic, seo | programmatic, seo, evaluate, creating, driven, pages, scale, structured, data, user, mentions, directory |
|
||||
| `prompt-caching` | Caching strategies for LLM prompts including Anthropic prompt caching, response caching, and CAG (Cache Augmented Generation) Use when: prompt caching, cache... | prompt, caching | prompt, caching, llm, prompts, including, anthropic, response, cag, cache, augmented, generation |
|
||||
| `prompt-engineering-patterns` | Master advanced prompt engineering techniques to maximize LLM performance, reliability, and controllability in production. Use when optimizing prompts, impro... | prompt, engineering | prompt, engineering, techniques, maximize, llm, performance, reliability, controllability, optimizing, prompts, improving, outputs |
|
||||
| `pydantic-models-py` | Create Pydantic models following the multi-model pattern with Base, Create, Update, Response, and InDB variants. Use when defining API request/response schem... | pydantic, models, py | pydantic, models, py, following, multi, model, base, update, response, indb, variants, defining |
|
||||
| `rag-engineer` | Expert in building Retrieval-Augmented Generation systems. Masters embedding models, vector databases, chunking strategies, and retrieval optimization for LL... | rag | rag, engineer, building, retrieval, augmented, generation, masters, embedding, models, vector, databases, chunking |
|
||||
| `rag-implementation` | Build Retrieval-Augmented Generation (RAG) systems for LLM applications with vector databases and semantic search. Use when implementing knowledge-grounded A... | rag | rag, retrieval, augmented, generation, llm, applications, vector, databases, semantic, search, implementing, knowledge |
|
||||
| `react-best-practices` | React and Next.js performance optimization guidelines from Vercel Engineering. This skill should be used when writing, reviewing, or refactoring React/Next.j... | react, best, practices | react, best, practices, vercel, next, js, performance, optimization, guidelines, engineering, skill, should |
|
||||
@@ -199,6 +279,7 @@ Total skills: 713
|
||||
| `senior-architect` | Comprehensive software architecture skill for designing scalable, maintainable systems using ReactJS, NextJS, NodeJS, Express, React Native, Swift, Kotlin, F... | senior | senior, architect, software, architecture, skill, designing, scalable, maintainable, reactjs, nextjs, nodejs, express |
|
||||
| `seo-audit` | Diagnose and audit SEO issues affecting crawlability, indexation, rankings, and organic performance. Use when the user asks for an SEO audit, technical SEO r... | seo, audit | seo, audit, diagnose, issues, affecting, crawlability, indexation, rankings, organic, performance, user, asks |
|
||||
| `similarity-search-patterns` | Implement efficient similarity search with vector databases. Use when building semantic search, implementing nearest neighbor queries, or optimizing retrieva... | similarity, search | similarity, search, efficient, vector, databases, building, semantic, implementing, nearest, neighbor, queries, optimizing |
|
||||
| `skill-creator-ms` | Guide for creating effective skills for AI coding agents working with Azure SDKs and Microsoft Foundry services. Use when creating new skills or updating exi... | skill, creator, ms | skill, creator, ms, creating, effective, skills, ai, coding, agents, working, azure, sdks |
|
||||
| `skill-seekers` | -Automatically convert documentation websites, GitHub repositories, and PDFs into Claude AI skills in minutes. | skill, seekers | skill, seekers, automatically, convert, documentation, websites, github, repositories, pdfs, claude, ai, skills |
|
||||
| `spark-optimization` | Optimize Apache Spark jobs with partitioning, caching, shuffle optimization, and memory tuning. Use when improving Spark performance, debugging slow jobs, or... | spark, optimization | spark, optimization, optimize, apache, jobs, partitioning, caching, shuffle, memory, tuning, improving, performance |
|
||||
| `sql-optimization-patterns` | Master SQL query optimization, indexing strategies, and EXPLAIN analysis to dramatically improve database performance and eliminate slow queries. Use when de... | sql, optimization | sql, optimization, query, indexing, explain, analysis, dramatically, improve, database, performance, eliminate, slow |
|
||||
@@ -219,7 +300,7 @@ Total skills: 713
|
||||
| `xlsx-official` | Comprehensive spreadsheet creation, editing, and analysis with support for formulas, formatting, data analysis, and visualization. When Claude needs to work ... | xlsx, official | xlsx, official, spreadsheet, creation, editing, analysis, formulas, formatting, data, visualization, claude, work |
|
||||
| `youtube-automation` | Automate YouTube tasks via Rube MCP (Composio): upload videos, manage playlists, search content, get analytics, and handle comments. Always search tools firs... | youtube | youtube, automation, automate, tasks, via, rube, mcp, composio, upload, videos, playlists, search |
|
||||
|
||||
## development (82)
|
||||
## development (127)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -231,19 +312,74 @@ Total skills: 713
|
||||
| `app-store-optimization` | Complete App Store Optimization (ASO) toolkit for researching, optimizing, and tracking mobile app performance on Apple App Store and Google Play Store | app, store, optimization | app, store, optimization, complete, aso, toolkit, researching, optimizing, tracking, mobile, performance, apple |
|
||||
| `architecture-patterns` | Implement proven backend architecture patterns including Clean Architecture, Hexagonal Architecture, and Domain-Driven Design. Use when architecting complex ... | architecture | architecture, proven, backend, including, clean, hexagonal, domain, driven, architecting, complex, refactoring, existing |
|
||||
| `async-python-patterns` | Master Python asyncio, concurrent programming, and async/await patterns for high-performance applications. Use when building async APIs, concurrent systems, ... | async, python | async, python, asyncio, concurrent, programming, await, high, performance, applications, building, apis, bound |
|
||||
| `azure-appconfiguration-java` | Azure App Configuration SDK for Java. Centralized application configuration management with key-value settings, feature flags, and snapshots.
|
||||
Triggers: "Conf... | azure, appconfiguration, java | azure, appconfiguration, java, app, configuration, sdk, centralized, application, key, value, settings, feature |
|
||||
| `azure-appconfiguration-py` | Azure App Configuration SDK for Python. Use for centralized configuration management, feature flags, and dynamic settings.
|
||||
Triggers: "azure-appconfiguration"... | azure, appconfiguration, py | azure, appconfiguration, py, app, configuration, sdk, python, centralized, feature, flags, dynamic, settings |
|
||||
| `azure-appconfiguration-ts` | Build applications using Azure App Configuration SDK for JavaScript (@azure/app-configuration). Use when working with configuration settings, feature flags, ... | azure, appconfiguration, ts | azure, appconfiguration, ts, applications, app, configuration, sdk, javascript, working, settings, feature, flags |
|
||||
| `azure-communication-callingserver-java` | Azure Communication Services CallingServer (legacy) Java SDK. Note - This SDK is deprecated. Use azure-communication-callautomation instead for new projects.... | azure, communication, callingserver, java | azure, communication, callingserver, java, legacy, sdk, note, deprecated, callautomation, instead, new, skill |
|
||||
| `azure-communication-chat-java` | Build real-time chat applications with Azure Communication Services Chat Java SDK. Use when implementing chat threads, messaging, participants, read receipts... | azure, communication, chat, java | azure, communication, chat, java, real, time, applications, sdk, implementing, threads, messaging, participants |
|
||||
| `azure-communication-common-java` | Azure Communication Services common utilities for Java. Use when working with CommunicationTokenCredential, user identifiers, token refresh, or shared authen... | azure, communication, common, java | azure, communication, common, java, utilities, working, communicationtokencredential, user, identifiers, token, refresh, shared |
|
||||
| `azure-communication-sms-java` | Send SMS messages with Azure Communication Services SMS Java SDK. Use when implementing SMS notifications, alerts, OTP delivery, bulk messaging, or delivery ... | azure, communication, sms, java | azure, communication, sms, java, send, messages, sdk, implementing, notifications, alerts, otp, delivery |
|
||||
| `azure-compute-batch-java` | Azure Batch SDK for Java. Run large-scale parallel and HPC batch jobs with pools, jobs, tasks, and compute nodes.
|
||||
Triggers: "BatchClient java", "azure batch ... | azure, compute, batch, java | azure, compute, batch, java, sdk, run, large, scale, parallel, hpc, jobs, pools |
|
||||
| `azure-containerregistry-py` | Azure Container Registry SDK for Python. Use for managing container images, artifacts, and repositories.
|
||||
Triggers: "azure-containerregistry", "ContainerRegis... | azure, containerregistry, py | azure, containerregistry, py, container, registry, sdk, python, managing, images, artifacts, repositories, triggers |
|
||||
| `azure-eventgrid-dotnet` | Azure Event Grid SDK for .NET. Client library for publishing and consuming events with Azure Event Grid. Use for event-driven architectures, pub/sub messagin... | azure, eventgrid, dotnet | azure, eventgrid, dotnet, event, grid, sdk, net, client, library, publishing, consuming, events |
|
||||
| `azure-eventgrid-java` | Build event-driven applications with Azure Event Grid SDK for Java. Use when publishing events, implementing pub/sub patterns, or integrating with Azure serv... | azure, eventgrid, java | azure, eventgrid, java, event, driven, applications, grid, sdk, publishing, events, implementing, pub |
|
||||
| `azure-eventgrid-py` | Azure Event Grid SDK for Python. Use for publishing events, handling CloudEvents, and event-driven architectures.
|
||||
Triggers: "event grid", "EventGridPublisher... | azure, eventgrid, py | azure, eventgrid, py, event, grid, sdk, python, publishing, events, handling, cloudevents, driven |
|
||||
| `azure-eventhub-py` | Azure Event Hubs SDK for Python streaming. Use for high-throughput event ingestion, producers, consumers, and checkpointing.
|
||||
Triggers: "event hubs", "EventHu... | azure, eventhub, py | azure, eventhub, py, event, hubs, sdk, python, streaming, high, throughput, ingestion, producers |
|
||||
| `azure-functions` | Expert patterns for Azure Functions development including isolated worker model, Durable Functions orchestration, cold start optimization, and production pat... | azure, functions | azure, functions, development, including, isolated, worker, model, durable, orchestration, cold, start, optimization |
|
||||
| `azure-identity-rust` | Azure Identity SDK for Rust authentication. Use for DeveloperToolsCredential, ManagedIdentityCredential, ClientSecretCredential, and token-based authenticati... | azure, identity, rust | azure, identity, rust, sdk, authentication, developertoolscredential, managedidentitycredential, clientsecretcredential, token, triggers, managed, credential |
|
||||
| `azure-keyvault-certificates-rust` | Azure Key Vault Certificates SDK for Rust. Use for creating, importing, and managing certificates.
|
||||
Triggers: "keyvault certificates rust", "CertificateClient... | azure, keyvault, certificates, rust | azure, keyvault, certificates, rust, key, vault, sdk, creating, importing, managing, triggers, certificateclient |
|
||||
| `azure-keyvault-keys-rust` | Azure Key Vault Keys SDK for Rust. Use for creating, managing, and using cryptographic keys.
|
||||
Triggers: "keyvault keys rust", "KeyClient rust", "create key ru... | azure, keyvault, keys, rust | azure, keyvault, keys, rust, key, vault, sdk, creating, managing, cryptographic, triggers, keyclient |
|
||||
| `azure-keyvault-keys-ts` | Manage cryptographic keys using Azure Key Vault Keys SDK for JavaScript (@azure/keyvault-keys). Use when creating, encrypting/decrypting, signing, or rotatin... | azure, keyvault, keys, ts | azure, keyvault, keys, ts, cryptographic, key, vault, sdk, javascript, creating, encrypting, decrypting |
|
||||
| `azure-messaging-webpubsub-java` | Build real-time web applications with Azure Web PubSub SDK for Java. Use when implementing WebSocket-based messaging, live updates, chat applications, or ser... | azure, messaging, webpubsub, java | azure, messaging, webpubsub, java, real, time, web, applications, pubsub, sdk, implementing, websocket |
|
||||
| `azure-mgmt-apicenter-dotnet` | Azure API Center SDK for .NET. Centralized API inventory management with governance, versioning, and discovery. Use for creating API services, workspaces, AP... | azure, mgmt, apicenter, dotnet | azure, mgmt, apicenter, dotnet, api, center, sdk, net, centralized, inventory, governance, versioning |
|
||||
| `azure-mgmt-apicenter-py` | Azure API Center Management SDK for Python. Use for managing API inventory, metadata, and governance across your organization.
|
||||
Triggers: "azure-mgmt-apicente... | azure, mgmt, apicenter, py | azure, mgmt, apicenter, py, api, center, sdk, python, managing, inventory, metadata, governance |
|
||||
| `azure-mgmt-apimanagement-py` | Azure API Management SDK for Python. Use for managing APIM services, APIs, products, subscriptions, and policies.
|
||||
Triggers: "azure-mgmt-apimanagement", "ApiM... | azure, mgmt, apimanagement, py | azure, mgmt, apimanagement, py, api, sdk, python, managing, apim, apis, products, subscriptions |
|
||||
| `azure-mgmt-fabric-dotnet` | Azure Resource Manager SDK for Fabric in .NET. Use for MANAGEMENT PLANE operations: provisioning, scaling, suspending/resuming Microsoft Fabric capacities, c... | azure, mgmt, fabric, dotnet | azure, mgmt, fabric, dotnet, resource, manager, sdk, net, plane, operations, provisioning, scaling |
|
||||
| `azure-mgmt-fabric-py` | Azure Fabric Management SDK for Python. Use for managing Microsoft Fabric capacities and resources.
|
||||
Triggers: "azure-mgmt-fabric", "FabricMgmtClient", "Fabri... | azure, mgmt, fabric, py | azure, mgmt, fabric, py, sdk, python, managing, microsoft, capacities, resources, triggers, fabricmgmtclient |
|
||||
| `azure-mgmt-mongodbatlas-dotnet` | Manage MongoDB Atlas Organizations as Azure ARM resources using Azure.ResourceManager.MongoDBAtlas SDK. Use when creating, updating, listing, or deleting Mon... | azure, mgmt, mongodbatlas, dotnet | azure, mgmt, mongodbatlas, dotnet, mongodb, atlas, organizations, arm, resources, resourcemanager, sdk, creating |
|
||||
| `azure-monitor-opentelemetry-exporter-py` | Azure Monitor OpenTelemetry Exporter for Python. Use for low-level OpenTelemetry export to Application Insights.
|
||||
Triggers: "azure-monitor-opentelemetry-expor... | azure, monitor, opentelemetry, exporter, py | azure, monitor, opentelemetry, exporter, py, python, low, level, export, application, insights, triggers |
|
||||
| `azure-monitor-opentelemetry-py` | Azure Monitor OpenTelemetry Distro for Python. Use for one-line Application Insights setup with auto-instrumentation.
|
||||
Triggers: "azure-monitor-opentelemetry"... | azure, monitor, opentelemetry, py | azure, monitor, opentelemetry, py, distro, python, one, line, application, insights, setup, auto |
|
||||
| `azure-resource-manager-durabletask-dotnet` | Azure Resource Manager SDK for Durable Task Scheduler in .NET. Use for MANAGEMENT PLANE operations: creating/managing Durable Task Schedulers, Task Hubs, and... | azure, resource, manager, durabletask, dotnet | azure, resource, manager, durabletask, dotnet, sdk, durable, task, scheduler, net, plane, operations |
|
||||
| `azure-resource-manager-playwright-dotnet` | Azure Resource Manager SDK for Microsoft Playwright Testing in .NET. Use for MANAGEMENT PLANE operations: creating/managing Playwright Testing workspaces, ch... | azure, resource, manager, playwright, dotnet | azure, resource, manager, playwright, dotnet, sdk, microsoft, testing, net, plane, operations, creating |
|
||||
| `azure-speech-to-text-rest-py` | Azure Speech to Text REST API for short audio (Python). Use for simple speech recognition of audio files up to 60 seconds without the Speech SDK.
|
||||
Triggers: "... | azure, speech, to, text, rest, py | azure, speech, to, text, rest, py, api, short, audio, python, simple, recognition |
|
||||
| `azure-storage-blob-py` | Azure Blob Storage SDK for Python. Use for uploading, downloading, listing blobs, managing containers, and blob lifecycle.
|
||||
Triggers: "blob storage", "BlobSer... | azure, storage, blob, py | azure, storage, blob, py, sdk, python, uploading, downloading, listing, blobs, managing, containers |
|
||||
| `azure-storage-blob-rust` | Azure Blob Storage SDK for Rust. Use for uploading, downloading, and managing blobs and containers.
|
||||
Triggers: "blob storage rust", "BlobClient rust", "upload... | azure, storage, blob, rust | azure, storage, blob, rust, sdk, uploading, downloading, managing, blobs, containers, triggers, blobclient |
|
||||
| `azure-storage-blob-ts` | Azure Blob Storage JavaScript/TypeScript SDK (@azure/storage-blob) for blob operations. Use for uploading, downloading, listing, and managing blobs and conta... | azure, storage, blob, ts | azure, storage, blob, ts, javascript, typescript, sdk, operations, uploading, downloading, listing, managing |
|
||||
| `azure-storage-file-share-ts` | Azure File Share JavaScript/TypeScript SDK (@azure/storage-file-share) for SMB file share operations. Use for creating shares, managing directories, uploadin... | azure, storage, file, share, ts | azure, storage, file, share, ts, javascript, typescript, sdk, smb, operations, creating, shares |
|
||||
| `azure-storage-queue-py` | Azure Queue Storage SDK for Python. Use for reliable message queuing, task distribution, and asynchronous processing.
|
||||
Triggers: "queue storage", "QueueServic... | azure, storage, queue, py | azure, storage, queue, py, sdk, python, reliable, message, queuing, task, distribution, asynchronous |
|
||||
| `azure-storage-queue-ts` | Azure Queue Storage JavaScript/TypeScript SDK (@azure/storage-queue) for message queue operations. Use for sending, receiving, peeking, and deleting messages... | azure, storage, queue, ts | azure, storage, queue, ts, javascript, typescript, sdk, message, operations, sending, receiving, peeking |
|
||||
| `azure-web-pubsub-ts` | Build real-time messaging applications using Azure Web PubSub SDKs for JavaScript (@azure/web-pubsub, @azure/web-pubsub-client). Use when implementing WebSoc... | azure, web, pubsub, ts | azure, web, pubsub, ts, real, time, messaging, applications, sdks, javascript, client, implementing |
|
||||
| `backend-dev-guidelines` | Opinionated backend development standards for Node.js + Express + TypeScript microservices. Covers layered architecture, BaseController pattern, dependency i... | backend, dev, guidelines | backend, dev, guidelines, opinionated, development, standards, node, js, express, typescript, microservices, covers |
|
||||
| `bullmq-specialist` | BullMQ expert for Redis-backed job queues, background processing, and reliable async execution in Node.js/TypeScript applications. Use when: bullmq, bull que... | bullmq | bullmq, redis, backed, job, queues, background, processing, reliable, async, execution, node, js |
|
||||
| `bun-development` | Modern JavaScript/TypeScript development with Bun runtime. Covers package management, bundling, testing, and migration from Node.js. Use when working with Bu... | bun | bun, development, javascript, typescript, runtime, covers, package, bundling, testing, migration, node, js |
|
||||
| `cc-skill-coding-standards` | Universal coding standards, best practices, and patterns for TypeScript, JavaScript, React, and Node.js development. | cc, skill, coding, standards | cc, skill, coding, standards, universal, typescript, javascript, react, node, js, development |
|
||||
| `cc-skill-frontend-patterns` | Frontend development patterns for React, Next.js, state management, performance optimization, and UI best practices. | cc, skill, frontend | cc, skill, frontend, development, react, next, js, state, performance, optimization, ui |
|
||||
| `context7-auto-research` | Automatically fetch latest library/framework documentation for Claude Code via Context7 API | context7, auto, research | context7, auto, research, automatically, fetch, latest, library, framework, documentation, claude, code, via |
|
||||
| `copilot-sdk` | Build applications powered by GitHub Copilot using the Copilot SDK. Use when creating programmatic integrations with Copilot across Node.js/TypeScript, Pytho... | copilot, sdk | copilot, sdk, applications, powered, github, creating, programmatic, integrations, node, js, typescript, python |
|
||||
| `csharp-pro` | Write modern C# code with advanced features like records, pattern matching, and async/await. Optimizes .NET applications, implements enterprise patterns, and... | csharp | csharp, pro, write, code, features, like, records, matching, async, await, optimizes, net |
|
||||
| `discord-bot-architect` | Specialized skill for building production-ready Discord bots. Covers Discord.js (JavaScript) and Pycord (Python), gateway intents, slash commands, interactiv... | discord, bot | discord, bot, architect, specialized, skill, building, bots, covers, js, javascript, pycord, python |
|
||||
| `dotnet-architect` | Expert .NET backend architect specializing in C#, ASP.NET Core, Entity Framework, Dapper, and enterprise application patterns. Masters async/await, dependenc... | dotnet | dotnet, architect, net, backend, specializing, asp, core, entity, framework, dapper, enterprise, application |
|
||||
| `dotnet-backend-patterns` | Master C#/.NET backend development patterns for building robust APIs, MCP servers, and enterprise applications. Covers async/await, dependency injection, Ent... | dotnet, backend | dotnet, backend, net, development, building, robust, apis, mcp, servers, enterprise, applications, covers |
|
||||
| `exa-search` | Semantic search, similar content discovery, and structured research using Exa API | exa, search | exa, search, semantic, similar, content, discovery, structured, research, api |
|
||||
| `fastapi-pro` | Build high-performance async APIs with FastAPI, SQLAlchemy 2.0, and Pydantic V2. Master microservices, WebSockets, and modern Python async patterns. Use PROA... | fastapi | fastapi, pro, high, performance, async, apis, sqlalchemy, pydantic, v2, microservices, websockets, python |
|
||||
| `fastapi-router-py` | Create FastAPI routers with CRUD operations, authentication dependencies, and proper response models. Use when building REST API endpoints, creating new rout... | fastapi, router, py | fastapi, router, py, routers, crud, operations, authentication, dependencies, proper, response, models, building |
|
||||
| `fastapi-templates` | Create production-ready FastAPI projects with async patterns, dependency injection, and comprehensive error handling. Use when building new FastAPI applicati... | fastapi | fastapi, async, dependency, injection, error, handling, building, new, applications, setting, up, backend |
|
||||
| `firecrawl-scraper` | Deep web scraping, screenshots, PDF parsing, and website crawling using Firecrawl API | firecrawl, scraper | firecrawl, scraper, deep, web, scraping, screenshots, pdf, parsing, website, crawling, api |
|
||||
| `fp-ts-errors` | Handle errors as values using fp-ts Either and TaskEither for cleaner, more predictable TypeScript code. Use when implementing error handling patterns with f... | fp, ts, errors | fp, ts, errors, handle, values, either, taskeither, cleaner, predictable, typescript, code, implementing |
|
||||
@@ -253,7 +389,9 @@ Total skills: 713
|
||||
| `frontend-mobile-development-component-scaffold` | You are a React component architecture expert specializing in scaffolding production-ready, accessible, and performant components. Generate complete componen... | frontend, mobile, component | frontend, mobile, component, development, scaffold, react, architecture, specializing, scaffolding, accessible, performant, components |
|
||||
| `frontend-slides` | Create stunning, animation-rich HTML presentations from scratch or by converting PowerPoint files. Use when the user wants to build a presentation, convert a... | frontend, slides | frontend, slides, stunning, animation, rich, html, presentations, scratch, converting, powerpoint, files, user |
|
||||
| `game-development/mobile-games` | Mobile game development principles. Touch input, battery, performance, app stores. | game, development/mobile, games | game, development/mobile, games, mobile, development, principles, touch, input, battery, performance, app, stores |
|
||||
| `gemini-api-dev` | Use this skill when building applications with Gemini models, Gemini API, working with multimodal content (text, images, audio, video), implementing function... | gemini, api, dev | gemini, api, dev, skill, building, applications, models, working, multimodal, content, text, images |
|
||||
| `go-concurrency-patterns` | Master Go concurrency with goroutines, channels, sync primitives, and context. Use when building concurrent Go applications, implementing worker pools, or de... | go, concurrency | go, concurrency, goroutines, channels, sync, primitives, context, building, concurrent, applications, implementing, worker |
|
||||
| `go-playwright` | Expert capability for robust, stealthy, and efficient browser automation using Playwright Go. | go, playwright | go, playwright, capability, robust, stealthy, efficient, browser, automation |
|
||||
| `golang-pro` | Master Go 1.21+ with modern patterns, advanced concurrency, performance optimization, and production-ready microservices. Expert in the latest Go ecosystem i... | golang | golang, pro, go, 21, concurrency, performance, optimization, microservices, latest, ecosystem, including, generics |
|
||||
| `hubspot-integration` | Expert patterns for HubSpot CRM integration including OAuth authentication, CRM objects, associations, batch operations, webhooks, and custom objects. Covers... | hubspot, integration | hubspot, integration, crm, including, oauth, authentication, objects, associations, batch, operations, webhooks, custom |
|
||||
| `javascript-mastery` | Comprehensive JavaScript reference covering 33+ essential concepts every developer should know. From fundamentals like primitives and closures to advanced pa... | javascript, mastery | javascript, mastery, reference, covering, 33, essential, concepts, every, developer, should, know, fundamentals |
|
||||
@@ -261,9 +399,12 @@ Total skills: 713
|
||||
| `javascript-testing-patterns` | Implement comprehensive testing strategies using Jest, Vitest, and Testing Library for unit tests, integration tests, and end-to-end testing with mocking, fi... | javascript | javascript, testing, jest, vitest, library, unit, tests, integration, mocking, fixtures, test, driven |
|
||||
| `javascript-typescript-typescript-scaffold` | You are a TypeScript project architecture expert specializing in scaffolding production-ready Node.js and frontend applications. Generate complete project st... | javascript, typescript | javascript, typescript, scaffold, architecture, specializing, scaffolding, node, js, frontend, applications, generate, complete |
|
||||
| `launch-strategy` | When the user wants to plan a product launch, feature announcement, or release strategy. Also use when the user mentions 'launch,' 'Product Hunt,' 'feature r... | launch | launch, user, wants, plan, product, feature, announcement, release, mentions, hunt, go, market |
|
||||
| `m365-agents-ts` | Microsoft 365 Agents SDK for TypeScript/Node.js. Build multichannel agents for Teams/M365/Copilot Studio with AgentApplication routing, Express hosting, stre... | m365, agents, ts | m365, agents, ts, microsoft, 365, sdk, typescript, node, js, multichannel, teams, copilot |
|
||||
| `makepad-skills` | Makepad UI development skills for Rust apps: setup, patterns, shaders, packaging, and troubleshooting. | makepad, skills | makepad, skills, ui, development, rust, apps, setup, shaders, packaging, troubleshooting |
|
||||
| `mcp-builder` | Guide for creating high-quality MCP (Model Context Protocol) servers that enable LLMs to interact with external services through well-designed tools. Use whe... | mcp, builder | mcp, builder, creating, high, quality, model, context, protocol, servers, enable, llms, interact |
|
||||
| `mcp-builder-ms` | Guide for creating high-quality MCP (Model Context Protocol) servers that enable LLMs to interact with external services through well-designed tools. Use whe... | mcp, builder, ms | mcp, builder, ms, creating, high, quality, model, context, protocol, servers, enable, llms |
|
||||
| `memory-safety-patterns` | Implement memory-safe programming with RAII, ownership, smart pointers, and resource management across Rust, C++, and C. Use when writing safe systems code, ... | memory, safety | memory, safety, safe, programming, raii, ownership, smart, pointers, resource, rust, writing, code |
|
||||
| `microsoft-azure-webjobs-extensions-authentication-events-dotnet` | Microsoft Entra Authentication Events SDK for .NET. Azure Functions triggers for custom authentication extensions. Use for token enrichment, custom claims, a... | microsoft, azure, webjobs, extensions, authentication, events, dotnet | microsoft, azure, webjobs, extensions, authentication, events, dotnet, entra, sdk, net, functions, triggers |
|
||||
| `mobile-design` | Mobile-first design and engineering doctrine for iOS and Android apps. Covers touch interaction, performance, platform conventions, offline behavior, and mob... | mobile | mobile, first, engineering, doctrine, ios, android, apps, covers, touch, interaction, performance, platform |
|
||||
| `mobile-developer` | Develop React Native, Flutter, or native mobile apps with modern architecture patterns. Masters cross-platform development, native integrations, offline sync... | mobile | mobile, developer, develop, react, native, flutter, apps, architecture, masters, cross, platform, development |
|
||||
| `modern-javascript-patterns` | Master ES6+ features including async/await, destructuring, spread operators, arrow functions, promises, modules, iterators, generators, and functional progra... | modern, javascript | modern, javascript, es6, features, including, async, await, destructuring, spread, operators, arrow, functions |
|
||||
@@ -278,6 +419,7 @@ Total skills: 713
|
||||
| `python-performance-optimization` | Profile and optimize Python code using cProfile, memory profilers, and performance best practices. Use when debugging slow Python code, optimizing bottleneck... | python, performance, optimization | python, performance, optimization, profile, optimize, code, cprofile, memory, profilers, debugging, slow, optimizing |
|
||||
| `python-pro` | Master Python 3.12+ with modern features, async programming, performance optimization, and production-ready practices. Expert in the latest Python ecosystem ... | python | python, pro, 12, features, async, programming, performance, optimization, latest, ecosystem, including, uv |
|
||||
| `python-testing-patterns` | Implement comprehensive testing strategies with pytest, fixtures, mocking, and test-driven development. Use when writing Python tests, setting up test suites... | python | python, testing, pytest, fixtures, mocking, test, driven, development, writing, tests, setting, up |
|
||||
| `react-flow-node-ts` | Create React Flow node components with TypeScript types, handles, and Zustand integration. Use when building custom nodes for React Flow canvas, creating vis... | react, flow, node, ts | react, flow, node, ts, components, typescript, types, zustand, integration, building, custom, nodes |
|
||||
| `react-modernization` | Upgrade React applications to latest versions, migrate from class components to hooks, and adopt concurrent features. Use when modernizing React codebases, m... | react, modernization | react, modernization, upgrade, applications, latest, versions, migrate, class, components, hooks, adopt, concurrent |
|
||||
| `react-native-architecture` | Build production React Native apps with Expo, navigation, native modules, offline sync, and cross-platform patterns. Use when developing mobile apps, impleme... | react, native, architecture | react, native, architecture, apps, expo, navigation, modules, offline, sync, cross, platform, developing |
|
||||
| `react-patterns` | Modern React patterns and principles. Hooks, composition, performance, TypeScript best practices. | react | react, principles, hooks, composition, performance, typescript |
|
||||
@@ -306,8 +448,9 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `uv-package-manager` | Master the uv package manager for fast Python dependency management, virtual environments, and modern Python project workflows. Use when setting up Python pr... | uv, package, manager | uv, package, manager, fast, python, dependency, virtual, environments, setting, up, managing, dependencies |
|
||||
| `viral-generator-builder` | Expert in building shareable generator tools that go viral - name generators, quiz makers, avatar creators, personality tests, and calculator tools. Covers t... | viral, generator, builder | viral, generator, builder, building, shareable, go, name, generators, quiz, makers, avatar, creators |
|
||||
| `webapp-testing` | Toolkit for interacting with and testing local web applications using Playwright. Supports verifying frontend functionality, debugging UI behavior, capturing... | webapp | webapp, testing, toolkit, interacting, local, web, applications, playwright, supports, verifying, frontend, functionality |
|
||||
| `zustand-store-ts` | Create Zustand stores with TypeScript, subscribeWithSelector middleware, and proper state/action separation. Use when building React state management, creati... | zustand, store, ts | zustand, store, ts, stores, typescript, subscribewithselector, middleware, proper, state, action, separation, building |
|
||||
|
||||
## general (131)
|
||||
## general (135)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -385,6 +528,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `git-advanced-workflows` | Master advanced Git workflows including rebasing, cherry-picking, bisect, worktrees, and reflog to maintain clean history and recover from any situation. Use... | git, advanced | git, advanced, including, rebasing, cherry, picking, bisect, worktrees, reflog, maintain, clean, history |
|
||||
| `git-pr-workflows-onboard` | You are an **expert onboarding specialist and knowledge transfer architect** with deep experience in remote-first organizations, technical team integration, ... | git, pr, onboard | git, pr, onboard, onboarding, knowledge, transfer, architect, deep, experience, remote, first, organizations |
|
||||
| `git-pr-workflows-pr-enhance` | You are a PR optimization expert specializing in creating high-quality pull requests that facilitate efficient code reviews. Generate comprehensive PR descri... | git, pr, enhance | git, pr, enhance, optimization, specializing, creating, high, quality, pull, requests, facilitate, efficient |
|
||||
| `github-issue-creator` | Convert raw notes, error logs, voice dictation, or screenshots into crisp GitHub-flavored markdown issue reports. Use when the user pastes bug info, error me... | github, issue, creator | github, issue, creator, convert, raw, notes, error, logs, voice, dictation, screenshots, crisp |
|
||||
| `imagen` | | imagen | imagen |
|
||||
| `infinite-gratitude` | Multi-agent research skill for parallel research execution (10 agents, battle-tested with real case studies). | infinite, gratitude | infinite, gratitude, multi, agent, research, skill, parallel, execution, 10, agents, battle, tested |
|
||||
| `interactive-portfolio` | Expert in building portfolios that actually land jobs and clients - not just showing work, but creating memorable experiences. Covers developer portfolios, d... | interactive, portfolio | interactive, portfolio, building, portfolios, actually, land, jobs, clients, just, showing, work, creating |
|
||||
@@ -426,7 +570,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `subagent-driven-development` | Use when executing implementation plans with independent tasks in the current session | subagent, driven | subagent, driven, development, executing, plans, independent, tasks, current, session |
|
||||
| `superpowers-lab` | Lab environment for Claude superpowers | superpowers, lab | superpowers, lab, environment, claude |
|
||||
| `theme-factory` | Toolkit for styling artifacts with a theme. These artifacts can be slides, docs, reportings, HTML landing pages, etc. There are 10 pre-set themes with colors... | theme, factory | theme, factory, toolkit, styling, artifacts, these, slides, docs, reportings, html, landing, pages |
|
||||
| `threejs-skills` | Three.js skills for creating 3D elements and interactive experiences | threejs, skills | threejs, skills, three, js, creating, 3d, elements, interactive, experiences |
|
||||
| `threejs-skills` | Create 3D scenes, interactive experiences, and visual effects using Three.js. Use when user requests 3D graphics, WebGL experiences, 3D visualizations, anima... | threejs, skills | threejs, skills, 3d, scenes, interactive, experiences, visual, effects, three, js, user, requests |
|
||||
| `turborepo-caching` | Configure Turborepo for efficient monorepo builds with local and remote caching. Use when setting up Turborepo, optimizing build pipelines, or implementing d... | turborepo, caching | turborepo, caching, configure, efficient, monorepo, local, remote, setting, up, optimizing, pipelines, implementing |
|
||||
| `tutorial-engineer` | Creates step-by-step tutorials and educational content from code. Transforms complex concepts into progressive learning experiences with hands-on examples. U... | tutorial | tutorial, engineer, creates, step, tutorials, educational, content, code, transforms, complex, concepts, progressive |
|
||||
| `ui-skills` | Opinionated, evolving constraints to guide agents when building interfaces | ui, skills | ui, skills, opinionated, evolving, constraints, agents, building, interfaces |
|
||||
@@ -437,13 +581,16 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `using-superpowers` | Use when starting any conversation - establishes how to find and use skills, requiring Skill tool invocation before ANY response including clarifying questions | using, superpowers | using, superpowers, starting, any, conversation, establishes, how, find, skills, requiring, skill, invocation |
|
||||
| `verification-before-completion` | Use when about to claim work is complete, fixed, or passing, before committing or creating PRs - requires running verification commands and confirming output... | verification, before, completion | verification, before, completion, about, claim, work, complete, fixed, passing, committing, creating, prs |
|
||||
| `web-performance-optimization` | Optimize website and web application performance including loading speed, Core Web Vitals, bundle size, caching strategies, and runtime performance | web, performance, optimization | web, performance, optimization, optimize, website, application, including, loading, speed, core, vitals, bundle |
|
||||
| `wiki-changelog` | Analyzes git commit history and generates structured changelogs categorized by change type. Use when the user asks about recent changes, wants a changelog, o... | wiki, changelog | wiki, changelog, analyzes, git, commit, history, generates, structured, changelogs, categorized, change, type |
|
||||
| `wiki-page-writer` | Generates rich technical documentation pages with dark-mode Mermaid diagrams, source code citations, and first-principles depth. Use when writing documentati... | wiki, page, writer | wiki, page, writer, generates, rich, technical, documentation, pages, dark, mode, mermaid, diagrams |
|
||||
| `wiki-vitepress` | Packages generated wiki Markdown into a VitePress static site with dark theme, dark-mode Mermaid diagrams with click-to-zoom, and production build output. Us... | wiki, vitepress | wiki, vitepress, packages, generated, markdown, static, site, dark, theme, mode, mermaid, diagrams |
|
||||
| `windows-privilege-escalation` | This skill should be used when the user asks to "escalate privileges on Windows," "find Windows privesc vectors," "enumerate Windows for privilege escalation... | windows, privilege, escalation | windows, privilege, escalation, skill, should, used, user, asks, escalate, privileges, find, privesc |
|
||||
| `writing-plans` | Use when you have a spec or requirements for a multi-step task, before touching code | writing, plans | writing, plans, spec, requirements, multi, step, task, before, touching, code |
|
||||
| `writing-skills` | Use when creating, updating, or improving agent skills. | writing, skills | writing, skills, creating, updating, improving, agent |
|
||||
| `x-article-publisher-skill` | Publish articles to X/Twitter | x, article, publisher, skill | x, article, publisher, skill, publish, articles, twitter |
|
||||
| `youtube-summarizer` | Extract transcripts from YouTube videos and generate comprehensive, detailed summaries using intelligent analysis frameworks | video, summarization, transcription, youtube, content-analysis | video, summarization, transcription, youtube, content-analysis, summarizer, extract, transcripts, videos, generate, detailed, summaries |
|
||||
|
||||
## infrastructure (83)
|
||||
## infrastructure (102)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -453,6 +600,32 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `application-performance-performance-optimization` | Optimize end-to-end application performance with profiling, observability, and backend/frontend tuning. Use when coordinating performance optimization across... | application, performance, optimization | application, performance, optimization, optimize, profiling, observability, backend, frontend, tuning, coordinating, stack |
|
||||
| `aws-serverless` | Specialized skill for building production-ready serverless applications on AWS. Covers Lambda functions, API Gateway, DynamoDB, SQS/SNS event-driven patterns... | aws, serverless | aws, serverless, specialized, skill, building, applications, covers, lambda, functions, api, gateway, dynamodb |
|
||||
| `aws-skills` | AWS development with infrastructure automation and cloud architecture patterns | aws, skills | aws, skills, development, infrastructure, automation, cloud, architecture |
|
||||
| `azd-deployment` | Deploy containerized applications to Azure Container Apps using Azure Developer CLI (azd). Use when setting up azd projects, writing azure.yaml configuration... | azd, deployment | azd, deployment, deploy, containerized, applications, azure, container, apps, developer, cli, setting, up |
|
||||
| `azure-ai-anomalydetector-java` | Build anomaly detection applications with Azure AI Anomaly Detector SDK for Java. Use when implementing univariate/multivariate anomaly detection, time-serie... | azure, ai, anomalydetector, java | azure, ai, anomalydetector, java, anomaly, detection, applications, detector, sdk, implementing, univariate, multivariate |
|
||||
| `azure-identity-java` | Azure Identity Java SDK for authentication with Azure services. Use when implementing DefaultAzureCredential, managed identity, service principal, or any Azu... | azure, identity, java | azure, identity, java, sdk, authentication, implementing, defaultazurecredential, managed, principal, any, applications |
|
||||
| `azure-identity-py` | Azure Identity SDK for Python authentication. Use for DefaultAzureCredential, managed identity, service principals, and token caching.
|
||||
Triggers: "azure-ident... | azure, identity, py | azure, identity, py, sdk, python, authentication, defaultazurecredential, managed, principals, token, caching, triggers |
|
||||
| `azure-identity-ts` | Authenticate to Azure services using Azure Identity SDK for JavaScript (@azure/identity). Use when configuring authentication with DefaultAzureCredential, ma... | azure, identity, ts | azure, identity, ts, authenticate, sdk, javascript, configuring, authentication, defaultazurecredential, managed, principals, interactive |
|
||||
| `azure-messaging-webpubsubservice-py` | Azure Web PubSub Service SDK for Python. Use for real-time messaging, WebSocket connections, and pub/sub patterns.
|
||||
Triggers: "azure-messaging-webpubsubservic... | azure, messaging, webpubsubservice, py | azure, messaging, webpubsubservice, py, web, pubsub, sdk, python, real, time, websocket, connections |
|
||||
| `azure-mgmt-apimanagement-dotnet` | Azure Resource Manager SDK for API Management in .NET. Use for MANAGEMENT PLANE operations: creating/managing APIM services, APIs, products, subscriptions, p... | azure, mgmt, apimanagement, dotnet | azure, mgmt, apimanagement, dotnet, resource, manager, sdk, api, net, plane, operations, creating |
|
||||
| `azure-mgmt-applicationinsights-dotnet` | Azure Application Insights SDK for .NET. Application performance monitoring and observability resource management. Use for creating Application Insights comp... | azure, mgmt, applicationinsights, dotnet | azure, mgmt, applicationinsights, dotnet, application, insights, sdk, net, performance, monitoring, observability, resource |
|
||||
| `azure-mgmt-arizeaiobservabilityeval-dotnet` | Azure Resource Manager SDK for Arize AI Observability and Evaluation (.NET). Use when managing Arize AI organizations
|
||||
on Azure via Azure Marketplace, creati... | azure, mgmt, arizeaiobservabilityeval, dotnet | azure, mgmt, arizeaiobservabilityeval, dotnet, resource, manager, sdk, arize, ai, observability, evaluation, net |
|
||||
| `azure-mgmt-botservice-dotnet` | Azure Resource Manager SDK for Bot Service in .NET. Management plane operations for creating and managing Azure Bot resources, channels (Teams, DirectLine, S... | azure, mgmt, botservice, dotnet | azure, mgmt, botservice, dotnet, resource, manager, sdk, bot, net, plane, operations, creating |
|
||||
| `azure-mgmt-botservice-py` | Azure Bot Service Management SDK for Python. Use for creating, managing, and configuring Azure Bot Service resources.
|
||||
Triggers: "azure-mgmt-botservice", "Azu... | azure, mgmt, botservice, py | azure, mgmt, botservice, py, bot, sdk, python, creating, managing, configuring, resources, triggers |
|
||||
| `azure-mgmt-weightsandbiases-dotnet` | Azure Weights & Biases SDK for .NET. ML experiment tracking and model management via Azure Marketplace. Use for creating W&B instances, managing SSO, marketp... | azure, mgmt, weightsandbiases, dotnet | azure, mgmt, weightsandbiases, dotnet, weights, biases, sdk, net, ml, experiment, tracking, model |
|
||||
| `azure-microsoft-playwright-testing-ts` | Run Playwright tests at scale using Azure Playwright Workspaces (formerly Microsoft Playwright Testing). Use when scaling browser tests across cloud-hosted b... | azure, microsoft, playwright, ts | azure, microsoft, playwright, ts, testing, run, tests, scale, workspaces, formerly, scaling, browser |
|
||||
| `azure-monitor-opentelemetry-exporter-java` | Azure Monitor OpenTelemetry Exporter for Java. Export OpenTelemetry traces, metrics, and logs to Azure Monitor/Application Insights.
|
||||
Triggers: "AzureMonitorE... | azure, monitor, opentelemetry, exporter, java | azure, monitor, opentelemetry, exporter, java, export, traces, metrics, logs, application, insights, triggers |
|
||||
| `azure-monitor-opentelemetry-ts` | Instrument applications with Azure Monitor and OpenTelemetry for JavaScript (@azure/monitor-opentelemetry). Use when adding distributed tracing, metrics, and... | azure, monitor, opentelemetry, ts | azure, monitor, opentelemetry, ts, instrument, applications, javascript, adding, distributed, tracing, metrics, logs |
|
||||
| `azure-servicebus-dotnet` | Azure Service Bus SDK for .NET. Enterprise messaging with queues, topics, subscriptions, and sessions. Use for reliable message delivery, pub/sub patterns, d... | azure, servicebus, dotnet | azure, servicebus, dotnet, bus, sdk, net, enterprise, messaging, queues, topics, subscriptions, sessions |
|
||||
| `azure-servicebus-py` | Azure Service Bus SDK for Python messaging. Use for queues, topics, subscriptions, and enterprise messaging patterns.
|
||||
Triggers: "service bus", "ServiceBusCli... | azure, servicebus, py | azure, servicebus, py, bus, sdk, python, messaging, queues, topics, subscriptions, enterprise, triggers |
|
||||
| `azure-servicebus-ts` | Build messaging applications using Azure Service Bus SDK for JavaScript (@azure/service-bus). Use when implementing queues, topics/subscriptions, message ses... | azure, servicebus, ts | azure, servicebus, ts, messaging, applications, bus, sdk, javascript, implementing, queues, topics, subscriptions |
|
||||
| `azure-storage-file-share-py` | Azure Storage File Share SDK for Python. Use for SMB file shares, directories, and file operations in the cloud.
|
||||
Triggers: "azure-storage-file-share", "Share... | azure, storage, file, share, py | azure, storage, file, share, py, sdk, python, smb, shares, directories, operations, cloud |
|
||||
| `backend-architect` | Expert backend architect specializing in scalable API design, microservices architecture, and distributed systems. Masters REST/GraphQL/gRPC APIs, event-driv... | backend | backend, architect, specializing, scalable, api, microservices, architecture, distributed, masters, rest, graphql, grpc |
|
||||
| `backend-development-feature-development` | Orchestrate end-to-end backend feature development from requirements to deployment. Use when coordinating multi-phase feature delivery across teams and servi... | backend | backend, development, feature, orchestrate, requirements, deployment, coordinating, multi, phase, delivery, teams |
|
||||
| `bash-defensive-patterns` | Master defensive Bash programming techniques for production-grade scripts. Use when writing robust shell scripts, CI/CD pipelines, or system utilities requir... | bash, defensive | bash, defensive, programming, techniques, grade, scripts, writing, robust, shell, ci, cd, pipelines |
|
||||
@@ -531,7 +704,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `wireshark-analysis` | This skill should be used when the user asks to "analyze network traffic with Wireshark", "capture packets for troubleshooting", "filter PCAP files", "follow... | wireshark | wireshark, network, traffic, analysis, skill, should, used, user, asks, analyze, capture, packets |
|
||||
| `workflow-automation` | Workflow automation is the infrastructure that makes AI agents reliable. Without durable execution, a network hiccup during a 10-step payment flow means lost... | | automation, infrastructure, makes, ai, agents, reliable, without, durable, execution, network, hiccup, during |
|
||||
|
||||
## security (113)
|
||||
## security (126)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -539,11 +712,22 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `active-directory-attacks` | This skill should be used when the user asks to "attack Active Directory", "exploit AD", "Kerberoasting", "DCSync", "pass-the-hash", "BloodHound enumeration"... | active, directory, attacks | active, directory, attacks, skill, should, used, user, asks, attack, exploit, ad, kerberoasting |
|
||||
| `agent-memory-systems` | Memory is the cornerstone of intelligent agents. Without it, every interaction starts from zero. This skill covers the architecture of agent memory: short-te... | agent, memory | agent, memory, cornerstone, intelligent, agents, without, every, interaction, starts, zero, skill, covers |
|
||||
| `ai-product` | Every product will be AI-powered. The question is whether you'll build it right or ship a demo that falls apart in production. This skill covers LLM integra... | ai, product | ai, product, every, powered, question, whether, ll, right, ship, demo, falls, apart |
|
||||
| `antigravity-workflows` | Orchestrate multiple Antigravity skills through guided workflows for SaaS MVP delivery, security audits, AI agent builds, and browser QA. | antigravity | antigravity, orchestrate, multiple, skills, through, guided, saas, mvp, delivery, security, audits, ai |
|
||||
| `api-fuzzing-bug-bounty` | This skill should be used when the user asks to "test API security", "fuzz APIs", "find IDOR vulnerabilities", "test REST API", "test GraphQL", "API penetrat... | api, fuzzing, bug, bounty | api, fuzzing, bug, bounty, skill, should, used, user, asks, test, security, fuzz |
|
||||
| `api-security-best-practices` | Implement secure API design patterns including authentication, authorization, input validation, rate limiting, and protection against common API vulnerabilities | api, security, best, practices | api, security, best, practices, secure, including, authentication, authorization, input, validation, rate, limiting |
|
||||
| `attack-tree-construction` | Build comprehensive attack trees to visualize threat paths. Use when mapping attack scenarios, identifying defense gaps, or communicating security risks to s... | attack, tree, construction | attack, tree, construction, trees, visualize, threat, paths, mapping, scenarios, identifying, defense, gaps |
|
||||
| `auth-implementation-patterns` | Master authentication and authorization patterns including JWT, OAuth2, session management, and RBAC to build secure, scalable access control systems. Use wh... | auth | auth, authentication, authorization, including, jwt, oauth2, session, rbac, secure, scalable, access, control |
|
||||
| `aws-penetration-testing` | This skill should be used when the user asks to "pentest AWS", "test AWS security", "enumerate IAM", "exploit cloud infrastructure", "AWS privilege escalatio... | aws, penetration | aws, penetration, testing, skill, should, used, user, asks, pentest, test, security, enumerate |
|
||||
| `azure-cosmos-db-py` | Build Azure Cosmos DB NoSQL services with Python/FastAPI following production-grade patterns. Use when implementing database client setup with dual auth (Def... | azure, cosmos, db, py | azure, cosmos, db, py, nosql, python, fastapi, following, grade, implementing, database, client |
|
||||
| `azure-identity-dotnet` | Azure Identity SDK for .NET. Authentication library for Azure SDK clients using Microsoft Entra ID. Use for DefaultAzureCredential, managed identity, service... | azure, identity, dotnet | azure, identity, dotnet, sdk, net, authentication, library, clients, microsoft, entra, id, defaultazurecredential |
|
||||
| `azure-keyvault-py` | Azure Key Vault SDK for Python. Use for secrets, keys, and certificates management with secure storage.
|
||||
Triggers: "key vault", "SecretClient", "KeyClient", "... | azure, keyvault, py | azure, keyvault, py, key, vault, sdk, python, secrets, keys, certificates, secure, storage |
|
||||
| `azure-keyvault-secrets-rust` | Azure Key Vault Secrets SDK for Rust. Use for storing and retrieving secrets, passwords, and API keys.
|
||||
Triggers: "keyvault secrets rust", "SecretClient rust"... | azure, keyvault, secrets, rust | azure, keyvault, secrets, rust, key, vault, sdk, storing, retrieving, passwords, api, keys |
|
||||
| `azure-keyvault-secrets-ts` | Manage secrets using Azure Key Vault Secrets SDK for JavaScript (@azure/keyvault-secrets). Use when storing and retrieving application secrets or configurati... | azure, keyvault, secrets, ts | azure, keyvault, secrets, ts, key, vault, sdk, javascript, storing, retrieving, application, configuration |
|
||||
| `azure-security-keyvault-keys-dotnet` | Azure Key Vault Keys SDK for .NET. Client library for managing cryptographic keys in Azure Key Vault and Managed HSM. Use for key creation, rotation, encrypt... | azure, security, keyvault, keys, dotnet | azure, security, keyvault, keys, dotnet, key, vault, sdk, net, client, library, managing |
|
||||
| `azure-security-keyvault-keys-java` | Azure Key Vault Keys Java SDK for cryptographic key management. Use when creating, managing, or using RSA/EC keys, performing encrypt/decrypt/sign/verify ope... | azure, security, keyvault, keys, java | azure, security, keyvault, keys, java, key, vault, sdk, cryptographic, creating, managing, rsa |
|
||||
| `azure-security-keyvault-secrets-java` | Azure Key Vault Secrets Java SDK for secret management. Use when storing, retrieving, or managing passwords, API keys, connection strings, or other sensitive... | azure, security, keyvault, secrets, java | azure, security, keyvault, secrets, java, key, vault, sdk, secret, storing, retrieving, managing |
|
||||
| `backend-security-coder` | Expert in secure backend coding practices specializing in input validation, authentication, and API security. Use PROACTIVELY for backend security implementa... | backend, security, coder | backend, security, coder, secure, coding, specializing, input, validation, authentication, api, proactively, implementations |
|
||||
| `broken-authentication` | This skill should be used when the user asks to "test for broken authentication vulnerabilities", "assess session management security", "perform credential s... | broken, authentication | broken, authentication, testing, skill, should, used, user, asks, test, vulnerabilities, assess, session |
|
||||
| `burp-suite-testing` | This skill should be used when the user asks to "intercept HTTP traffic", "modify web requests", "use Burp Suite for testing", "perform web vulnerability sca... | burp, suite | burp, suite, web, application, testing, skill, should, used, user, asks, intercept, http |
|
||||
@@ -593,6 +777,8 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `legal-advisor` | Draft privacy policies, terms of service, disclaimers, and legal notices. Creates GDPR-compliant texts, cookie policies, and data processing agreements. Use ... | legal, advisor | legal, advisor, draft, privacy, policies, terms, disclaimers, notices, creates, gdpr, compliant, texts |
|
||||
| `linkerd-patterns` | Implement Linkerd service mesh patterns for lightweight, security-focused service mesh deployments. Use when setting up Linkerd, configuring traffic policies... | linkerd | linkerd, mesh, lightweight, security, deployments, setting, up, configuring, traffic, policies, implementing, zero |
|
||||
| `loki-mode` | Multi-agent autonomous startup system for Claude Code. Triggers on "Loki Mode". Orchestrates 100+ specialized agents across engineering, QA, DevOps, security... | loki, mode | loki, mode, multi, agent, autonomous, startup, claude, code, triggers, orchestrates, 100, specialized |
|
||||
| `m365-agents-dotnet` | Microsoft 365 Agents SDK for .NET. Build multichannel agents for Teams/M365/Copilot Studio with ASP.NET Core hosting, AgentApplication routing, and MSAL-base... | m365, agents, dotnet | m365, agents, dotnet, microsoft, 365, sdk, net, multichannel, teams, copilot, studio, asp |
|
||||
| `m365-agents-py` | Microsoft 365 Agents SDK for Python. Build multichannel agents for Teams/M365/Copilot Studio with aiohttp hosting, AgentApplication routing, streaming respon... | m365, agents, py | m365, agents, py, microsoft, 365, sdk, python, multichannel, teams, copilot, studio, aiohttp |
|
||||
| `malware-analyst` | Expert malware analyst specializing in defensive malware research, threat intelligence, and incident response. Masters sandbox analysis, behavioral analysis,... | malware, analyst | malware, analyst, specializing, defensive, research, threat, intelligence, incident, response, masters, sandbox, analysis |
|
||||
| `memory-forensics` | Master memory forensics techniques including memory acquisition, process analysis, and artifact extraction using Volatility and related tools. Use when analy... | memory, forensics | memory, forensics, techniques, including, acquisition, process, analysis, artifact, extraction, volatility, related, analyzing |
|
||||
| `metasploit-framework` | This skill should be used when the user asks to "use Metasploit for penetration testing", "exploit vulnerabilities with msfconsole", "create payloads with ms... | metasploit, framework | metasploit, framework, skill, should, used, user, asks, penetration, testing, exploit, vulnerabilities, msfconsole |
|
||||
@@ -646,10 +832,12 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `varlock-claude-skill` | Secure environment variable management ensuring secrets are never exposed in Claude sessions, terminals, logs, or git commits | varlock, claude, skill | varlock, claude, skill, secure, environment, variable, ensuring, secrets, never, exposed, sessions, terminals |
|
||||
| `vulnerability-scanner` | Advanced vulnerability analysis principles. OWASP 2025, Supply Chain Security, attack surface mapping, risk prioritization. | vulnerability, scanner | vulnerability, scanner, analysis, principles, owasp, 2025, supply, chain, security, attack, surface, mapping |
|
||||
| `web-design-guidelines` | Review UI code for Web Interface Guidelines compliance. Use when asked to "review my UI", "check accessibility", "audit design", "review UX", or "check my si... | web, guidelines | web, guidelines, review, ui, code, interface, compliance, asked, my, check, accessibility, audit |
|
||||
| `wiki-onboarding` | Generates two complementary onboarding guides — a Principal-Level architectural deep-dive and a Zero-to-Hero contributor walkthrough. Use when the user wants... | wiki, onboarding | wiki, onboarding, generates, two, complementary, guides, principal, level, architectural, deep, dive, zero |
|
||||
| `wiki-researcher` | Conducts multi-turn iterative deep research on specific topics within a codebase with zero tolerance for shallow analysis. Use when the user wants an in-dept... | wiki, researcher | wiki, researcher, conducts, multi, turn, iterative, deep, research, specific, topics, within, codebase |
|
||||
| `wordpress-penetration-testing` | This skill should be used when the user asks to "pentest WordPress sites", "scan WordPress for vulnerabilities", "enumerate WordPress users, themes, or plugi... | wordpress, penetration | wordpress, penetration, testing, skill, should, used, user, asks, pentest, sites, scan, vulnerabilities |
|
||||
| `xss-html-injection` | This skill should be used when the user asks to "test for XSS vulnerabilities", "perform cross-site scripting attacks", "identify HTML injection flaws", "exp... | xss, html, injection | xss, html, injection, cross, site, scripting, testing, skill, should, used, user, asks |
|
||||
|
||||
## testing (23)
|
||||
## testing (24)
|
||||
|
||||
| Skill | Description | Tags | Triggers |
|
||||
| --- | --- | --- | --- |
|
||||
@@ -676,6 +864,7 @@ TRIGGER: "shopify", "shopify app", "checkout extension",... | shopify | shopify,
|
||||
| `test-fixing` | Run tests and systematically fix all failing tests using smart error grouping. Use when user asks to fix failing tests, mentions test failures, runs test sui... | fixing | fixing, test, run, tests, systematically, fix, all, failing, smart, error, grouping, user |
|
||||
| `unit-testing-test-generate` | Generate comprehensive, maintainable unit tests across languages with strong coverage and edge case focus. | unit, generate | unit, generate, testing, test, maintainable, tests, languages, strong, coverage, edge, case |
|
||||
| `web3-testing` | Test smart contracts comprehensively using Hardhat and Foundry with unit tests, integration tests, and mainnet forking. Use when testing Solidity contracts, ... | web3 | web3, testing, test, smart, contracts, comprehensively, hardhat, foundry, unit, tests, integration, mainnet |
|
||||
| `wiki-qa` | Answers questions about a code repository using source file analysis. Use when the user asks a question about how something works, wants to understand a comp... | wiki, qa | wiki, qa, answers, questions, about, code, repository, source, file, analysis, user, asks |
|
||||
|
||||
## workflow (81)
|
||||
|
||||
|
||||
124
CHANGELOG.md
124
CHANGELOG.md
@@ -7,6 +7,130 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
|
||||
---
|
||||
|
||||
## [5.3.0] - 2026-02-13 - "Advanced Three.js & Modern Graphics"
|
||||
|
||||
> **Enhanced Three.js patterns: performance, visual polish, and production practices.**
|
||||
|
||||
This release significantly upgrades our 3D visualization capabilities with a comprehensive Three.js skill upgrade, focusing on CDN-compatible patterns, performance optimizations, and modern graphics techniques like shadows, fog, and GSAP integration.
|
||||
|
||||
### Added
|
||||
|
||||
- **Modern Three.js Patterns**: Comprehensive guide for `r128` (CDN) and production environments.
|
||||
- **Visual Polish**: Advanced sections for shadows, environment maps, and tone mapping.
|
||||
- **Interaction Models**: Custom camera controls (OrbitControls alternative) and raycasting for object selection.
|
||||
- **Production Readiness**: Integration patterns for GSAP, scroll-based animations, and build tool optimizations.
|
||||
|
||||
### Registry
|
||||
|
||||
- **Total Skills**: 856.
|
||||
- **Metadata**: Fixed missing source and risk fields for `threejs-skills`.
|
||||
- **Sync**: All discovery artifacts (README, Catalog, Index) updated and synced.
|
||||
|
||||
### Contributors
|
||||
|
||||
- **[@Krishna-hehe](https://github.com/Krishna-hehe)** - Advanced Three.js skill overhaul (PR #78).
|
||||
|
||||
---
|
||||
|
||||
> **New AI capabilities: Podcast Generation, Azure Identity, and Self-Evolving Agents.**
|
||||
|
||||
### Added
|
||||
|
||||
- **New Skill**: `podcast-generation` - Create multi-speaker podcasts from text/URLs using OpenAI Text-to-Speech (TTS) and pydub.
|
||||
- **New Skill**: `weevolve` - Self-evolving knowledge engine with recursive improvement protocol.
|
||||
- **Azure Skills Expansion**:
|
||||
- `azure-ai-agents-persistent-dotnet`: Persistent agent patterns for .NET.
|
||||
- `azure-ai-agents-persistent-java`: Persistent agent patterns for Java.
|
||||
- `azd-deployment`: Azure Developer CLI deployment strategies.
|
||||
- **Python Enhancements**:
|
||||
- `pydantic-models-py`: Robust data validation patterns.
|
||||
- `fastapi-router-py`: Scalable API routing structures.
|
||||
|
||||
### Registry
|
||||
|
||||
- **Total Skills**: 856 (from 845).
|
||||
- **Generated Files**: Synced `skills_index.json`, `data/catalog.json`, and `README.md`.
|
||||
|
||||
### Contributors
|
||||
|
||||
- **[@sickn33](https://github.com/sickn33)** - Podcast Generation & Azure skills sync (PR #74).
|
||||
- **[@aro-brez](https://github.com/aro-brez)** - WeEvolve skill (Issue #75).
|
||||
|
||||
---
|
||||
|
||||
## [5.1.0] - 2026-02-12 - "Official Microsoft & Gemini Skills"
|
||||
|
||||
> **845+ skills: the largest single-PR expansion ever, powered by official vendor collections.**
|
||||
|
||||
Integrates the full official Microsoft skills collection (129 skills) and Google Gemini API development skills, significantly expanding Azure SDK coverage across .NET, Python, TypeScript, Java, and Rust, plus M365 Agents, Semantic Kernel, and wiki plugin skills.
|
||||
|
||||
### Added
|
||||
|
||||
- **129 Microsoft Official Skills** from [microsoft/skills](https://github.com/microsoft/skills):
|
||||
- Azure SDKs across .NET, Python, TypeScript, Java, and Rust
|
||||
- M365 Agents, Semantic Kernel, and wiki plugin skills
|
||||
- Flat structure using YAML `name` field as directory name
|
||||
- Attribution files: `docs/LICENSE-MICROSOFT`, `docs/microsoft-skills-attribution.json`
|
||||
- **Gemini API Skills**: Official Gemini API development skill under `skills/gemini-api-dev/`
|
||||
- **New Scripts & Tooling**:
|
||||
- `scripts/sync_microsoft_skills.py` (v4): Flat-structure sync with collision detection, stale cleanup, and attribution metadata
|
||||
- `scripts/tests/inspect_microsoft_repo.py`: Remote repo inspection
|
||||
- `scripts/tests/test_comprehensive_coverage.py`: Coverage verification
|
||||
- **New npm scripts**: `sync:microsoft` and `sync:all-official` in `package.json`
|
||||
|
||||
### Fixed
|
||||
|
||||
- **`scripts/generate_index.py`**: Enhanced frontmatter parsing for unquoted `@` symbols and commas
|
||||
- **`scripts/build-catalog.js`**: Deterministic `generatedAt` timestamp (prevents CI drift)
|
||||
|
||||
### Registry
|
||||
|
||||
- **Total Skills**: 845 (from 626). All generated files synced.
|
||||
|
||||
### Contributors
|
||||
|
||||
- [@ar27111994](https://github.com/ar27111994) - Microsoft & Gemini skills integration (PR #73)
|
||||
|
||||
---
|
||||
|
||||
## [5.0.0] - 2026-02-10 - "Antigravity Workflows Foundation"
|
||||
|
||||
> Workflows are now first-class: users can run guided, multi-skill playbooks instead of manually composing skills one by one.
|
||||
|
||||
### Added
|
||||
|
||||
- **New orchestration skill**: `antigravity-workflows`
|
||||
- `skills/antigravity-workflows/SKILL.md`
|
||||
- `skills/antigravity-workflows/resources/implementation-playbook.md`
|
||||
- **New workflow documentation**: `docs/WORKFLOWS.md`
|
||||
- Introduces the Workflows model and differentiates it from Bundles.
|
||||
- Provides execution playbooks with prerequisites, ordered steps, and prompt examples.
|
||||
- **New machine-readable workflow registry**: `data/workflows.json`
|
||||
- `ship-saas-mvp`
|
||||
- `security-audit-web-app`
|
||||
- `build-ai-agent-system`
|
||||
- `qa-browser-automation`
|
||||
|
||||
### Changed
|
||||
|
||||
- **README / Onboarding docs** updated to include Workflows discovery and usage:
|
||||
- `README.md` (TOC + "Antigravity Workflows" section)
|
||||
- `docs/GETTING_STARTED.md` (Bundles vs Workflows guidance)
|
||||
- `docs/FAQ.md` (new Q&A: Bundles vs Workflows)
|
||||
- **Go browser automation alignment**:
|
||||
- Workflow playbooks now include optional `@go-playwright` hooks for Go-based QA/E2E flows.
|
||||
- **Registry sync** after workflow skill addition:
|
||||
- `CATALOG.md`
|
||||
- `skills_index.json`
|
||||
- `data/catalog.json`
|
||||
- `data/bundles.json`
|
||||
|
||||
### Contributors
|
||||
|
||||
- [@sickn33](https://github.com/sickn33) - Workflows architecture, docs, and release integration
|
||||
|
||||
---
|
||||
|
||||
## [4.11.0] - 2026-02-08 - "Clean Code & Registry Stability"
|
||||
|
||||
> Quality improvements: Clean Code principles and deterministic builds.
|
||||
|
||||
45
README.md
45
README.md
@@ -1,6 +1,6 @@
|
||||
# 🌌 Antigravity Awesome Skills: 713+ Agentic Skills for Claude Code, Gemini CLI, Cursor, Copilot & More
|
||||
# 🌌 Antigravity Awesome Skills: 856+ Agentic Skills for Claude Code, Gemini CLI, Cursor, Copilot & More
|
||||
|
||||
> **The Ultimate Collection of 713+ Universal Agentic Skills for AI Coding Assistants — Claude Code, Gemini CLI, Codex CLI, Antigravity IDE, GitHub Copilot, Cursor, OpenCode, AdaL**
|
||||
> **The Ultimate Collection of 856+ Universal Agentic Skills for AI Coding Assistants — Claude Code, Gemini CLI, Codex CLI, Antigravity IDE, GitHub Copilot, Cursor, OpenCode, AdaL**
|
||||
|
||||
[](https://opensource.org/licenses/MIT)
|
||||
[](https://claude.ai)
|
||||
@@ -16,7 +16,7 @@
|
||||
|
||||
If this project helps you, you can [support it here](https://buymeacoffee.com/sickn33) or simply ⭐ the repo.
|
||||
|
||||
**Antigravity Awesome Skills** is a curated, battle-tested library of **713 high-performance agentic skills** designed to work seamlessly across all major AI coding assistants:
|
||||
**Antigravity Awesome Skills** is a curated, battle-tested library of **856 high-performance agentic skills** designed to work seamlessly across all major AI coding assistants:
|
||||
|
||||
- 🟣 **Claude Code** (Anthropic CLI)
|
||||
- 🔵 **Gemini CLI** (Google DeepMind)
|
||||
@@ -27,7 +27,7 @@ If this project helps you, you can [support it here](https://buymeacoffee.com/si
|
||||
- ⚪ **OpenCode** (Open-source CLI)
|
||||
- 🌸 **AdaL CLI** (Self-evolving Coding Agent)
|
||||
|
||||
This repository provides essential skills to transform your AI assistant into a **full-stack digital agency**, including official capabilities from **Anthropic**, **OpenAI**, **Google**, **Supabase**, and **Vercel Labs**.
|
||||
This repository provides essential skills to transform your AI assistant into a **full-stack digital agency**, including official capabilities from **Anthropic**, **OpenAI**, **Google**, **Microsoft**, **Supabase**, and **Vercel Labs**.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
@@ -36,8 +36,9 @@ This repository provides essential skills to transform your AI assistant into a
|
||||
- [🛠️ Installation](#installation)
|
||||
- [🧯 Troubleshooting](#troubleshooting)
|
||||
- [🎁 Curated Collections (Bundles)](#curated-collections)
|
||||
- [🧭 Antigravity Workflows](#antigravity-workflows)
|
||||
- [📦 Features & Categories](#features--categories)
|
||||
- [📚 Browse 713+ Skills](#browse-713-skills)
|
||||
- [📚 Browse 856+ Skills](#browse-856-skills)
|
||||
- [🤝 How to Contribute](#how-to-contribute)
|
||||
- [🤝 Community](#community)
|
||||
- [☕ Support the Project](#support-the-project)
|
||||
@@ -51,11 +52,11 @@ This repository provides essential skills to transform your AI assistant into a
|
||||
|
||||
## New Here? Start Here!
|
||||
|
||||
**Welcome to the V4.0.0 Enterprise Edition.** This isn't just a list of scripts; it's a complete operating system for your AI Agent.
|
||||
**Welcome to the V5.2.0 Workflows Edition.** This isn't just a list of scripts; it's a complete operating system for your AI Agent.
|
||||
|
||||
### 1. 🐣 Context: What is this?
|
||||
|
||||
**Antigravity Awesome Skills** (Release 4.0.0) is a massive upgrade to your AI's capabilities.
|
||||
**Antigravity Awesome Skills** (Release 5.2.0) is a massive upgrade to your AI's capabilities.
|
||||
|
||||
AI Agents (like Claude Code, Cursor, or Gemini) are smart, but they lack **specific tools**. They don't know your company's "Deployment Protocol" or the specific syntax for "AWS CloudFormation".
|
||||
**Skills** are small markdown files that teach them how to do these specific tasks perfectly, every time.
|
||||
@@ -220,24 +221,47 @@ npx antigravity-awesome-skills
|
||||
They help you avoid picking from 700+ skills one by one.
|
||||
|
||||
What bundles are:
|
||||
|
||||
- Recommended starting sets for common workflows.
|
||||
- A shortcut for onboarding and faster execution.
|
||||
|
||||
What bundles are not:
|
||||
|
||||
- Not a separate install.
|
||||
- Not a locked preset.
|
||||
|
||||
How to use bundles:
|
||||
|
||||
1. Install the repository once.
|
||||
2. Pick one bundle in [docs/BUNDLES.md](docs/BUNDLES.md).
|
||||
3. Start with 3-5 skills from that bundle in your prompt.
|
||||
4. Add more only when needed.
|
||||
|
||||
Examples:
|
||||
|
||||
- Building a SaaS MVP: `Essentials` + `Full-Stack Developer` + `QA & Testing`.
|
||||
- Hardening production: `Security Developer` + `DevOps & Cloud` + `Observability & Monitoring`.
|
||||
- Shipping OSS changes: `Essentials` + `OSS Maintainer`.
|
||||
|
||||
## Antigravity Workflows
|
||||
|
||||
Bundles help you choose skills. Workflows help you execute them in order.
|
||||
|
||||
- Use bundles when you need curated recommendations by role.
|
||||
- Use workflows when you need step-by-step execution for a concrete goal.
|
||||
|
||||
Start here:
|
||||
|
||||
- [docs/WORKFLOWS.md](docs/WORKFLOWS.md): human-readable playbooks.
|
||||
- [data/workflows.json](data/workflows.json): machine-readable workflow metadata.
|
||||
|
||||
Initial workflows include:
|
||||
|
||||
- Ship a SaaS MVP
|
||||
- Security Audit for a Web App
|
||||
- Build an AI Agent System
|
||||
- QA and Browser Automation (with optional `@go-playwright` support for Go stacks)
|
||||
|
||||
## Features & Categories
|
||||
|
||||
The repository is organized into specialized domains to transform your AI into an expert across the entire software development lifecycle:
|
||||
@@ -256,7 +280,7 @@ The repository is organized into specialized domains to transform your AI into a
|
||||
|
||||
Counts change as new skills are added. For the current full registry, see [CATALOG.md](CATALOG.md).
|
||||
|
||||
## Browse 713+ Skills
|
||||
## Browse 856+ Skills
|
||||
|
||||
We have moved the full skill registry to a dedicated catalog to keep this README clean.
|
||||
|
||||
@@ -290,14 +314,17 @@ Please ensure your skill follows the Antigravity/Claude Code best practices.
|
||||
Support is optional. This project stays free and open-source for everyone.
|
||||
|
||||
If this repository saves you time or helps you ship faster, you can support ongoing maintenance:
|
||||
|
||||
- [☕ Buy me a book on Buy Me a Coffee](https://buymeacoffee.com/sickn33)
|
||||
|
||||
Where support goes:
|
||||
|
||||
- Skill curation, testing, and quality validation.
|
||||
- Documentation updates, examples, and onboarding improvements.
|
||||
- Faster triage and review of community issues and PRs.
|
||||
|
||||
Prefer non-financial support:
|
||||
|
||||
- Star the repository.
|
||||
- Open clear, reproducible issues.
|
||||
- Submit PRs (skills, docs, fixes).
|
||||
@@ -328,6 +355,8 @@ This collection would not be possible without the incredible work of the Claude
|
||||
- **[vercel-labs/agent-skills](https://github.com/vercel-labs/agent-skills)**: Vercel Labs official skills - React Best Practices, Web Design Guidelines.
|
||||
- **[openai/skills](https://github.com/openai/skills)**: OpenAI Codex skills catalog - Agent skills, Skill Creator, Concise Planning.
|
||||
- **[supabase/agent-skills](https://github.com/supabase/agent-skills)**: Supabase official skills - Postgres Best Practices.
|
||||
- **[microsoft/skills](https://github.com/microsoft/skills)**: Official Microsoft skills - Azure cloud services, Bot Framework, Cognitive Services, and enterprise development patterns across .NET, Python, TypeScript, Go, Rust, and Java.
|
||||
- **[google-gemini/gemini-skills](https://github.com/google-gemini/gemini-skills)**: Official Gemini skills - Gemini API, SDK and model interactions.
|
||||
|
||||
### Community Contributors
|
||||
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 49 KiB After Width: | Height: | Size: 50 KiB |
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"generatedAt": "2026-02-08T10:02:49.626Z",
|
||||
"generatedAt": "2026-02-08T00:00:00.000Z",
|
||||
"aliases": {
|
||||
"accessibility-compliance-audit": "accessibility-compliance-accessibility-audit",
|
||||
"active directory attacks": "active-directory-attacks",
|
||||
@@ -10,6 +10,25 @@
|
||||
"templates": "app-builder/templates",
|
||||
"application-performance-optimization": "application-performance-performance-optimization",
|
||||
"aws penetration testing": "aws-penetration-testing",
|
||||
"azure-ai-dotnet": "azure-ai-agents-persistent-dotnet",
|
||||
"azure-ai-java": "azure-ai-agents-persistent-java",
|
||||
"azure-ai-py": "azure-ai-contentunderstanding-py",
|
||||
"azure-ai-ts": "azure-ai-document-intelligence-ts",
|
||||
"azure-communication-java": "azure-communication-callautomation-java",
|
||||
"azure-keyvault-rust": "azure-keyvault-certificates-rust",
|
||||
"azure-messaging-java": "azure-messaging-webpubsub-java",
|
||||
"azure-messaging-py": "azure-messaging-webpubsubservice-py",
|
||||
"azure-mgmt-dotnet": "azure-mgmt-apimanagement-dotnet",
|
||||
"azure-microsoft-ts": "azure-microsoft-playwright-testing-ts",
|
||||
"azure-monitor-java": "azure-monitor-ingestion-java",
|
||||
"azure-monitor-py": "azure-monitor-opentelemetry-exporter-py",
|
||||
"azure-monitor-ts": "azure-monitor-opentelemetry-ts",
|
||||
"azure-resource-dotnet": "azure-resource-manager-cosmosdb-dotnet",
|
||||
"azure-search-dotnet": "azure-search-documents-dotnet",
|
||||
"azure-security-dotnet": "azure-security-keyvault-keys-dotnet",
|
||||
"azure-security-java": "azure-security-keyvault-keys-java",
|
||||
"azure-speech-py": "azure-speech-to-text-rest-py",
|
||||
"azure-storage-py": "azure-storage-file-datalake-py",
|
||||
"backend-development-feature": "backend-development-feature-development",
|
||||
"brand-guidelines": "brand-guidelines-anthropic",
|
||||
"broken authentication testing": "broken-authentication",
|
||||
@@ -85,6 +104,7 @@
|
||||
"llm-application-optimize": "llm-application-dev-prompt-optimize",
|
||||
"machine-learning-pipeline": "machine-learning-ops-ml-pipeline",
|
||||
"metasploit framework": "metasploit-framework",
|
||||
"microsoft-azure-dotnet": "microsoft-azure-webjobs-extensions-authentication-events-dotnet",
|
||||
"moodle-external-development": "moodle-external-api-development",
|
||||
"multi-platform-apps": "multi-platform-apps-multi-platform",
|
||||
"network 101": "network-101",
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
{
|
||||
"generatedAt": "2026-02-08T10:02:49.626Z",
|
||||
"generatedAt": "2026-02-08T00:00:00.000Z",
|
||||
"bundles": {
|
||||
"core-dev": {
|
||||
"description": "Core development skills across languages, frameworks, and backend/frontend fundamentals.",
|
||||
"skills": [
|
||||
"3d-web-experience",
|
||||
"agent-framework-azure-ai-py",
|
||||
"algolia-search",
|
||||
"api-design-principles",
|
||||
"api-documentation-generator",
|
||||
@@ -19,7 +20,91 @@
|
||||
"async-python-patterns",
|
||||
"autonomous-agents",
|
||||
"aws-serverless",
|
||||
"azure-ai-agents-persistent-java",
|
||||
"azure-ai-anomalydetector-java",
|
||||
"azure-ai-contentsafety-java",
|
||||
"azure-ai-contentsafety-py",
|
||||
"azure-ai-contentunderstanding-py",
|
||||
"azure-ai-formrecognizer-java",
|
||||
"azure-ai-ml-py",
|
||||
"azure-ai-projects-java",
|
||||
"azure-ai-projects-py",
|
||||
"azure-ai-projects-ts",
|
||||
"azure-ai-transcription-py",
|
||||
"azure-ai-translation-ts",
|
||||
"azure-ai-vision-imageanalysis-java",
|
||||
"azure-ai-voicelive-java",
|
||||
"azure-ai-voicelive-py",
|
||||
"azure-ai-voicelive-ts",
|
||||
"azure-appconfiguration-java",
|
||||
"azure-appconfiguration-py",
|
||||
"azure-appconfiguration-ts",
|
||||
"azure-communication-callautomation-java",
|
||||
"azure-communication-callingserver-java",
|
||||
"azure-communication-chat-java",
|
||||
"azure-communication-common-java",
|
||||
"azure-communication-sms-java",
|
||||
"azure-compute-batch-java",
|
||||
"azure-containerregistry-py",
|
||||
"azure-cosmos-db-py",
|
||||
"azure-cosmos-java",
|
||||
"azure-cosmos-py",
|
||||
"azure-cosmos-rust",
|
||||
"azure-cosmos-ts",
|
||||
"azure-data-tables-java",
|
||||
"azure-data-tables-py",
|
||||
"azure-eventgrid-java",
|
||||
"azure-eventgrid-py",
|
||||
"azure-eventhub-java",
|
||||
"azure-eventhub-py",
|
||||
"azure-eventhub-rust",
|
||||
"azure-eventhub-ts",
|
||||
"azure-functions",
|
||||
"azure-identity-java",
|
||||
"azure-identity-py",
|
||||
"azure-identity-rust",
|
||||
"azure-identity-ts",
|
||||
"azure-keyvault-certificates-rust",
|
||||
"azure-keyvault-keys-rust",
|
||||
"azure-keyvault-keys-ts",
|
||||
"azure-keyvault-py",
|
||||
"azure-keyvault-secrets-rust",
|
||||
"azure-keyvault-secrets-ts",
|
||||
"azure-messaging-webpubsub-java",
|
||||
"azure-messaging-webpubsubservice-py",
|
||||
"azure-mgmt-apicenter-dotnet",
|
||||
"azure-mgmt-apicenter-py",
|
||||
"azure-mgmt-apimanagement-dotnet",
|
||||
"azure-mgmt-apimanagement-py",
|
||||
"azure-mgmt-applicationinsights-dotnet",
|
||||
"azure-mgmt-botservice-py",
|
||||
"azure-mgmt-fabric-py",
|
||||
"azure-monitor-ingestion-java",
|
||||
"azure-monitor-ingestion-py",
|
||||
"azure-monitor-opentelemetry-exporter-java",
|
||||
"azure-monitor-opentelemetry-exporter-py",
|
||||
"azure-monitor-opentelemetry-py",
|
||||
"azure-monitor-opentelemetry-ts",
|
||||
"azure-monitor-query-java",
|
||||
"azure-monitor-query-py",
|
||||
"azure-postgres-ts",
|
||||
"azure-search-documents-py",
|
||||
"azure-search-documents-ts",
|
||||
"azure-security-keyvault-keys-java",
|
||||
"azure-security-keyvault-secrets-java",
|
||||
"azure-servicebus-py",
|
||||
"azure-servicebus-ts",
|
||||
"azure-speech-to-text-rest-py",
|
||||
"azure-storage-blob-java",
|
||||
"azure-storage-blob-py",
|
||||
"azure-storage-blob-rust",
|
||||
"azure-storage-blob-ts",
|
||||
"azure-storage-file-datalake-py",
|
||||
"azure-storage-file-share-py",
|
||||
"azure-storage-file-share-ts",
|
||||
"azure-storage-queue-py",
|
||||
"azure-storage-queue-ts",
|
||||
"azure-web-pubsub-ts",
|
||||
"backend-architect",
|
||||
"backend-dev-guidelines",
|
||||
"backend-development-feature-development",
|
||||
@@ -33,6 +118,7 @@
|
||||
"claude-d3js-skill",
|
||||
"code-documentation-doc-generate",
|
||||
"context7-auto-research",
|
||||
"copilot-sdk",
|
||||
"discord-bot-architect",
|
||||
"django-pro",
|
||||
"documentation-generation-doc-generate",
|
||||
@@ -42,6 +128,7 @@
|
||||
"dotnet-backend-patterns",
|
||||
"exa-search",
|
||||
"fastapi-pro",
|
||||
"fastapi-router-py",
|
||||
"fastapi-templates",
|
||||
"firebase",
|
||||
"firecrawl-scraper",
|
||||
@@ -56,8 +143,11 @@
|
||||
"frontend-mobile-security-xss-scan",
|
||||
"frontend-security-coder",
|
||||
"frontend-slides",
|
||||
"frontend-ui-dark-ts",
|
||||
"game-development/mobile-games",
|
||||
"gemini-api-dev",
|
||||
"go-concurrency-patterns",
|
||||
"go-playwright",
|
||||
"golang-pro",
|
||||
"graphql",
|
||||
"hubspot-integration",
|
||||
@@ -70,8 +160,11 @@
|
||||
"javascript-typescript-typescript-scaffold",
|
||||
"langgraph",
|
||||
"launch-strategy",
|
||||
"m365-agents-py",
|
||||
"m365-agents-ts",
|
||||
"makepad-skills",
|
||||
"mcp-builder",
|
||||
"mcp-builder-ms",
|
||||
"memory-safety-patterns",
|
||||
"mobile-design",
|
||||
"mobile-developer",
|
||||
@@ -90,7 +183,9 @@
|
||||
"openapi-spec-generation",
|
||||
"php-pro",
|
||||
"plaid-fintech",
|
||||
"podcast-generation",
|
||||
"product-manager-toolkit",
|
||||
"pydantic-models-py",
|
||||
"python-development-python-scaffold",
|
||||
"python-packaging",
|
||||
"python-patterns",
|
||||
@@ -98,6 +193,7 @@
|
||||
"python-pro",
|
||||
"python-testing-patterns",
|
||||
"react-best-practices",
|
||||
"react-flow-node-ts",
|
||||
"react-modernization",
|
||||
"react-native-architecture",
|
||||
"react-patterns",
|
||||
@@ -136,18 +232,28 @@
|
||||
"voice-agents",
|
||||
"voice-ai-development",
|
||||
"web-artifacts-builder",
|
||||
"webapp-testing"
|
||||
"webapp-testing",
|
||||
"zustand-store-ts"
|
||||
]
|
||||
},
|
||||
"security-core": {
|
||||
"description": "Security, privacy, and compliance essentials.",
|
||||
"skills": [
|
||||
"accessibility-compliance-accessibility-audit",
|
||||
"antigravity-workflows",
|
||||
"api-fuzzing-bug-bounty",
|
||||
"api-security-best-practices",
|
||||
"attack-tree-construction",
|
||||
"auth-implementation-patterns",
|
||||
"aws-penetration-testing",
|
||||
"azure-cosmos-db-py",
|
||||
"azure-identity-dotnet",
|
||||
"azure-keyvault-py",
|
||||
"azure-keyvault-secrets-rust",
|
||||
"azure-keyvault-secrets-ts",
|
||||
"azure-security-keyvault-keys-dotnet",
|
||||
"azure-security-keyvault-keys-java",
|
||||
"azure-security-keyvault-secrets-java",
|
||||
"backend-security-coder",
|
||||
"broken-authentication",
|
||||
"burp-suite-testing",
|
||||
@@ -186,6 +292,8 @@
|
||||
"legal-advisor",
|
||||
"linkerd-patterns",
|
||||
"loki-mode",
|
||||
"m365-agents-dotnet",
|
||||
"m365-agents-py",
|
||||
"malware-analyst",
|
||||
"metasploit-framework",
|
||||
"mobile-security-coder",
|
||||
@@ -237,6 +345,19 @@
|
||||
"k8s-core": {
|
||||
"description": "Kubernetes and service mesh essentials.",
|
||||
"skills": [
|
||||
"azd-deployment",
|
||||
"azure-cosmos-db-py",
|
||||
"azure-identity-dotnet",
|
||||
"azure-identity-java",
|
||||
"azure-identity-py",
|
||||
"azure-identity-ts",
|
||||
"azure-messaging-webpubsubservice-py",
|
||||
"azure-mgmt-apimanagement-dotnet",
|
||||
"azure-mgmt-botservice-dotnet",
|
||||
"azure-mgmt-botservice-py",
|
||||
"azure-servicebus-dotnet",
|
||||
"azure-servicebus-py",
|
||||
"azure-servicebus-ts",
|
||||
"backend-architect",
|
||||
"devops-troubleshooter",
|
||||
"freshservice-automation",
|
||||
@@ -264,6 +385,35 @@
|
||||
"airflow-dag-patterns",
|
||||
"analytics-tracking",
|
||||
"angular-ui-patterns",
|
||||
"azure-ai-document-intelligence-dotnet",
|
||||
"azure-ai-document-intelligence-ts",
|
||||
"azure-ai-textanalytics-py",
|
||||
"azure-cosmos-db-py",
|
||||
"azure-cosmos-java",
|
||||
"azure-cosmos-py",
|
||||
"azure-cosmos-rust",
|
||||
"azure-cosmos-ts",
|
||||
"azure-data-tables-java",
|
||||
"azure-data-tables-py",
|
||||
"azure-eventhub-dotnet",
|
||||
"azure-eventhub-java",
|
||||
"azure-eventhub-rust",
|
||||
"azure-eventhub-ts",
|
||||
"azure-maps-search-dotnet",
|
||||
"azure-mgmt-applicationinsights-dotnet",
|
||||
"azure-monitor-ingestion-java",
|
||||
"azure-monitor-ingestion-py",
|
||||
"azure-monitor-query-java",
|
||||
"azure-monitor-query-py",
|
||||
"azure-postgres-ts",
|
||||
"azure-resource-manager-cosmosdb-dotnet",
|
||||
"azure-resource-manager-mysql-dotnet",
|
||||
"azure-resource-manager-postgresql-dotnet",
|
||||
"azure-resource-manager-redis-dotnet",
|
||||
"azure-resource-manager-sql-dotnet",
|
||||
"azure-security-keyvault-secrets-java",
|
||||
"azure-storage-blob-java",
|
||||
"azure-storage-file-datalake-py",
|
||||
"blockrun",
|
||||
"business-analyst",
|
||||
"cc-skill-backend-patterns",
|
||||
@@ -288,6 +438,7 @@
|
||||
"firebase",
|
||||
"fp-ts-react",
|
||||
"frontend-dev-guidelines",
|
||||
"frontend-ui-dark-ts",
|
||||
"gdpr-data-handling",
|
||||
"google-analytics-automation",
|
||||
"googlesheets-automation",
|
||||
@@ -312,6 +463,7 @@
|
||||
"postgresql",
|
||||
"prisma-expert",
|
||||
"programmatic-seo",
|
||||
"pydantic-models-py",
|
||||
"quant-analyst",
|
||||
"react-best-practices",
|
||||
"react-ui-patterns",
|
||||
@@ -342,6 +494,13 @@
|
||||
"api-testing-observability-api-mock",
|
||||
"application-performance-performance-optimization",
|
||||
"aws-serverless",
|
||||
"azd-deployment",
|
||||
"azure-ai-anomalydetector-java",
|
||||
"azure-mgmt-applicationinsights-dotnet",
|
||||
"azure-mgmt-arizeaiobservabilityeval-dotnet",
|
||||
"azure-mgmt-weightsandbiases-dotnet",
|
||||
"azure-monitor-opentelemetry-exporter-java",
|
||||
"azure-monitor-opentelemetry-ts",
|
||||
"backend-architect",
|
||||
"backend-development-feature-development",
|
||||
"c4-container",
|
||||
|
||||
3838
data/catalog.json
3838
data/catalog.json
File diff suppressed because it is too large
Load Diff
216
data/workflows.json
Normal file
216
data/workflows.json
Normal file
@@ -0,0 +1,216 @@
|
||||
{
|
||||
"generatedAt": "2026-02-10T00:00:00.000Z",
|
||||
"version": 1,
|
||||
"workflows": [
|
||||
{
|
||||
"id": "ship-saas-mvp",
|
||||
"name": "Ship a SaaS MVP",
|
||||
"description": "End-to-end workflow to scope, build, test, and ship a SaaS MVP quickly.",
|
||||
"category": "web",
|
||||
"relatedBundles": [
|
||||
"core-dev",
|
||||
"ops-core"
|
||||
],
|
||||
"steps": [
|
||||
{
|
||||
"title": "Plan the scope",
|
||||
"goal": "Convert the idea into a clear implementation plan and milestones.",
|
||||
"recommendedSkills": [
|
||||
"brainstorming",
|
||||
"concise-planning",
|
||||
"writing-plans"
|
||||
],
|
||||
"notes": "Define problem, user persona, MVP boundaries, and acceptance criteria before coding."
|
||||
},
|
||||
{
|
||||
"title": "Build backend and API",
|
||||
"goal": "Implement the core data model, API contracts, and auth baseline.",
|
||||
"recommendedSkills": [
|
||||
"backend-dev-guidelines",
|
||||
"api-patterns",
|
||||
"database-design",
|
||||
"auth-implementation-patterns"
|
||||
],
|
||||
"notes": "Prefer small vertical slices; keep API contracts explicit and testable."
|
||||
},
|
||||
{
|
||||
"title": "Build frontend",
|
||||
"goal": "Deliver the primary user flows with production-grade UX patterns.",
|
||||
"recommendedSkills": [
|
||||
"frontend-developer",
|
||||
"react-patterns",
|
||||
"frontend-design"
|
||||
],
|
||||
"notes": "Prioritize onboarding, empty states, and one complete happy-path flow."
|
||||
},
|
||||
{
|
||||
"title": "Test and validate",
|
||||
"goal": "Catch regressions and ensure key flows work before release.",
|
||||
"recommendedSkills": [
|
||||
"test-driven-development",
|
||||
"systematic-debugging",
|
||||
"browser-automation",
|
||||
"go-playwright"
|
||||
],
|
||||
"notes": "Use go-playwright when the product stack or QA tooling is Go-based."
|
||||
},
|
||||
{
|
||||
"title": "Ship safely",
|
||||
"goal": "Release with basic observability and rollback readiness.",
|
||||
"recommendedSkills": [
|
||||
"deployment-procedures",
|
||||
"observability-engineer",
|
||||
"postmortem-writing"
|
||||
],
|
||||
"notes": "Define release checklist, minimum telemetry, and rollback triggers."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "security-audit-web-app",
|
||||
"name": "Security Audit for a Web App",
|
||||
"description": "Structured workflow for baseline AppSec review and risk triage.",
|
||||
"category": "security",
|
||||
"relatedBundles": [
|
||||
"security-core",
|
||||
"ops-core"
|
||||
],
|
||||
"steps": [
|
||||
{
|
||||
"title": "Define scope and threat model",
|
||||
"goal": "Identify critical assets, trust boundaries, and threat scenarios.",
|
||||
"recommendedSkills": [
|
||||
"ethical-hacking-methodology",
|
||||
"threat-modeling-expert",
|
||||
"attack-tree-construction"
|
||||
],
|
||||
"notes": "Document in-scope targets, assumptions, and out-of-scope constraints."
|
||||
},
|
||||
{
|
||||
"title": "Review authentication and authorization",
|
||||
"goal": "Find broken auth patterns and access-control weaknesses.",
|
||||
"recommendedSkills": [
|
||||
"broken-authentication",
|
||||
"auth-implementation-patterns",
|
||||
"idor-testing"
|
||||
],
|
||||
"notes": "Prioritize account takeover and privilege escalation paths."
|
||||
},
|
||||
{
|
||||
"title": "Assess API and input security",
|
||||
"goal": "Detect high-impact API and injection risks.",
|
||||
"recommendedSkills": [
|
||||
"api-security-best-practices",
|
||||
"api-fuzzing-bug-bounty",
|
||||
"top-web-vulnerabilities"
|
||||
],
|
||||
"notes": "Map findings to severity and exploitability, not only CVSS."
|
||||
},
|
||||
{
|
||||
"title": "Harden and verify",
|
||||
"goal": "Translate findings into concrete remediations and retest.",
|
||||
"recommendedSkills": [
|
||||
"security-auditor",
|
||||
"sast-configuration",
|
||||
"verification-before-completion"
|
||||
],
|
||||
"notes": "Track remediation owners and target dates; verify each fix with evidence."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "build-ai-agent-system",
|
||||
"name": "Build an AI Agent System",
|
||||
"description": "Workflow to design, implement, and evaluate a production-ready AI agent.",
|
||||
"category": "ai-agents",
|
||||
"relatedBundles": [
|
||||
"core-dev",
|
||||
"data-core"
|
||||
],
|
||||
"steps": [
|
||||
{
|
||||
"title": "Define use case and reliability targets",
|
||||
"goal": "Choose a narrow use case and measurable quality goals.",
|
||||
"recommendedSkills": [
|
||||
"ai-agents-architect",
|
||||
"agent-evaluation",
|
||||
"product-manager-toolkit"
|
||||
],
|
||||
"notes": "Set latency, quality, and failure-rate thresholds before implementation."
|
||||
},
|
||||
{
|
||||
"title": "Design architecture and retrieval",
|
||||
"goal": "Design tools, memory, and retrieval strategy for the agent.",
|
||||
"recommendedSkills": [
|
||||
"llm-app-patterns",
|
||||
"rag-implementation",
|
||||
"vector-database-engineer",
|
||||
"embedding-strategies"
|
||||
],
|
||||
"notes": "Keep retrieval quality measurable and version prompt/tool contracts."
|
||||
},
|
||||
{
|
||||
"title": "Implement orchestration",
|
||||
"goal": "Implement the orchestration loop and production safeguards.",
|
||||
"recommendedSkills": [
|
||||
"langgraph",
|
||||
"mcp-builder",
|
||||
"workflow-automation"
|
||||
],
|
||||
"notes": "Start with constrained tool permissions and explicit fallback behavior."
|
||||
},
|
||||
{
|
||||
"title": "Evaluate and iterate",
|
||||
"goal": "Run benchmark scenarios and improve weak areas systematically.",
|
||||
"recommendedSkills": [
|
||||
"agent-evaluation",
|
||||
"langfuse",
|
||||
"kaizen"
|
||||
],
|
||||
"notes": "Use test datasets and failure buckets to guide each iteration cycle."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"id": "qa-browser-automation",
|
||||
"name": "QA and Browser Automation",
|
||||
"description": "Workflow for robust E2E and browser-driven validation across stacks.",
|
||||
"category": "testing",
|
||||
"relatedBundles": [
|
||||
"core-dev",
|
||||
"ops-core"
|
||||
],
|
||||
"steps": [
|
||||
{
|
||||
"title": "Prepare test strategy",
|
||||
"goal": "Define critical user journeys, environments, and test data.",
|
||||
"recommendedSkills": [
|
||||
"e2e-testing-patterns",
|
||||
"test-driven-development",
|
||||
"code-review-checklist"
|
||||
],
|
||||
"notes": "Focus on business-critical flows and keep setup deterministic."
|
||||
},
|
||||
{
|
||||
"title": "Implement browser tests",
|
||||
"goal": "Automate key flows with resilient locators and stable waits.",
|
||||
"recommendedSkills": [
|
||||
"browser-automation",
|
||||
"go-playwright"
|
||||
],
|
||||
"notes": "Use go-playwright for Go-native automation projects and Playwright for JS/TS stacks."
|
||||
},
|
||||
{
|
||||
"title": "Triage failures and harden",
|
||||
"goal": "Stabilize flaky tests and establish repeatable CI execution.",
|
||||
"recommendedSkills": [
|
||||
"systematic-debugging",
|
||||
"test-fixing",
|
||||
"verification-before-completion"
|
||||
],
|
||||
"notes": "Classify failures by root cause: selector drift, timing, environment, data."
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
13
docs/FAQ.md
13
docs/FAQ.md
@@ -11,12 +11,23 @@
|
||||
Skills are specialized instruction files that teach AI assistants how to handle specific tasks. Think of them as expert knowledge modules that your AI can load on-demand.
|
||||
**Simple analogy:** Just like you might consult different experts (a lawyer, a doctor, a mechanic), these skills let your AI become an expert in different areas when you need them.
|
||||
|
||||
### Do I need to install all 626+ skills?
|
||||
### Do I need to install all 700+ skills?
|
||||
|
||||
**No!** When you clone the repository, all skills are available, but your AI only loads them when you explicitly invoke them with `@skill-name`.
|
||||
It's like having a library - all books are there, but you only read the ones you need.
|
||||
**Pro Tip:** Use [Starter Packs](BUNDLES.md) to install only what matches your role.
|
||||
|
||||
### What is the difference between Bundles and Workflows?
|
||||
|
||||
- **Bundles** are curated recommendations grouped by role or domain.
|
||||
- **Workflows** are ordered execution playbooks for concrete outcomes.
|
||||
|
||||
Use bundles when you are deciding *which skills* to include. Use workflows when you need *step-by-step execution*.
|
||||
|
||||
Start from:
|
||||
- [BUNDLES.md](BUNDLES.md)
|
||||
- [WORKFLOWS.md](WORKFLOWS.md)
|
||||
|
||||
### Which AI tools work with these skills?
|
||||
|
||||
- ✅ **Claude Code** (Anthropic CLI)
|
||||
|
||||
@@ -15,7 +15,7 @@ AI Agents (like **Claude Code**, **Gemini**, **Cursor**) are smart, but they lac
|
||||
|
||||
## ⚡️ Quick Start: The "Starter Packs"
|
||||
|
||||
Don't panic about the 626+ skills. You don't need them all at once.
|
||||
Don't panic about the 700+ skills. You don't need them all at once.
|
||||
We have curated **Starter Packs** to get you running immediately.
|
||||
|
||||
You **install the full repo once** (npx or clone); Starter Packs are curated lists to help you **pick which skills to use** by role (e.g. Web Wizard, Hacker Pack)—they are not a different way to install.
|
||||
@@ -52,6 +52,21 @@ Find the bundle that matches your role (see [BUNDLES.md](BUNDLES.md)):
|
||||
|
||||
---
|
||||
|
||||
## 🧭 Bundles vs Workflows
|
||||
|
||||
Bundles and workflows solve different problems:
|
||||
|
||||
- **Bundles** = curated sets by role (what to pick).
|
||||
- **Workflows** = step-by-step playbooks (how to execute).
|
||||
|
||||
Start with bundles in [BUNDLES.md](BUNDLES.md), then run a workflow from [WORKFLOWS.md](WORKFLOWS.md) when you need guided execution.
|
||||
|
||||
Example:
|
||||
|
||||
> "Use **@antigravity-workflows** and run `ship-saas-mvp` for my project idea."
|
||||
|
||||
---
|
||||
|
||||
## 🚀 How to Use a Skill
|
||||
|
||||
Once installed, just talk to your AI naturally.
|
||||
@@ -103,7 +118,7 @@ _Check the [Skill Catalog](../CATALOG.md) for the full list._
|
||||
|
||||
## ❓ FAQ
|
||||
|
||||
**Q: Do I need to install all 626 skills?**
|
||||
**Q: Do I need to install all 700+ skills?**
|
||||
A: You clone the whole repo once; your AI only _reads_ the skills you invoke (or that are relevant), so it stays lightweight. **Starter Packs** in [BUNDLES.md](BUNDLES.md) are curated lists to help you discover the right skills for your role—they don't change how you install.
|
||||
|
||||
**Q: Can I make my own skills?**
|
||||
|
||||
21
docs/LICENSE-MICROSOFT
Normal file
21
docs/LICENSE-MICROSOFT
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) Microsoft Corporation.
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE
|
||||
@@ -3,16 +3,16 @@
|
||||
We believe in giving credit where credit is due.
|
||||
If you recognize your work here and it is not properly attributed, please open an Issue.
|
||||
|
||||
| Skill / Category | Original Source | License | Notes |
|
||||
| :-------------------------- | :----------------------------------------------------- | :------------- | :---------------------------- |
|
||||
| `cloud-penetration-testing` | [HackTricks](https://book.hacktricks.xyz/) | MIT / CC-BY-SA | Adapted for agentic use. |
|
||||
| `active-directory-attacks` | [HackTricks](https://book.hacktricks.xyz/) | MIT / CC-BY-SA | Adapted for agentic use. |
|
||||
| `owasp-top-10` | [OWASP](https://owasp.org/) | CC-BY-SA | Methodology adapted. |
|
||||
| `burp-suite-testing` | [PortSwigger](https://portswigger.net/burp) | N/A | Usage guide only (no binary). |
|
||||
| `crewai` | [CrewAI](https://github.com/joaomdmoura/crewAI) | MIT | Framework guides. |
|
||||
| `langgraph` | [LangGraph](https://github.com/langchain-ai/langgraph) | MIT | Framework guides. |
|
||||
| `react-patterns` | [React Docs](https://react.dev/) | CC-BY | Official patterns. |
|
||||
| **All Official Skills** | [Anthropic / Google / OpenAI] | Proprietary | Usage encouraged by vendors. |
|
||||
| Skill / Category | Original Source | License | Notes |
|
||||
| :-------------------------- | :----------------------------------------------------------------- | :------------- | :---------------------------- |
|
||||
| `cloud-penetration-testing` | [HackTricks](https://book.hacktricks.xyz/) | MIT / CC-BY-SA | Adapted for agentic use. |
|
||||
| `active-directory-attacks` | [HackTricks](https://book.hacktricks.xyz/) | MIT / CC-BY-SA | Adapted for agentic use. |
|
||||
| `owasp-top-10` | [OWASP](https://owasp.org/) | CC-BY-SA | Methodology adapted. |
|
||||
| `burp-suite-testing` | [PortSwigger](https://portswigger.net/burp) | N/A | Usage guide only (no binary). |
|
||||
| `crewai` | [CrewAI](https://github.com/joaomdmoura/crewAI) | MIT | Framework guides. |
|
||||
| `langgraph` | [LangGraph](https://github.com/langchain-ai/langgraph) | MIT | Framework guides. |
|
||||
| `react-patterns` | [React Docs](https://react.dev/) | CC-BY | Official patterns. |
|
||||
| **All Official Skills** | [Anthropic / Google / OpenAI / Microsoft / Supabase / Vercel Labs] | Proprietary | Usage encouraged by vendors. |
|
||||
|
||||
## Skills from VoltAgent/awesome-agent-skills
|
||||
|
||||
@@ -20,44 +20,44 @@ The following skills were added from the curated collection at [VoltAgent/awesom
|
||||
|
||||
### Official Team Skills
|
||||
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :---- | :-------------- | :------ | :---- |
|
||||
| `vercel-deploy-claimable` | [Vercel Labs](https://github.com/vercel-labs/agent-skills) | MIT | Official Vercel skill |
|
||||
| `design-md` | [Google Labs (Stitch)](https://github.com/google-labs-code/stitch-skills) | Compatible | Google Labs Stitch skills |
|
||||
| `hugging-face-cli`, `hugging-face-jobs` | [Hugging Face](https://github.com/huggingface/skills) | Compatible | Official Hugging Face skills |
|
||||
| `culture-index`, `fix-review`, `sharp-edges` | [Trail of Bits](https://github.com/trailofbits/skills) | Compatible | Security skills from Trail of Bits |
|
||||
| `expo-deployment`, `upgrading-expo` | [Expo](https://github.com/expo/skills) | Compatible | Official Expo skills |
|
||||
| `commit`, `create-pr`, `find-bugs`, `iterate-pr` | [Sentry](https://github.com/getsentry/skills) | Compatible | Sentry dev team skills |
|
||||
| `using-neon` | [Neon](https://github.com/neondatabase/agent-skills) | Compatible | Neon Postgres best practices |
|
||||
| `fal-audio`, `fal-generate`, `fal-image-edit`, `fal-platform`, `fal-upscale`, `fal-workflow` | [fal.ai Community](https://github.com/fal-ai-community/skills) | Compatible | fal.ai AI model skills |
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------ | :--------- | :--------------------------------- |
|
||||
| `vercel-deploy-claimable` | [Vercel Labs](https://github.com/vercel-labs/agent-skills) | MIT | Official Vercel skill |
|
||||
| `design-md` | [Google Labs (Stitch)](https://github.com/google-labs-code/stitch-skills) | Compatible | Google Labs Stitch skills |
|
||||
| `hugging-face-cli`, `hugging-face-jobs` | [Hugging Face](https://github.com/huggingface/skills) | Compatible | Official Hugging Face skills |
|
||||
| `culture-index`, `fix-review`, `sharp-edges` | [Trail of Bits](https://github.com/trailofbits/skills) | Compatible | Security skills from Trail of Bits |
|
||||
| `expo-deployment`, `upgrading-expo` | [Expo](https://github.com/expo/skills) | Compatible | Official Expo skills |
|
||||
| `commit`, `create-pr`, `find-bugs`, `iterate-pr` | [Sentry](https://github.com/getsentry/skills) | Compatible | Sentry dev team skills |
|
||||
| `using-neon` | [Neon](https://github.com/neondatabase/agent-skills) | Compatible | Neon Postgres best practices |
|
||||
| `fal-audio`, `fal-generate`, `fal-image-edit`, `fal-platform`, `fal-upscale`, `fal-workflow` | [fal.ai Community](https://github.com/fal-ai-community/skills) | Compatible | fal.ai AI model skills |
|
||||
|
||||
### Community Skills
|
||||
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :---- | :-------------- | :------ | :---- |
|
||||
| `automate-whatsapp`, `observe-whatsapp` | [gokapso](https://github.com/gokapso/agent-skills) | Compatible | WhatsApp automation skills |
|
||||
| `readme` | [Shpigford](https://github.com/Shpigford/skills) | Compatible | README generation |
|
||||
| `screenshots` | [Shpigford](https://github.com/Shpigford/skills) | Compatible | Marketing screenshots |
|
||||
| `aws-skills` | [zxkane](https://github.com/zxkane/aws-skills) | Compatible | AWS development patterns |
|
||||
| `deep-research` | [sanjay3290](https://github.com/sanjay3290/ai-skills) | Compatible | Gemini Deep Research Agent |
|
||||
| `ffuf-claude-skill` | [jthack](https://github.com/jthack/ffuf_claude_skill) | Compatible | Web fuzzing with ffuf |
|
||||
| `ui-skills` | [ibelick](https://github.com/ibelick/ui-skills) | Compatible | UI development constraints |
|
||||
| `vexor` | [scarletkc](https://github.com/scarletkc/vexor) | Compatible | Vector-powered CLI |
|
||||
| `pypict-skill` | [omkamal](https://github.com/omkamal/pypict-claude-skill) | Compatible | Pairwise test generation |
|
||||
| `makepad-skills` | [ZhangHanDong](https://github.com/ZhangHanDong/makepad-skills) | Compatible | Makepad UI development |
|
||||
| `swiftui-expert-skill` | [AvdLee](https://github.com/AvdLee/SwiftUI-Agent-Skill) | Compatible | SwiftUI best practices |
|
||||
| `threejs-skills` | [CloudAI-X](https://github.com/CloudAI-X/threejs-skills) | Compatible | Three.js 3D experiences |
|
||||
| `claude-scientific-skills` | [K-Dense-AI](https://github.com/K-Dense-AI/claude-scientific-skills) | Compatible | Scientific research skills |
|
||||
| `claude-win11-speckit-update-skill` | [NotMyself](https://github.com/NotMyself/claude-win11-speckit-update-skill) | Compatible | Windows 11 management |
|
||||
| `imagen` | [sanjay3290](https://github.com/sanjay3290/ai-skills) | Compatible | Google Gemini image generation |
|
||||
| `security-bluebook-builder` | [SHADOWPR0](https://github.com/SHADOWPR0/security-bluebook-builder) | Compatible | Security documentation |
|
||||
| `claude-ally-health` | [huifer](https://github.com/huifer/Claude-Ally-Health) | Compatible | Health assistant |
|
||||
| `clarity-gate` | [frmoretto](https://github.com/frmoretto/clarity-gate) | Compatible | RAG quality verification |
|
||||
| `n8n-code-python`, `n8n-mcp-tools-expert`, `n8n-node-configuration` | [czlonkowski](https://github.com/czlonkowski/n8n-skills) | Compatible | n8n automation skills |
|
||||
| `varlock-claude-skill` | [wrsmith108](https://github.com/wrsmith108/varlock-claude-skill) | Compatible | Secure environment variables |
|
||||
| `beautiful-prose` | [SHADOWPR0](https://github.com/SHADOWPR0/beautiful_prose) | Compatible | Writing style guide |
|
||||
| `claude-speed-reader` | [SeanZoR](https://github.com/SeanZoR/claude-speed-reader) | Compatible | Speed reading tool |
|
||||
| `skill-seekers` | [yusufkaraaslan](https://github.com/yusufkaraaslan/Skill_Seekers) | Compatible | Skill conversion tool |
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :------------------------------------------------------------------ | :-------------------------------------------------------------------------- | :--------- | :----------------------------- |
|
||||
| `automate-whatsapp`, `observe-whatsapp` | [gokapso](https://github.com/gokapso/agent-skills) | Compatible | WhatsApp automation skills |
|
||||
| `readme` | [Shpigford](https://github.com/Shpigford/skills) | Compatible | README generation |
|
||||
| `screenshots` | [Shpigford](https://github.com/Shpigford/skills) | Compatible | Marketing screenshots |
|
||||
| `aws-skills` | [zxkane](https://github.com/zxkane/aws-skills) | Compatible | AWS development patterns |
|
||||
| `deep-research` | [sanjay3290](https://github.com/sanjay3290/ai-skills) | Compatible | Gemini Deep Research Agent |
|
||||
| `ffuf-claude-skill` | [jthack](https://github.com/jthack/ffuf_claude_skill) | Compatible | Web fuzzing with ffuf |
|
||||
| `ui-skills` | [ibelick](https://github.com/ibelick/ui-skills) | Compatible | UI development constraints |
|
||||
| `vexor` | [scarletkc](https://github.com/scarletkc/vexor) | Compatible | Vector-powered CLI |
|
||||
| `pypict-skill` | [omkamal](https://github.com/omkamal/pypict-claude-skill) | Compatible | Pairwise test generation |
|
||||
| `makepad-skills` | [ZhangHanDong](https://github.com/ZhangHanDong/makepad-skills) | Compatible | Makepad UI development |
|
||||
| `swiftui-expert-skill` | [AvdLee](https://github.com/AvdLee/SwiftUI-Agent-Skill) | Compatible | SwiftUI best practices |
|
||||
| `threejs-skills` | [CloudAI-X](https://github.com/CloudAI-X/threejs-skills) | Compatible | Three.js 3D experiences |
|
||||
| `claude-scientific-skills` | [K-Dense-AI](https://github.com/K-Dense-AI/claude-scientific-skills) | Compatible | Scientific research skills |
|
||||
| `claude-win11-speckit-update-skill` | [NotMyself](https://github.com/NotMyself/claude-win11-speckit-update-skill) | Compatible | Windows 11 management |
|
||||
| `imagen` | [sanjay3290](https://github.com/sanjay3290/ai-skills) | Compatible | Google Gemini image generation |
|
||||
| `security-bluebook-builder` | [SHADOWPR0](https://github.com/SHADOWPR0/security-bluebook-builder) | Compatible | Security documentation |
|
||||
| `claude-ally-health` | [huifer](https://github.com/huifer/Claude-Ally-Health) | Compatible | Health assistant |
|
||||
| `clarity-gate` | [frmoretto](https://github.com/frmoretto/clarity-gate) | Compatible | RAG quality verification |
|
||||
| `n8n-code-python`, `n8n-mcp-tools-expert`, `n8n-node-configuration` | [czlonkowski](https://github.com/czlonkowski/n8n-skills) | Compatible | n8n automation skills |
|
||||
| `varlock-claude-skill` | [wrsmith108](https://github.com/wrsmith108/varlock-claude-skill) | Compatible | Secure environment variables |
|
||||
| `beautiful-prose` | [SHADOWPR0](https://github.com/SHADOWPR0/beautiful_prose) | Compatible | Writing style guide |
|
||||
| `claude-speed-reader` | [SeanZoR](https://github.com/SeanZoR/claude-speed-reader) | Compatible | Speed reading tool |
|
||||
| `skill-seekers` | [yusufkaraaslan](https://github.com/yusufkaraaslan/Skill_Seekers) | Compatible | Skill conversion tool |
|
||||
|
||||
- **frontend-slides** - [zarazhangrui](https://github.com/zarazhangrui/frontend-slides)
|
||||
- **linear-claude-skill** - [wrsmith108](https://github.com/wrsmith108/linear-claude-skill)
|
||||
@@ -74,11 +74,11 @@ The following skills were added from the curated collection at [VoltAgent/awesom
|
||||
|
||||
## Skills from whatiskadudoing/fp-ts-skills (v4.4.0)
|
||||
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :---- | :-------------- | :------ | :---- |
|
||||
| Skill | Original Source | License | Notes |
|
||||
| :---------------- | :------------------------------------------------------------------------------ | :--------- | :------------------------------------------------------- |
|
||||
| `fp-ts-pragmatic` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | Pragmatic fp-ts guide – pipe, Option, Either, TaskEither |
|
||||
| `fp-ts-react` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | fp-ts with React 18/19 and Next.js |
|
||||
| `fp-ts-errors` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | Type-safe error handling with Either and TaskEither |
|
||||
| `fp-ts-react` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | fp-ts with React 18/19 and Next.js |
|
||||
| `fp-ts-errors` | [whatiskadudoing/fp-ts-skills](https://github.com/whatiskadudoing/fp-ts-skills) | Compatible | Type-safe error handling with Either and TaskEither |
|
||||
|
||||
## License Policy
|
||||
|
||||
|
||||
174
docs/WORKFLOWS.md
Normal file
174
docs/WORKFLOWS.md
Normal file
@@ -0,0 +1,174 @@
|
||||
# Antigravity Workflows
|
||||
|
||||
> Workflow playbooks to orchestrate multiple skills with less friction.
|
||||
|
||||
## What Is a Workflow?
|
||||
|
||||
A workflow is a guided, step-by-step execution path that combines multiple skills for one concrete outcome.
|
||||
|
||||
- **Bundles** tell you which skills are relevant for a role.
|
||||
- **Workflows** tell you how to use those skills in sequence to complete a real objective.
|
||||
|
||||
If bundles are your toolbox, workflows are your execution playbook.
|
||||
|
||||
---
|
||||
|
||||
## How to Use Workflows
|
||||
|
||||
1. Install the repository once (`npx antigravity-awesome-skills`).
|
||||
2. Pick a workflow matching your immediate goal.
|
||||
3. Execute steps in order and invoke the listed skills in each step.
|
||||
4. Keep output artifacts at each step (plan, decisions, tests, validation evidence).
|
||||
|
||||
You can combine workflows with bundles from [BUNDLES.md](BUNDLES.md) when you need broader coverage.
|
||||
|
||||
---
|
||||
|
||||
## Workflow: Ship a SaaS MVP
|
||||
|
||||
Build and ship a minimal but production-minded SaaS product.
|
||||
|
||||
**Related bundles:** `Essentials`, `Full-Stack Developer`, `QA & Testing`, `DevOps & Cloud`
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Local repository and runtime configured.
|
||||
- Clear user problem and MVP scope.
|
||||
- Basic deployment target selected.
|
||||
|
||||
### Steps
|
||||
|
||||
1. **Plan the scope**
|
||||
- **Goal:** Define MVP boundaries and acceptance criteria.
|
||||
- **Skills:** [`@brainstorming`](../skills/brainstorming/), [`@concise-planning`](../skills/concise-planning/), [`@writing-plans`](../skills/writing-plans/)
|
||||
- **Prompt example:** `Usa @concise-planning per definire milestones e criteri di accettazione del mio MVP SaaS.`
|
||||
|
||||
2. **Build backend and API**
|
||||
- **Goal:** Implement core entities, APIs, and auth baseline.
|
||||
- **Skills:** [`@backend-dev-guidelines`](../skills/backend-dev-guidelines/), [`@api-patterns`](../skills/api-patterns/), [`@database-design`](../skills/database-design/)
|
||||
- **Prompt example:** `Usa @backend-dev-guidelines per creare API e servizi del dominio billing.`
|
||||
|
||||
3. **Build frontend**
|
||||
- **Goal:** Ship core user flow with clear UX states.
|
||||
- **Skills:** [`@frontend-developer`](../skills/frontend-developer/), [`@react-patterns`](../skills/react-patterns/), [`@frontend-design`](../skills/frontend-design/)
|
||||
- **Prompt example:** `Usa @frontend-developer per implementare onboarding, empty state e dashboard iniziale.`
|
||||
|
||||
4. **Test and validate**
|
||||
- **Goal:** Cover critical user journeys before release.
|
||||
- **Skills:** [`@test-driven-development`](../skills/test-driven-development/), [`@browser-automation`](../skills/browser-automation/), `@go-playwright` (optional, Go stack)
|
||||
- **Prompt example:** `Usa @browser-automation per creare test E2E sui flussi signup e checkout.`
|
||||
- **Go note:** Se il progetto QA e tooling sono in Go, preferisci `@go-playwright`.
|
||||
|
||||
5. **Ship safely**
|
||||
- **Goal:** Release with observability and rollback plan.
|
||||
- **Skills:** [`@deployment-procedures`](../skills/deployment-procedures/), [`@observability-engineer`](../skills/observability-engineer/)
|
||||
- **Prompt example:** `Usa @deployment-procedures per una checklist di rilascio con rollback.`
|
||||
|
||||
---
|
||||
|
||||
## Workflow: Security Audit for a Web App
|
||||
|
||||
Run a focused security review from scope definition to remediation validation.
|
||||
|
||||
**Related bundles:** `Security Engineer`, `Security Developer`, `Observability & Monitoring`
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Explicit authorization for testing.
|
||||
- In-scope targets documented.
|
||||
- Logging and environment details available.
|
||||
|
||||
### Steps
|
||||
|
||||
1. **Define scope and threat model**
|
||||
- **Goal:** Identify assets, trust boundaries, and attack paths.
|
||||
- **Skills:** [`@ethical-hacking-methodology`](../skills/ethical-hacking-methodology/), [`@threat-modeling-expert`](../skills/threat-modeling-expert/), [`@attack-tree-construction`](../skills/attack-tree-construction/)
|
||||
- **Prompt example:** `Usa @threat-modeling-expert per mappare asset critici e trust boundaries della mia web app.`
|
||||
|
||||
2. **Review auth and access control**
|
||||
- **Goal:** Detect account takeover and authorization flaws.
|
||||
- **Skills:** [`@broken-authentication`](../skills/broken-authentication/), [`@auth-implementation-patterns`](../skills/auth-implementation-patterns/), [`@idor-testing`](../skills/idor-testing/)
|
||||
- **Prompt example:** `Usa @idor-testing per verificare accessi non autorizzati su endpoint multitenant.`
|
||||
|
||||
3. **Assess API and input security**
|
||||
- **Goal:** Uncover high-impact API and injection vulnerabilities.
|
||||
- **Skills:** [`@api-security-best-practices`](../skills/api-security-best-practices/), [`@api-fuzzing-bug-bounty`](../skills/api-fuzzing-bug-bounty/), [`@top-web-vulnerabilities`](../skills/top-web-vulnerabilities/)
|
||||
- **Prompt example:** `Usa @api-security-best-practices per audit endpoint auth, billing e admin.`
|
||||
|
||||
4. **Harden and verify**
|
||||
- **Goal:** Convert findings into fixes and verify evidence of mitigation.
|
||||
- **Skills:** [`@security-auditor`](../skills/security-auditor/), [`@sast-configuration`](../skills/sast-configuration/), [`@verification-before-completion`](../skills/verification-before-completion/)
|
||||
- **Prompt example:** `Usa @verification-before-completion per provare che le mitigazioni sono effettive.`
|
||||
|
||||
---
|
||||
|
||||
## Workflow: Build an AI Agent System
|
||||
|
||||
Design and deliver a production-grade agent with measurable reliability.
|
||||
|
||||
**Related bundles:** `Agent Architect`, `LLM Application Developer`, `Data Engineering`
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Narrow use case with measurable outcomes.
|
||||
- Access to model provider(s) and observability tooling.
|
||||
- Initial dataset or knowledge corpus.
|
||||
|
||||
### Steps
|
||||
|
||||
1. **Define target behavior and KPIs**
|
||||
- **Goal:** Set quality, latency, and failure thresholds.
|
||||
- **Skills:** [`@ai-agents-architect`](../skills/ai-agents-architect/), [`@agent-evaluation`](../skills/agent-evaluation/), [`@product-manager-toolkit`](../skills/product-manager-toolkit/)
|
||||
- **Prompt example:** `Usa @agent-evaluation per definire benchmark e criteri di successo del mio agente.`
|
||||
|
||||
2. **Design retrieval and memory**
|
||||
- **Goal:** Build reliable retrieval and context architecture.
|
||||
- **Skills:** [`@llm-app-patterns`](../skills/llm-app-patterns/), [`@rag-implementation`](../skills/rag-implementation/), [`@vector-database-engineer`](../skills/vector-database-engineer/)
|
||||
- **Prompt example:** `Usa @rag-implementation per progettare pipeline di chunking, embedding e retrieval.`
|
||||
|
||||
3. **Implement orchestration**
|
||||
- **Goal:** Implement deterministic orchestration and tool boundaries.
|
||||
- **Skills:** [`@langgraph`](../skills/langgraph/), [`@mcp-builder`](../skills/mcp-builder/), [`@workflow-automation`](../skills/workflow-automation/)
|
||||
- **Prompt example:** `Usa @langgraph per implementare il grafo agente con fallback e human-in-the-loop.`
|
||||
|
||||
4. **Evaluate and iterate**
|
||||
- **Goal:** Improve weak points with a structured loop.
|
||||
- **Skills:** [`@agent-evaluation`](../skills/agent-evaluation/), [`@langfuse`](../skills/langfuse/), [`@kaizen`](../skills/kaizen/)
|
||||
- **Prompt example:** `Usa @kaizen per prioritizzare le correzioni sulle failure modes rilevate dai test.`
|
||||
|
||||
---
|
||||
|
||||
## Workflow: QA and Browser Automation
|
||||
|
||||
Create resilient browser automation with deterministic execution in CI.
|
||||
|
||||
**Related bundles:** `QA & Testing`, `Full-Stack Developer`
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Test environments and stable credentials.
|
||||
- Critical user journeys identified.
|
||||
- CI pipeline available.
|
||||
|
||||
### Steps
|
||||
|
||||
1. **Prepare test strategy**
|
||||
- **Goal:** Scope journeys, fixtures, and execution environments.
|
||||
- **Skills:** [`@e2e-testing-patterns`](../skills/e2e-testing-patterns/), [`@test-driven-development`](../skills/test-driven-development/)
|
||||
- **Prompt example:** `Usa @e2e-testing-patterns per definire suite E2E minima ma ad alto impatto.`
|
||||
|
||||
2. **Implement browser tests**
|
||||
- **Goal:** Build robust test coverage with stable selectors.
|
||||
- **Skills:** [`@browser-automation`](../skills/browser-automation/), `@go-playwright` (optional, Go stack)
|
||||
- **Prompt example:** `Usa @go-playwright per implementare browser automation in un progetto Go.`
|
||||
|
||||
3. **Triage and harden**
|
||||
- **Goal:** Remove flaky behavior and enforce repeatability.
|
||||
- **Skills:** [`@systematic-debugging`](../skills/systematic-debugging/), [`@test-fixing`](../skills/test-fixing/), [`@verification-before-completion`](../skills/verification-before-completion/)
|
||||
- **Prompt example:** `Usa @systematic-debugging per classificare e risolvere le flakiness in CI.`
|
||||
|
||||
---
|
||||
|
||||
## Machine-Readable Workflows
|
||||
|
||||
For tooling and automation, workflow metadata is available in [data/workflows.json](../data/workflows.json).
|
||||
709
docs/microsoft-skills-attribution.json
Normal file
709
docs/microsoft-skills-attribution.json
Normal file
@@ -0,0 +1,709 @@
|
||||
{
|
||||
"source": "microsoft/skills",
|
||||
"repository": "https://github.com/microsoft/skills",
|
||||
"license": "MIT",
|
||||
"synced_skills": 140,
|
||||
"structure": "flat (frontmatter name as directory name)",
|
||||
"skills": [
|
||||
{
|
||||
"flat_name": "azure-ai-voicelive-dotnet",
|
||||
"original_path": "dotnet/foundry/voicelive",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-document-intelligence-dotnet",
|
||||
"original_path": "dotnet/foundry/document-intelligence",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-openai-dotnet",
|
||||
"original_path": "dotnet/foundry/openai",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-weightsandbiases-dotnet",
|
||||
"original_path": "dotnet/foundry/weightsandbiases",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-projects-dotnet",
|
||||
"original_path": "dotnet/foundry/projects",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-search-documents-dotnet",
|
||||
"original_path": "dotnet/foundry/search-documents",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-applicationinsights-dotnet",
|
||||
"original_path": "dotnet/monitoring/applicationinsights",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "m365-agents-dotnet",
|
||||
"original_path": "dotnet/m365/m365-agents",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-apimanagement-dotnet",
|
||||
"original_path": "dotnet/integration/apimanagement",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-apicenter-dotnet",
|
||||
"original_path": "dotnet/integration/apicenter",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-resource-manager-playwright-dotnet",
|
||||
"original_path": "dotnet/compute/playwright",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-resource-manager-durabletask-dotnet",
|
||||
"original_path": "dotnet/compute/durabletask",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-botservice-dotnet",
|
||||
"original_path": "dotnet/compute/botservice",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-identity-dotnet",
|
||||
"original_path": "dotnet/entra/azure-identity",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "microsoft-azure-webjobs-extensions-authentication-events-dotnet",
|
||||
"original_path": "dotnet/entra/authentication-events",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-security-keyvault-keys-dotnet",
|
||||
"original_path": "dotnet/entra/keyvault",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-maps-search-dotnet",
|
||||
"original_path": "dotnet/general/maps",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-eventgrid-dotnet",
|
||||
"original_path": "dotnet/messaging/eventgrid",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-servicebus-dotnet",
|
||||
"original_path": "dotnet/messaging/servicebus",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-eventhub-dotnet",
|
||||
"original_path": "dotnet/messaging/eventhubs",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-resource-manager-redis-dotnet",
|
||||
"original_path": "dotnet/data/redis",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-resource-manager-postgresql-dotnet",
|
||||
"original_path": "dotnet/data/postgresql",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-resource-manager-mysql-dotnet",
|
||||
"original_path": "dotnet/data/mysql",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-resource-manager-cosmosdb-dotnet",
|
||||
"original_path": "dotnet/data/cosmosdb",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-fabric-dotnet",
|
||||
"original_path": "dotnet/data/fabric",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-resource-manager-sql-dotnet",
|
||||
"original_path": "dotnet/data/sql",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-arizeaiobservabilityeval-dotnet",
|
||||
"original_path": "dotnet/partner/arize-ai-observability-eval",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-mongodbatlas-dotnet",
|
||||
"original_path": "dotnet/partner/mongodbatlas",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-keyvault-keys-rust",
|
||||
"original_path": "rust/entra/azure-keyvault-keys-rust",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-keyvault-secrets-rust",
|
||||
"original_path": "rust/entra/azure-keyvault-secrets-rust",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-identity-rust",
|
||||
"original_path": "rust/entra/azure-identity-rust",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-keyvault-certificates-rust",
|
||||
"original_path": "rust/entra/azure-keyvault-certificates-rust",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-eventhub-rust",
|
||||
"original_path": "rust/messaging/azure-eventhub-rust",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-cosmos-rust",
|
||||
"original_path": "rust/data/azure-cosmos-rust",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-blob-rust",
|
||||
"original_path": "rust/data/azure-storage-blob-rust",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-voicelive-ts",
|
||||
"original_path": "typescript/foundry/voicelive",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-contentsafety-ts",
|
||||
"original_path": "typescript/foundry/contentsafety",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-document-intelligence-ts",
|
||||
"original_path": "typescript/foundry/document-intelligence",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-projects-ts",
|
||||
"original_path": "typescript/foundry/projects",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-search-documents-ts",
|
||||
"original_path": "typescript/foundry/search-documents",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-translation-ts",
|
||||
"original_path": "typescript/foundry/translation",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-monitor-opentelemetry-ts",
|
||||
"original_path": "typescript/monitoring/opentelemetry",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "zustand-store-ts",
|
||||
"original_path": "typescript/frontend/zustand-store",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "frontend-ui-dark-ts",
|
||||
"original_path": "typescript/frontend/frontend-ui-dark",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "react-flow-node-ts",
|
||||
"original_path": "typescript/frontend/react-flow-node",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "m365-agents-ts",
|
||||
"original_path": "typescript/m365/m365-agents",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-appconfiguration-ts",
|
||||
"original_path": "typescript/integration/appconfiguration",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-microsoft-playwright-testing-ts",
|
||||
"original_path": "typescript/compute/playwright",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-identity-ts",
|
||||
"original_path": "typescript/entra/azure-identity",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-keyvault-keys-ts",
|
||||
"original_path": "typescript/entra/keyvault-keys",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-keyvault-secrets-ts",
|
||||
"original_path": "typescript/entra/keyvault-secrets",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-servicebus-ts",
|
||||
"original_path": "typescript/messaging/servicebus",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-web-pubsub-ts",
|
||||
"original_path": "typescript/messaging/webpubsub",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-eventhub-ts",
|
||||
"original_path": "typescript/messaging/eventhubs",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-cosmos-ts",
|
||||
"original_path": "typescript/data/cosmosdb",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-blob-ts",
|
||||
"original_path": "typescript/data/blob",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-postgres-ts",
|
||||
"original_path": "typescript/data/postgres",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-queue-ts",
|
||||
"original_path": "typescript/data/queue",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-file-share-ts",
|
||||
"original_path": "typescript/data/fileshare",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-speech-to-text-rest-py",
|
||||
"original_path": "python/foundry/speech-to-text-rest",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-transcription-py",
|
||||
"original_path": "python/foundry/transcription",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-vision-imageanalysis-py",
|
||||
"original_path": "python/foundry/vision-imageanalysis",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-contentunderstanding-py",
|
||||
"original_path": "python/foundry/contentunderstanding",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-voicelive-py",
|
||||
"original_path": "python/foundry/voicelive",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "agent-framework-azure-ai-py",
|
||||
"original_path": "python/foundry/agent-framework",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-contentsafety-py",
|
||||
"original_path": "python/foundry/contentsafety",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "agents-v2-py",
|
||||
"original_path": "python/foundry/agents-v2",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-translation-document-py",
|
||||
"original_path": "python/foundry/translation-document",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-translation-text-py",
|
||||
"original_path": "python/foundry/translation-text",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-textanalytics-py",
|
||||
"original_path": "python/foundry/textanalytics",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-ml-py",
|
||||
"original_path": "python/foundry/ml",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-projects-py",
|
||||
"original_path": "python/foundry/projects",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-search-documents-py",
|
||||
"original_path": "python/foundry/search-documents",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-monitor-opentelemetry-py",
|
||||
"original_path": "python/monitoring/opentelemetry",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-monitor-ingestion-py",
|
||||
"original_path": "python/monitoring/ingestion",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-monitor-query-py",
|
||||
"original_path": "python/monitoring/query",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-monitor-opentelemetry-exporter-py",
|
||||
"original_path": "python/monitoring/opentelemetry-exporter",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "m365-agents-py",
|
||||
"original_path": "python/m365/m365-agents",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-appconfiguration-py",
|
||||
"original_path": "python/integration/appconfiguration",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-apimanagement-py",
|
||||
"original_path": "python/integration/apimanagement",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-apicenter-py",
|
||||
"original_path": "python/integration/apicenter",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-fabric-py",
|
||||
"original_path": "python/compute/fabric",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-mgmt-botservice-py",
|
||||
"original_path": "python/compute/botservice",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-containerregistry-py",
|
||||
"original_path": "python/compute/containerregistry",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-identity-py",
|
||||
"original_path": "python/entra/azure-identity",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-keyvault-py",
|
||||
"original_path": "python/entra/keyvault",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-eventgrid-py",
|
||||
"original_path": "python/messaging/eventgrid",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-servicebus-py",
|
||||
"original_path": "python/messaging/servicebus",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-messaging-webpubsubservice-py",
|
||||
"original_path": "python/messaging/webpubsub-service",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-eventhub-py",
|
||||
"original_path": "python/messaging/eventhub",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-data-tables-py",
|
||||
"original_path": "python/data/tables",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-cosmos-py",
|
||||
"original_path": "python/data/cosmos",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-blob-py",
|
||||
"original_path": "python/data/blob",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-file-datalake-py",
|
||||
"original_path": "python/data/datalake",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-cosmos-db-py",
|
||||
"original_path": "python/data/cosmos-db",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-queue-py",
|
||||
"original_path": "python/data/queue",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-file-share-py",
|
||||
"original_path": "python/data/fileshare",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-formrecognizer-java",
|
||||
"original_path": "java/foundry/formrecognizer",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-vision-imageanalysis-java",
|
||||
"original_path": "java/foundry/vision-imageanalysis",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-voicelive-java",
|
||||
"original_path": "java/foundry/voicelive",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-contentsafety-java",
|
||||
"original_path": "java/foundry/contentsafety",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-projects-java",
|
||||
"original_path": "java/foundry/projects",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-anomalydetector-java",
|
||||
"original_path": "java/foundry/anomalydetector",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-monitor-ingestion-java",
|
||||
"original_path": "java/monitoring/ingestion",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-monitor-query-java",
|
||||
"original_path": "java/monitoring/query",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-monitor-opentelemetry-exporter-java",
|
||||
"original_path": "java/monitoring/opentelemetry-exporter",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-appconfiguration-java",
|
||||
"original_path": "java/integration/appconfiguration",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-communication-common-java",
|
||||
"original_path": "java/communication/common",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-communication-callingserver-java",
|
||||
"original_path": "java/communication/callingserver",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-communication-sms-java",
|
||||
"original_path": "java/communication/sms",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-communication-callautomation-java",
|
||||
"original_path": "java/communication/callautomation",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-communication-chat-java",
|
||||
"original_path": "java/communication/chat",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-compute-batch-java",
|
||||
"original_path": "java/compute/batch",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-identity-java",
|
||||
"original_path": "java/entra/azure-identity",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-security-keyvault-keys-java",
|
||||
"original_path": "java/entra/keyvault-keys",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-security-keyvault-secrets-java",
|
||||
"original_path": "java/entra/keyvault-secrets",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-eventgrid-java",
|
||||
"original_path": "java/messaging/eventgrid",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-messaging-webpubsub-java",
|
||||
"original_path": "java/messaging/webpubsub",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-eventhub-java",
|
||||
"original_path": "java/messaging/eventhubs",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-data-tables-java",
|
||||
"original_path": "java/data/tables",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-cosmos-java",
|
||||
"original_path": "java/data/cosmos",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-storage-blob-java",
|
||||
"original_path": "java/data/blob",
|
||||
"source": "microsoft/skills"
|
||||
},
|
||||
{
|
||||
"flat_name": "wiki-page-writer",
|
||||
"original_path": "plugins/wiki-page-writer",
|
||||
"source": "microsoft/skills (plugin)"
|
||||
},
|
||||
{
|
||||
"flat_name": "wiki-vitepress",
|
||||
"original_path": "plugins/wiki-vitepress",
|
||||
"source": "microsoft/skills (plugin)"
|
||||
},
|
||||
{
|
||||
"flat_name": "wiki-researcher",
|
||||
"original_path": "plugins/wiki-researcher",
|
||||
"source": "microsoft/skills (plugin)"
|
||||
},
|
||||
{
|
||||
"flat_name": "wiki-qa",
|
||||
"original_path": "plugins/wiki-qa",
|
||||
"source": "microsoft/skills (plugin)"
|
||||
},
|
||||
{
|
||||
"flat_name": "wiki-onboarding",
|
||||
"original_path": "plugins/wiki-onboarding",
|
||||
"source": "microsoft/skills (plugin)"
|
||||
},
|
||||
{
|
||||
"flat_name": "wiki-architect",
|
||||
"original_path": "plugins/wiki-architect",
|
||||
"source": "microsoft/skills (plugin)"
|
||||
},
|
||||
{
|
||||
"flat_name": "wiki-changelog",
|
||||
"original_path": "plugins/wiki-changelog",
|
||||
"source": "microsoft/skills (plugin)"
|
||||
},
|
||||
{
|
||||
"flat_name": "fastapi-router-py",
|
||||
"original_path": ".github/skills/fastapi-router-py",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "azd-deployment",
|
||||
"original_path": ".github/skills/azd-deployment",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "copilot-sdk",
|
||||
"original_path": ".github/skills/copilot-sdk",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-agents-persistent-dotnet",
|
||||
"original_path": ".github/skills/azure-ai-agents-persistent-dotnet",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "hosted-agents-v2-py",
|
||||
"original_path": ".github/skills/hosted-agents-v2-py",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "pydantic-models-py",
|
||||
"original_path": ".github/skills/pydantic-models-py",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "skill-creator-ms",
|
||||
"original_path": ".github/skills/skill-creator",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "podcast-generation",
|
||||
"original_path": ".github/skills/podcast-generation",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "github-issue-creator",
|
||||
"original_path": ".github/skills/github-issue-creator",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "azure-ai-agents-persistent-java",
|
||||
"original_path": ".github/skills/azure-ai-agents-persistent-java",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
},
|
||||
{
|
||||
"flat_name": "mcp-builder-ms",
|
||||
"original_path": ".github/skills/mcp-builder",
|
||||
"source": "microsoft/skills (.github/skills)"
|
||||
}
|
||||
]
|
||||
}
|
||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -1,12 +1,12 @@
|
||||
{
|
||||
"name": "antigravity-awesome-skills",
|
||||
"version": "4.10.0",
|
||||
"version": "5.2.0",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "antigravity-awesome-skills",
|
||||
"version": "4.10.0",
|
||||
"version": "5.2.0",
|
||||
"license": "MIT",
|
||||
"bin": {
|
||||
"antigravity-awesome-skills": "bin/install.js"
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "antigravity-awesome-skills",
|
||||
"version": "4.11.0",
|
||||
"description": "626+ agentic skills for Claude Code, Gemini CLI, Cursor, Antigravity & more. Installer CLI.",
|
||||
"version": "5.3.0",
|
||||
"description": "845+ agentic skills for Claude Code, Gemini CLI, Cursor, Antigravity & more. Installer CLI.",
|
||||
"license": "MIT",
|
||||
"scripts": {
|
||||
"validate": "python3 scripts/validate_skills.py",
|
||||
@@ -11,7 +11,9 @@
|
||||
"chain": "npm run validate && npm run index && npm run readme",
|
||||
"catalog": "node scripts/build-catalog.js",
|
||||
"build": "npm run chain && npm run catalog",
|
||||
"test": "node scripts/tests/validate_skills_headings.test.js && python3 scripts/tests/test_validate_skills_headings.py"
|
||||
"test": "node scripts/tests/validate_skills_headings.test.js && python3 scripts/tests/test_validate_skills_headings.py && python3 scripts/tests/inspect_microsoft_repo.py && python3 scripts/tests/test_comprehensive_coverage.py",
|
||||
"sync:microsoft": "python3 scripts/sync_microsoft_skills.py",
|
||||
"sync:all-official": "npm run sync:microsoft && npm run chain"
|
||||
},
|
||||
"devDependencies": {
|
||||
"yaml": "^2.8.2"
|
||||
|
||||
@@ -1,36 +1,36 @@
|
||||
## [4.10.0] - 2026-02-06 - "Composio Automation + .NET Backend"
|
||||
## [5.0.0] - 2026-02-10 - "Antigravity Workflows Foundation"
|
||||
|
||||
> This release significantly expands hands-on automation coverage while adding a production-oriented .NET backend skill.
|
||||
> First-class Workflows are now available to orchestrate multiple skills through guided execution playbooks.
|
||||
|
||||
### Added
|
||||
### 🚀 New Skills
|
||||
|
||||
- **79 new skills total**.
|
||||
- **78 Composio/Rube automation skills** (PR #64), covering practical workflows across:
|
||||
- CRM/sales (`HubSpot`, `Salesforce`, `Pipedrive`, `Zoho CRM`, `Close`)
|
||||
- Collaboration/project ops (`Notion`, `ClickUp`, `Asana`, `Jira`, `Confluence`, `Trello`, `Monday`)
|
||||
- Messaging/support (`Slack`, `Discord`, `Teams`, `Intercom`, `Freshdesk`, `Zendesk`)
|
||||
- Analytics/marketing (`Google Analytics`, `Mixpanel`, `PostHog`, `Segment`, `Mailchimp`, `Klaviyo`)
|
||||
- Dev/infra operations (`GitHub`, `GitLab`, `CircleCI`, `Datadog`, `PagerDuty`, `Vercel`, `Render`)
|
||||
- **1 new `dotnet-backend` skill** (PR #65) with detailed ASP.NET Core 8+ guidance:
|
||||
- Minimal API and controller patterns
|
||||
- EF Core data-access patterns
|
||||
- JWT authentication implementation
|
||||
- Background service templates
|
||||
- Explicit "When to Use" and "Limitations" sections
|
||||
- **Registry growth**: 634 -> 713 indexed skills.
|
||||
### 🧭 [antigravity-workflows](skills/antigravity-workflows/)
|
||||
|
||||
### Changed
|
||||
**Orchestrates multi-step outcomes using curated workflow playbooks.**
|
||||
This new skill routes users from high-level goals to concrete execution steps across related skills and bundles.
|
||||
|
||||
- Regenerated and synchronized discovery artifacts:
|
||||
- `README.md`
|
||||
- `skills_index.json`
|
||||
- `CATALOG.md`
|
||||
- `data/catalog.json`
|
||||
- `data/bundles.json`
|
||||
- `data/aliases.json`
|
||||
- Updated release metadata and published tag/release `v4.10.0`.
|
||||
- **Key Feature 1**: Workflow routing for SaaS MVP, Security Audit, AI Agent Systems, and Browser QA.
|
||||
- **Key Feature 2**: Explicit step-by-step outputs with prerequisites, recommended skills, and validation checkpoints.
|
||||
|
||||
### Contributors
|
||||
> **Try it:** `Use @antigravity-workflows to run ship-saas-mvp for my project.`
|
||||
|
||||
- [@sohamganatra](https://github.com/sohamganatra)
|
||||
- [@Nguyen-Van-Chan](https://github.com/Nguyen-Van-Chan)
|
||||
---
|
||||
|
||||
## 📦 Improvements
|
||||
|
||||
- **Workflow Registry**: Added `data/workflows.json` for machine-readable workflow metadata.
|
||||
- **Workflow Docs**: Added `docs/WORKFLOWS.md` to distinguish Bundles vs Workflows and provide practical execution playbooks.
|
||||
- **Trinity Sync**: Updated `README.md`, `docs/GETTING_STARTED.md`, and `docs/FAQ.md` for workflow onboarding.
|
||||
- **Go QA Path**: Added optional `@go-playwright` wiring in QA/E2E workflow steps.
|
||||
- **Registry Update**: Catalog regenerated; repository now tracks 714 skills.
|
||||
|
||||
## 👥 Credits
|
||||
|
||||
A huge shoutout to our community and maintainers:
|
||||
|
||||
- **@Walapalam** for the Workflows concept request ([Issue #72](https://github.com/sickn33/antigravity-awesome-skills/issues/72))
|
||||
- **@sickn33** for workflow integration, release preparation, and maintenance updates
|
||||
|
||||
---
|
||||
|
||||
_Upgrade now: `git pull origin main` to fetch the latest skills._
|
||||
|
||||
@@ -1,161 +1,454 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const fs = require("fs");
|
||||
const path = require("path");
|
||||
const {
|
||||
listSkillIdsRecursive,
|
||||
readSkill,
|
||||
tokenize,
|
||||
unique,
|
||||
} = require('../lib/skill-utils');
|
||||
} = require("../lib/skill-utils");
|
||||
|
||||
const ROOT = path.resolve(__dirname, '..');
|
||||
const SKILLS_DIR = path.join(ROOT, 'skills');
|
||||
const ROOT = path.resolve(__dirname, "..");
|
||||
const SKILLS_DIR = path.join(ROOT, "skills");
|
||||
|
||||
const STOPWORDS = new Set([
|
||||
'a', 'an', 'and', 'are', 'as', 'at', 'be', 'but', 'by', 'for', 'from', 'has', 'have', 'in', 'into',
|
||||
'is', 'it', 'its', 'of', 'on', 'or', 'our', 'out', 'over', 'that', 'the', 'their', 'they', 'this',
|
||||
'to', 'use', 'when', 'with', 'you', 'your', 'will', 'can', 'if', 'not', 'only', 'also', 'more',
|
||||
'best', 'practice', 'practices', 'expert', 'specialist', 'focused', 'focus', 'master', 'modern',
|
||||
'advanced', 'comprehensive', 'production', 'production-ready', 'ready', 'build', 'create', 'deliver',
|
||||
'design', 'implement', 'implementation', 'strategy', 'strategies', 'patterns', 'pattern', 'workflow',
|
||||
'workflows', 'guide', 'template', 'templates', 'tool', 'tools', 'project', 'projects', 'support',
|
||||
'manage', 'management', 'system', 'systems', 'services', 'service', 'across', 'end', 'end-to-end',
|
||||
'using', 'based', 'ensure', 'ensure', 'help', 'needs', 'need', 'focuses', 'handles', 'builds', 'make',
|
||||
"a",
|
||||
"an",
|
||||
"and",
|
||||
"are",
|
||||
"as",
|
||||
"at",
|
||||
"be",
|
||||
"but",
|
||||
"by",
|
||||
"for",
|
||||
"from",
|
||||
"has",
|
||||
"have",
|
||||
"in",
|
||||
"into",
|
||||
"is",
|
||||
"it",
|
||||
"its",
|
||||
"of",
|
||||
"on",
|
||||
"or",
|
||||
"our",
|
||||
"out",
|
||||
"over",
|
||||
"that",
|
||||
"the",
|
||||
"their",
|
||||
"they",
|
||||
"this",
|
||||
"to",
|
||||
"use",
|
||||
"when",
|
||||
"with",
|
||||
"you",
|
||||
"your",
|
||||
"will",
|
||||
"can",
|
||||
"if",
|
||||
"not",
|
||||
"only",
|
||||
"also",
|
||||
"more",
|
||||
"best",
|
||||
"practice",
|
||||
"practices",
|
||||
"expert",
|
||||
"specialist",
|
||||
"focused",
|
||||
"focus",
|
||||
"master",
|
||||
"modern",
|
||||
"advanced",
|
||||
"comprehensive",
|
||||
"production",
|
||||
"production-ready",
|
||||
"ready",
|
||||
"build",
|
||||
"create",
|
||||
"deliver",
|
||||
"design",
|
||||
"implement",
|
||||
"implementation",
|
||||
"strategy",
|
||||
"strategies",
|
||||
"patterns",
|
||||
"pattern",
|
||||
"workflow",
|
||||
"workflows",
|
||||
"guide",
|
||||
"template",
|
||||
"templates",
|
||||
"tool",
|
||||
"tools",
|
||||
"project",
|
||||
"projects",
|
||||
"support",
|
||||
"manage",
|
||||
"management",
|
||||
"system",
|
||||
"systems",
|
||||
"services",
|
||||
"service",
|
||||
"across",
|
||||
"end",
|
||||
"end-to-end",
|
||||
"using",
|
||||
"based",
|
||||
"ensure",
|
||||
"ensure",
|
||||
"help",
|
||||
"needs",
|
||||
"need",
|
||||
"focuses",
|
||||
"handles",
|
||||
"builds",
|
||||
"make",
|
||||
]);
|
||||
|
||||
const TAG_STOPWORDS = new Set([
|
||||
'pro', 'expert', 'patterns', 'pattern', 'workflow', 'workflows', 'templates', 'template', 'toolkit',
|
||||
'tools', 'tool', 'project', 'projects', 'guide', 'management', 'engineer', 'architect', 'developer',
|
||||
'specialist', 'assistant', 'analysis', 'review', 'reviewer', 'automation', 'orchestration', 'scaffold',
|
||||
'scaffolding', 'implementation', 'strategy', 'context', 'management', 'feature', 'features', 'smart',
|
||||
'system', 'systems', 'design', 'development', 'development', 'test', 'testing', 'workflow',
|
||||
"pro",
|
||||
"expert",
|
||||
"patterns",
|
||||
"pattern",
|
||||
"workflow",
|
||||
"workflows",
|
||||
"templates",
|
||||
"template",
|
||||
"toolkit",
|
||||
"tools",
|
||||
"tool",
|
||||
"project",
|
||||
"projects",
|
||||
"guide",
|
||||
"management",
|
||||
"engineer",
|
||||
"architect",
|
||||
"developer",
|
||||
"specialist",
|
||||
"assistant",
|
||||
"analysis",
|
||||
"review",
|
||||
"reviewer",
|
||||
"automation",
|
||||
"orchestration",
|
||||
"scaffold",
|
||||
"scaffolding",
|
||||
"implementation",
|
||||
"strategy",
|
||||
"context",
|
||||
"management",
|
||||
"feature",
|
||||
"features",
|
||||
"smart",
|
||||
"system",
|
||||
"systems",
|
||||
"design",
|
||||
"development",
|
||||
"development",
|
||||
"test",
|
||||
"testing",
|
||||
"workflow",
|
||||
]);
|
||||
|
||||
const CATEGORY_RULES = [
|
||||
{
|
||||
name: 'security',
|
||||
name: "security",
|
||||
keywords: [
|
||||
'security', 'sast', 'compliance', 'privacy', 'threat', 'vulnerability', 'owasp', 'pci', 'gdpr',
|
||||
'secrets', 'risk', 'malware', 'forensics', 'attack', 'incident', 'auth', 'mtls', 'zero', 'trust',
|
||||
"security",
|
||||
"sast",
|
||||
"compliance",
|
||||
"privacy",
|
||||
"threat",
|
||||
"vulnerability",
|
||||
"owasp",
|
||||
"pci",
|
||||
"gdpr",
|
||||
"secrets",
|
||||
"risk",
|
||||
"malware",
|
||||
"forensics",
|
||||
"attack",
|
||||
"incident",
|
||||
"auth",
|
||||
"mtls",
|
||||
"zero",
|
||||
"trust",
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'infrastructure',
|
||||
name: "infrastructure",
|
||||
keywords: [
|
||||
'kubernetes', 'k8s', 'helm', 'terraform', 'cloud', 'network', 'devops', 'gitops', 'prometheus',
|
||||
'grafana', 'observability', 'monitoring', 'logging', 'tracing', 'deployment', 'istio', 'linkerd',
|
||||
'service', 'mesh', 'slo', 'sre', 'oncall', 'incident', 'pipeline', 'cicd', 'ci', 'cd', 'kafka',
|
||||
"kubernetes",
|
||||
"k8s",
|
||||
"helm",
|
||||
"terraform",
|
||||
"cloud",
|
||||
"network",
|
||||
"devops",
|
||||
"gitops",
|
||||
"prometheus",
|
||||
"grafana",
|
||||
"observability",
|
||||
"monitoring",
|
||||
"logging",
|
||||
"tracing",
|
||||
"deployment",
|
||||
"istio",
|
||||
"linkerd",
|
||||
"service",
|
||||
"mesh",
|
||||
"slo",
|
||||
"sre",
|
||||
"oncall",
|
||||
"incident",
|
||||
"pipeline",
|
||||
"cicd",
|
||||
"ci",
|
||||
"cd",
|
||||
"kafka",
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'data-ai',
|
||||
name: "data-ai",
|
||||
keywords: [
|
||||
'data', 'database', 'db', 'sql', 'postgres', 'mysql', 'analytics', 'etl', 'warehouse', 'dbt',
|
||||
'ml', 'ai', 'llm', 'rag', 'vector', 'embedding', 'spark', 'airflow', 'cdc', 'pipeline',
|
||||
"data",
|
||||
"database",
|
||||
"db",
|
||||
"sql",
|
||||
"postgres",
|
||||
"mysql",
|
||||
"analytics",
|
||||
"etl",
|
||||
"warehouse",
|
||||
"dbt",
|
||||
"ml",
|
||||
"ai",
|
||||
"llm",
|
||||
"rag",
|
||||
"vector",
|
||||
"embedding",
|
||||
"spark",
|
||||
"airflow",
|
||||
"cdc",
|
||||
"pipeline",
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'development',
|
||||
name: "development",
|
||||
keywords: [
|
||||
'python', 'javascript', 'typescript', 'java', 'golang', 'go', 'rust', 'csharp', 'dotnet', 'php',
|
||||
'ruby', 'node', 'react', 'frontend', 'backend', 'mobile', 'ios', 'android', 'flutter', 'fastapi',
|
||||
'django', 'nextjs', 'vue', 'api',
|
||||
"python",
|
||||
"javascript",
|
||||
"typescript",
|
||||
"java",
|
||||
"golang",
|
||||
"go",
|
||||
"rust",
|
||||
"csharp",
|
||||
"dotnet",
|
||||
"php",
|
||||
"ruby",
|
||||
"node",
|
||||
"react",
|
||||
"frontend",
|
||||
"backend",
|
||||
"mobile",
|
||||
"ios",
|
||||
"android",
|
||||
"flutter",
|
||||
"fastapi",
|
||||
"django",
|
||||
"nextjs",
|
||||
"vue",
|
||||
"api",
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'architecture',
|
||||
name: "architecture",
|
||||
keywords: [
|
||||
'architecture', 'c4', 'microservices', 'event', 'cqrs', 'saga', 'domain', 'ddd', 'patterns',
|
||||
'decision', 'adr',
|
||||
"architecture",
|
||||
"c4",
|
||||
"microservices",
|
||||
"event",
|
||||
"cqrs",
|
||||
"saga",
|
||||
"domain",
|
||||
"ddd",
|
||||
"patterns",
|
||||
"decision",
|
||||
"adr",
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'testing',
|
||||
keywords: ['testing', 'tdd', 'unit', 'e2e', 'qa', 'test'],
|
||||
name: "testing",
|
||||
keywords: ["testing", "tdd", "unit", "e2e", "qa", "test"],
|
||||
},
|
||||
{
|
||||
name: 'business',
|
||||
name: "business",
|
||||
keywords: [
|
||||
'business', 'market', 'sales', 'finance', 'startup', 'legal', 'hr', 'product', 'customer', 'seo',
|
||||
'marketing', 'kpi', 'contract', 'employment',
|
||||
"business",
|
||||
"market",
|
||||
"sales",
|
||||
"finance",
|
||||
"startup",
|
||||
"legal",
|
||||
"hr",
|
||||
"product",
|
||||
"customer",
|
||||
"seo",
|
||||
"marketing",
|
||||
"kpi",
|
||||
"contract",
|
||||
"employment",
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'workflow',
|
||||
keywords: ['workflow', 'orchestration', 'conductor', 'automation', 'process', 'collaboration'],
|
||||
name: "workflow",
|
||||
keywords: [
|
||||
"workflow",
|
||||
"orchestration",
|
||||
"conductor",
|
||||
"automation",
|
||||
"process",
|
||||
"collaboration",
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
const BUNDLE_RULES = {
|
||||
'core-dev': {
|
||||
description: 'Core development skills across languages, frameworks, and backend/frontend fundamentals.',
|
||||
"core-dev": {
|
||||
description:
|
||||
"Core development skills across languages, frameworks, and backend/frontend fundamentals.",
|
||||
keywords: [
|
||||
'python', 'javascript', 'typescript', 'go', 'golang', 'rust', 'java', 'node', 'frontend', 'backend',
|
||||
'react', 'fastapi', 'django', 'nextjs', 'api', 'mobile', 'ios', 'android', 'flutter', 'php', 'ruby',
|
||||
"python",
|
||||
"javascript",
|
||||
"typescript",
|
||||
"go",
|
||||
"golang",
|
||||
"rust",
|
||||
"java",
|
||||
"node",
|
||||
"frontend",
|
||||
"backend",
|
||||
"react",
|
||||
"fastapi",
|
||||
"django",
|
||||
"nextjs",
|
||||
"api",
|
||||
"mobile",
|
||||
"ios",
|
||||
"android",
|
||||
"flutter",
|
||||
"php",
|
||||
"ruby",
|
||||
],
|
||||
},
|
||||
'security-core': {
|
||||
description: 'Security, privacy, and compliance essentials.',
|
||||
"security-core": {
|
||||
description: "Security, privacy, and compliance essentials.",
|
||||
keywords: [
|
||||
'security', 'sast', 'compliance', 'threat', 'risk', 'privacy', 'secrets', 'owasp', 'gdpr', 'pci',
|
||||
'vulnerability', 'auth',
|
||||
"security",
|
||||
"sast",
|
||||
"compliance",
|
||||
"threat",
|
||||
"risk",
|
||||
"privacy",
|
||||
"secrets",
|
||||
"owasp",
|
||||
"gdpr",
|
||||
"pci",
|
||||
"vulnerability",
|
||||
"auth",
|
||||
],
|
||||
},
|
||||
'k8s-core': {
|
||||
description: 'Kubernetes and service mesh essentials.',
|
||||
keywords: ['kubernetes', 'k8s', 'helm', 'istio', 'linkerd', 'service', 'mesh'],
|
||||
},
|
||||
'data-core': {
|
||||
description: 'Data engineering and analytics foundations.',
|
||||
"k8s-core": {
|
||||
description: "Kubernetes and service mesh essentials.",
|
||||
keywords: [
|
||||
'data', 'database', 'sql', 'dbt', 'airflow', 'spark', 'analytics', 'etl', 'warehouse', 'postgres',
|
||||
'mysql', 'kafka',
|
||||
"kubernetes",
|
||||
"k8s",
|
||||
"helm",
|
||||
"istio",
|
||||
"linkerd",
|
||||
"service",
|
||||
"mesh",
|
||||
],
|
||||
},
|
||||
'ops-core': {
|
||||
description: 'Operations, observability, and delivery pipelines.',
|
||||
"data-core": {
|
||||
description: "Data engineering and analytics foundations.",
|
||||
keywords: [
|
||||
'observability', 'monitoring', 'logging', 'tracing', 'prometheus', 'grafana', 'devops', 'gitops',
|
||||
'deployment', 'cicd', 'pipeline', 'slo', 'sre', 'incident',
|
||||
"data",
|
||||
"database",
|
||||
"sql",
|
||||
"dbt",
|
||||
"airflow",
|
||||
"spark",
|
||||
"analytics",
|
||||
"etl",
|
||||
"warehouse",
|
||||
"postgres",
|
||||
"mysql",
|
||||
"kafka",
|
||||
],
|
||||
},
|
||||
"ops-core": {
|
||||
description: "Operations, observability, and delivery pipelines.",
|
||||
keywords: [
|
||||
"observability",
|
||||
"monitoring",
|
||||
"logging",
|
||||
"tracing",
|
||||
"prometheus",
|
||||
"grafana",
|
||||
"devops",
|
||||
"gitops",
|
||||
"deployment",
|
||||
"cicd",
|
||||
"pipeline",
|
||||
"slo",
|
||||
"sre",
|
||||
"incident",
|
||||
],
|
||||
},
|
||||
};
|
||||
|
||||
const CURATED_COMMON = [
|
||||
'bash-pro',
|
||||
'python-pro',
|
||||
'javascript-pro',
|
||||
'typescript-pro',
|
||||
'golang-pro',
|
||||
'rust-pro',
|
||||
'java-pro',
|
||||
'frontend-developer',
|
||||
'backend-architect',
|
||||
'nodejs-backend-patterns',
|
||||
'fastapi-pro',
|
||||
'api-design-principles',
|
||||
'sql-pro',
|
||||
'database-architect',
|
||||
'kubernetes-architect',
|
||||
'terraform-specialist',
|
||||
'observability-engineer',
|
||||
'security-auditor',
|
||||
'sast-configuration',
|
||||
'gitops-workflow',
|
||||
"bash-pro",
|
||||
"python-pro",
|
||||
"javascript-pro",
|
||||
"typescript-pro",
|
||||
"golang-pro",
|
||||
"rust-pro",
|
||||
"java-pro",
|
||||
"frontend-developer",
|
||||
"backend-architect",
|
||||
"nodejs-backend-patterns",
|
||||
"fastapi-pro",
|
||||
"api-design-principles",
|
||||
"sql-pro",
|
||||
"database-architect",
|
||||
"kubernetes-architect",
|
||||
"terraform-specialist",
|
||||
"observability-engineer",
|
||||
"security-auditor",
|
||||
"sast-configuration",
|
||||
"gitops-workflow",
|
||||
];
|
||||
|
||||
function normalizeTokens(tokens) {
|
||||
return unique(tokens.map(token => token.toLowerCase())).filter(Boolean);
|
||||
return unique(tokens.map((token) => token.toLowerCase())).filter(Boolean);
|
||||
}
|
||||
|
||||
function deriveTags(skill) {
|
||||
let tags = Array.isArray(skill.tags) ? skill.tags : [];
|
||||
tags = tags.map(tag => tag.toLowerCase()).filter(Boolean);
|
||||
tags = tags.map((tag) => tag.toLowerCase()).filter(Boolean);
|
||||
|
||||
if (!tags.length) {
|
||||
tags = skill.id
|
||||
.split('-')
|
||||
.map(tag => tag.toLowerCase())
|
||||
.filter(tag => tag && !TAG_STOPWORDS.has(tag));
|
||||
.split("-")
|
||||
.map((tag) => tag.toLowerCase())
|
||||
.filter((tag) => tag && !TAG_STOPWORDS.has(tag));
|
||||
}
|
||||
|
||||
return normalizeTokens(tags);
|
||||
@@ -177,17 +470,18 @@ function detectCategory(skill, tags) {
|
||||
}
|
||||
}
|
||||
|
||||
return 'general';
|
||||
return "general";
|
||||
}
|
||||
|
||||
function buildTriggers(skill, tags) {
|
||||
const tokens = tokenize(`${skill.name} ${skill.description}`)
|
||||
.filter(token => token.length >= 2 && !STOPWORDS.has(token));
|
||||
const tokens = tokenize(`${skill.name} ${skill.description}`).filter(
|
||||
(token) => token.length >= 2 && !STOPWORDS.has(token),
|
||||
);
|
||||
return unique([...tags, ...tokens]).slice(0, 12);
|
||||
}
|
||||
|
||||
function buildAliases(skills) {
|
||||
const existingIds = new Set(skills.map(skill => skill.id));
|
||||
const existingIds = new Set(skills.map((skill) => skill.id));
|
||||
const aliases = {};
|
||||
const used = new Set();
|
||||
|
||||
@@ -200,7 +494,7 @@ function buildAliases(skills) {
|
||||
}
|
||||
}
|
||||
|
||||
const tokens = skill.id.split('-').filter(Boolean);
|
||||
const tokens = skill.id.split("-").filter(Boolean);
|
||||
if (skill.id.length < 28 || tokens.length < 4) continue;
|
||||
|
||||
const deduped = [];
|
||||
@@ -211,10 +505,11 @@ function buildAliases(skills) {
|
||||
deduped.push(token);
|
||||
}
|
||||
|
||||
const aliasTokens = deduped.length > 3
|
||||
? [deduped[0], deduped[1], deduped[deduped.length - 1]]
|
||||
: deduped;
|
||||
const alias = unique(aliasTokens).join('-');
|
||||
const aliasTokens =
|
||||
deduped.length > 3
|
||||
? [deduped[0], deduped[1], deduped[deduped.length - 1]]
|
||||
: deduped;
|
||||
const alias = unique(aliasTokens).join("-");
|
||||
|
||||
if (!alias || alias === skill.id) continue;
|
||||
if (existingIds.has(alias) || used.has(alias)) continue;
|
||||
@@ -241,11 +536,11 @@ function buildBundles(skills) {
|
||||
|
||||
for (const [bundleName, rule] of Object.entries(BUNDLE_RULES)) {
|
||||
const bundleSkills = [];
|
||||
const keywords = rule.keywords.map(keyword => keyword.toLowerCase());
|
||||
const keywords = rule.keywords.map((keyword) => keyword.toLowerCase());
|
||||
|
||||
for (const skill of skills) {
|
||||
const tokenSet = skillTokens.get(skill.id) || new Set();
|
||||
if (keywords.some(keyword => tokenSet.has(keyword))) {
|
||||
if (keywords.some((keyword) => tokenSet.has(keyword))) {
|
||||
bundleSkills.push(skill.id);
|
||||
}
|
||||
}
|
||||
@@ -256,49 +551,58 @@ function buildBundles(skills) {
|
||||
};
|
||||
}
|
||||
|
||||
const common = CURATED_COMMON.filter(skillId => skillTokens.has(skillId));
|
||||
const common = CURATED_COMMON.filter((skillId) => skillTokens.has(skillId));
|
||||
|
||||
return { bundles, common };
|
||||
}
|
||||
|
||||
function truncate(value, limit) {
|
||||
if (!value || value.length <= limit) return value || '';
|
||||
if (!value || value.length <= limit) return value || "";
|
||||
return `${value.slice(0, limit - 3)}...`;
|
||||
}
|
||||
|
||||
function renderCatalogMarkdown(catalog) {
|
||||
const lines = [];
|
||||
lines.push('# Skill Catalog');
|
||||
lines.push('');
|
||||
lines.push("# Skill Catalog");
|
||||
lines.push("");
|
||||
lines.push(`Generated at: ${catalog.generatedAt}`);
|
||||
lines.push('');
|
||||
lines.push("");
|
||||
lines.push(`Total skills: ${catalog.total}`);
|
||||
lines.push('');
|
||||
lines.push("");
|
||||
|
||||
const categories = Array.from(new Set(catalog.skills.map(skill => skill.category))).sort();
|
||||
const categories = Array.from(
|
||||
new Set(catalog.skills.map((skill) => skill.category)),
|
||||
).sort();
|
||||
for (const category of categories) {
|
||||
const grouped = catalog.skills.filter(skill => skill.category === category);
|
||||
const grouped = catalog.skills.filter(
|
||||
(skill) => skill.category === category,
|
||||
);
|
||||
lines.push(`## ${category} (${grouped.length})`);
|
||||
lines.push('');
|
||||
lines.push('| Skill | Description | Tags | Triggers |');
|
||||
lines.push('| --- | --- | --- | --- |');
|
||||
lines.push("");
|
||||
lines.push("| Skill | Description | Tags | Triggers |");
|
||||
lines.push("| --- | --- | --- | --- |");
|
||||
|
||||
for (const skill of grouped) {
|
||||
const description = truncate(skill.description, 160).replace(/\|/g, '\\|');
|
||||
const tags = skill.tags.join(', ');
|
||||
const triggers = skill.triggers.join(', ');
|
||||
lines.push(`| \`${skill.id}\` | ${description} | ${tags} | ${triggers} |`);
|
||||
const description = truncate(skill.description, 160).replace(
|
||||
/\|/g,
|
||||
"\\|",
|
||||
);
|
||||
const tags = skill.tags.join(", ");
|
||||
const triggers = skill.triggers.join(", ");
|
||||
lines.push(
|
||||
`| \`${skill.id}\` | ${description} | ${tags} | ${triggers} |`,
|
||||
);
|
||||
}
|
||||
|
||||
lines.push('');
|
||||
lines.push("");
|
||||
}
|
||||
|
||||
return lines.join('\n');
|
||||
return lines.join("\n");
|
||||
}
|
||||
|
||||
function buildCatalog() {
|
||||
const skillRelPaths = listSkillIdsRecursive(SKILLS_DIR);
|
||||
const skills = skillRelPaths.map(relPath => readSkill(SKILLS_DIR, relPath));
|
||||
const skills = skillRelPaths.map((relPath) => readSkill(SKILLS_DIR, relPath));
|
||||
const catalogSkills = [];
|
||||
|
||||
for (const skill of skills) {
|
||||
@@ -318,24 +622,32 @@ function buildCatalog() {
|
||||
}
|
||||
|
||||
const catalog = {
|
||||
generatedAt: process.env.SOURCE_DATE_EPOCH ? new Date(process.env.SOURCE_DATE_EPOCH * 1000).toISOString() : new Date().toISOString(),
|
||||
generatedAt: process.env.SOURCE_DATE_EPOCH
|
||||
? new Date(process.env.SOURCE_DATE_EPOCH * 1000).toISOString()
|
||||
: "2026-02-08T00:00:00.000Z",
|
||||
total: catalogSkills.length,
|
||||
skills: catalogSkills.sort((a, b) => (a.id < b.id ? -1 : a.id > b.id ? 1 : 0)),
|
||||
skills: catalogSkills.sort((a, b) =>
|
||||
a.id < b.id ? -1 : a.id > b.id ? 1 : 0,
|
||||
),
|
||||
};
|
||||
|
||||
const aliases = buildAliases(catalog.skills);
|
||||
const bundleData = buildBundles(catalog.skills);
|
||||
|
||||
const catalogPath = path.join(ROOT, 'data', 'catalog.json');
|
||||
const catalogMarkdownPath = path.join(ROOT, 'CATALOG.md');
|
||||
const bundlesPath = path.join(ROOT, 'data', 'bundles.json');
|
||||
const aliasesPath = path.join(ROOT, 'data', 'aliases.json');
|
||||
const catalogPath = path.join(ROOT, "data", "catalog.json");
|
||||
const catalogMarkdownPath = path.join(ROOT, "CATALOG.md");
|
||||
const bundlesPath = path.join(ROOT, "data", "bundles.json");
|
||||
const aliasesPath = path.join(ROOT, "data", "aliases.json");
|
||||
|
||||
fs.writeFileSync(catalogPath, JSON.stringify(catalog, null, 2));
|
||||
fs.writeFileSync(catalogMarkdownPath, renderCatalogMarkdown(catalog));
|
||||
fs.writeFileSync(
|
||||
bundlesPath,
|
||||
JSON.stringify({ generatedAt: catalog.generatedAt, ...bundleData }, null, 2),
|
||||
JSON.stringify(
|
||||
{ generatedAt: catalog.generatedAt, ...bundleData },
|
||||
null,
|
||||
2,
|
||||
),
|
||||
);
|
||||
fs.writeFileSync(
|
||||
aliasesPath,
|
||||
|
||||
@@ -6,14 +6,34 @@ import yaml
|
||||
|
||||
def parse_frontmatter(content):
|
||||
"""
|
||||
Parses YAML frontmatter using PyYAML for standard compliance.
|
||||
Parses YAML frontmatter, sanitizing unquoted values containing @.
|
||||
Handles single values and comma-separated lists by quoting the entire line.
|
||||
"""
|
||||
fm_match = re.search(r'^---\s*\n(.*?)\n---', content, re.DOTALL)
|
||||
if not fm_match:
|
||||
return {}
|
||||
|
||||
yaml_text = fm_match.group(1)
|
||||
|
||||
# Process line by line to handle values containing @ and commas
|
||||
sanitized_lines = []
|
||||
for line in yaml_text.splitlines():
|
||||
# Match "key: value" (handles keys with dashes like 'package-name')
|
||||
match = re.match(r'^(\s*[\w-]+):\s*(.*)$', line)
|
||||
if match:
|
||||
key, val = match.groups()
|
||||
val_s = val.strip()
|
||||
# If value contains @ and isn't already quoted, wrap the whole string in double quotes
|
||||
if '@' in val_s and not (val_s.startswith('"') or val_s.startswith("'")):
|
||||
# Escape any existing double quotes within the value string
|
||||
safe_val = val_s.replace('"', '\\"')
|
||||
line = f'{key}: "{safe_val}"'
|
||||
sanitized_lines.append(line)
|
||||
|
||||
sanitized_yaml = '\n'.join(sanitized_lines)
|
||||
|
||||
try:
|
||||
return yaml.safe_load(fm_match.group(1)) or {}
|
||||
return yaml.safe_load(sanitized_yaml) or {}
|
||||
except yaml.YAMLError as e:
|
||||
print(f"⚠️ YAML parsing error: {e}")
|
||||
return {}
|
||||
|
||||
424
scripts/sync_microsoft_skills.py
Normal file
424
scripts/sync_microsoft_skills.py
Normal file
@@ -0,0 +1,424 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Sync Microsoft Skills Repository - v4 (Flat Structure)
|
||||
Reads each SKILL.md frontmatter 'name' field and uses it as a flat directory
|
||||
name under skills/ to comply with the repository's indexing conventions.
|
||||
"""
|
||||
|
||||
import re
|
||||
import shutil
|
||||
import subprocess
|
||||
import tempfile
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
MS_REPO = "https://github.com/microsoft/skills.git"
|
||||
REPO_ROOT = Path(__file__).parent.parent
|
||||
TARGET_DIR = REPO_ROOT / "skills"
|
||||
DOCS_DIR = REPO_ROOT / "docs"
|
||||
ATTRIBUTION_FILE = DOCS_DIR / "microsoft-skills-attribution.json"
|
||||
|
||||
|
||||
def clone_repo(temp_dir: Path):
|
||||
"""Clone Microsoft skills repository (shallow)."""
|
||||
print("🔄 Cloning Microsoft Skills repository...")
|
||||
subprocess.run(
|
||||
["git", "clone", "--depth", "1", MS_REPO, str(temp_dir)],
|
||||
check=True,
|
||||
)
|
||||
|
||||
|
||||
def cleanup_previous_sync():
|
||||
"""Remove skill directories from a previous sync using the attribution manifest."""
|
||||
if not ATTRIBUTION_FILE.exists():
|
||||
print(" ℹ️ No previous attribution file found — skipping cleanup.")
|
||||
return 0
|
||||
|
||||
try:
|
||||
with open(ATTRIBUTION_FILE) as f:
|
||||
attribution = json.load(f)
|
||||
except (json.JSONDecodeError, OSError) as e:
|
||||
print(f" ⚠️ Could not read attribution file: {e}")
|
||||
return 0
|
||||
|
||||
previous_skills = attribution.get("skills", [])
|
||||
removed_count = 0
|
||||
|
||||
for skill in previous_skills:
|
||||
flat_name = skill.get("flat_name", "")
|
||||
if not flat_name:
|
||||
continue
|
||||
|
||||
skill_dir = TARGET_DIR / flat_name
|
||||
if skill_dir.exists() and skill_dir.is_dir():
|
||||
shutil.rmtree(skill_dir)
|
||||
removed_count += 1
|
||||
|
||||
print(
|
||||
f" 🗑️ Removed {removed_count} previously synced skill directories.")
|
||||
return removed_count
|
||||
|
||||
|
||||
def extract_skill_name(skill_md_path: Path) -> str | None:
|
||||
"""Extract the 'name' field from SKILL.md YAML frontmatter."""
|
||||
try:
|
||||
content = skill_md_path.read_text(encoding="utf-8")
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
fm_match = re.search(r"^---\s*\n(.*?)\n---", content, re.DOTALL)
|
||||
if not fm_match:
|
||||
return None
|
||||
|
||||
for line in fm_match.group(1).splitlines():
|
||||
match = re.match(r"^name:\s*(.+)$", line)
|
||||
if match:
|
||||
value = match.group(1).strip().strip("\"'")
|
||||
if value:
|
||||
return value
|
||||
return None
|
||||
|
||||
|
||||
def generate_fallback_name(relative_path: Path) -> str:
|
||||
"""
|
||||
Generate a fallback directory name when frontmatter 'name' is missing.
|
||||
Converts a path like 'dotnet/compute/botservice' to 'ms-dotnet-compute-botservice'.
|
||||
"""
|
||||
parts = [p for p in relative_path.parts if p]
|
||||
return "ms-" + "-".join(parts)
|
||||
|
||||
|
||||
def find_skills_in_directory(source_dir: Path):
|
||||
"""
|
||||
Walk the Microsoft repo's skills/ directory (which uses symlinks)
|
||||
and resolve each to its actual SKILL.md content.
|
||||
Returns list of dicts: {relative_path, skill_md_path, source_dir}.
|
||||
"""
|
||||
skills_source = source_dir / "skills"
|
||||
results = []
|
||||
|
||||
if not skills_source.exists():
|
||||
return results
|
||||
|
||||
for item in skills_source.rglob("*"):
|
||||
if not item.is_dir():
|
||||
continue
|
||||
|
||||
skill_md = None
|
||||
actual_dir = None
|
||||
|
||||
if item.is_symlink():
|
||||
try:
|
||||
resolved = item.resolve()
|
||||
if (resolved / "SKILL.md").exists():
|
||||
skill_md = resolved / "SKILL.md"
|
||||
actual_dir = resolved
|
||||
except Exception:
|
||||
continue
|
||||
elif (item / "SKILL.md").exists():
|
||||
skill_md = item / "SKILL.md"
|
||||
actual_dir = item
|
||||
|
||||
if skill_md is None:
|
||||
continue
|
||||
|
||||
try:
|
||||
relative_path = item.relative_to(skills_source)
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
results.append({
|
||||
"relative_path": relative_path,
|
||||
"skill_md": skill_md,
|
||||
"source_dir": actual_dir,
|
||||
})
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def find_plugin_skills(source_dir: Path, already_synced_names: set):
|
||||
"""Find plugin skills in .github/plugins/ that haven't been synced yet."""
|
||||
results = []
|
||||
github_plugins = source_dir / ".github" / "plugins"
|
||||
|
||||
if not github_plugins.exists():
|
||||
return results
|
||||
|
||||
for skill_file in github_plugins.rglob("SKILL.md"):
|
||||
skill_dir = skill_file.parent
|
||||
skill_name = skill_dir.name
|
||||
|
||||
if skill_name not in already_synced_names:
|
||||
results.append({
|
||||
"relative_path": Path("plugins") / skill_name,
|
||||
"skill_md": skill_file,
|
||||
"source_dir": skill_dir,
|
||||
})
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def find_github_skills(source_dir: Path, already_synced_names: set):
|
||||
"""Find skills in .github/skills/ not reachable via the skills/ symlink tree."""
|
||||
results = []
|
||||
github_skills = source_dir / ".github" / "skills"
|
||||
|
||||
if not github_skills.exists():
|
||||
return results
|
||||
|
||||
for skill_dir in github_skills.iterdir():
|
||||
if not skill_dir.is_dir() or not (skill_dir / "SKILL.md").exists():
|
||||
continue
|
||||
|
||||
if skill_dir.name not in already_synced_names:
|
||||
results.append({
|
||||
"relative_path": Path(".github/skills") / skill_dir.name,
|
||||
"skill_md": skill_dir / "SKILL.md",
|
||||
"source_dir": skill_dir,
|
||||
})
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def sync_skills_flat(source_dir: Path, target_dir: Path):
|
||||
"""
|
||||
Sync all Microsoft skills into a flat structure under skills/.
|
||||
Uses frontmatter 'name' as directory name, with collision detection.
|
||||
Protects existing non-Microsoft skills from being overwritten.
|
||||
"""
|
||||
# Load previous attribution to know which dirs are Microsoft-owned
|
||||
previously_synced_names = set()
|
||||
if ATTRIBUTION_FILE.exists():
|
||||
try:
|
||||
with open(ATTRIBUTION_FILE) as f:
|
||||
prev = json.load(f)
|
||||
previously_synced_names = {
|
||||
s["flat_name"] for s in prev.get("skills", []) if s.get("flat_name")
|
||||
}
|
||||
except (json.JSONDecodeError, OSError):
|
||||
pass
|
||||
|
||||
all_skill_entries = find_skills_in_directory(source_dir)
|
||||
print(f" 📂 Found {len(all_skill_entries)} skills in skills/ directory")
|
||||
|
||||
synced_count = 0
|
||||
skill_metadata = []
|
||||
# name -> original relative_path (for collision logging)
|
||||
used_names: dict[str, str] = {}
|
||||
|
||||
for entry in all_skill_entries:
|
||||
skill_name = extract_skill_name(entry["skill_md"])
|
||||
|
||||
if not skill_name:
|
||||
skill_name = generate_fallback_name(entry["relative_path"])
|
||||
print(
|
||||
f" ⚠️ No frontmatter name for {entry['relative_path']}, using fallback: {skill_name}")
|
||||
|
||||
# Internal collision detection (two Microsoft skills with same name)
|
||||
if skill_name in used_names:
|
||||
original = used_names[skill_name]
|
||||
print(
|
||||
f" ⚠️ Name collision '{skill_name}': {entry['relative_path']} vs {original}")
|
||||
lang = entry["relative_path"].parts[0] if entry["relative_path"].parts else "unknown"
|
||||
skill_name = f"{skill_name}-{lang}"
|
||||
print(f" Resolved to: {skill_name}")
|
||||
|
||||
# Protect existing non-Microsoft skills from being overwritten
|
||||
target_skill_dir = target_dir / skill_name
|
||||
if target_skill_dir.exists() and skill_name not in previously_synced_names:
|
||||
original_name = skill_name
|
||||
skill_name = f"{skill_name}-ms"
|
||||
print(
|
||||
f" ⚠️ '{original_name}' exists as a non-Microsoft skill, using: {skill_name}")
|
||||
|
||||
used_names[skill_name] = str(entry["relative_path"])
|
||||
|
||||
# Create flat target directory
|
||||
target_skill_dir = target_dir / skill_name
|
||||
target_skill_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Copy SKILL.md
|
||||
shutil.copy2(entry["skill_md"], target_skill_dir / "SKILL.md")
|
||||
|
||||
# Copy other files from the skill directory
|
||||
for file_item in entry["source_dir"].iterdir():
|
||||
if file_item.name != "SKILL.md" and file_item.is_file():
|
||||
shutil.copy2(file_item, target_skill_dir / file_item.name)
|
||||
|
||||
skill_metadata.append({
|
||||
"flat_name": skill_name,
|
||||
"original_path": str(entry["relative_path"]),
|
||||
"source": "microsoft/skills",
|
||||
})
|
||||
|
||||
synced_count += 1
|
||||
print(f" ✅ {entry['relative_path']} → skills/{skill_name}/")
|
||||
|
||||
# Collect all source directory names already synced (for dedup)
|
||||
synced_names = set(used_names.keys())
|
||||
already_synced_dir_names = {
|
||||
e["source_dir"].name for e in all_skill_entries}
|
||||
|
||||
# Sync plugin skills from .github/plugins/
|
||||
plugin_entries = find_plugin_skills(source_dir, already_synced_dir_names)
|
||||
|
||||
if plugin_entries:
|
||||
print(f"\n 📦 Found {len(plugin_entries)} additional plugin skills")
|
||||
for entry in plugin_entries:
|
||||
skill_name = extract_skill_name(entry["skill_md"])
|
||||
if not skill_name:
|
||||
skill_name = entry["source_dir"].name
|
||||
|
||||
if skill_name in synced_names:
|
||||
skill_name = f"{skill_name}-plugin"
|
||||
|
||||
# Protect existing non-Microsoft skills
|
||||
target_skill_dir = target_dir / skill_name
|
||||
if target_skill_dir.exists() and skill_name not in previously_synced_names:
|
||||
original_name = skill_name
|
||||
skill_name = f"{skill_name}-ms"
|
||||
target_skill_dir = target_dir / skill_name
|
||||
print(
|
||||
f" ⚠️ '{original_name}' exists as a non-Microsoft skill, using: {skill_name}")
|
||||
|
||||
synced_names.add(skill_name)
|
||||
already_synced_dir_names.add(entry["source_dir"].name)
|
||||
|
||||
target_skill_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
shutil.copy2(entry["skill_md"], target_skill_dir / "SKILL.md")
|
||||
|
||||
for file_item in entry["source_dir"].iterdir():
|
||||
if file_item.name != "SKILL.md" and file_item.is_file():
|
||||
shutil.copy2(file_item, target_skill_dir / file_item.name)
|
||||
|
||||
skill_metadata.append({
|
||||
"flat_name": skill_name,
|
||||
"original_path": str(entry["relative_path"]),
|
||||
"source": "microsoft/skills (plugin)",
|
||||
})
|
||||
|
||||
synced_count += 1
|
||||
print(f" ✅ {entry['relative_path']} → skills/{skill_name}/")
|
||||
|
||||
# Sync skills in .github/skills/ not reachable via the skills/ symlink tree
|
||||
github_skill_entries = find_github_skills(
|
||||
source_dir, already_synced_dir_names)
|
||||
|
||||
if github_skill_entries:
|
||||
print(
|
||||
f"\n <20> Found {len(github_skill_entries)} skills in .github/skills/ not linked from skills/")
|
||||
for entry in github_skill_entries:
|
||||
skill_name = extract_skill_name(entry["skill_md"])
|
||||
if not skill_name:
|
||||
skill_name = entry["source_dir"].name
|
||||
|
||||
if skill_name in synced_names:
|
||||
skill_name = f"{skill_name}-github"
|
||||
|
||||
# Protect existing non-Microsoft skills
|
||||
target_skill_dir = target_dir / skill_name
|
||||
if target_skill_dir.exists() and skill_name not in previously_synced_names:
|
||||
original_name = skill_name
|
||||
skill_name = f"{skill_name}-ms"
|
||||
target_skill_dir = target_dir / skill_name
|
||||
print(
|
||||
f" ⚠️ '{original_name}' exists as a non-Microsoft skill, using: {skill_name}")
|
||||
|
||||
synced_names.add(skill_name)
|
||||
|
||||
target_skill_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
shutil.copy2(entry["skill_md"], target_skill_dir / "SKILL.md")
|
||||
|
||||
for file_item in entry["source_dir"].iterdir():
|
||||
if file_item.name != "SKILL.md" and file_item.is_file():
|
||||
shutil.copy2(file_item, target_skill_dir / file_item.name)
|
||||
|
||||
skill_metadata.append({
|
||||
"flat_name": skill_name,
|
||||
"original_path": str(entry["relative_path"]),
|
||||
"source": "microsoft/skills (.github/skills)",
|
||||
})
|
||||
|
||||
synced_count += 1
|
||||
print(f" ✅ {entry['relative_path']} → skills/{skill_name}/")
|
||||
|
||||
return synced_count, skill_metadata
|
||||
|
||||
|
||||
def save_attribution(metadata: list):
|
||||
"""Save attribution metadata to docs/."""
|
||||
DOCS_DIR.mkdir(parents=True, exist_ok=True)
|
||||
attribution = {
|
||||
"source": "microsoft/skills",
|
||||
"repository": "https://github.com/microsoft/skills",
|
||||
"license": "MIT",
|
||||
"synced_skills": len(metadata),
|
||||
"structure": "flat (frontmatter name as directory name)",
|
||||
"skills": metadata,
|
||||
}
|
||||
with open(DOCS_DIR / "microsoft-skills-attribution.json", "w") as f:
|
||||
json.dump(attribution, f, indent=2)
|
||||
|
||||
|
||||
def copy_license(source_dir: Path):
|
||||
"""Copy the Microsoft LICENSE to docs/."""
|
||||
DOCS_DIR.mkdir(parents=True, exist_ok=True)
|
||||
if (source_dir / "LICENSE").exists():
|
||||
shutil.copy2(source_dir / "LICENSE", DOCS_DIR / "LICENSE-MICROSOFT")
|
||||
|
||||
|
||||
def main():
|
||||
"""Main sync function."""
|
||||
print("🚀 Microsoft Skills Sync Script v4 (Flat Structure)")
|
||||
print("=" * 55)
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
temp_path = Path(temp_dir)
|
||||
|
||||
try:
|
||||
clone_repo(temp_path)
|
||||
|
||||
TARGET_DIR.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
print("\n🧹 Cleaning up previous sync...")
|
||||
cleanup_previous_sync()
|
||||
|
||||
print("\n🔗 Resolving symlinks and flattening into skills/<name>/...")
|
||||
count, metadata = sync_skills_flat(temp_path, TARGET_DIR)
|
||||
|
||||
print("\n📄 Saving attribution...")
|
||||
save_attribution(metadata)
|
||||
copy_license(temp_path)
|
||||
|
||||
print(
|
||||
f"\n✨ Success! Synced {count} Microsoft skills (flat structure)")
|
||||
print(f"📁 Location: {TARGET_DIR}/")
|
||||
|
||||
# Show summary of languages
|
||||
languages = set()
|
||||
for skill in metadata:
|
||||
parts = skill["original_path"].split("/")
|
||||
if len(parts) >= 1 and parts[0] != "plugins":
|
||||
languages.add(parts[0])
|
||||
|
||||
print(f"\n📊 Organization:")
|
||||
print(f" Total skills: {count}")
|
||||
print(f" Languages: {', '.join(sorted(languages))}")
|
||||
|
||||
print("\n📋 Next steps:")
|
||||
print("1. Run: npm run build")
|
||||
print("2. Commit changes and create PR")
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
exit(main())
|
||||
98
scripts/tests/inspect_microsoft_repo.py
Normal file
98
scripts/tests/inspect_microsoft_repo.py
Normal file
@@ -0,0 +1,98 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Inspect Microsoft Skills Repository Structure
|
||||
Shows the repository layout, skill locations, and what flat names would be generated.
|
||||
"""
|
||||
|
||||
import re
|
||||
import subprocess
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
MS_REPO = "https://github.com/microsoft/skills.git"
|
||||
|
||||
|
||||
def extract_skill_name(skill_md_path: Path) -> str | None:
|
||||
"""Extract the 'name' field from SKILL.md YAML frontmatter."""
|
||||
try:
|
||||
content = skill_md_path.read_text(encoding="utf-8")
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
fm_match = re.search(r"^---\s*\n(.*?)\n---", content, re.DOTALL)
|
||||
if not fm_match:
|
||||
return None
|
||||
|
||||
for line in fm_match.group(1).splitlines():
|
||||
match = re.match(r"^name:\s*(.+)$", line)
|
||||
if match:
|
||||
value = match.group(1).strip().strip("\"'")
|
||||
if value:
|
||||
return value
|
||||
return None
|
||||
|
||||
|
||||
def inspect_repo():
|
||||
"""Inspect the Microsoft skills repository structure."""
|
||||
print("🔍 Inspecting Microsoft Skills Repository Structure")
|
||||
print("=" * 60)
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
temp_path = Path(temp_dir)
|
||||
|
||||
print("\n1️⃣ Cloning repository...")
|
||||
subprocess.run(
|
||||
["git", "clone", "--depth", "1", MS_REPO, str(temp_path)],
|
||||
check=True,
|
||||
capture_output=True,
|
||||
)
|
||||
|
||||
# Find all SKILL.md files
|
||||
all_skill_mds = list(temp_path.rglob("SKILL.md"))
|
||||
print(f"\n2️⃣ Total SKILL.md files found: {len(all_skill_mds)}")
|
||||
|
||||
# Show flat name mapping
|
||||
print(f"\n3️⃣ Flat Name Mapping (frontmatter 'name' → directory name):")
|
||||
print("-" * 60)
|
||||
|
||||
names_seen: dict[str, list[str]] = {}
|
||||
|
||||
for skill_md in sorted(all_skill_mds, key=lambda p: str(p)):
|
||||
try:
|
||||
rel = skill_md.parent.relative_to(temp_path)
|
||||
except ValueError:
|
||||
rel = skill_md.parent
|
||||
|
||||
name = extract_skill_name(skill_md)
|
||||
display_name = name if name else f"(no name → ms-{'-'.join(rel.parts[1:])})"
|
||||
|
||||
print(f" {rel} → {display_name}")
|
||||
|
||||
effective_name = name if name else f"ms-{'-'.join(rel.parts[1:])}"
|
||||
if effective_name not in names_seen:
|
||||
names_seen[effective_name] = []
|
||||
names_seen[effective_name].append(str(rel))
|
||||
|
||||
# Collision check
|
||||
collisions = {n: paths for n, paths in names_seen.items()
|
||||
if len(paths) > 1}
|
||||
if collisions:
|
||||
print(f"\n4️⃣ ⚠️ Name Collisions Detected ({len(collisions)}):")
|
||||
for name, paths in collisions.items():
|
||||
print(f" '{name}':")
|
||||
for p in paths:
|
||||
print(f" - {p}")
|
||||
else:
|
||||
print(
|
||||
f"\n4️⃣ ✅ No name collisions — all {len(names_seen)} names are unique!")
|
||||
|
||||
print("\n✨ Inspection complete!")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
inspect_repo()
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
189
scripts/tests/test_comprehensive_coverage.py
Normal file
189
scripts/tests/test_comprehensive_coverage.py
Normal file
@@ -0,0 +1,189 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Test Script: Verify Microsoft Skills Sync Coverage and Flat Name Uniqueness
|
||||
Ensures all skills are captured and no directory name collisions exist.
|
||||
"""
|
||||
|
||||
import re
|
||||
import subprocess
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from collections import defaultdict
|
||||
|
||||
MS_REPO = "https://github.com/microsoft/skills.git"
|
||||
|
||||
|
||||
def extract_skill_name(skill_md_path: Path) -> str | None:
|
||||
"""Extract the 'name' field from SKILL.md YAML frontmatter."""
|
||||
try:
|
||||
content = skill_md_path.read_text(encoding="utf-8")
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
fm_match = re.search(r"^---\s*\n(.*?)\n---", content, re.DOTALL)
|
||||
if not fm_match:
|
||||
return None
|
||||
|
||||
for line in fm_match.group(1).splitlines():
|
||||
match = re.match(r"^name:\s*(.+)$", line)
|
||||
if match:
|
||||
value = match.group(1).strip().strip("\"'")
|
||||
if value:
|
||||
return value
|
||||
return None
|
||||
|
||||
|
||||
def analyze_skill_locations():
|
||||
"""
|
||||
Comprehensive analysis of all skill locations in Microsoft repo.
|
||||
Verifies flat name uniqueness and coverage.
|
||||
"""
|
||||
print("🔬 Comprehensive Skill Coverage & Uniqueness Analysis")
|
||||
print("=" * 60)
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
temp_path = Path(temp_dir)
|
||||
|
||||
print("\n1️⃣ Cloning repository...")
|
||||
subprocess.run(
|
||||
["git", "clone", "--depth", "1", MS_REPO, str(temp_path)],
|
||||
check=True,
|
||||
capture_output=True,
|
||||
)
|
||||
|
||||
# Find ALL SKILL.md files
|
||||
all_skill_files = list(temp_path.rglob("SKILL.md"))
|
||||
print(f"\n2️⃣ Total SKILL.md files found: {len(all_skill_files)}")
|
||||
|
||||
# Categorize by location
|
||||
location_types = defaultdict(list)
|
||||
for skill_file in all_skill_files:
|
||||
path_str = str(skill_file)
|
||||
if ".github/skills" in path_str:
|
||||
location_types["github_skills"].append(skill_file)
|
||||
elif ".github/plugins" in path_str:
|
||||
location_types["github_plugins"].append(skill_file)
|
||||
elif "/skills/" in path_str:
|
||||
location_types["skills_dir"].append(skill_file)
|
||||
else:
|
||||
location_types["other"].append(skill_file)
|
||||
|
||||
print("\n3️⃣ Skills by Location Type:")
|
||||
for loc_type, files in sorted(location_types.items()):
|
||||
print(f" 📍 {loc_type}: {len(files)} skills")
|
||||
|
||||
# Flat name uniqueness check
|
||||
print("\n4️⃣ Flat Name Uniqueness Check:")
|
||||
print("-" * 60)
|
||||
|
||||
name_map: dict[str, list[str]] = {}
|
||||
missing_names = []
|
||||
|
||||
for skill_file in all_skill_files:
|
||||
try:
|
||||
rel = skill_file.parent.relative_to(temp_path)
|
||||
except ValueError:
|
||||
rel = skill_file.parent
|
||||
|
||||
name = extract_skill_name(skill_file)
|
||||
if not name:
|
||||
missing_names.append(str(rel))
|
||||
# Generate fallback
|
||||
parts = [p for p in rel.parts if p not in (
|
||||
".github", "skills", "plugins")]
|
||||
name = "ms-" + "-".join(parts) if parts else str(rel)
|
||||
|
||||
if name not in name_map:
|
||||
name_map[name] = []
|
||||
name_map[name].append(str(rel))
|
||||
|
||||
# Report results
|
||||
collisions = {n: paths for n, paths in name_map.items()
|
||||
if len(paths) > 1}
|
||||
unique_names = {n: paths for n,
|
||||
paths in name_map.items() if len(paths) == 1}
|
||||
|
||||
print(f"\n ✅ Unique names: {len(unique_names)}")
|
||||
|
||||
if missing_names:
|
||||
print(
|
||||
f"\n ⚠️ Skills missing frontmatter 'name' ({len(missing_names)}):")
|
||||
for path in missing_names[:5]:
|
||||
print(f" - {path}")
|
||||
if len(missing_names) > 5:
|
||||
print(f" ... and {len(missing_names) - 5} more")
|
||||
|
||||
if collisions:
|
||||
print(f"\n ❌ Name collisions ({len(collisions)}):")
|
||||
for name, paths in collisions.items():
|
||||
print(f" '{name}':")
|
||||
for p in paths:
|
||||
print(f" - {p}")
|
||||
else:
|
||||
print(f"\n ✅ No collisions detected!")
|
||||
|
||||
# Validate all names are valid directory names
|
||||
print("\n5️⃣ Directory Name Validation:")
|
||||
invalid_names = []
|
||||
for name in name_map:
|
||||
if not re.match(r"^[a-zA-Z0-9][a-zA-Z0-9._-]*$", name):
|
||||
invalid_names.append(name)
|
||||
|
||||
if invalid_names:
|
||||
print(f" ❌ Invalid directory names ({len(invalid_names)}):")
|
||||
for name in invalid_names[:5]:
|
||||
print(f" - '{name}'")
|
||||
else:
|
||||
print(f" ✅ All {len(name_map)} names are valid directory names!")
|
||||
|
||||
# Summary
|
||||
print("\n6️⃣ Summary:")
|
||||
print("-" * 60)
|
||||
total = len(all_skill_files)
|
||||
unique = len(unique_names) + len(collisions)
|
||||
|
||||
print(f" Total SKILL.md files: {total}")
|
||||
print(f" Unique flat names: {len(unique_names)}")
|
||||
print(f" Collisions: {len(collisions)}")
|
||||
print(f" Missing names: {len(missing_names)}")
|
||||
|
||||
is_pass = len(collisions) == 0 and len(invalid_names) == 0
|
||||
if is_pass:
|
||||
print(f"\n ✅ ALL CHECKS PASSED")
|
||||
else:
|
||||
print(f"\n ⚠️ SOME CHECKS NEED ATTENTION")
|
||||
|
||||
print("\n✨ Analysis complete!")
|
||||
|
||||
return {
|
||||
"total": total,
|
||||
"unique": len(unique_names),
|
||||
"collisions": len(collisions),
|
||||
"missing_names": len(missing_names),
|
||||
"invalid_names": len(invalid_names),
|
||||
"passed": is_pass,
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
results = analyze_skill_locations()
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print("FINAL VERDICT")
|
||||
print("=" * 60)
|
||||
|
||||
if results["passed"]:
|
||||
print("\n✅ V4 FLAT STRUCTURE IS VALID")
|
||||
print(" All names are unique and valid directory names!")
|
||||
else:
|
||||
print("\n⚠️ V4 FLAT STRUCTURE NEEDS FIXES")
|
||||
if results["collisions"] > 0:
|
||||
print(f" {results['collisions']} name collisions to resolve")
|
||||
if results["invalid_names"] > 0:
|
||||
print(f" {results['invalid_names']} invalid directory names")
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
333
skills/agent-framework-azure-ai-py/SKILL.md
Normal file
333
skills/agent-framework-azure-ai-py/SKILL.md
Normal file
@@ -0,0 +1,333 @@
|
||||
---
|
||||
name: agent-framework-azure-ai-py
|
||||
description: Build Azure AI Foundry agents using the Microsoft Agent Framework Python SDK (agent-framework-azure-ai). Use when creating persistent agents with AzureAIAgentsProvider, using hosted tools (code interpreter, file search, web search), integrating MCP servers, managing conversation threads, or implementing streaming responses. Covers function tools, structured outputs, and multi-tool agents.
|
||||
package: agent-framework-azure-ai
|
||||
---
|
||||
|
||||
# Agent Framework Azure Hosted Agents
|
||||
|
||||
Build persistent agents on Azure AI Foundry using the Microsoft Agent Framework Python SDK.
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
User Query → AzureAIAgentsProvider → Azure AI Agent Service (Persistent)
|
||||
↓
|
||||
Agent.run() / Agent.run_stream()
|
||||
↓
|
||||
Tools: Functions | Hosted (Code/Search/Web) | MCP
|
||||
↓
|
||||
AgentThread (conversation persistence)
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Full framework (recommended)
|
||||
pip install agent-framework --pre
|
||||
|
||||
# Or Azure-specific package only
|
||||
pip install agent-framework-azure-ai --pre
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
export AZURE_AI_PROJECT_ENDPOINT="https://<project>.services.ai.azure.com/api/projects/<project-id>"
|
||||
export AZURE_AI_MODEL_DEPLOYMENT_NAME="gpt-4o-mini"
|
||||
export BING_CONNECTION_ID="your-bing-connection-id" # For web search
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity.aio import AzureCliCredential, DefaultAzureCredential
|
||||
|
||||
# Development
|
||||
credential = AzureCliCredential()
|
||||
|
||||
# Production
|
||||
credential = DefaultAzureCredential()
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Basic Agent
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="MyAgent",
|
||||
instructions="You are a helpful assistant.",
|
||||
)
|
||||
|
||||
result = await agent.run("Hello!")
|
||||
print(result.text)
|
||||
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
### Agent with Function Tools
|
||||
|
||||
```python
|
||||
from typing import Annotated
|
||||
from pydantic import Field
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
def get_weather(
|
||||
location: Annotated[str, Field(description="City name to get weather for")],
|
||||
) -> str:
|
||||
"""Get the current weather for a location."""
|
||||
return f"Weather in {location}: 72°F, sunny"
|
||||
|
||||
def get_current_time() -> str:
|
||||
"""Get the current UTC time."""
|
||||
from datetime import datetime, timezone
|
||||
return datetime.now(timezone.utc).strftime("%Y-%m-%d %H:%M:%S UTC")
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="WeatherAgent",
|
||||
instructions="You help with weather and time queries.",
|
||||
tools=[get_weather, get_current_time], # Pass functions directly
|
||||
)
|
||||
|
||||
result = await agent.run("What's the weather in Seattle?")
|
||||
print(result.text)
|
||||
```
|
||||
|
||||
### Agent with Hosted Tools
|
||||
|
||||
```python
|
||||
from agent_framework import (
|
||||
HostedCodeInterpreterTool,
|
||||
HostedFileSearchTool,
|
||||
HostedWebSearchTool,
|
||||
)
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="MultiToolAgent",
|
||||
instructions="You can execute code, search files, and search the web.",
|
||||
tools=[
|
||||
HostedCodeInterpreterTool(),
|
||||
HostedWebSearchTool(name="Bing"),
|
||||
],
|
||||
)
|
||||
|
||||
result = await agent.run("Calculate the factorial of 20 in Python")
|
||||
print(result.text)
|
||||
```
|
||||
|
||||
### Streaming Responses
|
||||
|
||||
```python
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="StreamingAgent",
|
||||
instructions="You are a helpful assistant.",
|
||||
)
|
||||
|
||||
print("Agent: ", end="", flush=True)
|
||||
async for chunk in agent.run_stream("Tell me a short story"):
|
||||
if chunk.text:
|
||||
print(chunk.text, end="", flush=True)
|
||||
print()
|
||||
```
|
||||
|
||||
### Conversation Threads
|
||||
|
||||
```python
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="ChatAgent",
|
||||
instructions="You are a helpful assistant.",
|
||||
tools=[get_weather],
|
||||
)
|
||||
|
||||
# Create thread for conversation persistence
|
||||
thread = agent.get_new_thread()
|
||||
|
||||
# First turn
|
||||
result1 = await agent.run("What's the weather in Seattle?", thread=thread)
|
||||
print(f"Agent: {result1.text}")
|
||||
|
||||
# Second turn - context is maintained
|
||||
result2 = await agent.run("What about Portland?", thread=thread)
|
||||
print(f"Agent: {result2.text}")
|
||||
|
||||
# Save thread ID for later resumption
|
||||
print(f"Conversation ID: {thread.conversation_id}")
|
||||
```
|
||||
|
||||
### Structured Outputs
|
||||
|
||||
```python
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
class WeatherResponse(BaseModel):
|
||||
model_config = ConfigDict(extra="forbid")
|
||||
|
||||
location: str
|
||||
temperature: float
|
||||
unit: str
|
||||
conditions: str
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="StructuredAgent",
|
||||
instructions="Provide weather information in structured format.",
|
||||
response_format=WeatherResponse,
|
||||
)
|
||||
|
||||
result = await agent.run("Weather in Seattle?")
|
||||
weather = WeatherResponse.model_validate_json(result.text)
|
||||
print(f"{weather.location}: {weather.temperature}°{weather.unit}")
|
||||
```
|
||||
|
||||
## Provider Methods
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `create_agent()` | Create new agent on Azure AI service |
|
||||
| `get_agent(agent_id)` | Retrieve existing agent by ID |
|
||||
| `as_agent(sdk_agent)` | Wrap SDK Agent object (no HTTP call) |
|
||||
|
||||
## Hosted Tools Quick Reference
|
||||
|
||||
| Tool | Import | Purpose |
|
||||
|------|--------|---------|
|
||||
| `HostedCodeInterpreterTool` | `from agent_framework import HostedCodeInterpreterTool` | Execute Python code |
|
||||
| `HostedFileSearchTool` | `from agent_framework import HostedFileSearchTool` | Search vector stores |
|
||||
| `HostedWebSearchTool` | `from agent_framework import HostedWebSearchTool` | Bing web search |
|
||||
| `HostedMCPTool` | `from agent_framework import HostedMCPTool` | Service-managed MCP |
|
||||
| `MCPStreamableHTTPTool` | `from agent_framework import MCPStreamableHTTPTool` | Client-managed MCP |
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from typing import Annotated
|
||||
from pydantic import BaseModel, Field
|
||||
from agent_framework import (
|
||||
HostedCodeInterpreterTool,
|
||||
HostedWebSearchTool,
|
||||
MCPStreamableHTTPTool,
|
||||
)
|
||||
from agent_framework.azure import AzureAIAgentsProvider
|
||||
from azure.identity.aio import AzureCliCredential
|
||||
|
||||
|
||||
def get_weather(
|
||||
location: Annotated[str, Field(description="City name")],
|
||||
) -> str:
|
||||
"""Get weather for a location."""
|
||||
return f"Weather in {location}: 72°F, sunny"
|
||||
|
||||
|
||||
class AnalysisResult(BaseModel):
|
||||
summary: str
|
||||
key_findings: list[str]
|
||||
confidence: float
|
||||
|
||||
|
||||
async def main():
|
||||
async with (
|
||||
AzureCliCredential() as credential,
|
||||
MCPStreamableHTTPTool(
|
||||
name="Docs MCP",
|
||||
url="https://learn.microsoft.com/api/mcp",
|
||||
) as mcp_tool,
|
||||
AzureAIAgentsProvider(credential=credential) as provider,
|
||||
):
|
||||
agent = await provider.create_agent(
|
||||
name="ResearchAssistant",
|
||||
instructions="You are a research assistant with multiple capabilities.",
|
||||
tools=[
|
||||
get_weather,
|
||||
HostedCodeInterpreterTool(),
|
||||
HostedWebSearchTool(name="Bing"),
|
||||
mcp_tool,
|
||||
],
|
||||
)
|
||||
|
||||
thread = agent.get_new_thread()
|
||||
|
||||
# Non-streaming
|
||||
result = await agent.run(
|
||||
"Search for Python best practices and summarize",
|
||||
thread=thread,
|
||||
)
|
||||
print(f"Response: {result.text}")
|
||||
|
||||
# Streaming
|
||||
print("\nStreaming: ", end="")
|
||||
async for chunk in agent.run_stream("Continue with examples", thread=thread):
|
||||
if chunk.text:
|
||||
print(chunk.text, end="", flush=True)
|
||||
print()
|
||||
|
||||
# Structured output
|
||||
result = await agent.run(
|
||||
"Analyze findings",
|
||||
thread=thread,
|
||||
response_format=AnalysisResult,
|
||||
)
|
||||
analysis = AnalysisResult.model_validate_json(result.text)
|
||||
print(f"\nConfidence: {analysis.confidence}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
## Conventions
|
||||
|
||||
- Always use async context managers: `async with provider:`
|
||||
- Pass functions directly to `tools=` parameter (auto-converted to AIFunction)
|
||||
- Use `Annotated[type, Field(description=...)]` for function parameters
|
||||
- Use `get_new_thread()` for multi-turn conversations
|
||||
- Prefer `HostedMCPTool` for service-managed MCP, `MCPStreamableHTTPTool` for client-managed
|
||||
|
||||
## Reference Files
|
||||
|
||||
- [references/tools.md](references/tools.md): Detailed hosted tool patterns
|
||||
- [references/mcp.md](references/mcp.md): MCP integration (hosted + local)
|
||||
- [references/threads.md](references/threads.md): Thread and conversation management
|
||||
- [references/advanced.md](references/advanced.md): OpenAPI, citations, structured outputs
|
||||
325
skills/agents-v2-py/SKILL.md
Normal file
325
skills/agents-v2-py/SKILL.md
Normal file
@@ -0,0 +1,325 @@
|
||||
---
|
||||
name: agents-v2-py
|
||||
description: |
|
||||
Build container-based Foundry Agents using Azure AI Projects SDK with ImageBasedHostedAgentDefinition.
|
||||
Use when creating hosted agents that run custom code in Azure AI Foundry with your own container images.
|
||||
Triggers: "ImageBasedHostedAgentDefinition", "hosted agent", "container agent", "Foundry Agent",
|
||||
"create_version", "ProtocolVersionRecord", "AgentProtocol.RESPONSES", "custom agent image".
|
||||
package: azure-ai-projects
|
||||
---
|
||||
|
||||
# Azure AI Hosted Agents (Python)
|
||||
|
||||
Build container-based hosted agents using `ImageBasedHostedAgentDefinition` from the Azure AI Projects SDK.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-projects>=2.0.0b3 azure-identity
|
||||
```
|
||||
|
||||
**Minimum SDK Version:** `2.0.0b3` or later required for hosted agent support.
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_AI_PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
Before creating hosted agents:
|
||||
|
||||
1. **Container Image** - Build and push to Azure Container Registry (ACR)
|
||||
2. **ACR Pull Permissions** - Grant your project's managed identity `AcrPull` role on the ACR
|
||||
3. **Capability Host** - Account-level capability host with `enablePublicHostingEnvironment=true`
|
||||
4. **SDK Version** - Ensure `azure-ai-projects>=2.0.0b3`
|
||||
|
||||
## Authentication
|
||||
|
||||
Always use `DefaultAzureCredential`:
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.ai.projects import AIProjectClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=credential
|
||||
)
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Imports
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.ai.projects import AIProjectClient
|
||||
from azure.ai.projects.models import (
|
||||
ImageBasedHostedAgentDefinition,
|
||||
ProtocolVersionRecord,
|
||||
AgentProtocol,
|
||||
)
|
||||
```
|
||||
|
||||
### 2. Create Hosted Agent
|
||||
|
||||
```python
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
agent = client.agents.create_version(
|
||||
agent_name="my-hosted-agent",
|
||||
definition=ImageBasedHostedAgentDefinition(
|
||||
container_protocol_versions=[
|
||||
ProtocolVersionRecord(protocol=AgentProtocol.RESPONSES, version="v1")
|
||||
],
|
||||
cpu="1",
|
||||
memory="2Gi",
|
||||
image="myregistry.azurecr.io/my-agent:latest",
|
||||
tools=[{"type": "code_interpreter"}],
|
||||
environment_variables={
|
||||
"AZURE_AI_PROJECT_ENDPOINT": os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
"MODEL_NAME": "gpt-4o-mini"
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Created agent: {agent.name} (version: {agent.version})")
|
||||
```
|
||||
|
||||
### 3. List Agent Versions
|
||||
|
||||
```python
|
||||
versions = client.agents.list_versions(agent_name="my-hosted-agent")
|
||||
for version in versions:
|
||||
print(f"Version: {version.version}, State: {version.state}")
|
||||
```
|
||||
|
||||
### 4. Delete Agent Version
|
||||
|
||||
```python
|
||||
client.agents.delete_version(
|
||||
agent_name="my-hosted-agent",
|
||||
version=agent.version
|
||||
)
|
||||
```
|
||||
|
||||
## ImageBasedHostedAgentDefinition Parameters
|
||||
|
||||
| Parameter | Type | Required | Description |
|
||||
|-----------|------|----------|-------------|
|
||||
| `container_protocol_versions` | `list[ProtocolVersionRecord]` | Yes | Protocol versions the agent supports |
|
||||
| `image` | `str` | Yes | Full container image path (registry/image:tag) |
|
||||
| `cpu` | `str` | No | CPU allocation (e.g., "1", "2") |
|
||||
| `memory` | `str` | No | Memory allocation (e.g., "2Gi", "4Gi") |
|
||||
| `tools` | `list[dict]` | No | Tools available to the agent |
|
||||
| `environment_variables` | `dict[str, str]` | No | Environment variables for the container |
|
||||
|
||||
## Protocol Versions
|
||||
|
||||
The `container_protocol_versions` parameter specifies which protocols your agent supports:
|
||||
|
||||
```python
|
||||
from azure.ai.projects.models import ProtocolVersionRecord, AgentProtocol
|
||||
|
||||
# RESPONSES protocol - standard agent responses
|
||||
container_protocol_versions=[
|
||||
ProtocolVersionRecord(protocol=AgentProtocol.RESPONSES, version="v1")
|
||||
]
|
||||
```
|
||||
|
||||
**Available Protocols:**
|
||||
| Protocol | Description |
|
||||
|----------|-------------|
|
||||
| `AgentProtocol.RESPONSES` | Standard response protocol for agent interactions |
|
||||
|
||||
## Resource Allocation
|
||||
|
||||
Specify CPU and memory for your container:
|
||||
|
||||
```python
|
||||
definition=ImageBasedHostedAgentDefinition(
|
||||
container_protocol_versions=[...],
|
||||
image="myregistry.azurecr.io/my-agent:latest",
|
||||
cpu="2", # 2 CPU cores
|
||||
memory="4Gi" # 4 GiB memory
|
||||
)
|
||||
```
|
||||
|
||||
**Resource Limits:**
|
||||
| Resource | Min | Max | Default |
|
||||
|----------|-----|-----|---------|
|
||||
| CPU | 0.5 | 4 | 1 |
|
||||
| Memory | 1Gi | 8Gi | 2Gi |
|
||||
|
||||
## Tools Configuration
|
||||
|
||||
Add tools to your hosted agent:
|
||||
|
||||
### Code Interpreter
|
||||
|
||||
```python
|
||||
tools=[{"type": "code_interpreter"}]
|
||||
```
|
||||
|
||||
### MCP Tools
|
||||
|
||||
```python
|
||||
tools=[
|
||||
{"type": "code_interpreter"},
|
||||
{
|
||||
"type": "mcp",
|
||||
"server_label": "my-mcp-server",
|
||||
"server_url": "https://my-mcp-server.example.com"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Multiple Tools
|
||||
|
||||
```python
|
||||
tools=[
|
||||
{"type": "code_interpreter"},
|
||||
{"type": "file_search"},
|
||||
{
|
||||
"type": "mcp",
|
||||
"server_label": "custom-tool",
|
||||
"server_url": "https://custom-tool.example.com"
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Pass configuration to your container:
|
||||
|
||||
```python
|
||||
environment_variables={
|
||||
"AZURE_AI_PROJECT_ENDPOINT": os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
"MODEL_NAME": "gpt-4o-mini",
|
||||
"LOG_LEVEL": "INFO",
|
||||
"CUSTOM_CONFIG": "value"
|
||||
}
|
||||
```
|
||||
|
||||
**Best Practice:** Never hardcode secrets. Use environment variables or Azure Key Vault.
|
||||
|
||||
## Complete Example
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.ai.projects import AIProjectClient
|
||||
from azure.ai.projects.models import (
|
||||
ImageBasedHostedAgentDefinition,
|
||||
ProtocolVersionRecord,
|
||||
AgentProtocol,
|
||||
)
|
||||
|
||||
def create_hosted_agent():
|
||||
"""Create a hosted agent with custom container image."""
|
||||
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
agent = client.agents.create_version(
|
||||
agent_name="data-processor-agent",
|
||||
definition=ImageBasedHostedAgentDefinition(
|
||||
container_protocol_versions=[
|
||||
ProtocolVersionRecord(
|
||||
protocol=AgentProtocol.RESPONSES,
|
||||
version="v1"
|
||||
)
|
||||
],
|
||||
image="myregistry.azurecr.io/data-processor:v1.0",
|
||||
cpu="2",
|
||||
memory="4Gi",
|
||||
tools=[
|
||||
{"type": "code_interpreter"},
|
||||
{"type": "file_search"}
|
||||
],
|
||||
environment_variables={
|
||||
"AZURE_AI_PROJECT_ENDPOINT": os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
"MODEL_NAME": "gpt-4o-mini",
|
||||
"MAX_RETRIES": "3"
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Created hosted agent: {agent.name}")
|
||||
print(f"Version: {agent.version}")
|
||||
print(f"State: {agent.state}")
|
||||
|
||||
return agent
|
||||
|
||||
if __name__ == "__main__":
|
||||
create_hosted_agent()
|
||||
```
|
||||
|
||||
## Async Pattern
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
from azure.ai.projects.aio import AIProjectClient
|
||||
from azure.ai.projects.models import (
|
||||
ImageBasedHostedAgentDefinition,
|
||||
ProtocolVersionRecord,
|
||||
AgentProtocol,
|
||||
)
|
||||
|
||||
async def create_hosted_agent_async():
|
||||
"""Create a hosted agent asynchronously."""
|
||||
|
||||
async with DefaultAzureCredential() as credential:
|
||||
async with AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=credential
|
||||
) as client:
|
||||
agent = await client.agents.create_version(
|
||||
agent_name="async-agent",
|
||||
definition=ImageBasedHostedAgentDefinition(
|
||||
container_protocol_versions=[
|
||||
ProtocolVersionRecord(
|
||||
protocol=AgentProtocol.RESPONSES,
|
||||
version="v1"
|
||||
)
|
||||
],
|
||||
image="myregistry.azurecr.io/async-agent:latest",
|
||||
cpu="1",
|
||||
memory="2Gi"
|
||||
)
|
||||
)
|
||||
return agent
|
||||
```
|
||||
|
||||
## Common Errors
|
||||
|
||||
| Error | Cause | Solution |
|
||||
|-------|-------|----------|
|
||||
| `ImagePullBackOff` | ACR pull permission denied | Grant `AcrPull` role to project's managed identity |
|
||||
| `InvalidContainerImage` | Image not found | Verify image path and tag exist in ACR |
|
||||
| `CapabilityHostNotFound` | No capability host configured | Create account-level capability host |
|
||||
| `ProtocolVersionNotSupported` | Invalid protocol version | Use `AgentProtocol.RESPONSES` with version `"v1"` |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Version Your Images** - Use specific tags, not `latest` in production
|
||||
2. **Minimal Resources** - Start with minimum CPU/memory, scale up as needed
|
||||
3. **Environment Variables** - Use for all configuration, never hardcode
|
||||
4. **Error Handling** - Wrap agent creation in try/except blocks
|
||||
5. **Cleanup** - Delete unused agent versions to free resources
|
||||
|
||||
## Reference Links
|
||||
|
||||
- [Azure AI Projects SDK](https://pypi.org/project/azure-ai-projects/)
|
||||
- [Hosted Agents Documentation](https://learn.microsoft.com/azure/ai-services/agents/how-to/hosted-agents)
|
||||
- [Azure Container Registry](https://learn.microsoft.com/azure/container-registry/)
|
||||
80
skills/antigravity-workflows/SKILL.md
Normal file
80
skills/antigravity-workflows/SKILL.md
Normal file
@@ -0,0 +1,80 @@
|
||||
---
|
||||
name: antigravity-workflows
|
||||
description: "Orchestrate multiple Antigravity skills through guided workflows for SaaS MVP delivery, security audits, AI agent builds, and browser QA."
|
||||
source: self
|
||||
risk: none
|
||||
---
|
||||
|
||||
# Antigravity Workflows
|
||||
|
||||
Use this skill to turn a complex objective into a guided sequence of skill invocations.
|
||||
|
||||
## When to Use This Skill
|
||||
|
||||
Use this skill when:
|
||||
- The user wants to combine several skills without manually selecting each one.
|
||||
- The goal is multi-phase (for example: plan, build, test, ship).
|
||||
- The user asks for best-practice execution for common scenarios like:
|
||||
- Shipping a SaaS MVP
|
||||
- Running a web security audit
|
||||
- Building an AI agent system
|
||||
- Implementing browser automation and E2E QA
|
||||
|
||||
## Workflow Source of Truth
|
||||
|
||||
Read workflows in this order:
|
||||
1. `docs/WORKFLOWS.md` for human-readable playbooks.
|
||||
2. `data/workflows.json` for machine-readable workflow metadata.
|
||||
|
||||
## How to Run This Skill
|
||||
|
||||
1. Identify the user's concrete outcome.
|
||||
2. Propose the 1-2 best matching workflows.
|
||||
3. Ask the user to choose one.
|
||||
4. Execute step-by-step:
|
||||
- Announce current step and expected artifact.
|
||||
- Invoke recommended skills for that step.
|
||||
- Verify completion criteria before moving to next step.
|
||||
5. At the end, provide:
|
||||
- Completed artifacts
|
||||
- Validation evidence
|
||||
- Remaining risks and next actions
|
||||
|
||||
## Default Workflow Routing
|
||||
|
||||
- Product delivery request -> `ship-saas-mvp`
|
||||
- Security review request -> `security-audit-web-app`
|
||||
- Agent/LLM product request -> `build-ai-agent-system`
|
||||
- E2E/browser testing request -> `qa-browser-automation`
|
||||
|
||||
## Copy-Paste Prompts
|
||||
|
||||
```text
|
||||
Use @antigravity-workflows to run the "Ship a SaaS MVP" workflow for my project idea.
|
||||
```
|
||||
|
||||
```text
|
||||
Use @antigravity-workflows and execute a full "Security Audit for a Web App" workflow.
|
||||
```
|
||||
|
||||
```text
|
||||
Use @antigravity-workflows to guide me through "Build an AI Agent System" with checkpoints.
|
||||
```
|
||||
|
||||
```text
|
||||
Use @antigravity-workflows to execute the "QA and Browser Automation" workflow and stabilize flaky tests.
|
||||
```
|
||||
|
||||
## Limitations
|
||||
|
||||
- This skill orchestrates; it does not replace specialized skills.
|
||||
- It depends on the local availability of referenced skills.
|
||||
- It does not guarantee success without environment access, credentials, or required infrastructure.
|
||||
- For stack-specific browser automation in Go, `go-playwright` may require the corresponding skill to be present in your local skills repository.
|
||||
|
||||
## Related Skills
|
||||
|
||||
- `concise-planning`
|
||||
- `brainstorming`
|
||||
- `workflow-automation`
|
||||
- `verification-before-completion`
|
||||
@@ -0,0 +1,36 @@
|
||||
# Antigravity Workflows Implementation Playbook
|
||||
|
||||
This document explains how an agent should execute workflow-based orchestration.
|
||||
|
||||
## Execution Contract
|
||||
|
||||
For every workflow:
|
||||
|
||||
1. Confirm objective and scope.
|
||||
2. Select the best-matching workflow.
|
||||
3. Execute workflow steps in order.
|
||||
4. Produce one concrete artifact per step.
|
||||
5. Validate before continuing.
|
||||
|
||||
## Step Artifact Examples
|
||||
|
||||
- Plan step -> scope document or milestone checklist.
|
||||
- Build step -> code changes and implementation notes.
|
||||
- Test step -> test results and failure triage.
|
||||
- Release step -> rollout checklist and risk log.
|
||||
|
||||
## Safety Guardrails
|
||||
|
||||
- Never run destructive actions without explicit user approval.
|
||||
- If a required skill is missing, state the gap and fallback to closest available skill.
|
||||
- When security testing is involved, ensure authorization is explicit.
|
||||
|
||||
## Suggested Completion Format
|
||||
|
||||
At workflow completion, return:
|
||||
|
||||
1. Completed steps
|
||||
2. Artifacts produced
|
||||
3. Validation evidence
|
||||
4. Open risks
|
||||
5. Suggested next action
|
||||
296
skills/azd-deployment/SKILL.md
Normal file
296
skills/azd-deployment/SKILL.md
Normal file
@@ -0,0 +1,296 @@
|
||||
---
|
||||
name: azd-deployment
|
||||
description: Deploy containerized applications to Azure Container Apps using Azure Developer CLI (azd). Use when setting up azd projects, writing azure.yaml configuration, creating Bicep infrastructure for Container Apps, configuring remote builds with ACR, implementing idempotent deployments, managing environment variables across local/.azure/Bicep, or troubleshooting azd up failures. Triggers on requests for azd configuration, Container Apps deployment, multi-service deployments, and infrastructure-as-code with Bicep.
|
||||
---
|
||||
|
||||
# Azure Developer CLI (azd) Container Apps Deployment
|
||||
|
||||
Deploy containerized frontend + backend applications to Azure Container Apps with remote builds, managed identity, and idempotent infrastructure.
|
||||
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
# Initialize and deploy
|
||||
azd auth login
|
||||
azd init # Creates azure.yaml and .azure/ folder
|
||||
azd env new <env-name> # Create environment (dev, staging, prod)
|
||||
azd up # Provision infra + build + deploy
|
||||
```
|
||||
|
||||
## Core File Structure
|
||||
|
||||
```
|
||||
project/
|
||||
├── azure.yaml # azd service definitions + hooks
|
||||
├── infra/
|
||||
│ ├── main.bicep # Root infrastructure module
|
||||
│ ├── main.parameters.json # Parameter injection from env vars
|
||||
│ └── modules/
|
||||
│ ├── container-apps-environment.bicep
|
||||
│ └── container-app.bicep
|
||||
├── .azure/
|
||||
│ ├── config.json # Default environment pointer
|
||||
│ └── <env-name>/
|
||||
│ ├── .env # Environment-specific values (azd-managed)
|
||||
│ └── config.json # Environment metadata
|
||||
└── src/
|
||||
├── frontend/Dockerfile
|
||||
└── backend/Dockerfile
|
||||
```
|
||||
|
||||
## azure.yaml Configuration
|
||||
|
||||
### Minimal Configuration
|
||||
|
||||
```yaml
|
||||
name: azd-deployment
|
||||
services:
|
||||
backend:
|
||||
project: ./src/backend
|
||||
language: python
|
||||
host: containerapp
|
||||
docker:
|
||||
path: ./Dockerfile
|
||||
remoteBuild: true
|
||||
```
|
||||
|
||||
### Full Configuration with Hooks
|
||||
|
||||
```yaml
|
||||
name: azd-deployment
|
||||
metadata:
|
||||
template: my-project@1.0.0
|
||||
|
||||
infra:
|
||||
provider: bicep
|
||||
path: ./infra
|
||||
|
||||
azure:
|
||||
location: eastus2
|
||||
|
||||
services:
|
||||
frontend:
|
||||
project: ./src/frontend
|
||||
language: ts
|
||||
host: containerapp
|
||||
docker:
|
||||
path: ./Dockerfile
|
||||
context: .
|
||||
remoteBuild: true
|
||||
|
||||
backend:
|
||||
project: ./src/backend
|
||||
language: python
|
||||
host: containerapp
|
||||
docker:
|
||||
path: ./Dockerfile
|
||||
context: .
|
||||
remoteBuild: true
|
||||
|
||||
hooks:
|
||||
preprovision:
|
||||
shell: sh
|
||||
run: |
|
||||
echo "Before provisioning..."
|
||||
|
||||
postprovision:
|
||||
shell: sh
|
||||
run: |
|
||||
echo "After provisioning - set up RBAC, etc."
|
||||
|
||||
postdeploy:
|
||||
shell: sh
|
||||
run: |
|
||||
echo "Frontend: ${SERVICE_FRONTEND_URI}"
|
||||
echo "Backend: ${SERVICE_BACKEND_URI}"
|
||||
```
|
||||
|
||||
### Key azure.yaml Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
| `remoteBuild: true` | Build images in Azure Container Registry (recommended) |
|
||||
| `context: .` | Docker build context relative to project path |
|
||||
| `host: containerapp` | Deploy to Azure Container Apps |
|
||||
| `infra.provider: bicep` | Use Bicep for infrastructure |
|
||||
|
||||
## Environment Variables Flow
|
||||
|
||||
### Three-Level Configuration
|
||||
|
||||
1. **Local `.env`** - For local development only
|
||||
2. **`.azure/<env>/.env`** - azd-managed, auto-populated from Bicep outputs
|
||||
3. **`main.parameters.json`** - Maps env vars to Bicep parameters
|
||||
|
||||
### Parameter Injection Pattern
|
||||
|
||||
```json
|
||||
// infra/main.parameters.json
|
||||
{
|
||||
"parameters": {
|
||||
"environmentName": { "value": "${AZURE_ENV_NAME}" },
|
||||
"location": { "value": "${AZURE_LOCATION=eastus2}" },
|
||||
"azureOpenAiEndpoint": { "value": "${AZURE_OPENAI_ENDPOINT}" }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Syntax: `${VAR_NAME}` or `${VAR_NAME=default_value}`
|
||||
|
||||
### Setting Environment Variables
|
||||
|
||||
```bash
|
||||
# Set for current environment
|
||||
azd env set AZURE_OPENAI_ENDPOINT "https://my-openai.openai.azure.com"
|
||||
azd env set AZURE_SEARCH_ENDPOINT "https://my-search.search.windows.net"
|
||||
|
||||
# Set during init
|
||||
azd env new prod
|
||||
azd env set AZURE_OPENAI_ENDPOINT "..."
|
||||
```
|
||||
|
||||
### Bicep Output → Environment Variable
|
||||
|
||||
```bicep
|
||||
// In main.bicep - outputs auto-populate .azure/<env>/.env
|
||||
output SERVICE_FRONTEND_URI string = frontend.outputs.uri
|
||||
output SERVICE_BACKEND_URI string = backend.outputs.uri
|
||||
output BACKEND_PRINCIPAL_ID string = backend.outputs.principalId
|
||||
```
|
||||
|
||||
## Idempotent Deployments
|
||||
|
||||
### Why azd up is Idempotent
|
||||
|
||||
1. **Bicep is declarative** - Resources reconcile to desired state
|
||||
2. **Remote builds tag uniquely** - Image tags include deployment timestamp
|
||||
3. **ACR reuses layers** - Only changed layers upload
|
||||
|
||||
### Preserving Manual Changes
|
||||
|
||||
Custom domains added via Portal can be lost on redeploy. Preserve with hooks:
|
||||
|
||||
```yaml
|
||||
hooks:
|
||||
preprovision:
|
||||
shell: sh
|
||||
run: |
|
||||
# Save custom domains before provision
|
||||
if az containerapp show --name "$FRONTEND_NAME" -g "$RG" &>/dev/null; then
|
||||
az containerapp show --name "$FRONTEND_NAME" -g "$RG" \
|
||||
--query "properties.configuration.ingress.customDomains" \
|
||||
-o json > /tmp/domains.json
|
||||
fi
|
||||
|
||||
postprovision:
|
||||
shell: sh
|
||||
run: |
|
||||
# Verify/restore custom domains
|
||||
if [ -f /tmp/domains.json ]; then
|
||||
echo "Saved domains: $(cat /tmp/domains.json)"
|
||||
fi
|
||||
```
|
||||
|
||||
### Handling Existing Resources
|
||||
|
||||
```bicep
|
||||
// Reference existing ACR (don't recreate)
|
||||
resource containerRegistry 'Microsoft.ContainerRegistry/registries@2023-07-01' existing = {
|
||||
name: containerRegistryName
|
||||
}
|
||||
|
||||
// Set customDomains to null to preserve Portal-added domains
|
||||
customDomains: empty(customDomainsParam) ? null : customDomainsParam
|
||||
```
|
||||
|
||||
## Container App Service Discovery
|
||||
|
||||
Internal HTTP routing between Container Apps in same environment:
|
||||
|
||||
```bicep
|
||||
// Backend reference in frontend env vars
|
||||
env: [
|
||||
{
|
||||
name: 'BACKEND_URL'
|
||||
value: 'http://ca-backend-${resourceToken}' // Internal DNS
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
Frontend nginx proxies to internal URL:
|
||||
```nginx
|
||||
location /api {
|
||||
proxy_pass $BACKEND_URL;
|
||||
}
|
||||
```
|
||||
|
||||
## Managed Identity & RBAC
|
||||
|
||||
### Enable System-Assigned Identity
|
||||
|
||||
```bicep
|
||||
resource containerApp 'Microsoft.App/containerApps@2024-03-01' = {
|
||||
identity: {
|
||||
type: 'SystemAssigned'
|
||||
}
|
||||
}
|
||||
|
||||
output principalId string = containerApp.identity.principalId
|
||||
```
|
||||
|
||||
### Post-Provision RBAC Assignment
|
||||
|
||||
```yaml
|
||||
hooks:
|
||||
postprovision:
|
||||
shell: sh
|
||||
run: |
|
||||
PRINCIPAL_ID="${BACKEND_PRINCIPAL_ID}"
|
||||
|
||||
# Azure OpenAI access
|
||||
az role assignment create \
|
||||
--assignee-object-id "$PRINCIPAL_ID" \
|
||||
--assignee-principal-type ServicePrincipal \
|
||||
--role "Cognitive Services OpenAI User" \
|
||||
--scope "$OPENAI_RESOURCE_ID" 2>/dev/null || true
|
||||
|
||||
# Azure AI Search access
|
||||
az role assignment create \
|
||||
--assignee-object-id "$PRINCIPAL_ID" \
|
||||
--role "Search Index Data Reader" \
|
||||
--scope "$SEARCH_RESOURCE_ID" 2>/dev/null || true
|
||||
```
|
||||
|
||||
## Common Commands
|
||||
|
||||
```bash
|
||||
# Environment management
|
||||
azd env list # List environments
|
||||
azd env select <name> # Switch environment
|
||||
azd env get-values # Show all env vars
|
||||
azd env set KEY value # Set variable
|
||||
|
||||
# Deployment
|
||||
azd up # Full provision + deploy
|
||||
azd provision # Infrastructure only
|
||||
azd deploy # Code deployment only
|
||||
azd deploy --service backend # Deploy single service
|
||||
|
||||
# Debugging
|
||||
azd show # Show project status
|
||||
az containerapp logs show -n <app> -g <rg> --follow # Stream logs
|
||||
```
|
||||
|
||||
## Reference Files
|
||||
|
||||
- **Bicep patterns**: See [references/bicep-patterns.md](references/bicep-patterns.md) for Container Apps modules
|
||||
- **Troubleshooting**: See [references/troubleshooting.md](references/troubleshooting.md) for common issues
|
||||
- **azure.yaml schema**: See [references/azure-yaml-schema.md](references/azure-yaml-schema.md) for full options
|
||||
|
||||
## Critical Reminders
|
||||
|
||||
1. **Always use `remoteBuild: true`** - Local builds fail on M1/ARM Macs deploying to AMD64
|
||||
2. **Bicep outputs auto-populate .azure/<env>/.env** - Don't manually edit
|
||||
3. **Use `azd env set` for secrets** - Not main.parameters.json defaults
|
||||
4. **Service tags (`azd-service-name`)** - Required for azd to find Container Apps
|
||||
5. **`|| true` in hooks** - Prevent RBAC "already exists" errors from failing deploy
|
||||
349
skills/azure-ai-agents-persistent-dotnet/SKILL.md
Normal file
349
skills/azure-ai-agents-persistent-dotnet/SKILL.md
Normal file
@@ -0,0 +1,349 @@
|
||||
---
|
||||
name: azure-ai-agents-persistent-dotnet
|
||||
description: |
|
||||
Azure AI Agents Persistent SDK for .NET. Low-level SDK for creating and managing AI agents with threads, messages, runs, and tools. Use for agent CRUD, conversation threads, streaming responses, function calling, file search, and code interpreter. Triggers: "PersistentAgentsClient", "persistent agents", "agent threads", "agent runs", "streaming agents", "function calling agents .NET".
|
||||
package: Azure.AI.Agents.Persistent
|
||||
---
|
||||
|
||||
# Azure.AI.Agents.Persistent (.NET)
|
||||
|
||||
Low-level SDK for creating and managing persistent AI agents with threads, messages, runs, and tools.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.Agents.Persistent --prerelease
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.1.0, Preview v1.2.0-beta.8
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
MODEL_DEPLOYMENT_NAME=gpt-4o-mini
|
||||
AZURE_BING_CONNECTION_ID=<bing-connection-resource-id>
|
||||
AZURE_AI_SEARCH_CONNECTION_ID=<search-connection-resource-id>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.AI.Agents.Persistent;
|
||||
using Azure.Identity;
|
||||
|
||||
var projectEndpoint = Environment.GetEnvironmentVariable("PROJECT_ENDPOINT");
|
||||
PersistentAgentsClient client = new(projectEndpoint, new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
PersistentAgentsClient
|
||||
├── Administration → Agent CRUD operations
|
||||
├── Threads → Thread management
|
||||
├── Messages → Message operations
|
||||
├── Runs → Run execution and streaming
|
||||
├── Files → File upload/download
|
||||
└── VectorStores → Vector store management
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Create Agent
|
||||
|
||||
```csharp
|
||||
var modelDeploymentName = Environment.GetEnvironmentVariable("MODEL_DEPLOYMENT_NAME");
|
||||
|
||||
PersistentAgent agent = await client.Administration.CreateAgentAsync(
|
||||
model: modelDeploymentName,
|
||||
name: "Math Tutor",
|
||||
instructions: "You are a personal math tutor. Write and run code to answer math questions.",
|
||||
tools: [new CodeInterpreterToolDefinition()]
|
||||
);
|
||||
```
|
||||
|
||||
### 2. Create Thread and Message
|
||||
|
||||
```csharp
|
||||
// Create thread
|
||||
PersistentAgentThread thread = await client.Threads.CreateThreadAsync();
|
||||
|
||||
// Create message
|
||||
await client.Messages.CreateMessageAsync(
|
||||
thread.Id,
|
||||
MessageRole.User,
|
||||
"I need to solve the equation `3x + 11 = 14`. Can you help me?"
|
||||
);
|
||||
```
|
||||
|
||||
### 3. Run Agent (Polling)
|
||||
|
||||
```csharp
|
||||
// Create run
|
||||
ThreadRun run = await client.Runs.CreateRunAsync(
|
||||
thread.Id,
|
||||
agent.Id,
|
||||
additionalInstructions: "Please address the user as Jane Doe."
|
||||
);
|
||||
|
||||
// Poll for completion
|
||||
do
|
||||
{
|
||||
await Task.Delay(TimeSpan.FromMilliseconds(500));
|
||||
run = await client.Runs.GetRunAsync(thread.Id, run.Id);
|
||||
}
|
||||
while (run.Status == RunStatus.Queued || run.Status == RunStatus.InProgress);
|
||||
|
||||
// Retrieve messages
|
||||
await foreach (PersistentThreadMessage message in client.Messages.GetMessagesAsync(
|
||||
threadId: thread.Id,
|
||||
order: ListSortOrder.Ascending))
|
||||
{
|
||||
Console.Write($"{message.Role}: ");
|
||||
foreach (MessageContent content in message.ContentItems)
|
||||
{
|
||||
if (content is MessageTextContent textContent)
|
||||
Console.WriteLine(textContent.Text);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Streaming Response
|
||||
|
||||
```csharp
|
||||
AsyncCollectionResult<StreamingUpdate> stream = client.Runs.CreateRunStreamingAsync(
|
||||
thread.Id,
|
||||
agent.Id
|
||||
);
|
||||
|
||||
await foreach (StreamingUpdate update in stream)
|
||||
{
|
||||
if (update.UpdateKind == StreamingUpdateReason.RunCreated)
|
||||
{
|
||||
Console.WriteLine("--- Run started! ---");
|
||||
}
|
||||
else if (update is MessageContentUpdate contentUpdate)
|
||||
{
|
||||
Console.Write(contentUpdate.Text);
|
||||
}
|
||||
else if (update.UpdateKind == StreamingUpdateReason.RunCompleted)
|
||||
{
|
||||
Console.WriteLine("\n--- Run completed! ---");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Function Calling
|
||||
|
||||
```csharp
|
||||
// Define function tool
|
||||
FunctionToolDefinition weatherTool = new(
|
||||
name: "getCurrentWeather",
|
||||
description: "Gets the current weather at a location.",
|
||||
parameters: BinaryData.FromObjectAsJson(new
|
||||
{
|
||||
Type = "object",
|
||||
Properties = new
|
||||
{
|
||||
Location = new { Type = "string", Description = "City and state, e.g. San Francisco, CA" },
|
||||
Unit = new { Type = "string", Enum = new[] { "c", "f" } }
|
||||
},
|
||||
Required = new[] { "location" }
|
||||
}, new JsonSerializerOptions { PropertyNamingPolicy = JsonNamingPolicy.CamelCase })
|
||||
);
|
||||
|
||||
// Create agent with function
|
||||
PersistentAgent agent = await client.Administration.CreateAgentAsync(
|
||||
model: modelDeploymentName,
|
||||
name: "Weather Bot",
|
||||
instructions: "You are a weather bot.",
|
||||
tools: [weatherTool]
|
||||
);
|
||||
|
||||
// Handle function calls during polling
|
||||
do
|
||||
{
|
||||
await Task.Delay(500);
|
||||
run = await client.Runs.GetRunAsync(thread.Id, run.Id);
|
||||
|
||||
if (run.Status == RunStatus.RequiresAction
|
||||
&& run.RequiredAction is SubmitToolOutputsAction submitAction)
|
||||
{
|
||||
List<ToolOutput> outputs = [];
|
||||
foreach (RequiredToolCall toolCall in submitAction.ToolCalls)
|
||||
{
|
||||
if (toolCall is RequiredFunctionToolCall funcCall)
|
||||
{
|
||||
// Execute function and get result
|
||||
string result = ExecuteFunction(funcCall.Name, funcCall.Arguments);
|
||||
outputs.Add(new ToolOutput(toolCall, result));
|
||||
}
|
||||
}
|
||||
run = await client.Runs.SubmitToolOutputsToRunAsync(run, outputs, toolApprovals: null);
|
||||
}
|
||||
}
|
||||
while (run.Status == RunStatus.Queued || run.Status == RunStatus.InProgress);
|
||||
```
|
||||
|
||||
### 6. File Search with Vector Store
|
||||
|
||||
```csharp
|
||||
// Upload file
|
||||
PersistentAgentFileInfo file = await client.Files.UploadFileAsync(
|
||||
filePath: "document.txt",
|
||||
purpose: PersistentAgentFilePurpose.Agents
|
||||
);
|
||||
|
||||
// Create vector store
|
||||
PersistentAgentsVectorStore vectorStore = await client.VectorStores.CreateVectorStoreAsync(
|
||||
fileIds: [file.Id],
|
||||
name: "my_vector_store"
|
||||
);
|
||||
|
||||
// Create file search resource
|
||||
FileSearchToolResource fileSearchResource = new();
|
||||
fileSearchResource.VectorStoreIds.Add(vectorStore.Id);
|
||||
|
||||
// Create agent with file search
|
||||
PersistentAgent agent = await client.Administration.CreateAgentAsync(
|
||||
model: modelDeploymentName,
|
||||
name: "Document Assistant",
|
||||
instructions: "You help users find information in documents.",
|
||||
tools: [new FileSearchToolDefinition()],
|
||||
toolResources: new ToolResources { FileSearch = fileSearchResource }
|
||||
);
|
||||
```
|
||||
|
||||
### 7. Bing Grounding
|
||||
|
||||
```csharp
|
||||
var bingConnectionId = Environment.GetEnvironmentVariable("AZURE_BING_CONNECTION_ID");
|
||||
|
||||
BingGroundingToolDefinition bingTool = new(
|
||||
new BingGroundingSearchToolParameters(
|
||||
[new BingGroundingSearchConfiguration(bingConnectionId)]
|
||||
)
|
||||
);
|
||||
|
||||
PersistentAgent agent = await client.Administration.CreateAgentAsync(
|
||||
model: modelDeploymentName,
|
||||
name: "Search Agent",
|
||||
instructions: "Use Bing to answer questions about current events.",
|
||||
tools: [bingTool]
|
||||
);
|
||||
```
|
||||
|
||||
### 8. Azure AI Search
|
||||
|
||||
```csharp
|
||||
AzureAISearchToolResource searchResource = new(
|
||||
connectionId: searchConnectionId,
|
||||
indexName: "my_index",
|
||||
topK: 5,
|
||||
filter: "category eq 'documentation'",
|
||||
queryType: AzureAISearchQueryType.Simple
|
||||
);
|
||||
|
||||
PersistentAgent agent = await client.Administration.CreateAgentAsync(
|
||||
model: modelDeploymentName,
|
||||
name: "Search Agent",
|
||||
instructions: "Search the documentation index to answer questions.",
|
||||
tools: [new AzureAISearchToolDefinition()],
|
||||
toolResources: new ToolResources { AzureAISearch = searchResource }
|
||||
);
|
||||
```
|
||||
|
||||
### 9. Cleanup
|
||||
|
||||
```csharp
|
||||
await client.Threads.DeleteThreadAsync(thread.Id);
|
||||
await client.Administration.DeleteAgentAsync(agent.Id);
|
||||
await client.VectorStores.DeleteVectorStoreAsync(vectorStore.Id);
|
||||
await client.Files.DeleteFileAsync(file.Id);
|
||||
```
|
||||
|
||||
## Available Tools
|
||||
|
||||
| Tool | Class | Purpose |
|
||||
|------|-------|---------|
|
||||
| Code Interpreter | `CodeInterpreterToolDefinition` | Execute Python code, generate visualizations |
|
||||
| File Search | `FileSearchToolDefinition` | Search uploaded files via vector stores |
|
||||
| Function Calling | `FunctionToolDefinition` | Call custom functions |
|
||||
| Bing Grounding | `BingGroundingToolDefinition` | Web search via Bing |
|
||||
| Azure AI Search | `AzureAISearchToolDefinition` | Search Azure AI Search indexes |
|
||||
| OpenAPI | `OpenApiToolDefinition` | Call external APIs via OpenAPI spec |
|
||||
| Azure Functions | `AzureFunctionToolDefinition` | Invoke Azure Functions |
|
||||
| MCP | `MCPToolDefinition` | Model Context Protocol tools |
|
||||
| SharePoint | `SharepointToolDefinition` | Access SharePoint content |
|
||||
| Microsoft Fabric | `MicrosoftFabricToolDefinition` | Access Fabric data |
|
||||
|
||||
## Streaming Update Types
|
||||
|
||||
| Update Type | Description |
|
||||
|-------------|-------------|
|
||||
| `StreamingUpdateReason.RunCreated` | Run started |
|
||||
| `StreamingUpdateReason.RunInProgress` | Run processing |
|
||||
| `StreamingUpdateReason.RunCompleted` | Run finished |
|
||||
| `StreamingUpdateReason.RunFailed` | Run errored |
|
||||
| `MessageContentUpdate` | Text content chunk |
|
||||
| `RunStepUpdate` | Step status change |
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `PersistentAgentsClient` | Main entry point |
|
||||
| `PersistentAgent` | Agent with model, instructions, tools |
|
||||
| `PersistentAgentThread` | Conversation thread |
|
||||
| `PersistentThreadMessage` | Message in thread |
|
||||
| `ThreadRun` | Execution of agent against thread |
|
||||
| `RunStatus` | Queued, InProgress, RequiresAction, Completed, Failed |
|
||||
| `ToolResources` | Combined tool resources |
|
||||
| `ToolOutput` | Function call response |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always dispose clients** — Use `using` statements or explicit disposal
|
||||
2. **Poll with appropriate delays** — 500ms recommended between status checks
|
||||
3. **Clean up resources** — Delete threads and agents when done
|
||||
4. **Handle all run statuses** — Check for `RequiresAction`, `Failed`, `Cancelled`
|
||||
5. **Use streaming for real-time UX** — Better user experience than polling
|
||||
6. **Store IDs not objects** — Reference agents/threads by ID
|
||||
7. **Use async methods** — All operations should be async
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var agent = await client.Administration.CreateAgentAsync(...);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 404)
|
||||
{
|
||||
Console.WriteLine("Resource not found");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.Agents.Persistent` | Low-level agents (this SDK) | `dotnet add package Azure.AI.Agents.Persistent` |
|
||||
| `Azure.AI.Projects` | High-level project client | `dotnet add package Azure.AI.Projects` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.Agents.Persistent |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.agents.persistent |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.Agents.Persistent |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.Agents.Persistent/samples |
|
||||
137
skills/azure-ai-agents-persistent-java/SKILL.md
Normal file
137
skills/azure-ai-agents-persistent-java/SKILL.md
Normal file
@@ -0,0 +1,137 @@
|
||||
---
|
||||
name: azure-ai-agents-persistent-java
|
||||
description: |
|
||||
Azure AI Agents Persistent SDK for Java. Low-level SDK for creating and managing AI agents with threads, messages, runs, and tools.
|
||||
Triggers: "PersistentAgentsClient", "persistent agents java", "agent threads java", "agent runs java", "streaming agents java".
|
||||
package: com.azure:azure-ai-agents-persistent
|
||||
---
|
||||
|
||||
# Azure AI Agents Persistent SDK for Java
|
||||
|
||||
Low-level SDK for creating and managing persistent AI agents with threads, messages, runs, and tools.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-agents-persistent</artifactId>
|
||||
<version>1.0.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
MODEL_DEPLOYMENT_NAME=gpt-4o-mini
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```java
|
||||
import com.azure.ai.agents.persistent.PersistentAgentsClient;
|
||||
import com.azure.ai.agents.persistent.PersistentAgentsClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
String endpoint = System.getenv("PROJECT_ENDPOINT");
|
||||
PersistentAgentsClient client = new PersistentAgentsClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
The Azure AI Agents Persistent SDK provides a low-level API for managing persistent agents that can be reused across sessions.
|
||||
|
||||
### Client Hierarchy
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `PersistentAgentsClient` | Sync client for agent operations |
|
||||
| `PersistentAgentsAsyncClient` | Async client for agent operations |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Create Agent
|
||||
|
||||
```java
|
||||
// Create agent with tools
|
||||
PersistentAgent agent = client.createAgent(
|
||||
modelDeploymentName,
|
||||
"Math Tutor",
|
||||
"You are a personal math tutor."
|
||||
);
|
||||
```
|
||||
|
||||
### 2. Create Thread
|
||||
|
||||
```java
|
||||
PersistentAgentThread thread = client.createThread();
|
||||
```
|
||||
|
||||
### 3. Add Message
|
||||
|
||||
```java
|
||||
client.createMessage(
|
||||
thread.getId(),
|
||||
MessageRole.USER,
|
||||
"I need help with equations."
|
||||
);
|
||||
```
|
||||
|
||||
### 4. Run Agent
|
||||
|
||||
```java
|
||||
ThreadRun run = client.createRun(thread.getId(), agent.getId());
|
||||
|
||||
// Poll for completion
|
||||
while (run.getStatus() == RunStatus.QUEUED || run.getStatus() == RunStatus.IN_PROGRESS) {
|
||||
Thread.sleep(500);
|
||||
run = client.getRun(thread.getId(), run.getId());
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Get Response
|
||||
|
||||
```java
|
||||
PagedIterable<PersistentThreadMessage> messages = client.listMessages(thread.getId());
|
||||
for (PersistentThreadMessage message : messages) {
|
||||
System.out.println(message.getRole() + ": " + message.getContent());
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Cleanup
|
||||
|
||||
```java
|
||||
client.deleteThread(thread.getId());
|
||||
client.deleteAgent(agent.getId());
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for production authentication
|
||||
2. **Poll with appropriate delays** — 500ms recommended between status checks
|
||||
3. **Clean up resources** — Delete threads and agents when done
|
||||
4. **Handle all run statuses** — Check for RequiresAction, Failed, Cancelled
|
||||
5. **Use async client** for better throughput in high-concurrency scenarios
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
PersistentAgent agent = client.createAgent(modelName, name, instructions);
|
||||
} catch (HttpResponseException e) {
|
||||
System.err.println("Error: " + e.getResponse().getStatusCode() + " - " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-ai-agents-persistent |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-agents-persistent |
|
||||
256
skills/azure-ai-anomalydetector-java/SKILL.md
Normal file
256
skills/azure-ai-anomalydetector-java/SKILL.md
Normal file
@@ -0,0 +1,256 @@
|
||||
---
|
||||
name: azure-ai-anomalydetector-java
|
||||
description: Build anomaly detection applications with Azure AI Anomaly Detector SDK for Java. Use when implementing univariate/multivariate anomaly detection, time-series analysis, or AI-powered monitoring.
|
||||
package: com.azure:azure-ai-anomalydetector
|
||||
---
|
||||
|
||||
# Azure AI Anomaly Detector SDK for Java
|
||||
|
||||
Build anomaly detection applications using the Azure AI Anomaly Detector SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-anomalydetector</artifactId>
|
||||
<version>3.0.0-beta.6</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### Sync and Async Clients
|
||||
|
||||
```java
|
||||
import com.azure.ai.anomalydetector.AnomalyDetectorClientBuilder;
|
||||
import com.azure.ai.anomalydetector.MultivariateClient;
|
||||
import com.azure.ai.anomalydetector.UnivariateClient;
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
String endpoint = System.getenv("AZURE_ANOMALY_DETECTOR_ENDPOINT");
|
||||
String key = System.getenv("AZURE_ANOMALY_DETECTOR_API_KEY");
|
||||
|
||||
// Multivariate client for multiple correlated signals
|
||||
MultivariateClient multivariateClient = new AnomalyDetectorClientBuilder()
|
||||
.credential(new AzureKeyCredential(key))
|
||||
.endpoint(endpoint)
|
||||
.buildMultivariateClient();
|
||||
|
||||
// Univariate client for single variable analysis
|
||||
UnivariateClient univariateClient = new AnomalyDetectorClientBuilder()
|
||||
.credential(new AzureKeyCredential(key))
|
||||
.endpoint(endpoint)
|
||||
.buildUnivariateClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
MultivariateClient client = new AnomalyDetectorClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(endpoint)
|
||||
.buildMultivariateClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### Univariate Anomaly Detection
|
||||
- **Batch Detection**: Analyze entire time series at once
|
||||
- **Streaming Detection**: Real-time detection on latest data point
|
||||
- **Change Point Detection**: Detect trend changes in time series
|
||||
|
||||
### Multivariate Anomaly Detection
|
||||
- Detect anomalies across 300+ correlated signals
|
||||
- Uses Graph Attention Network for inter-correlations
|
||||
- Three-step process: Train → Inference → Results
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Univariate Batch Detection
|
||||
|
||||
```java
|
||||
import com.azure.ai.anomalydetector.models.*;
|
||||
import java.time.OffsetDateTime;
|
||||
import java.util.List;
|
||||
|
||||
List<TimeSeriesPoint> series = List.of(
|
||||
new TimeSeriesPoint(OffsetDateTime.parse("2023-01-01T00:00:00Z"), 1.0),
|
||||
new TimeSeriesPoint(OffsetDateTime.parse("2023-01-02T00:00:00Z"), 2.5),
|
||||
// ... more data points (minimum 12 points required)
|
||||
);
|
||||
|
||||
UnivariateDetectionOptions options = new UnivariateDetectionOptions(series)
|
||||
.setGranularity(TimeGranularity.DAILY)
|
||||
.setSensitivity(95);
|
||||
|
||||
UnivariateEntireDetectionResult result = univariateClient.detectUnivariateEntireSeries(options);
|
||||
|
||||
// Check for anomalies
|
||||
for (int i = 0; i < result.getIsAnomaly().size(); i++) {
|
||||
if (result.getIsAnomaly().get(i)) {
|
||||
System.out.printf("Anomaly detected at index %d with value %.2f%n",
|
||||
i, series.get(i).getValue());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Univariate Last Point Detection (Streaming)
|
||||
|
||||
```java
|
||||
UnivariateLastDetectionResult lastResult = univariateClient.detectUnivariateLastPoint(options);
|
||||
|
||||
if (lastResult.isAnomaly()) {
|
||||
System.out.println("Latest point is an anomaly!");
|
||||
System.out.printf("Expected: %.2f, Upper: %.2f, Lower: %.2f%n",
|
||||
lastResult.getExpectedValue(),
|
||||
lastResult.getUpperMargin(),
|
||||
lastResult.getLowerMargin());
|
||||
}
|
||||
```
|
||||
|
||||
### Change Point Detection
|
||||
|
||||
```java
|
||||
UnivariateChangePointDetectionOptions changeOptions =
|
||||
new UnivariateChangePointDetectionOptions(series, TimeGranularity.DAILY);
|
||||
|
||||
UnivariateChangePointDetectionResult changeResult =
|
||||
univariateClient.detectUnivariateChangePoint(changeOptions);
|
||||
|
||||
for (int i = 0; i < changeResult.getIsChangePoint().size(); i++) {
|
||||
if (changeResult.getIsChangePoint().get(i)) {
|
||||
System.out.printf("Change point at index %d with confidence %.2f%n",
|
||||
i, changeResult.getConfidenceScores().get(i));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Multivariate Model Training
|
||||
|
||||
```java
|
||||
import com.azure.ai.anomalydetector.models.*;
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
|
||||
// Prepare training request with blob storage data
|
||||
ModelInfo modelInfo = new ModelInfo()
|
||||
.setDataSource("https://storage.blob.core.windows.net/container/data.zip?sasToken")
|
||||
.setStartTime(OffsetDateTime.parse("2023-01-01T00:00:00Z"))
|
||||
.setEndTime(OffsetDateTime.parse("2023-06-01T00:00:00Z"))
|
||||
.setSlidingWindow(200)
|
||||
.setDisplayName("MyMultivariateModel");
|
||||
|
||||
// Train model (long-running operation)
|
||||
AnomalyDetectionModel trainedModel = multivariateClient.trainMultivariateModel(modelInfo);
|
||||
|
||||
String modelId = trainedModel.getModelId();
|
||||
System.out.println("Model ID: " + modelId);
|
||||
|
||||
// Check training status
|
||||
AnomalyDetectionModel model = multivariateClient.getMultivariateModel(modelId);
|
||||
System.out.println("Status: " + model.getModelInfo().getStatus());
|
||||
```
|
||||
|
||||
### Multivariate Batch Inference
|
||||
|
||||
```java
|
||||
MultivariateBatchDetectionOptions detectionOptions = new MultivariateBatchDetectionOptions()
|
||||
.setDataSource("https://storage.blob.core.windows.net/container/inference-data.zip?sasToken")
|
||||
.setStartTime(OffsetDateTime.parse("2023-07-01T00:00:00Z"))
|
||||
.setEndTime(OffsetDateTime.parse("2023-07-31T00:00:00Z"))
|
||||
.setTopContributorCount(10);
|
||||
|
||||
MultivariateDetectionResult detectionResult =
|
||||
multivariateClient.detectMultivariateBatchAnomaly(modelId, detectionOptions);
|
||||
|
||||
String resultId = detectionResult.getResultId();
|
||||
|
||||
// Poll for results
|
||||
MultivariateDetectionResult result = multivariateClient.getBatchDetectionResult(resultId);
|
||||
for (AnomalyState state : result.getResults()) {
|
||||
if (state.getValue().isAnomaly()) {
|
||||
System.out.printf("Anomaly at %s, severity: %.2f%n",
|
||||
state.getTimestamp(),
|
||||
state.getValue().getSeverity());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Multivariate Last Point Detection
|
||||
|
||||
```java
|
||||
MultivariateLastDetectionOptions lastOptions = new MultivariateLastDetectionOptions()
|
||||
.setVariables(List.of(
|
||||
new VariableValues("variable1", List.of("timestamp1"), List.of(1.0f)),
|
||||
new VariableValues("variable2", List.of("timestamp1"), List.of(2.5f))
|
||||
))
|
||||
.setTopContributorCount(5);
|
||||
|
||||
MultivariateLastDetectionResult lastResult =
|
||||
multivariateClient.detectMultivariateLastAnomaly(modelId, lastOptions);
|
||||
|
||||
if (lastResult.getValue().isAnomaly()) {
|
||||
System.out.println("Anomaly detected!");
|
||||
// Check contributing variables
|
||||
for (AnomalyContributor contributor : lastResult.getValue().getInterpretation()) {
|
||||
System.out.printf("Variable: %s, Contribution: %.2f%n",
|
||||
contributor.getVariable(),
|
||||
contributor.getContributionScore());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Model Management
|
||||
|
||||
```java
|
||||
// List all models
|
||||
PagedIterable<AnomalyDetectionModel> models = multivariateClient.listMultivariateModels();
|
||||
for (AnomalyDetectionModel m : models) {
|
||||
System.out.printf("Model: %s, Status: %s%n",
|
||||
m.getModelId(),
|
||||
m.getModelInfo().getStatus());
|
||||
}
|
||||
|
||||
// Delete a model
|
||||
multivariateClient.deleteMultivariateModel(modelId);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
univariateClient.detectUnivariateEntireSeries(options);
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status code: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_ANOMALY_DETECTOR_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
AZURE_ANOMALY_DETECTOR_API_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Minimum Data Points**: Univariate requires at least 12 points; more data improves accuracy
|
||||
2. **Granularity Alignment**: Match `TimeGranularity` to your actual data frequency
|
||||
3. **Sensitivity Tuning**: Higher values (0-99) detect more anomalies
|
||||
4. **Multivariate Training**: Use 200-1000 sliding window based on pattern complexity
|
||||
5. **Error Handling**: Always handle `HttpResponseException` for API errors
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "anomaly detection Java"
|
||||
- "detect anomalies time series"
|
||||
- "multivariate anomaly Java"
|
||||
- "univariate anomaly detection"
|
||||
- "streaming anomaly detection"
|
||||
- "change point detection"
|
||||
- "Azure AI Anomaly Detector"
|
||||
282
skills/azure-ai-contentsafety-java/SKILL.md
Normal file
282
skills/azure-ai-contentsafety-java/SKILL.md
Normal file
@@ -0,0 +1,282 @@
|
||||
---
|
||||
name: azure-ai-contentsafety-java
|
||||
description: Build content moderation applications with Azure AI Content Safety SDK for Java. Use when implementing text/image analysis, blocklist management, or harm detection for hate, violence, sexual content, and self-harm.
|
||||
package: com.azure:azure-ai-contentsafety
|
||||
---
|
||||
|
||||
# Azure AI Content Safety SDK for Java
|
||||
|
||||
Build content moderation applications using the Azure AI Content Safety SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-contentsafety</artifactId>
|
||||
<version>1.1.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With API Key
|
||||
|
||||
```java
|
||||
import com.azure.ai.contentsafety.ContentSafetyClient;
|
||||
import com.azure.ai.contentsafety.ContentSafetyClientBuilder;
|
||||
import com.azure.ai.contentsafety.BlocklistClient;
|
||||
import com.azure.ai.contentsafety.BlocklistClientBuilder;
|
||||
import com.azure.core.credential.KeyCredential;
|
||||
|
||||
String endpoint = System.getenv("CONTENT_SAFETY_ENDPOINT");
|
||||
String key = System.getenv("CONTENT_SAFETY_KEY");
|
||||
|
||||
ContentSafetyClient contentSafetyClient = new ContentSafetyClientBuilder()
|
||||
.credential(new KeyCredential(key))
|
||||
.endpoint(endpoint)
|
||||
.buildClient();
|
||||
|
||||
BlocklistClient blocklistClient = new BlocklistClientBuilder()
|
||||
.credential(new KeyCredential(key))
|
||||
.endpoint(endpoint)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
ContentSafetyClient client = new ContentSafetyClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(endpoint)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### Harm Categories
|
||||
| Category | Description |
|
||||
|----------|-------------|
|
||||
| Hate | Discriminatory language based on identity groups |
|
||||
| Sexual | Sexual content, relationships, acts |
|
||||
| Violence | Physical harm, weapons, injury |
|
||||
| Self-harm | Self-injury, suicide-related content |
|
||||
|
||||
### Severity Levels
|
||||
- Text: 0-7 scale (default outputs 0, 2, 4, 6)
|
||||
- Image: 0, 2, 4, 6 (trimmed scale)
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Analyze Text
|
||||
|
||||
```java
|
||||
import com.azure.ai.contentsafety.models.*;
|
||||
|
||||
AnalyzeTextResult result = contentSafetyClient.analyzeText(
|
||||
new AnalyzeTextOptions("This is text to analyze"));
|
||||
|
||||
for (TextCategoriesAnalysis category : result.getCategoriesAnalysis()) {
|
||||
System.out.printf("Category: %s, Severity: %d%n",
|
||||
category.getCategory(),
|
||||
category.getSeverity());
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze Text with Options
|
||||
|
||||
```java
|
||||
AnalyzeTextOptions options = new AnalyzeTextOptions("Text to analyze")
|
||||
.setCategories(Arrays.asList(
|
||||
TextCategory.HATE,
|
||||
TextCategory.VIOLENCE))
|
||||
.setOutputType(AnalyzeTextOutputType.EIGHT_SEVERITY_LEVELS);
|
||||
|
||||
AnalyzeTextResult result = contentSafetyClient.analyzeText(options);
|
||||
```
|
||||
|
||||
### Analyze Text with Blocklist
|
||||
|
||||
```java
|
||||
AnalyzeTextOptions options = new AnalyzeTextOptions("I h*te you and want to k*ll you")
|
||||
.setBlocklistNames(Arrays.asList("my-blocklist"))
|
||||
.setHaltOnBlocklistHit(true);
|
||||
|
||||
AnalyzeTextResult result = contentSafetyClient.analyzeText(options);
|
||||
|
||||
if (result.getBlocklistsMatch() != null) {
|
||||
for (TextBlocklistMatch match : result.getBlocklistsMatch()) {
|
||||
System.out.printf("Blocklist: %s, Item: %s, Text: %s%n",
|
||||
match.getBlocklistName(),
|
||||
match.getBlocklistItemId(),
|
||||
match.getBlocklistItemText());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze Image
|
||||
|
||||
```java
|
||||
import com.azure.ai.contentsafety.models.*;
|
||||
import com.azure.core.util.BinaryData;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Paths;
|
||||
|
||||
// From file
|
||||
byte[] imageBytes = Files.readAllBytes(Paths.get("image.png"));
|
||||
ContentSafetyImageData imageData = new ContentSafetyImageData()
|
||||
.setContent(BinaryData.fromBytes(imageBytes));
|
||||
|
||||
AnalyzeImageResult result = contentSafetyClient.analyzeImage(
|
||||
new AnalyzeImageOptions(imageData));
|
||||
|
||||
for (ImageCategoriesAnalysis category : result.getCategoriesAnalysis()) {
|
||||
System.out.printf("Category: %s, Severity: %d%n",
|
||||
category.getCategory(),
|
||||
category.getSeverity());
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze Image from URL
|
||||
|
||||
```java
|
||||
ContentSafetyImageData imageData = new ContentSafetyImageData()
|
||||
.setBlobUrl("https://example.com/image.jpg");
|
||||
|
||||
AnalyzeImageResult result = contentSafetyClient.analyzeImage(
|
||||
new AnalyzeImageOptions(imageData));
|
||||
```
|
||||
|
||||
## Blocklist Management
|
||||
|
||||
### Create or Update Blocklist
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.RequestOptions;
|
||||
import com.azure.core.http.rest.Response;
|
||||
import com.azure.core.util.BinaryData;
|
||||
import java.util.Map;
|
||||
|
||||
Map<String, String> description = Map.of("description", "Custom blocklist");
|
||||
BinaryData resource = BinaryData.fromObject(description);
|
||||
|
||||
Response<BinaryData> response = blocklistClient.createOrUpdateTextBlocklistWithResponse(
|
||||
"my-blocklist", resource, new RequestOptions());
|
||||
|
||||
if (response.getStatusCode() == 201) {
|
||||
System.out.println("Blocklist created");
|
||||
} else if (response.getStatusCode() == 200) {
|
||||
System.out.println("Blocklist updated");
|
||||
}
|
||||
```
|
||||
|
||||
### Add Block Items
|
||||
|
||||
```java
|
||||
import com.azure.ai.contentsafety.models.*;
|
||||
import java.util.Arrays;
|
||||
|
||||
List<TextBlocklistItem> items = Arrays.asList(
|
||||
new TextBlocklistItem("badword1").setDescription("Offensive term"),
|
||||
new TextBlocklistItem("badword2").setDescription("Another term")
|
||||
);
|
||||
|
||||
AddOrUpdateTextBlocklistItemsResult result = blocklistClient.addOrUpdateBlocklistItems(
|
||||
"my-blocklist",
|
||||
new AddOrUpdateTextBlocklistItemsOptions(items));
|
||||
|
||||
for (TextBlocklistItem item : result.getBlocklistItems()) {
|
||||
System.out.printf("Added: %s (ID: %s)%n",
|
||||
item.getText(),
|
||||
item.getBlocklistItemId());
|
||||
}
|
||||
```
|
||||
|
||||
### List Blocklists
|
||||
|
||||
```java
|
||||
PagedIterable<TextBlocklist> blocklists = blocklistClient.listTextBlocklists();
|
||||
|
||||
for (TextBlocklist blocklist : blocklists) {
|
||||
System.out.printf("Blocklist: %s, Description: %s%n",
|
||||
blocklist.getName(),
|
||||
blocklist.getDescription());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Blocklist
|
||||
|
||||
```java
|
||||
TextBlocklist blocklist = blocklistClient.getTextBlocklist("my-blocklist");
|
||||
System.out.println("Name: " + blocklist.getName());
|
||||
```
|
||||
|
||||
### List Block Items
|
||||
|
||||
```java
|
||||
PagedIterable<TextBlocklistItem> items =
|
||||
blocklistClient.listTextBlocklistItems("my-blocklist");
|
||||
|
||||
for (TextBlocklistItem item : items) {
|
||||
System.out.printf("ID: %s, Text: %s%n",
|
||||
item.getBlocklistItemId(),
|
||||
item.getText());
|
||||
}
|
||||
```
|
||||
|
||||
### Remove Block Items
|
||||
|
||||
```java
|
||||
List<String> itemIds = Arrays.asList("item-id-1", "item-id-2");
|
||||
|
||||
blocklistClient.removeBlocklistItems(
|
||||
"my-blocklist",
|
||||
new RemoveTextBlocklistItemsOptions(itemIds));
|
||||
```
|
||||
|
||||
### Delete Blocklist
|
||||
|
||||
```java
|
||||
blocklistClient.deleteTextBlocklist("my-blocklist");
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
contentSafetyClient.analyzeText(new AnalyzeTextOptions("test"));
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
// Common codes: InvalidRequestBody, ResourceNotFound, TooManyRequests
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
CONTENT_SAFETY_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
CONTENT_SAFETY_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Blocklist Delay**: Changes take ~5 minutes to take effect
|
||||
2. **Category Selection**: Only request needed categories to reduce latency
|
||||
3. **Severity Thresholds**: Typically block severity >= 4 for strict moderation
|
||||
4. **Batch Processing**: Process multiple items in parallel for throughput
|
||||
5. **Caching**: Cache blocklist results where appropriate
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "content safety Java"
|
||||
- "content moderation Azure"
|
||||
- "analyze text safety"
|
||||
- "image moderation Java"
|
||||
- "blocklist management"
|
||||
- "hate speech detection"
|
||||
- "harmful content filter"
|
||||
214
skills/azure-ai-contentsafety-py/SKILL.md
Normal file
214
skills/azure-ai-contentsafety-py/SKILL.md
Normal file
@@ -0,0 +1,214 @@
|
||||
---
|
||||
name: azure-ai-contentsafety-py
|
||||
description: |
|
||||
Azure AI Content Safety SDK for Python. Use for detecting harmful content in text and images with multi-severity classification.
|
||||
Triggers: "azure-ai-contentsafety", "ContentSafetyClient", "content moderation", "harmful content", "text analysis", "image analysis".
|
||||
package: azure-ai-contentsafety
|
||||
---
|
||||
|
||||
# Azure AI Content Safety SDK for Python
|
||||
|
||||
Detect harmful user-generated and AI-generated content in applications.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-contentsafety
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
CONTENT_SAFETY_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
CONTENT_SAFETY_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import ContentSafetyClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
import os
|
||||
|
||||
client = ContentSafetyClient(
|
||||
endpoint=os.environ["CONTENT_SAFETY_ENDPOINT"],
|
||||
credential=AzureKeyCredential(os.environ["CONTENT_SAFETY_KEY"])
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import ContentSafetyClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = ContentSafetyClient(
|
||||
endpoint=os.environ["CONTENT_SAFETY_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Analyze Text
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import ContentSafetyClient
|
||||
from azure.ai.contentsafety.models import AnalyzeTextOptions, TextCategory
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
client = ContentSafetyClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
request = AnalyzeTextOptions(text="Your text content to analyze")
|
||||
response = client.analyze_text(request)
|
||||
|
||||
# Check each category
|
||||
for category in [TextCategory.HATE, TextCategory.SELF_HARM,
|
||||
TextCategory.SEXUAL, TextCategory.VIOLENCE]:
|
||||
result = next((r for r in response.categories_analysis
|
||||
if r.category == category), None)
|
||||
if result:
|
||||
print(f"{category}: severity {result.severity}")
|
||||
```
|
||||
|
||||
## Analyze Image
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import ContentSafetyClient
|
||||
from azure.ai.contentsafety.models import AnalyzeImageOptions, ImageData
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
import base64
|
||||
|
||||
client = ContentSafetyClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
# From file
|
||||
with open("image.jpg", "rb") as f:
|
||||
image_data = base64.b64encode(f.read()).decode("utf-8")
|
||||
|
||||
request = AnalyzeImageOptions(
|
||||
image=ImageData(content=image_data)
|
||||
)
|
||||
|
||||
response = client.analyze_image(request)
|
||||
|
||||
for result in response.categories_analysis:
|
||||
print(f"{result.category}: severity {result.severity}")
|
||||
```
|
||||
|
||||
### Image from URL
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety.models import AnalyzeImageOptions, ImageData
|
||||
|
||||
request = AnalyzeImageOptions(
|
||||
image=ImageData(blob_url="https://example.com/image.jpg")
|
||||
)
|
||||
|
||||
response = client.analyze_image(request)
|
||||
```
|
||||
|
||||
## Text Blocklist Management
|
||||
|
||||
### Create Blocklist
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety import BlocklistClient
|
||||
from azure.ai.contentsafety.models import TextBlocklist
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
blocklist_client = BlocklistClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
blocklist = TextBlocklist(
|
||||
blocklist_name="my-blocklist",
|
||||
description="Custom terms to block"
|
||||
)
|
||||
|
||||
result = blocklist_client.create_or_update_text_blocklist(
|
||||
blocklist_name="my-blocklist",
|
||||
options=blocklist
|
||||
)
|
||||
```
|
||||
|
||||
### Add Block Items
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety.models import AddOrUpdateTextBlocklistItemsOptions, TextBlocklistItem
|
||||
|
||||
items = AddOrUpdateTextBlocklistItemsOptions(
|
||||
blocklist_items=[
|
||||
TextBlocklistItem(text="blocked-term-1"),
|
||||
TextBlocklistItem(text="blocked-term-2")
|
||||
]
|
||||
)
|
||||
|
||||
result = blocklist_client.add_or_update_blocklist_items(
|
||||
blocklist_name="my-blocklist",
|
||||
options=items
|
||||
)
|
||||
```
|
||||
|
||||
### Analyze with Blocklist
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety.models import AnalyzeTextOptions
|
||||
|
||||
request = AnalyzeTextOptions(
|
||||
text="Text containing blocked-term-1",
|
||||
blocklist_names=["my-blocklist"],
|
||||
halt_on_blocklist_hit=True
|
||||
)
|
||||
|
||||
response = client.analyze_text(request)
|
||||
|
||||
if response.blocklists_match:
|
||||
for match in response.blocklists_match:
|
||||
print(f"Blocked: {match.blocklist_item_text}")
|
||||
```
|
||||
|
||||
## Severity Levels
|
||||
|
||||
Text analysis returns 4 severity levels (0, 2, 4, 6) by default. For 8 levels (0-7):
|
||||
|
||||
```python
|
||||
from azure.ai.contentsafety.models import AnalyzeTextOptions, AnalyzeTextOutputType
|
||||
|
||||
request = AnalyzeTextOptions(
|
||||
text="Your text",
|
||||
output_type=AnalyzeTextOutputType.EIGHT_SEVERITY_LEVELS
|
||||
)
|
||||
```
|
||||
|
||||
## Harm Categories
|
||||
|
||||
| Category | Description |
|
||||
|----------|-------------|
|
||||
| `Hate` | Attacks based on identity (race, religion, gender, etc.) |
|
||||
| `Sexual` | Sexual content, relationships, anatomy |
|
||||
| `Violence` | Physical harm, weapons, injury |
|
||||
| `SelfHarm` | Self-injury, suicide, eating disorders |
|
||||
|
||||
## Severity Scale
|
||||
|
||||
| Level | Text Range | Image Range | Meaning |
|
||||
|-------|------------|-------------|---------|
|
||||
| 0 | Safe | Safe | No harmful content |
|
||||
| 2 | Low | Low | Mild references |
|
||||
| 4 | Medium | Medium | Moderate content |
|
||||
| 6 | High | High | Severe content |
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `ContentSafetyClient` | Analyze text and images |
|
||||
| `BlocklistClient` | Manage custom blocklists |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use blocklists** for domain-specific terms
|
||||
2. **Set severity thresholds** appropriate for your use case
|
||||
3. **Handle multiple categories** — content can be harmful in multiple ways
|
||||
4. **Use halt_on_blocklist_hit** for immediate rejection
|
||||
5. **Log analysis results** for audit and improvement
|
||||
6. **Consider 8-severity mode** for finer-grained control
|
||||
7. **Pre-moderate AI outputs** before showing to users
|
||||
300
skills/azure-ai-contentsafety-ts/SKILL.md
Normal file
300
skills/azure-ai-contentsafety-ts/SKILL.md
Normal file
@@ -0,0 +1,300 @@
|
||||
---
|
||||
name: azure-ai-contentsafety-ts
|
||||
description: Analyze text and images for harmful content using Azure AI Content Safety (@azure-rest/ai-content-safety). Use when moderating user-generated content, detecting hate speech, violence, sexual content, or self-harm, or managing custom blocklists.
|
||||
package: @azure-rest/ai-content-safety
|
||||
---
|
||||
|
||||
# Azure AI Content Safety REST SDK for TypeScript
|
||||
|
||||
Analyze text and images for harmful content with customizable blocklists.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @azure-rest/ai-content-safety @azure/identity @azure/core-auth
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
CONTENT_SAFETY_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
CONTENT_SAFETY_KEY=<api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
**Important**: This is a REST client. `ContentSafetyClient` is a **function**, not a class.
|
||||
|
||||
### API Key
|
||||
|
||||
```typescript
|
||||
import ContentSafetyClient from "@azure-rest/ai-content-safety";
|
||||
import { AzureKeyCredential } from "@azure/core-auth";
|
||||
|
||||
const client = ContentSafetyClient(
|
||||
process.env.CONTENT_SAFETY_ENDPOINT!,
|
||||
new AzureKeyCredential(process.env.CONTENT_SAFETY_KEY!)
|
||||
);
|
||||
```
|
||||
|
||||
### DefaultAzureCredential
|
||||
|
||||
```typescript
|
||||
import ContentSafetyClient from "@azure-rest/ai-content-safety";
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
const client = ContentSafetyClient(
|
||||
process.env.CONTENT_SAFETY_ENDPOINT!,
|
||||
new DefaultAzureCredential()
|
||||
);
|
||||
```
|
||||
|
||||
## Analyze Text
|
||||
|
||||
```typescript
|
||||
import ContentSafetyClient, { isUnexpected } from "@azure-rest/ai-content-safety";
|
||||
|
||||
const result = await client.path("/text:analyze").post({
|
||||
body: {
|
||||
text: "Text content to analyze",
|
||||
categories: ["Hate", "Sexual", "Violence", "SelfHarm"],
|
||||
outputType: "FourSeverityLevels" // or "EightSeverityLevels"
|
||||
}
|
||||
});
|
||||
|
||||
if (isUnexpected(result)) {
|
||||
throw result.body;
|
||||
}
|
||||
|
||||
for (const analysis of result.body.categoriesAnalysis) {
|
||||
console.log(`${analysis.category}: severity ${analysis.severity}`);
|
||||
}
|
||||
```
|
||||
|
||||
## Analyze Image
|
||||
|
||||
### Base64 Content
|
||||
|
||||
```typescript
|
||||
import { readFileSync } from "node:fs";
|
||||
|
||||
const imageBuffer = readFileSync("./image.png");
|
||||
const base64Image = imageBuffer.toString("base64");
|
||||
|
||||
const result = await client.path("/image:analyze").post({
|
||||
body: {
|
||||
image: { content: base64Image }
|
||||
}
|
||||
});
|
||||
|
||||
if (isUnexpected(result)) {
|
||||
throw result.body;
|
||||
}
|
||||
|
||||
for (const analysis of result.body.categoriesAnalysis) {
|
||||
console.log(`${analysis.category}: severity ${analysis.severity}`);
|
||||
}
|
||||
```
|
||||
|
||||
### Blob URL
|
||||
|
||||
```typescript
|
||||
const result = await client.path("/image:analyze").post({
|
||||
body: {
|
||||
image: { blobUrl: "https://storage.blob.core.windows.net/container/image.png" }
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Blocklist Management
|
||||
|
||||
### Create Blocklist
|
||||
|
||||
```typescript
|
||||
const result = await client
|
||||
.path("/text/blocklists/{blocklistName}", "my-blocklist")
|
||||
.patch({
|
||||
contentType: "application/merge-patch+json",
|
||||
body: {
|
||||
description: "Custom blocklist for prohibited terms"
|
||||
}
|
||||
});
|
||||
|
||||
if (isUnexpected(result)) {
|
||||
throw result.body;
|
||||
}
|
||||
|
||||
console.log(`Created: ${result.body.blocklistName}`);
|
||||
```
|
||||
|
||||
### Add Items to Blocklist
|
||||
|
||||
```typescript
|
||||
const result = await client
|
||||
.path("/text/blocklists/{blocklistName}:addOrUpdateBlocklistItems", "my-blocklist")
|
||||
.post({
|
||||
body: {
|
||||
blocklistItems: [
|
||||
{ text: "prohibited-term-1", description: "First blocked term" },
|
||||
{ text: "prohibited-term-2", description: "Second blocked term" }
|
||||
]
|
||||
}
|
||||
});
|
||||
|
||||
if (isUnexpected(result)) {
|
||||
throw result.body;
|
||||
}
|
||||
|
||||
for (const item of result.body.blocklistItems ?? []) {
|
||||
console.log(`Added: ${item.blocklistItemId}`);
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze with Blocklist
|
||||
|
||||
```typescript
|
||||
const result = await client.path("/text:analyze").post({
|
||||
body: {
|
||||
text: "Text that might contain blocked terms",
|
||||
blocklistNames: ["my-blocklist"],
|
||||
haltOnBlocklistHit: false
|
||||
}
|
||||
});
|
||||
|
||||
if (isUnexpected(result)) {
|
||||
throw result.body;
|
||||
}
|
||||
|
||||
// Check blocklist matches
|
||||
if (result.body.blocklistsMatch) {
|
||||
for (const match of result.body.blocklistsMatch) {
|
||||
console.log(`Blocked: "${match.blocklistItemText}" from ${match.blocklistName}`);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### List Blocklists
|
||||
|
||||
```typescript
|
||||
const result = await client.path("/text/blocklists").get();
|
||||
|
||||
if (isUnexpected(result)) {
|
||||
throw result.body;
|
||||
}
|
||||
|
||||
for (const blocklist of result.body.value ?? []) {
|
||||
console.log(`${blocklist.blocklistName}: ${blocklist.description}`);
|
||||
}
|
||||
```
|
||||
|
||||
### Delete Blocklist
|
||||
|
||||
```typescript
|
||||
await client.path("/text/blocklists/{blocklistName}", "my-blocklist").delete();
|
||||
```
|
||||
|
||||
## Harm Categories
|
||||
|
||||
| Category | API Term | Description |
|
||||
|----------|----------|-------------|
|
||||
| Hate and Fairness | `Hate` | Discriminatory language targeting identity groups |
|
||||
| Sexual | `Sexual` | Sexual content, nudity, pornography |
|
||||
| Violence | `Violence` | Physical harm, weapons, terrorism |
|
||||
| Self-Harm | `SelfHarm` | Self-injury, suicide, eating disorders |
|
||||
|
||||
## Severity Levels
|
||||
|
||||
| Level | Risk | Recommended Action |
|
||||
|-------|------|-------------------|
|
||||
| 0 | Safe | Allow |
|
||||
| 2 | Low | Review or allow with warning |
|
||||
| 4 | Medium | Block or require human review |
|
||||
| 6 | High | Block immediately |
|
||||
|
||||
**Output Types**:
|
||||
- `FourSeverityLevels` (default): Returns 0, 2, 4, 6
|
||||
- `EightSeverityLevels`: Returns 0-7
|
||||
|
||||
## Content Moderation Helper
|
||||
|
||||
```typescript
|
||||
import ContentSafetyClient, {
|
||||
isUnexpected,
|
||||
TextCategoriesAnalysisOutput
|
||||
} from "@azure-rest/ai-content-safety";
|
||||
|
||||
interface ModerationResult {
|
||||
isAllowed: boolean;
|
||||
flaggedCategories: string[];
|
||||
maxSeverity: number;
|
||||
blocklistMatches: string[];
|
||||
}
|
||||
|
||||
async function moderateContent(
|
||||
client: ReturnType<typeof ContentSafetyClient>,
|
||||
text: string,
|
||||
maxAllowedSeverity = 2,
|
||||
blocklistNames: string[] = []
|
||||
): Promise<ModerationResult> {
|
||||
const result = await client.path("/text:analyze").post({
|
||||
body: { text, blocklistNames, haltOnBlocklistHit: false }
|
||||
});
|
||||
|
||||
if (isUnexpected(result)) {
|
||||
throw result.body;
|
||||
}
|
||||
|
||||
const flaggedCategories = result.body.categoriesAnalysis
|
||||
.filter(c => (c.severity ?? 0) > maxAllowedSeverity)
|
||||
.map(c => c.category!);
|
||||
|
||||
const maxSeverity = Math.max(
|
||||
...result.body.categoriesAnalysis.map(c => c.severity ?? 0)
|
||||
);
|
||||
|
||||
const blocklistMatches = (result.body.blocklistsMatch ?? [])
|
||||
.map(m => m.blocklistItemText!);
|
||||
|
||||
return {
|
||||
isAllowed: flaggedCategories.length === 0 && blocklistMatches.length === 0,
|
||||
flaggedCategories,
|
||||
maxSeverity,
|
||||
blocklistMatches
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
| Operation | Method | Path |
|
||||
|-----------|--------|------|
|
||||
| Analyze Text | POST | `/text:analyze` |
|
||||
| Analyze Image | POST | `/image:analyze` |
|
||||
| Create/Update Blocklist | PATCH | `/text/blocklists/{blocklistName}` |
|
||||
| List Blocklists | GET | `/text/blocklists` |
|
||||
| Delete Blocklist | DELETE | `/text/blocklists/{blocklistName}` |
|
||||
| Add Blocklist Items | POST | `/text/blocklists/{blocklistName}:addOrUpdateBlocklistItems` |
|
||||
| List Blocklist Items | GET | `/text/blocklists/{blocklistName}/blocklistItems` |
|
||||
| Remove Blocklist Items | POST | `/text/blocklists/{blocklistName}:removeBlocklistItems` |
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
import ContentSafetyClient, {
|
||||
isUnexpected,
|
||||
AnalyzeTextParameters,
|
||||
AnalyzeImageParameters,
|
||||
TextCategoriesAnalysisOutput,
|
||||
ImageCategoriesAnalysisOutput,
|
||||
TextBlocklist,
|
||||
TextBlocklistItem
|
||||
} from "@azure-rest/ai-content-safety";
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always use isUnexpected()** - Type guard for error handling
|
||||
2. **Set appropriate thresholds** - Different categories may need different severity thresholds
|
||||
3. **Use blocklists for domain-specific terms** - Supplement AI detection with custom rules
|
||||
4. **Log moderation decisions** - Keep audit trail for compliance
|
||||
5. **Handle edge cases** - Empty text, very long text, unsupported image formats
|
||||
273
skills/azure-ai-contentunderstanding-py/SKILL.md
Normal file
273
skills/azure-ai-contentunderstanding-py/SKILL.md
Normal file
@@ -0,0 +1,273 @@
|
||||
---
|
||||
name: azure-ai-contentunderstanding-py
|
||||
description: |
|
||||
Azure AI Content Understanding SDK for Python. Use for multimodal content extraction from documents, images, audio, and video.
|
||||
Triggers: "azure-ai-contentunderstanding", "ContentUnderstandingClient", "multimodal analysis", "document extraction", "video analysis", "audio transcription".
|
||||
package: azure-ai-contentunderstanding
|
||||
---
|
||||
|
||||
# Azure AI Content Understanding SDK for Python
|
||||
|
||||
Multimodal AI service that extracts semantic content from documents, video, audio, and image files for RAG and automated workflows.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-contentunderstanding
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
CONTENTUNDERSTANDING_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.contentunderstanding import ContentUnderstandingClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
endpoint = os.environ["CONTENTUNDERSTANDING_ENDPOINT"]
|
||||
credential = DefaultAzureCredential()
|
||||
client = ContentUnderstandingClient(endpoint=endpoint, credential=credential)
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
Content Understanding operations are asynchronous long-running operations:
|
||||
|
||||
1. **Begin Analysis** — Start the analysis operation with `begin_analyze()` (returns a poller)
|
||||
2. **Poll for Results** — Poll until analysis completes (SDK handles this with `.result()`)
|
||||
3. **Process Results** — Extract structured results from `AnalyzeResult.contents`
|
||||
|
||||
## Prebuilt Analyzers
|
||||
|
||||
| Analyzer | Content Type | Purpose |
|
||||
|----------|--------------|---------|
|
||||
| `prebuilt-documentSearch` | Documents | Extract markdown for RAG applications |
|
||||
| `prebuilt-imageSearch` | Images | Extract content from images |
|
||||
| `prebuilt-audioSearch` | Audio | Transcribe audio with timing |
|
||||
| `prebuilt-videoSearch` | Video | Extract frames, transcripts, summaries |
|
||||
| `prebuilt-invoice` | Documents | Extract invoice fields |
|
||||
|
||||
## Analyze Document
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.contentunderstanding import ContentUnderstandingClient
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
endpoint = os.environ["CONTENTUNDERSTANDING_ENDPOINT"]
|
||||
client = ContentUnderstandingClient(
|
||||
endpoint=endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
# Analyze document from URL
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="prebuilt-documentSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/document.pdf")]
|
||||
)
|
||||
|
||||
result = poller.result()
|
||||
|
||||
# Access markdown content (contents is a list)
|
||||
content = result.contents[0]
|
||||
print(content.markdown)
|
||||
```
|
||||
|
||||
## Access Document Content Details
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import MediaContentKind, DocumentContent
|
||||
|
||||
content = result.contents[0]
|
||||
if content.kind == MediaContentKind.DOCUMENT:
|
||||
document_content: DocumentContent = content # type: ignore
|
||||
print(document_content.start_page_number)
|
||||
```
|
||||
|
||||
## Analyze Image
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="prebuilt-imageSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/image.jpg")]
|
||||
)
|
||||
result = poller.result()
|
||||
content = result.contents[0]
|
||||
print(content.markdown)
|
||||
```
|
||||
|
||||
## Analyze Video
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="prebuilt-videoSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/video.mp4")]
|
||||
)
|
||||
|
||||
result = poller.result()
|
||||
|
||||
# Access video content (AudioVisualContent)
|
||||
content = result.contents[0]
|
||||
|
||||
# Get transcript phrases with timing
|
||||
for phrase in content.transcript_phrases:
|
||||
print(f"[{phrase.start_time} - {phrase.end_time}]: {phrase.text}")
|
||||
|
||||
# Get key frames (for video)
|
||||
for frame in content.key_frames:
|
||||
print(f"Frame at {frame.time}: {frame.description}")
|
||||
```
|
||||
|
||||
## Analyze Audio
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="prebuilt-audioSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/audio.mp3")]
|
||||
)
|
||||
|
||||
result = poller.result()
|
||||
|
||||
# Access audio transcript
|
||||
content = result.contents[0]
|
||||
for phrase in content.transcript_phrases:
|
||||
print(f"[{phrase.start_time}] {phrase.text}")
|
||||
```
|
||||
|
||||
## Custom Analyzers
|
||||
|
||||
Create custom analyzers with field schemas for specialized extraction:
|
||||
|
||||
```python
|
||||
# Create custom analyzer
|
||||
analyzer = client.create_analyzer(
|
||||
analyzer_id="my-invoice-analyzer",
|
||||
analyzer={
|
||||
"description": "Custom invoice analyzer",
|
||||
"base_analyzer_id": "prebuilt-documentSearch",
|
||||
"field_schema": {
|
||||
"fields": {
|
||||
"vendor_name": {"type": "string"},
|
||||
"invoice_total": {"type": "number"},
|
||||
"line_items": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"description": {"type": "string"},
|
||||
"amount": {"type": "number"}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
# Use custom analyzer
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
|
||||
poller = client.begin_analyze(
|
||||
analyzer_id="my-invoice-analyzer",
|
||||
inputs=[AnalyzeInput(url="https://example.com/invoice.pdf")]
|
||||
)
|
||||
|
||||
result = poller.result()
|
||||
|
||||
# Access extracted fields
|
||||
print(result.fields["vendor_name"])
|
||||
print(result.fields["invoice_total"])
|
||||
```
|
||||
|
||||
## Analyzer Management
|
||||
|
||||
```python
|
||||
# List all analyzers
|
||||
analyzers = client.list_analyzers()
|
||||
for analyzer in analyzers:
|
||||
print(f"{analyzer.analyzer_id}: {analyzer.description}")
|
||||
|
||||
# Get specific analyzer
|
||||
analyzer = client.get_analyzer("prebuilt-documentSearch")
|
||||
|
||||
# Delete custom analyzer
|
||||
client.delete_analyzer("my-custom-analyzer")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import os
|
||||
from azure.ai.contentunderstanding.aio import ContentUnderstandingClient
|
||||
from azure.ai.contentunderstanding.models import AnalyzeInput
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def analyze_document():
|
||||
endpoint = os.environ["CONTENTUNDERSTANDING_ENDPOINT"]
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with ContentUnderstandingClient(
|
||||
endpoint=endpoint,
|
||||
credential=credential
|
||||
) as client:
|
||||
poller = await client.begin_analyze(
|
||||
analyzer_id="prebuilt-documentSearch",
|
||||
inputs=[AnalyzeInput(url="https://example.com/doc.pdf")]
|
||||
)
|
||||
result = await poller.result()
|
||||
content = result.contents[0]
|
||||
return content.markdown
|
||||
|
||||
asyncio.run(analyze_document())
|
||||
```
|
||||
|
||||
## Content Types
|
||||
|
||||
| Class | For | Provides |
|
||||
|-------|-----|----------|
|
||||
| `DocumentContent` | PDF, images, Office docs | Pages, tables, figures, paragraphs |
|
||||
| `AudioVisualContent` | Audio, video files | Transcript phrases, timing, key frames |
|
||||
|
||||
Both derive from `MediaContent` which provides basic info and markdown representation.
|
||||
|
||||
## Model Imports
|
||||
|
||||
```python
|
||||
from azure.ai.contentunderstanding.models import (
|
||||
AnalyzeInput,
|
||||
AnalyzeResult,
|
||||
MediaContentKind,
|
||||
DocumentContent,
|
||||
AudioVisualContent,
|
||||
)
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `ContentUnderstandingClient` | Sync client for all operations |
|
||||
| `ContentUnderstandingClient` (aio) | Async client for all operations |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `begin_analyze` with `AnalyzeInput`** — this is the correct method signature
|
||||
2. **Access results via `result.contents[0]`** — results are returned as a list
|
||||
3. **Use prebuilt analyzers** for common scenarios (document/image/audio/video search)
|
||||
4. **Create custom analyzers** only for domain-specific field extraction
|
||||
5. **Use async client** for high-throughput scenarios with `azure.identity.aio` credentials
|
||||
6. **Handle long-running operations** — video/audio analysis can take minutes
|
||||
7. **Use URL sources** when possible to avoid upload overhead
|
||||
337
skills/azure-ai-document-intelligence-dotnet/SKILL.md
Normal file
337
skills/azure-ai-document-intelligence-dotnet/SKILL.md
Normal file
@@ -0,0 +1,337 @@
|
||||
---
|
||||
name: azure-ai-document-intelligence-dotnet
|
||||
description: |
|
||||
Azure AI Document Intelligence SDK for .NET. Extract text, tables, and structured data from documents using prebuilt and custom models. Use for invoice processing, receipt extraction, ID document analysis, and custom document models. Triggers: "Document Intelligence", "DocumentIntelligenceClient", "form recognizer", "invoice extraction", "receipt OCR", "document analysis .NET".
|
||||
package: Azure.AI.DocumentIntelligence
|
||||
---
|
||||
|
||||
# Azure.AI.DocumentIntelligence (.NET)
|
||||
|
||||
Extract text, tables, and structured data from documents using prebuilt and custom models.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.DocumentIntelligence
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.0.0 (GA)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
DOCUMENT_INTELLIGENCE_ENDPOINT=https://<resource-name>.cognitiveservices.azure.com/
|
||||
DOCUMENT_INTELLIGENCE_API_KEY=<your-api-key>
|
||||
BLOB_CONTAINER_SAS_URL=https://<storage>.blob.core.windows.net/<container>?<sas-token>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Microsoft Entra ID (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.DocumentIntelligence;
|
||||
|
||||
string endpoint = Environment.GetEnvironmentVariable("DOCUMENT_INTELLIGENCE_ENDPOINT");
|
||||
var credential = new DefaultAzureCredential();
|
||||
var client = new DocumentIntelligenceClient(new Uri(endpoint), credential);
|
||||
```
|
||||
|
||||
> **Note**: Entra ID requires a **custom subdomain** (e.g., `https://<resource-name>.cognitiveservices.azure.com/`), not a regional endpoint.
|
||||
|
||||
### API Key
|
||||
|
||||
```csharp
|
||||
string endpoint = Environment.GetEnvironmentVariable("DOCUMENT_INTELLIGENCE_ENDPOINT");
|
||||
string apiKey = Environment.GetEnvironmentVariable("DOCUMENT_INTELLIGENCE_API_KEY");
|
||||
var client = new DocumentIntelligenceClient(new Uri(endpoint), new AzureKeyCredential(apiKey));
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `DocumentIntelligenceClient` | Analyze documents, classify documents |
|
||||
| `DocumentIntelligenceAdministrationClient` | Build/manage custom models and classifiers |
|
||||
|
||||
## Prebuilt Models
|
||||
|
||||
| Model ID | Description |
|
||||
|----------|-------------|
|
||||
| `prebuilt-read` | Extract text, languages, handwriting |
|
||||
| `prebuilt-layout` | Extract text, tables, selection marks, structure |
|
||||
| `prebuilt-invoice` | Extract invoice fields (vendor, items, totals) |
|
||||
| `prebuilt-receipt` | Extract receipt fields (merchant, items, total) |
|
||||
| `prebuilt-idDocument` | Extract ID document fields (name, DOB, address) |
|
||||
| `prebuilt-businessCard` | Extract business card fields |
|
||||
| `prebuilt-tax.us.w2` | Extract W-2 tax form fields |
|
||||
| `prebuilt-healthInsuranceCard.us` | Extract health insurance card fields |
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Analyze Invoice
|
||||
|
||||
```csharp
|
||||
using Azure.AI.DocumentIntelligence;
|
||||
|
||||
Uri invoiceUri = new Uri("https://example.com/invoice.pdf");
|
||||
|
||||
Operation<AnalyzeResult> operation = await client.AnalyzeDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
"prebuilt-invoice",
|
||||
invoiceUri);
|
||||
|
||||
AnalyzeResult result = operation.Value;
|
||||
|
||||
foreach (AnalyzedDocument document in result.Documents)
|
||||
{
|
||||
if (document.Fields.TryGetValue("VendorName", out DocumentField vendorNameField)
|
||||
&& vendorNameField.FieldType == DocumentFieldType.String)
|
||||
{
|
||||
string vendorName = vendorNameField.ValueString;
|
||||
Console.WriteLine($"Vendor Name: '{vendorName}', confidence: {vendorNameField.Confidence}");
|
||||
}
|
||||
|
||||
if (document.Fields.TryGetValue("InvoiceTotal", out DocumentField invoiceTotalField)
|
||||
&& invoiceTotalField.FieldType == DocumentFieldType.Currency)
|
||||
{
|
||||
CurrencyValue invoiceTotal = invoiceTotalField.ValueCurrency;
|
||||
Console.WriteLine($"Invoice Total: '{invoiceTotal.CurrencySymbol}{invoiceTotal.Amount}'");
|
||||
}
|
||||
|
||||
// Extract line items
|
||||
if (document.Fields.TryGetValue("Items", out DocumentField itemsField)
|
||||
&& itemsField.FieldType == DocumentFieldType.List)
|
||||
{
|
||||
foreach (DocumentField item in itemsField.ValueList)
|
||||
{
|
||||
var itemFields = item.ValueDictionary;
|
||||
if (itemFields.TryGetValue("Description", out DocumentField descField))
|
||||
Console.WriteLine($" Item: {descField.ValueString}");
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Extract Layout (Text, Tables, Structure)
|
||||
|
||||
```csharp
|
||||
Uri fileUri = new Uri("https://example.com/document.pdf");
|
||||
|
||||
Operation<AnalyzeResult> operation = await client.AnalyzeDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
"prebuilt-layout",
|
||||
fileUri);
|
||||
|
||||
AnalyzeResult result = operation.Value;
|
||||
|
||||
// Extract text by page
|
||||
foreach (DocumentPage page in result.Pages)
|
||||
{
|
||||
Console.WriteLine($"Page {page.PageNumber}: {page.Lines.Count} lines, {page.Words.Count} words");
|
||||
|
||||
foreach (DocumentLine line in page.Lines)
|
||||
{
|
||||
Console.WriteLine($" Line: '{line.Content}'");
|
||||
}
|
||||
}
|
||||
|
||||
// Extract tables
|
||||
foreach (DocumentTable table in result.Tables)
|
||||
{
|
||||
Console.WriteLine($"Table: {table.RowCount} rows x {table.ColumnCount} columns");
|
||||
foreach (DocumentTableCell cell in table.Cells)
|
||||
{
|
||||
Console.WriteLine($" Cell ({cell.RowIndex}, {cell.ColumnIndex}): {cell.Content}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Analyze Receipt
|
||||
|
||||
```csharp
|
||||
Operation<AnalyzeResult> operation = await client.AnalyzeDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
"prebuilt-receipt",
|
||||
receiptUri);
|
||||
|
||||
AnalyzeResult result = operation.Value;
|
||||
|
||||
foreach (AnalyzedDocument document in result.Documents)
|
||||
{
|
||||
if (document.Fields.TryGetValue("MerchantName", out DocumentField merchantField))
|
||||
Console.WriteLine($"Merchant: {merchantField.ValueString}");
|
||||
|
||||
if (document.Fields.TryGetValue("Total", out DocumentField totalField))
|
||||
Console.WriteLine($"Total: {totalField.ValueCurrency.Amount}");
|
||||
|
||||
if (document.Fields.TryGetValue("TransactionDate", out DocumentField dateField))
|
||||
Console.WriteLine($"Date: {dateField.ValueDate}");
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Build Custom Model
|
||||
|
||||
```csharp
|
||||
var adminClient = new DocumentIntelligenceAdministrationClient(
|
||||
new Uri(endpoint),
|
||||
new AzureKeyCredential(apiKey));
|
||||
|
||||
string modelId = "my-custom-model";
|
||||
Uri blobContainerUri = new Uri("<blob-container-sas-url>");
|
||||
|
||||
var blobSource = new BlobContentSource(blobContainerUri);
|
||||
var options = new BuildDocumentModelOptions(modelId, DocumentBuildMode.Template, blobSource);
|
||||
|
||||
Operation<DocumentModelDetails> operation = await adminClient.BuildDocumentModelAsync(
|
||||
WaitUntil.Completed,
|
||||
options);
|
||||
|
||||
DocumentModelDetails model = operation.Value;
|
||||
|
||||
Console.WriteLine($"Model ID: {model.ModelId}");
|
||||
Console.WriteLine($"Created: {model.CreatedOn}");
|
||||
|
||||
foreach (var docType in model.DocumentTypes)
|
||||
{
|
||||
Console.WriteLine($"Document type: {docType.Key}");
|
||||
foreach (var field in docType.Value.FieldSchema)
|
||||
{
|
||||
Console.WriteLine($" Field: {field.Key}, Confidence: {docType.Value.FieldConfidence[field.Key]}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Build Document Classifier
|
||||
|
||||
```csharp
|
||||
string classifierId = "my-classifier";
|
||||
Uri blobContainerUri = new Uri("<blob-container-sas-url>");
|
||||
|
||||
var sourceA = new BlobContentSource(blobContainerUri) { Prefix = "TypeA/train" };
|
||||
var sourceB = new BlobContentSource(blobContainerUri) { Prefix = "TypeB/train" };
|
||||
|
||||
var docTypes = new Dictionary<string, ClassifierDocumentTypeDetails>()
|
||||
{
|
||||
{ "TypeA", new ClassifierDocumentTypeDetails(sourceA) },
|
||||
{ "TypeB", new ClassifierDocumentTypeDetails(sourceB) }
|
||||
};
|
||||
|
||||
var options = new BuildClassifierOptions(classifierId, docTypes);
|
||||
|
||||
Operation<DocumentClassifierDetails> operation = await adminClient.BuildClassifierAsync(
|
||||
WaitUntil.Completed,
|
||||
options);
|
||||
|
||||
DocumentClassifierDetails classifier = operation.Value;
|
||||
Console.WriteLine($"Classifier ID: {classifier.ClassifierId}");
|
||||
```
|
||||
|
||||
### 6. Classify Document
|
||||
|
||||
```csharp
|
||||
string classifierId = "my-classifier";
|
||||
Uri documentUri = new Uri("https://example.com/document.pdf");
|
||||
|
||||
var options = new ClassifyDocumentOptions(classifierId, documentUri);
|
||||
|
||||
Operation<AnalyzeResult> operation = await client.ClassifyDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
options);
|
||||
|
||||
AnalyzeResult result = operation.Value;
|
||||
|
||||
foreach (AnalyzedDocument document in result.Documents)
|
||||
{
|
||||
Console.WriteLine($"Document type: {document.DocumentType}, confidence: {document.Confidence}");
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Manage Models
|
||||
|
||||
```csharp
|
||||
// Get resource details
|
||||
DocumentIntelligenceResourceDetails resourceDetails = await adminClient.GetResourceDetailsAsync();
|
||||
Console.WriteLine($"Custom models: {resourceDetails.CustomDocumentModels.Count}/{resourceDetails.CustomDocumentModels.Limit}");
|
||||
|
||||
// Get specific model
|
||||
DocumentModelDetails model = await adminClient.GetModelAsync("my-model-id");
|
||||
Console.WriteLine($"Model: {model.ModelId}, Created: {model.CreatedOn}");
|
||||
|
||||
// List models
|
||||
await foreach (DocumentModelDetails modelItem in adminClient.GetModelsAsync())
|
||||
{
|
||||
Console.WriteLine($"Model: {modelItem.ModelId}");
|
||||
}
|
||||
|
||||
// Delete model
|
||||
await adminClient.DeleteModelAsync("my-model-id");
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Description |
|
||||
|------|-------------|
|
||||
| `DocumentIntelligenceClient` | Main client for analysis |
|
||||
| `DocumentIntelligenceAdministrationClient` | Model management |
|
||||
| `AnalyzeResult` | Result of document analysis |
|
||||
| `AnalyzedDocument` | Single document within result |
|
||||
| `DocumentField` | Extracted field with value and confidence |
|
||||
| `DocumentFieldType` | String, Date, Number, Currency, etc. |
|
||||
| `DocumentPage` | Page info (lines, words, selection marks) |
|
||||
| `DocumentTable` | Extracted table with cells |
|
||||
| `DocumentModelDetails` | Custom model metadata |
|
||||
| `BlobContentSource` | Training data source |
|
||||
|
||||
## Build Modes
|
||||
|
||||
| Mode | Use Case |
|
||||
|------|----------|
|
||||
| `DocumentBuildMode.Template` | Fixed layout documents (forms) |
|
||||
| `DocumentBuildMode.Neural` | Variable layout documents |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for production
|
||||
2. **Reuse client instances** — clients are thread-safe
|
||||
3. **Handle long-running operations** — Use `WaitUntil.Completed` for simplicity
|
||||
4. **Check field confidence** — Always verify `Confidence` property
|
||||
5. **Use appropriate model** — Prebuilt for common docs, custom for specialized
|
||||
6. **Use custom subdomain** — Required for Entra ID authentication
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await client.AnalyzeDocumentAsync(
|
||||
WaitUntil.Completed,
|
||||
"prebuilt-invoice",
|
||||
documentUri);
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.DocumentIntelligence` | Document analysis (this SDK) | `dotnet add package Azure.AI.DocumentIntelligence` |
|
||||
| `Azure.AI.FormRecognizer` | Legacy SDK (deprecated) | Use DocumentIntelligence instead |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.DocumentIntelligence |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.documentintelligence |
|
||||
| GitHub Samples | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/documentintelligence/Azure.AI.DocumentIntelligence/samples |
|
||||
| Document Intelligence Studio | https://documentintelligence.ai.azure.com/ |
|
||||
| Prebuilt Models | https://aka.ms/azsdk/formrecognizer/models |
|
||||
323
skills/azure-ai-document-intelligence-ts/SKILL.md
Normal file
323
skills/azure-ai-document-intelligence-ts/SKILL.md
Normal file
@@ -0,0 +1,323 @@
|
||||
---
|
||||
name: azure-ai-document-intelligence-ts
|
||||
description: Extract text, tables, and structured data from documents using Azure Document Intelligence (@azure-rest/ai-document-intelligence). Use when processing invoices, receipts, IDs, forms, or building custom document models.
|
||||
package: @azure-rest/ai-document-intelligence
|
||||
---
|
||||
|
||||
# Azure Document Intelligence REST SDK for TypeScript
|
||||
|
||||
Extract text, tables, and structured data from documents using prebuilt and custom models.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @azure-rest/ai-document-intelligence @azure/identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
DOCUMENT_INTELLIGENCE_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
DOCUMENT_INTELLIGENCE_API_KEY=<api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
**Important**: This is a REST client. `DocumentIntelligence` is a **function**, not a class.
|
||||
|
||||
### DefaultAzureCredential
|
||||
|
||||
```typescript
|
||||
import DocumentIntelligence from "@azure-rest/ai-document-intelligence";
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
const client = DocumentIntelligence(
|
||||
process.env.DOCUMENT_INTELLIGENCE_ENDPOINT!,
|
||||
new DefaultAzureCredential()
|
||||
);
|
||||
```
|
||||
|
||||
### API Key
|
||||
|
||||
```typescript
|
||||
import DocumentIntelligence from "@azure-rest/ai-document-intelligence";
|
||||
|
||||
const client = DocumentIntelligence(
|
||||
process.env.DOCUMENT_INTELLIGENCE_ENDPOINT!,
|
||||
{ key: process.env.DOCUMENT_INTELLIGENCE_API_KEY! }
|
||||
);
|
||||
```
|
||||
|
||||
## Analyze Document (URL)
|
||||
|
||||
```typescript
|
||||
import DocumentIntelligence, {
|
||||
isUnexpected,
|
||||
getLongRunningPoller,
|
||||
AnalyzeOperationOutput
|
||||
} from "@azure-rest/ai-document-intelligence";
|
||||
|
||||
const initialResponse = await client
|
||||
.path("/documentModels/{modelId}:analyze", "prebuilt-layout")
|
||||
.post({
|
||||
contentType: "application/json",
|
||||
body: {
|
||||
urlSource: "https://example.com/document.pdf"
|
||||
},
|
||||
queryParameters: { locale: "en-US" }
|
||||
});
|
||||
|
||||
if (isUnexpected(initialResponse)) {
|
||||
throw initialResponse.body.error;
|
||||
}
|
||||
|
||||
const poller = getLongRunningPoller(client, initialResponse);
|
||||
const result = (await poller.pollUntilDone()).body as AnalyzeOperationOutput;
|
||||
|
||||
console.log("Pages:", result.analyzeResult?.pages?.length);
|
||||
console.log("Tables:", result.analyzeResult?.tables?.length);
|
||||
```
|
||||
|
||||
## Analyze Document (Local File)
|
||||
|
||||
```typescript
|
||||
import { readFile } from "node:fs/promises";
|
||||
|
||||
const fileBuffer = await readFile("./document.pdf");
|
||||
const base64Source = fileBuffer.toString("base64");
|
||||
|
||||
const initialResponse = await client
|
||||
.path("/documentModels/{modelId}:analyze", "prebuilt-invoice")
|
||||
.post({
|
||||
contentType: "application/json",
|
||||
body: { base64Source }
|
||||
});
|
||||
|
||||
if (isUnexpected(initialResponse)) {
|
||||
throw initialResponse.body.error;
|
||||
}
|
||||
|
||||
const poller = getLongRunningPoller(client, initialResponse);
|
||||
const result = (await poller.pollUntilDone()).body as AnalyzeOperationOutput;
|
||||
```
|
||||
|
||||
## Prebuilt Models
|
||||
|
||||
| Model ID | Description |
|
||||
|----------|-------------|
|
||||
| `prebuilt-read` | OCR - text and language extraction |
|
||||
| `prebuilt-layout` | Text, tables, selection marks, structure |
|
||||
| `prebuilt-invoice` | Invoice fields |
|
||||
| `prebuilt-receipt` | Receipt fields |
|
||||
| `prebuilt-idDocument` | ID document fields |
|
||||
| `prebuilt-tax.us.w2` | W-2 tax form fields |
|
||||
| `prebuilt-healthInsuranceCard.us` | Health insurance card fields |
|
||||
| `prebuilt-contract` | Contract fields |
|
||||
| `prebuilt-bankStatement.us` | Bank statement fields |
|
||||
|
||||
## Extract Invoice Fields
|
||||
|
||||
```typescript
|
||||
const initialResponse = await client
|
||||
.path("/documentModels/{modelId}:analyze", "prebuilt-invoice")
|
||||
.post({
|
||||
contentType: "application/json",
|
||||
body: { urlSource: invoiceUrl }
|
||||
});
|
||||
|
||||
if (isUnexpected(initialResponse)) {
|
||||
throw initialResponse.body.error;
|
||||
}
|
||||
|
||||
const poller = getLongRunningPoller(client, initialResponse);
|
||||
const result = (await poller.pollUntilDone()).body as AnalyzeOperationOutput;
|
||||
|
||||
const invoice = result.analyzeResult?.documents?.[0];
|
||||
if (invoice) {
|
||||
console.log("Vendor:", invoice.fields?.VendorName?.content);
|
||||
console.log("Total:", invoice.fields?.InvoiceTotal?.content);
|
||||
console.log("Due Date:", invoice.fields?.DueDate?.content);
|
||||
}
|
||||
```
|
||||
|
||||
## Extract Receipt Fields
|
||||
|
||||
```typescript
|
||||
const initialResponse = await client
|
||||
.path("/documentModels/{modelId}:analyze", "prebuilt-receipt")
|
||||
.post({
|
||||
contentType: "application/json",
|
||||
body: { urlSource: receiptUrl }
|
||||
});
|
||||
|
||||
const poller = getLongRunningPoller(client, initialResponse);
|
||||
const result = (await poller.pollUntilDone()).body as AnalyzeOperationOutput;
|
||||
|
||||
const receipt = result.analyzeResult?.documents?.[0];
|
||||
if (receipt) {
|
||||
console.log("Merchant:", receipt.fields?.MerchantName?.content);
|
||||
console.log("Total:", receipt.fields?.Total?.content);
|
||||
|
||||
for (const item of receipt.fields?.Items?.values || []) {
|
||||
console.log("Item:", item.properties?.Description?.content);
|
||||
console.log("Price:", item.properties?.TotalPrice?.content);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## List Document Models
|
||||
|
||||
```typescript
|
||||
import DocumentIntelligence, { isUnexpected, paginate } from "@azure-rest/ai-document-intelligence";
|
||||
|
||||
const response = await client.path("/documentModels").get();
|
||||
|
||||
if (isUnexpected(response)) {
|
||||
throw response.body.error;
|
||||
}
|
||||
|
||||
for await (const model of paginate(client, response)) {
|
||||
console.log(model.modelId);
|
||||
}
|
||||
```
|
||||
|
||||
## Build Custom Model
|
||||
|
||||
```typescript
|
||||
const initialResponse = await client.path("/documentModels:build").post({
|
||||
body: {
|
||||
modelId: "my-custom-model",
|
||||
description: "Custom model for purchase orders",
|
||||
buildMode: "template", // or "neural"
|
||||
azureBlobSource: {
|
||||
containerUrl: process.env.TRAINING_CONTAINER_SAS_URL!,
|
||||
prefix: "training-data/"
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
if (isUnexpected(initialResponse)) {
|
||||
throw initialResponse.body.error;
|
||||
}
|
||||
|
||||
const poller = getLongRunningPoller(client, initialResponse);
|
||||
const result = await poller.pollUntilDone();
|
||||
console.log("Model built:", result.body);
|
||||
```
|
||||
|
||||
## Build Document Classifier
|
||||
|
||||
```typescript
|
||||
import { DocumentClassifierBuildOperationDetailsOutput } from "@azure-rest/ai-document-intelligence";
|
||||
|
||||
const containerSasUrl = process.env.TRAINING_CONTAINER_SAS_URL!;
|
||||
|
||||
const initialResponse = await client.path("/documentClassifiers:build").post({
|
||||
body: {
|
||||
classifierId: "my-classifier",
|
||||
description: "Invoice vs Receipt classifier",
|
||||
docTypes: {
|
||||
invoices: {
|
||||
azureBlobSource: { containerUrl: containerSasUrl, prefix: "invoices/" }
|
||||
},
|
||||
receipts: {
|
||||
azureBlobSource: { containerUrl: containerSasUrl, prefix: "receipts/" }
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
if (isUnexpected(initialResponse)) {
|
||||
throw initialResponse.body.error;
|
||||
}
|
||||
|
||||
const poller = getLongRunningPoller(client, initialResponse);
|
||||
const result = (await poller.pollUntilDone()).body as DocumentClassifierBuildOperationDetailsOutput;
|
||||
console.log("Classifier:", result.result?.classifierId);
|
||||
```
|
||||
|
||||
## Classify Document
|
||||
|
||||
```typescript
|
||||
const initialResponse = await client
|
||||
.path("/documentClassifiers/{classifierId}:analyze", "my-classifier")
|
||||
.post({
|
||||
contentType: "application/json",
|
||||
body: { urlSource: documentUrl },
|
||||
queryParameters: { split: "auto" }
|
||||
});
|
||||
|
||||
if (isUnexpected(initialResponse)) {
|
||||
throw initialResponse.body.error;
|
||||
}
|
||||
|
||||
const poller = getLongRunningPoller(client, initialResponse);
|
||||
const result = await poller.pollUntilDone();
|
||||
console.log("Classification:", result.body.analyzeResult?.documents);
|
||||
```
|
||||
|
||||
## Get Service Info
|
||||
|
||||
```typescript
|
||||
const response = await client.path("/info").get();
|
||||
|
||||
if (isUnexpected(response)) {
|
||||
throw response.body.error;
|
||||
}
|
||||
|
||||
console.log("Custom model limit:", response.body.customDocumentModels.limit);
|
||||
console.log("Custom model count:", response.body.customDocumentModels.count);
|
||||
```
|
||||
|
||||
## Polling Pattern
|
||||
|
||||
```typescript
|
||||
import DocumentIntelligence, {
|
||||
isUnexpected,
|
||||
getLongRunningPoller,
|
||||
AnalyzeOperationOutput
|
||||
} from "@azure-rest/ai-document-intelligence";
|
||||
|
||||
// 1. Start operation
|
||||
const initialResponse = await client
|
||||
.path("/documentModels/{modelId}:analyze", "prebuilt-layout")
|
||||
.post({ contentType: "application/json", body: { urlSource } });
|
||||
|
||||
// 2. Check for errors
|
||||
if (isUnexpected(initialResponse)) {
|
||||
throw initialResponse.body.error;
|
||||
}
|
||||
|
||||
// 3. Create poller
|
||||
const poller = getLongRunningPoller(client, initialResponse);
|
||||
|
||||
// 4. Optional: Monitor progress
|
||||
poller.onProgress((state) => {
|
||||
console.log("Status:", state.status);
|
||||
});
|
||||
|
||||
// 5. Wait for completion
|
||||
const result = (await poller.pollUntilDone()).body as AnalyzeOperationOutput;
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
import DocumentIntelligence, {
|
||||
isUnexpected,
|
||||
getLongRunningPoller,
|
||||
paginate,
|
||||
parseResultIdFromResponse,
|
||||
AnalyzeOperationOutput,
|
||||
DocumentClassifierBuildOperationDetailsOutput
|
||||
} from "@azure-rest/ai-document-intelligence";
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use getLongRunningPoller()** - Document analysis is async, always poll for results
|
||||
2. **Check isUnexpected()** - Type guard for proper error handling
|
||||
3. **Choose the right model** - Use prebuilt models when possible, custom for specialized docs
|
||||
4. **Handle confidence scores** - Fields have confidence values, set thresholds for your use case
|
||||
5. **Use pagination** - Use `paginate()` helper for listing models
|
||||
6. **Prefer neural mode** - For custom models, neural handles more variation than template
|
||||
341
skills/azure-ai-formrecognizer-java/SKILL.md
Normal file
341
skills/azure-ai-formrecognizer-java/SKILL.md
Normal file
@@ -0,0 +1,341 @@
|
||||
---
|
||||
name: azure-ai-formrecognizer-java
|
||||
description: Build document analysis applications with Azure Document Intelligence (Form Recognizer) SDK for Java. Use when extracting text, tables, key-value pairs from documents, receipts, invoices, or building custom document models.
|
||||
package: com.azure:azure-ai-formrecognizer
|
||||
---
|
||||
|
||||
# Azure Document Intelligence (Form Recognizer) SDK for Java
|
||||
|
||||
Build document analysis applications using the Azure AI Document Intelligence SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-formrecognizer</artifactId>
|
||||
<version>4.2.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### DocumentAnalysisClient
|
||||
|
||||
```java
|
||||
import com.azure.ai.formrecognizer.documentanalysis.DocumentAnalysisClient;
|
||||
import com.azure.ai.formrecognizer.documentanalysis.DocumentAnalysisClientBuilder;
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
DocumentAnalysisClient client = new DocumentAnalysisClientBuilder()
|
||||
.credential(new AzureKeyCredential("{key}"))
|
||||
.endpoint("{endpoint}")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### DocumentModelAdministrationClient
|
||||
|
||||
```java
|
||||
import com.azure.ai.formrecognizer.documentanalysis.administration.DocumentModelAdministrationClient;
|
||||
import com.azure.ai.formrecognizer.documentanalysis.administration.DocumentModelAdministrationClientBuilder;
|
||||
|
||||
DocumentModelAdministrationClient adminClient = new DocumentModelAdministrationClientBuilder()
|
||||
.credential(new AzureKeyCredential("{key}"))
|
||||
.endpoint("{endpoint}")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
DocumentAnalysisClient client = new DocumentAnalysisClientBuilder()
|
||||
.endpoint("{endpoint}")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Prebuilt Models
|
||||
|
||||
| Model ID | Purpose |
|
||||
|----------|---------|
|
||||
| `prebuilt-layout` | Extract text, tables, selection marks |
|
||||
| `prebuilt-document` | General document with key-value pairs |
|
||||
| `prebuilt-receipt` | Receipt data extraction |
|
||||
| `prebuilt-invoice` | Invoice field extraction |
|
||||
| `prebuilt-businessCard` | Business card parsing |
|
||||
| `prebuilt-idDocument` | ID document (passport, license) |
|
||||
| `prebuilt-tax.us.w2` | US W2 tax forms |
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Extract Layout
|
||||
|
||||
```java
|
||||
import com.azure.ai.formrecognizer.documentanalysis.models.*;
|
||||
import com.azure.core.util.BinaryData;
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
import java.io.File;
|
||||
|
||||
File document = new File("document.pdf");
|
||||
BinaryData documentData = BinaryData.fromFile(document.toPath());
|
||||
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocument("prebuilt-layout", documentData);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
// Process pages
|
||||
for (DocumentPage page : result.getPages()) {
|
||||
System.out.printf("Page %d: %.2f x %.2f %s%n",
|
||||
page.getPageNumber(),
|
||||
page.getWidth(),
|
||||
page.getHeight(),
|
||||
page.getUnit());
|
||||
|
||||
// Lines
|
||||
for (DocumentLine line : page.getLines()) {
|
||||
System.out.println("Line: " + line.getContent());
|
||||
}
|
||||
|
||||
// Selection marks (checkboxes)
|
||||
for (DocumentSelectionMark mark : page.getSelectionMarks()) {
|
||||
System.out.printf("Checkbox: %s (confidence: %.2f)%n",
|
||||
mark.getSelectionMarkState(),
|
||||
mark.getConfidence());
|
||||
}
|
||||
}
|
||||
|
||||
// Tables
|
||||
for (DocumentTable table : result.getTables()) {
|
||||
System.out.printf("Table: %d rows x %d columns%n",
|
||||
table.getRowCount(),
|
||||
table.getColumnCount());
|
||||
|
||||
for (DocumentTableCell cell : table.getCells()) {
|
||||
System.out.printf("Cell[%d,%d]: %s%n",
|
||||
cell.getRowIndex(),
|
||||
cell.getColumnIndex(),
|
||||
cell.getContent());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze from URL
|
||||
|
||||
```java
|
||||
String documentUrl = "https://example.com/invoice.pdf";
|
||||
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocumentFromUrl("prebuilt-invoice", documentUrl);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
```
|
||||
|
||||
### Analyze Receipt
|
||||
|
||||
```java
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocumentFromUrl("prebuilt-receipt", receiptUrl);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
for (AnalyzedDocument doc : result.getDocuments()) {
|
||||
Map<String, DocumentField> fields = doc.getFields();
|
||||
|
||||
DocumentField merchantName = fields.get("MerchantName");
|
||||
if (merchantName != null && merchantName.getType() == DocumentFieldType.STRING) {
|
||||
System.out.printf("Merchant: %s (confidence: %.2f)%n",
|
||||
merchantName.getValueAsString(),
|
||||
merchantName.getConfidence());
|
||||
}
|
||||
|
||||
DocumentField transactionDate = fields.get("TransactionDate");
|
||||
if (transactionDate != null && transactionDate.getType() == DocumentFieldType.DATE) {
|
||||
System.out.printf("Date: %s%n", transactionDate.getValueAsDate());
|
||||
}
|
||||
|
||||
DocumentField items = fields.get("Items");
|
||||
if (items != null && items.getType() == DocumentFieldType.LIST) {
|
||||
for (DocumentField item : items.getValueAsList()) {
|
||||
Map<String, DocumentField> itemFields = item.getValueAsMap();
|
||||
System.out.printf("Item: %s, Price: %.2f%n",
|
||||
itemFields.get("Name").getValueAsString(),
|
||||
itemFields.get("Price").getValueAsDouble());
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### General Document Analysis
|
||||
|
||||
```java
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocumentFromUrl("prebuilt-document", documentUrl);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
// Key-value pairs
|
||||
for (DocumentKeyValuePair kvp : result.getKeyValuePairs()) {
|
||||
System.out.printf("Key: %s => Value: %s%n",
|
||||
kvp.getKey().getContent(),
|
||||
kvp.getValue() != null ? kvp.getValue().getContent() : "null");
|
||||
}
|
||||
```
|
||||
|
||||
## Custom Models
|
||||
|
||||
### Build Custom Model
|
||||
|
||||
```java
|
||||
import com.azure.ai.formrecognizer.documentanalysis.administration.models.*;
|
||||
|
||||
String blobContainerUrl = "{SAS_URL_of_training_data}";
|
||||
String prefix = "training-docs/";
|
||||
|
||||
SyncPoller<OperationResult, DocumentModelDetails> poller = adminClient.beginBuildDocumentModel(
|
||||
blobContainerUrl,
|
||||
DocumentModelBuildMode.TEMPLATE,
|
||||
prefix,
|
||||
new BuildDocumentModelOptions()
|
||||
.setModelId("my-custom-model")
|
||||
.setDescription("Custom invoice model"),
|
||||
Context.NONE);
|
||||
|
||||
DocumentModelDetails model = poller.getFinalResult();
|
||||
|
||||
System.out.println("Model ID: " + model.getModelId());
|
||||
System.out.println("Created: " + model.getCreatedOn());
|
||||
|
||||
model.getDocumentTypes().forEach((docType, details) -> {
|
||||
System.out.println("Document type: " + docType);
|
||||
details.getFieldSchema().forEach((field, schema) -> {
|
||||
System.out.printf(" Field: %s (%s)%n", field, schema.getType());
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
### Analyze with Custom Model
|
||||
|
||||
```java
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginAnalyzeDocumentFromUrl("my-custom-model", documentUrl);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
for (AnalyzedDocument doc : result.getDocuments()) {
|
||||
System.out.printf("Document type: %s (confidence: %.2f)%n",
|
||||
doc.getDocType(),
|
||||
doc.getConfidence());
|
||||
|
||||
doc.getFields().forEach((name, field) -> {
|
||||
System.out.printf("Field '%s': %s (confidence: %.2f)%n",
|
||||
name,
|
||||
field.getContent(),
|
||||
field.getConfidence());
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
### Compose Models
|
||||
|
||||
```java
|
||||
List<String> modelIds = Arrays.asList("model-1", "model-2", "model-3");
|
||||
|
||||
SyncPoller<OperationResult, DocumentModelDetails> poller =
|
||||
adminClient.beginComposeDocumentModel(
|
||||
modelIds,
|
||||
new ComposeDocumentModelOptions()
|
||||
.setModelId("composed-model")
|
||||
.setDescription("Composed from multiple models"));
|
||||
|
||||
DocumentModelDetails composedModel = poller.getFinalResult();
|
||||
```
|
||||
|
||||
### Manage Models
|
||||
|
||||
```java
|
||||
// List models
|
||||
PagedIterable<DocumentModelSummary> models = adminClient.listDocumentModels();
|
||||
for (DocumentModelSummary summary : models) {
|
||||
System.out.printf("Model: %s, Created: %s%n",
|
||||
summary.getModelId(),
|
||||
summary.getCreatedOn());
|
||||
}
|
||||
|
||||
// Get model details
|
||||
DocumentModelDetails model = adminClient.getDocumentModel("model-id");
|
||||
|
||||
// Delete model
|
||||
adminClient.deleteDocumentModel("model-id");
|
||||
|
||||
// Check resource limits
|
||||
ResourceDetails resources = adminClient.getResourceDetails();
|
||||
System.out.printf("Models: %d / %d%n",
|
||||
resources.getCustomDocumentModelCount(),
|
||||
resources.getCustomDocumentModelLimit());
|
||||
```
|
||||
|
||||
## Document Classification
|
||||
|
||||
### Build Classifier
|
||||
|
||||
```java
|
||||
Map<String, ClassifierDocumentTypeDetails> docTypes = new HashMap<>();
|
||||
docTypes.put("invoice", new ClassifierDocumentTypeDetails()
|
||||
.setAzureBlobSource(new AzureBlobContentSource(containerUrl).setPrefix("invoices/")));
|
||||
docTypes.put("receipt", new ClassifierDocumentTypeDetails()
|
||||
.setAzureBlobSource(new AzureBlobContentSource(containerUrl).setPrefix("receipts/")));
|
||||
|
||||
SyncPoller<OperationResult, DocumentClassifierDetails> poller =
|
||||
adminClient.beginBuildDocumentClassifier(docTypes,
|
||||
new BuildDocumentClassifierOptions().setClassifierId("my-classifier"));
|
||||
|
||||
DocumentClassifierDetails classifier = poller.getFinalResult();
|
||||
```
|
||||
|
||||
### Classify Document
|
||||
|
||||
```java
|
||||
SyncPoller<OperationResult, AnalyzeResult> poller =
|
||||
client.beginClassifyDocumentFromUrl("my-classifier", documentUrl, Context.NONE);
|
||||
|
||||
AnalyzeResult result = poller.getFinalResult();
|
||||
|
||||
for (AnalyzedDocument doc : result.getDocuments()) {
|
||||
System.out.printf("Classified as: %s (confidence: %.2f)%n",
|
||||
doc.getDocType(),
|
||||
doc.getConfidence());
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.beginAnalyzeDocumentFromUrl("prebuilt-receipt", "invalid-url");
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
FORM_RECOGNIZER_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
FORM_RECOGNIZER_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "document intelligence Java"
|
||||
- "form recognizer SDK"
|
||||
- "extract text from PDF"
|
||||
- "OCR document Java"
|
||||
- "analyze invoice receipt"
|
||||
- "custom document model"
|
||||
- "document classification"
|
||||
271
skills/azure-ai-ml-py/SKILL.md
Normal file
271
skills/azure-ai-ml-py/SKILL.md
Normal file
@@ -0,0 +1,271 @@
|
||||
---
|
||||
name: azure-ai-ml-py
|
||||
description: |
|
||||
Azure Machine Learning SDK v2 for Python. Use for ML workspaces, jobs, models, datasets, compute, and pipelines.
|
||||
Triggers: "azure-ai-ml", "MLClient", "workspace", "model registry", "training jobs", "datasets".
|
||||
package: azure-ai-ml
|
||||
---
|
||||
|
||||
# Azure Machine Learning SDK v2 for Python
|
||||
|
||||
Client library for managing Azure ML resources: workspaces, jobs, models, data, and compute.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-ml
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_ML_WORKSPACE_NAME=<your-workspace-name>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.ai.ml import MLClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
ml_client = MLClient(
|
||||
credential=DefaultAzureCredential(),
|
||||
subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"],
|
||||
resource_group_name=os.environ["AZURE_RESOURCE_GROUP"],
|
||||
workspace_name=os.environ["AZURE_ML_WORKSPACE_NAME"]
|
||||
)
|
||||
```
|
||||
|
||||
### From Config File
|
||||
|
||||
```python
|
||||
from azure.ai.ml import MLClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
# Uses config.json in current directory or parent
|
||||
ml_client = MLClient.from_config(
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Workspace Management
|
||||
|
||||
### Create Workspace
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import Workspace
|
||||
|
||||
ws = Workspace(
|
||||
name="my-workspace",
|
||||
location="eastus",
|
||||
display_name="My Workspace",
|
||||
description="ML workspace for experiments",
|
||||
tags={"purpose": "demo"}
|
||||
)
|
||||
|
||||
ml_client.workspaces.begin_create(ws).result()
|
||||
```
|
||||
|
||||
### List Workspaces
|
||||
|
||||
```python
|
||||
for ws in ml_client.workspaces.list():
|
||||
print(f"{ws.name}: {ws.location}")
|
||||
```
|
||||
|
||||
## Data Assets
|
||||
|
||||
### Register Data
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import Data
|
||||
from azure.ai.ml.constants import AssetTypes
|
||||
|
||||
# Register a file
|
||||
my_data = Data(
|
||||
name="my-dataset",
|
||||
version="1",
|
||||
path="azureml://datastores/workspaceblobstore/paths/data/train.csv",
|
||||
type=AssetTypes.URI_FILE,
|
||||
description="Training data"
|
||||
)
|
||||
|
||||
ml_client.data.create_or_update(my_data)
|
||||
```
|
||||
|
||||
### Register Folder
|
||||
|
||||
```python
|
||||
my_data = Data(
|
||||
name="my-folder-dataset",
|
||||
version="1",
|
||||
path="azureml://datastores/workspaceblobstore/paths/data/",
|
||||
type=AssetTypes.URI_FOLDER
|
||||
)
|
||||
|
||||
ml_client.data.create_or_update(my_data)
|
||||
```
|
||||
|
||||
## Model Registry
|
||||
|
||||
### Register Model
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import Model
|
||||
from azure.ai.ml.constants import AssetTypes
|
||||
|
||||
model = Model(
|
||||
name="my-model",
|
||||
version="1",
|
||||
path="./model/",
|
||||
type=AssetTypes.CUSTOM_MODEL,
|
||||
description="My trained model"
|
||||
)
|
||||
|
||||
ml_client.models.create_or_update(model)
|
||||
```
|
||||
|
||||
### List Models
|
||||
|
||||
```python
|
||||
for model in ml_client.models.list(name="my-model"):
|
||||
print(f"{model.name} v{model.version}")
|
||||
```
|
||||
|
||||
## Compute
|
||||
|
||||
### Create Compute Cluster
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import AmlCompute
|
||||
|
||||
cluster = AmlCompute(
|
||||
name="cpu-cluster",
|
||||
type="amlcompute",
|
||||
size="Standard_DS3_v2",
|
||||
min_instances=0,
|
||||
max_instances=4,
|
||||
idle_time_before_scale_down=120
|
||||
)
|
||||
|
||||
ml_client.compute.begin_create_or_update(cluster).result()
|
||||
```
|
||||
|
||||
### List Compute
|
||||
|
||||
```python
|
||||
for compute in ml_client.compute.list():
|
||||
print(f"{compute.name}: {compute.type}")
|
||||
```
|
||||
|
||||
## Jobs
|
||||
|
||||
### Command Job
|
||||
|
||||
```python
|
||||
from azure.ai.ml import command, Input
|
||||
|
||||
job = command(
|
||||
code="./src",
|
||||
command="python train.py --data ${{inputs.data}} --lr ${{inputs.learning_rate}}",
|
||||
inputs={
|
||||
"data": Input(type="uri_folder", path="azureml:my-dataset:1"),
|
||||
"learning_rate": 0.01
|
||||
},
|
||||
environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest",
|
||||
compute="cpu-cluster",
|
||||
display_name="training-job"
|
||||
)
|
||||
|
||||
returned_job = ml_client.jobs.create_or_update(job)
|
||||
print(f"Job URL: {returned_job.studio_url}")
|
||||
```
|
||||
|
||||
### Monitor Job
|
||||
|
||||
```python
|
||||
ml_client.jobs.stream(returned_job.name)
|
||||
```
|
||||
|
||||
## Pipelines
|
||||
|
||||
```python
|
||||
from azure.ai.ml import dsl, Input, Output
|
||||
from azure.ai.ml.entities import Pipeline
|
||||
|
||||
@dsl.pipeline(
|
||||
compute="cpu-cluster",
|
||||
description="Training pipeline"
|
||||
)
|
||||
def training_pipeline(data_input):
|
||||
prep_step = prep_component(data=data_input)
|
||||
train_step = train_component(
|
||||
data=prep_step.outputs.output_data,
|
||||
learning_rate=0.01
|
||||
)
|
||||
return {"model": train_step.outputs.model}
|
||||
|
||||
pipeline = training_pipeline(
|
||||
data_input=Input(type="uri_folder", path="azureml:my-dataset:1")
|
||||
)
|
||||
|
||||
pipeline_job = ml_client.jobs.create_or_update(pipeline)
|
||||
```
|
||||
|
||||
## Environments
|
||||
|
||||
### Create Custom Environment
|
||||
|
||||
```python
|
||||
from azure.ai.ml.entities import Environment
|
||||
|
||||
env = Environment(
|
||||
name="my-env",
|
||||
version="1",
|
||||
image="mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
|
||||
conda_file="./environment.yml"
|
||||
)
|
||||
|
||||
ml_client.environments.create_or_update(env)
|
||||
```
|
||||
|
||||
## Datastores
|
||||
|
||||
### List Datastores
|
||||
|
||||
```python
|
||||
for ds in ml_client.datastores.list():
|
||||
print(f"{ds.name}: {ds.type}")
|
||||
```
|
||||
|
||||
### Get Default Datastore
|
||||
|
||||
```python
|
||||
default_ds = ml_client.datastores.get_default()
|
||||
print(f"Default: {default_ds.name}")
|
||||
```
|
||||
|
||||
## MLClient Operations
|
||||
|
||||
| Property | Operations |
|
||||
|----------|------------|
|
||||
| `workspaces` | create, get, list, delete |
|
||||
| `jobs` | create_or_update, get, list, stream, cancel |
|
||||
| `models` | create_or_update, get, list, archive |
|
||||
| `data` | create_or_update, get, list |
|
||||
| `compute` | begin_create_or_update, get, list, delete |
|
||||
| `environments` | create_or_update, get, list |
|
||||
| `datastores` | create_or_update, get, list, get_default |
|
||||
| `components` | create_or_update, get, list |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use versioning** for data, models, and environments
|
||||
2. **Configure idle scale-down** to reduce compute costs
|
||||
3. **Use environments** for reproducible training
|
||||
4. **Stream job logs** to monitor progress
|
||||
5. **Register models** after successful training jobs
|
||||
6. **Use pipelines** for multi-step workflows
|
||||
7. **Tag resources** for organization and cost tracking
|
||||
455
skills/azure-ai-openai-dotnet/SKILL.md
Normal file
455
skills/azure-ai-openai-dotnet/SKILL.md
Normal file
@@ -0,0 +1,455 @@
|
||||
---
|
||||
name: azure-ai-openai-dotnet
|
||||
description: |
|
||||
Azure OpenAI SDK for .NET. Client library for Azure OpenAI and OpenAI services. Use for chat completions, embeddings, image generation, audio transcription, and assistants. Triggers: "Azure OpenAI", "AzureOpenAIClient", "ChatClient", "chat completions .NET", "GPT-4", "embeddings", "DALL-E", "Whisper", "OpenAI .NET".
|
||||
package: Azure.AI.OpenAI
|
||||
---
|
||||
|
||||
# Azure.AI.OpenAI (.NET)
|
||||
|
||||
Client library for Azure OpenAI Service providing access to OpenAI models including GPT-4, GPT-4o, embeddings, DALL-E, and Whisper.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.OpenAI
|
||||
|
||||
# For OpenAI (non-Azure) compatibility
|
||||
dotnet add package OpenAI
|
||||
```
|
||||
|
||||
**Current Version**: 2.1.0 (stable)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_OPENAI_ENDPOINT=https://<resource-name>.openai.azure.com
|
||||
AZURE_OPENAI_API_KEY=<api-key> # For key-based auth
|
||||
AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4o-mini # Your deployment name
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
AzureOpenAIClient (top-level)
|
||||
├── GetChatClient(deploymentName) → ChatClient
|
||||
├── GetEmbeddingClient(deploymentName) → EmbeddingClient
|
||||
├── GetImageClient(deploymentName) → ImageClient
|
||||
├── GetAudioClient(deploymentName) → AudioClient
|
||||
└── GetAssistantClient() → AssistantClient
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key Authentication
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.AI.OpenAI;
|
||||
|
||||
AzureOpenAIClient client = new(
|
||||
new Uri(Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT")!),
|
||||
new AzureKeyCredential(Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY")!));
|
||||
```
|
||||
|
||||
### Microsoft Entra ID (Recommended for Production)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.OpenAI;
|
||||
|
||||
AzureOpenAIClient client = new(
|
||||
new Uri(Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT")!),
|
||||
new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
### Using OpenAI SDK Directly with Azure
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using OpenAI;
|
||||
using OpenAI.Chat;
|
||||
using System.ClientModel.Primitives;
|
||||
|
||||
#pragma warning disable OPENAI001
|
||||
|
||||
BearerTokenPolicy tokenPolicy = new(
|
||||
new DefaultAzureCredential(),
|
||||
"https://cognitiveservices.azure.com/.default");
|
||||
|
||||
ChatClient client = new(
|
||||
model: "gpt-4o-mini",
|
||||
authenticationPolicy: tokenPolicy,
|
||||
options: new OpenAIClientOptions()
|
||||
{
|
||||
Endpoint = new Uri("https://YOUR-RESOURCE.openai.azure.com/openai/v1")
|
||||
});
|
||||
```
|
||||
|
||||
## Chat Completions
|
||||
|
||||
### Basic Chat
|
||||
|
||||
```csharp
|
||||
using Azure.AI.OpenAI;
|
||||
using OpenAI.Chat;
|
||||
|
||||
AzureOpenAIClient azureClient = new(
|
||||
new Uri(endpoint),
|
||||
new DefaultAzureCredential());
|
||||
|
||||
ChatClient chatClient = azureClient.GetChatClient("gpt-4o-mini");
|
||||
|
||||
ChatCompletion completion = chatClient.CompleteChat(
|
||||
[
|
||||
new SystemChatMessage("You are a helpful assistant."),
|
||||
new UserChatMessage("What is Azure OpenAI?")
|
||||
]);
|
||||
|
||||
Console.WriteLine(completion.Content[0].Text);
|
||||
```
|
||||
|
||||
### Async Chat
|
||||
|
||||
```csharp
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[
|
||||
new SystemChatMessage("You are a helpful assistant."),
|
||||
new UserChatMessage("Explain cloud computing in simple terms.")
|
||||
]);
|
||||
|
||||
Console.WriteLine($"Response: {completion.Content[0].Text}");
|
||||
Console.WriteLine($"Tokens used: {completion.Usage.TotalTokenCount}");
|
||||
```
|
||||
|
||||
### Streaming Chat
|
||||
|
||||
```csharp
|
||||
await foreach (StreamingChatCompletionUpdate update
|
||||
in chatClient.CompleteChatStreamingAsync(messages))
|
||||
{
|
||||
if (update.ContentUpdate.Count > 0)
|
||||
{
|
||||
Console.Write(update.ContentUpdate[0].Text);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Chat with Options
|
||||
|
||||
```csharp
|
||||
ChatCompletionOptions options = new()
|
||||
{
|
||||
MaxOutputTokenCount = 1000,
|
||||
Temperature = 0.7f,
|
||||
TopP = 0.95f,
|
||||
FrequencyPenalty = 0,
|
||||
PresencePenalty = 0
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(messages, options);
|
||||
```
|
||||
|
||||
### Multi-turn Conversation
|
||||
|
||||
```csharp
|
||||
List<ChatMessage> messages = new()
|
||||
{
|
||||
new SystemChatMessage("You are a helpful assistant."),
|
||||
new UserChatMessage("Hi, can you help me?"),
|
||||
new AssistantChatMessage("Of course! What do you need help with?"),
|
||||
new UserChatMessage("What's the capital of France?")
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(messages);
|
||||
messages.Add(new AssistantChatMessage(completion.Content[0].Text));
|
||||
```
|
||||
|
||||
## Structured Outputs (JSON Schema)
|
||||
|
||||
```csharp
|
||||
using System.Text.Json;
|
||||
|
||||
ChatCompletionOptions options = new()
|
||||
{
|
||||
ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat(
|
||||
jsonSchemaFormatName: "math_reasoning",
|
||||
jsonSchema: BinaryData.FromBytes("""
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"steps": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"explanation": { "type": "string" },
|
||||
"output": { "type": "string" }
|
||||
},
|
||||
"required": ["explanation", "output"],
|
||||
"additionalProperties": false
|
||||
}
|
||||
},
|
||||
"final_answer": { "type": "string" }
|
||||
},
|
||||
"required": ["steps", "final_answer"],
|
||||
"additionalProperties": false
|
||||
}
|
||||
"""u8.ToArray()),
|
||||
jsonSchemaIsStrict: true)
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[new UserChatMessage("How can I solve 8x + 7 = -23?")],
|
||||
options);
|
||||
|
||||
using JsonDocument json = JsonDocument.Parse(completion.Content[0].Text);
|
||||
Console.WriteLine($"Answer: {json.RootElement.GetProperty("final_answer")}");
|
||||
```
|
||||
|
||||
## Reasoning Models (o1, o4-mini)
|
||||
|
||||
```csharp
|
||||
ChatCompletionOptions options = new()
|
||||
{
|
||||
ReasoningEffortLevel = ChatReasoningEffortLevel.Low,
|
||||
MaxOutputTokenCount = 100000
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[
|
||||
new DeveloperChatMessage("You are a helpful assistant"),
|
||||
new UserChatMessage("Explain the theory of relativity")
|
||||
], options);
|
||||
```
|
||||
|
||||
## Azure AI Search Integration (RAG)
|
||||
|
||||
```csharp
|
||||
using Azure.AI.OpenAI.Chat;
|
||||
|
||||
#pragma warning disable AOAI001
|
||||
|
||||
ChatCompletionOptions options = new();
|
||||
options.AddDataSource(new AzureSearchChatDataSource()
|
||||
{
|
||||
Endpoint = new Uri(searchEndpoint),
|
||||
IndexName = searchIndex,
|
||||
Authentication = DataSourceAuthentication.FromApiKey(searchKey)
|
||||
});
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[new UserChatMessage("What health plans are available?")],
|
||||
options);
|
||||
|
||||
ChatMessageContext context = completion.GetMessageContext();
|
||||
if (context?.Intent is not null)
|
||||
{
|
||||
Console.WriteLine($"Intent: {context.Intent}");
|
||||
}
|
||||
foreach (ChatCitation citation in context?.Citations ?? [])
|
||||
{
|
||||
Console.WriteLine($"Citation: {citation.Content}");
|
||||
}
|
||||
```
|
||||
|
||||
## Embeddings
|
||||
|
||||
```csharp
|
||||
using OpenAI.Embeddings;
|
||||
|
||||
EmbeddingClient embeddingClient = azureClient.GetEmbeddingClient("text-embedding-ada-002");
|
||||
|
||||
OpenAIEmbedding embedding = await embeddingClient.GenerateEmbeddingAsync("Hello, world!");
|
||||
ReadOnlyMemory<float> vector = embedding.ToFloats();
|
||||
|
||||
Console.WriteLine($"Embedding dimensions: {vector.Length}");
|
||||
```
|
||||
|
||||
### Batch Embeddings
|
||||
|
||||
```csharp
|
||||
List<string> inputs = new()
|
||||
{
|
||||
"First document text",
|
||||
"Second document text",
|
||||
"Third document text"
|
||||
};
|
||||
|
||||
OpenAIEmbeddingCollection embeddings = await embeddingClient.GenerateEmbeddingsAsync(inputs);
|
||||
|
||||
foreach (OpenAIEmbedding emb in embeddings)
|
||||
{
|
||||
Console.WriteLine($"Index {emb.Index}: {emb.ToFloats().Length} dimensions");
|
||||
}
|
||||
```
|
||||
|
||||
## Image Generation (DALL-E)
|
||||
|
||||
```csharp
|
||||
using OpenAI.Images;
|
||||
|
||||
ImageClient imageClient = azureClient.GetImageClient("dall-e-3");
|
||||
|
||||
GeneratedImage image = await imageClient.GenerateImageAsync(
|
||||
"A futuristic city skyline at sunset",
|
||||
new ImageGenerationOptions
|
||||
{
|
||||
Size = GeneratedImageSize.W1024xH1024,
|
||||
Quality = GeneratedImageQuality.High,
|
||||
Style = GeneratedImageStyle.Vivid
|
||||
});
|
||||
|
||||
Console.WriteLine($"Image URL: {image.ImageUri}");
|
||||
```
|
||||
|
||||
## Audio (Whisper)
|
||||
|
||||
### Transcription
|
||||
|
||||
```csharp
|
||||
using OpenAI.Audio;
|
||||
|
||||
AudioClient audioClient = azureClient.GetAudioClient("whisper");
|
||||
|
||||
AudioTranscription transcription = await audioClient.TranscribeAudioAsync(
|
||||
"audio.mp3",
|
||||
new AudioTranscriptionOptions
|
||||
{
|
||||
ResponseFormat = AudioTranscriptionFormat.Verbose,
|
||||
Language = "en"
|
||||
});
|
||||
|
||||
Console.WriteLine(transcription.Text);
|
||||
```
|
||||
|
||||
### Text-to-Speech
|
||||
|
||||
```csharp
|
||||
BinaryData speech = await audioClient.GenerateSpeechAsync(
|
||||
"Hello, welcome to Azure OpenAI!",
|
||||
GeneratedSpeechVoice.Alloy,
|
||||
new SpeechGenerationOptions
|
||||
{
|
||||
SpeedRatio = 1.0f,
|
||||
ResponseFormat = GeneratedSpeechFormat.Mp3
|
||||
});
|
||||
|
||||
await File.WriteAllBytesAsync("output.mp3", speech.ToArray());
|
||||
```
|
||||
|
||||
## Function Calling (Tools)
|
||||
|
||||
```csharp
|
||||
ChatTool getCurrentWeatherTool = ChatTool.CreateFunctionTool(
|
||||
functionName: "get_current_weather",
|
||||
functionDescription: "Get the current weather in a given location",
|
||||
functionParameters: BinaryData.FromString("""
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"location": {
|
||||
"type": "string",
|
||||
"description": "The city and state, e.g. San Francisco, CA"
|
||||
},
|
||||
"unit": {
|
||||
"type": "string",
|
||||
"enum": ["celsius", "fahrenheit"]
|
||||
}
|
||||
},
|
||||
"required": ["location"]
|
||||
}
|
||||
"""));
|
||||
|
||||
ChatCompletionOptions options = new()
|
||||
{
|
||||
Tools = { getCurrentWeatherTool }
|
||||
};
|
||||
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(
|
||||
[new UserChatMessage("What's the weather in Seattle?")],
|
||||
options);
|
||||
|
||||
if (completion.FinishReason == ChatFinishReason.ToolCalls)
|
||||
{
|
||||
foreach (ChatToolCall toolCall in completion.ToolCalls)
|
||||
{
|
||||
Console.WriteLine($"Function: {toolCall.FunctionName}");
|
||||
Console.WriteLine($"Arguments: {toolCall.FunctionArguments}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `AzureOpenAIClient` | Top-level client for Azure OpenAI |
|
||||
| `ChatClient` | Chat completions |
|
||||
| `EmbeddingClient` | Text embeddings |
|
||||
| `ImageClient` | Image generation (DALL-E) |
|
||||
| `AudioClient` | Audio transcription/TTS |
|
||||
| `ChatCompletion` | Chat response |
|
||||
| `ChatCompletionOptions` | Request configuration |
|
||||
| `StreamingChatCompletionUpdate` | Streaming response chunk |
|
||||
| `ChatMessage` | Base message type |
|
||||
| `SystemChatMessage` | System prompt |
|
||||
| `UserChatMessage` | User input |
|
||||
| `AssistantChatMessage` | Assistant response |
|
||||
| `DeveloperChatMessage` | Developer message (reasoning models) |
|
||||
| `ChatTool` | Function/tool definition |
|
||||
| `ChatToolCall` | Tool invocation request |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID in production** — Avoid API keys; use `DefaultAzureCredential`
|
||||
2. **Reuse client instances** — Create once, share across requests
|
||||
3. **Handle rate limits** — Implement exponential backoff for 429 errors
|
||||
4. **Stream for long responses** — Use `CompleteChatStreamingAsync` for better UX
|
||||
5. **Set appropriate timeouts** — Long completions may need extended timeouts
|
||||
6. **Use structured outputs** — JSON schema ensures consistent response format
|
||||
7. **Monitor token usage** — Track `completion.Usage` for cost management
|
||||
8. **Validate tool calls** — Always validate function arguments before execution
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ChatCompletion completion = await chatClient.CompleteChatAsync(messages);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 429)
|
||||
{
|
||||
Console.WriteLine("Rate limited. Retry after delay.");
|
||||
await Task.Delay(TimeSpan.FromSeconds(10));
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Bad request: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure OpenAI error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.OpenAI` | Azure OpenAI client (this SDK) | `dotnet add package Azure.AI.OpenAI` |
|
||||
| `OpenAI` | OpenAI compatibility | `dotnet add package OpenAI` |
|
||||
| `Azure.Identity` | Authentication | `dotnet add package Azure.Identity` |
|
||||
| `Azure.Search.Documents` | AI Search for RAG | `dotnet add package Azure.Search.Documents` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.OpenAI |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.openai |
|
||||
| Migration Guide (1.0→2.0) | https://learn.microsoft.com/azure/ai-services/openai/how-to/dotnet-migration |
|
||||
| Quickstart | https://learn.microsoft.com/azure/ai-services/openai/quickstart |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/openai/Azure.AI.OpenAI |
|
||||
348
skills/azure-ai-projects-dotnet/SKILL.md
Normal file
348
skills/azure-ai-projects-dotnet/SKILL.md
Normal file
@@ -0,0 +1,348 @@
|
||||
---
|
||||
name: azure-ai-projects-dotnet
|
||||
description: |
|
||||
Azure AI Projects SDK for .NET. High-level client for Azure AI Foundry projects including agents, connections, datasets, deployments, evaluations, and indexes. Use for AI Foundry project management, versioned agents, and orchestration. Triggers: "AI Projects", "AIProjectClient", "Foundry project", "versioned agents", "evaluations", "datasets", "connections", "deployments .NET".
|
||||
package: Azure.AI.Projects
|
||||
---
|
||||
|
||||
# Azure.AI.Projects (.NET)
|
||||
|
||||
High-level SDK for Azure AI Foundry project operations including agents, connections, datasets, deployments, evaluations, and indexes.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.Projects
|
||||
dotnet add package Azure.Identity
|
||||
|
||||
# Optional: For versioned agents with OpenAI extensions
|
||||
dotnet add package Azure.AI.Projects.OpenAI --prerelease
|
||||
|
||||
# Optional: For low-level agent operations
|
||||
dotnet add package Azure.AI.Agents.Persistent --prerelease
|
||||
```
|
||||
|
||||
**Current Versions**: GA v1.1.0, Preview v1.2.0-beta.5
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
MODEL_DEPLOYMENT_NAME=gpt-4o-mini
|
||||
CONNECTION_NAME=<your-connection-name>
|
||||
AI_SEARCH_CONNECTION_NAME=<ai-search-connection>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.Projects;
|
||||
|
||||
var endpoint = Environment.GetEnvironmentVariable("PROJECT_ENDPOINT");
|
||||
AIProjectClient projectClient = new AIProjectClient(
|
||||
new Uri(endpoint),
|
||||
new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
AIProjectClient
|
||||
├── Agents → AIProjectAgentsOperations (versioned agents)
|
||||
├── Connections → ConnectionsClient
|
||||
├── Datasets → DatasetsClient
|
||||
├── Deployments → DeploymentsClient
|
||||
├── Evaluations → EvaluationsClient
|
||||
├── Evaluators → EvaluatorsClient
|
||||
├── Indexes → IndexesClient
|
||||
├── Telemetry → AIProjectTelemetry
|
||||
├── OpenAI → ProjectOpenAIClient (preview)
|
||||
└── GetPersistentAgentsClient() → PersistentAgentsClient
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Get Persistent Agents Client
|
||||
|
||||
```csharp
|
||||
// Get low-level agents client from project client
|
||||
PersistentAgentsClient agentsClient = projectClient.GetPersistentAgentsClient();
|
||||
|
||||
// Create agent
|
||||
PersistentAgent agent = await agentsClient.Administration.CreateAgentAsync(
|
||||
model: "gpt-4o-mini",
|
||||
name: "Math Tutor",
|
||||
instructions: "You are a personal math tutor.");
|
||||
|
||||
// Create thread and run
|
||||
PersistentAgentThread thread = await agentsClient.Threads.CreateThreadAsync();
|
||||
await agentsClient.Messages.CreateMessageAsync(thread.Id, MessageRole.User, "Solve 3x + 11 = 14");
|
||||
ThreadRun run = await agentsClient.Runs.CreateRunAsync(thread.Id, agent.Id);
|
||||
|
||||
// Poll for completion
|
||||
do
|
||||
{
|
||||
await Task.Delay(500);
|
||||
run = await agentsClient.Runs.GetRunAsync(thread.Id, run.Id);
|
||||
}
|
||||
while (run.Status == RunStatus.Queued || run.Status == RunStatus.InProgress);
|
||||
|
||||
// Get messages
|
||||
await foreach (var msg in agentsClient.Messages.GetMessagesAsync(thread.Id))
|
||||
{
|
||||
foreach (var content in msg.ContentItems)
|
||||
{
|
||||
if (content is MessageTextContent textContent)
|
||||
Console.WriteLine(textContent.Text);
|
||||
}
|
||||
}
|
||||
|
||||
// Cleanup
|
||||
await agentsClient.Threads.DeleteThreadAsync(thread.Id);
|
||||
await agentsClient.Administration.DeleteAgentAsync(agent.Id);
|
||||
```
|
||||
|
||||
### 2. Versioned Agents with Tools (Preview)
|
||||
|
||||
```csharp
|
||||
using Azure.AI.Projects.OpenAI;
|
||||
|
||||
// Create agent with web search tool
|
||||
PromptAgentDefinition agentDefinition = new(model: "gpt-4o-mini")
|
||||
{
|
||||
Instructions = "You are a helpful assistant that can search the web",
|
||||
Tools = {
|
||||
ResponseTool.CreateWebSearchTool(
|
||||
userLocation: WebSearchToolLocation.CreateApproximateLocation(
|
||||
country: "US",
|
||||
city: "Seattle",
|
||||
region: "Washington"
|
||||
)
|
||||
),
|
||||
}
|
||||
};
|
||||
|
||||
AgentVersion agentVersion = await projectClient.Agents.CreateAgentVersionAsync(
|
||||
agentName: "myAgent",
|
||||
options: new(agentDefinition));
|
||||
|
||||
// Get response client
|
||||
ProjectResponsesClient responseClient = projectClient.OpenAI.GetProjectResponsesClientForAgent(agentVersion.Name);
|
||||
|
||||
// Create response
|
||||
ResponseResult response = responseClient.CreateResponse("What's the weather in Seattle?");
|
||||
Console.WriteLine(response.GetOutputText());
|
||||
|
||||
// Cleanup
|
||||
projectClient.Agents.DeleteAgentVersion(agentName: agentVersion.Name, agentVersion: agentVersion.Version);
|
||||
```
|
||||
|
||||
### 3. Connections
|
||||
|
||||
```csharp
|
||||
// List all connections
|
||||
foreach (AIProjectConnection connection in projectClient.Connections.GetConnections())
|
||||
{
|
||||
Console.WriteLine($"{connection.Name}: {connection.ConnectionType}");
|
||||
}
|
||||
|
||||
// Get specific connection
|
||||
AIProjectConnection conn = projectClient.Connections.GetConnection(
|
||||
connectionName,
|
||||
includeCredentials: true);
|
||||
|
||||
// Get default connection
|
||||
AIProjectConnection defaultConn = projectClient.Connections.GetDefaultConnection(
|
||||
includeCredentials: false);
|
||||
```
|
||||
|
||||
### 4. Deployments
|
||||
|
||||
```csharp
|
||||
// List all deployments
|
||||
foreach (AIProjectDeployment deployment in projectClient.Deployments.GetDeployments())
|
||||
{
|
||||
Console.WriteLine($"{deployment.Name}: {deployment.ModelName}");
|
||||
}
|
||||
|
||||
// Filter by publisher
|
||||
foreach (var deployment in projectClient.Deployments.GetDeployments(modelPublisher: "Microsoft"))
|
||||
{
|
||||
Console.WriteLine(deployment.Name);
|
||||
}
|
||||
|
||||
// Get specific deployment
|
||||
ModelDeployment details = (ModelDeployment)projectClient.Deployments.GetDeployment("gpt-4o-mini");
|
||||
```
|
||||
|
||||
### 5. Datasets
|
||||
|
||||
```csharp
|
||||
// Upload single file
|
||||
FileDataset fileDataset = projectClient.Datasets.UploadFile(
|
||||
name: "my-dataset",
|
||||
version: "1.0",
|
||||
filePath: "data/training.txt",
|
||||
connectionName: connectionName);
|
||||
|
||||
// Upload folder
|
||||
FolderDataset folderDataset = projectClient.Datasets.UploadFolder(
|
||||
name: "my-dataset",
|
||||
version: "2.0",
|
||||
folderPath: "data/training",
|
||||
connectionName: connectionName,
|
||||
filePattern: new Regex(".*\\.txt"));
|
||||
|
||||
// Get dataset
|
||||
AIProjectDataset dataset = projectClient.Datasets.GetDataset("my-dataset", "1.0");
|
||||
|
||||
// Delete dataset
|
||||
projectClient.Datasets.Delete("my-dataset", "1.0");
|
||||
```
|
||||
|
||||
### 6. Indexes
|
||||
|
||||
```csharp
|
||||
// Create Azure AI Search index
|
||||
AzureAISearchIndex searchIndex = new(aiSearchConnectionName, aiSearchIndexName)
|
||||
{
|
||||
Description = "Sample Index"
|
||||
};
|
||||
|
||||
searchIndex = (AzureAISearchIndex)projectClient.Indexes.CreateOrUpdate(
|
||||
name: "my-index",
|
||||
version: "1.0",
|
||||
index: searchIndex);
|
||||
|
||||
// List indexes
|
||||
foreach (AIProjectIndex index in projectClient.Indexes.GetIndexes())
|
||||
{
|
||||
Console.WriteLine(index.Name);
|
||||
}
|
||||
|
||||
// Delete index
|
||||
projectClient.Indexes.Delete(name: "my-index", version: "1.0");
|
||||
```
|
||||
|
||||
### 7. Evaluations
|
||||
|
||||
```csharp
|
||||
// Create evaluation configuration
|
||||
var evaluatorConfig = new EvaluatorConfiguration(id: EvaluatorIDs.Relevance);
|
||||
evaluatorConfig.InitParams.Add("deployment_name", BinaryData.FromObjectAsJson("gpt-4o"));
|
||||
|
||||
// Create evaluation
|
||||
Evaluation evaluation = new Evaluation(
|
||||
data: new InputDataset("<dataset_id>"),
|
||||
evaluators: new Dictionary<string, EvaluatorConfiguration>
|
||||
{
|
||||
{ "relevance", evaluatorConfig }
|
||||
}
|
||||
)
|
||||
{
|
||||
DisplayName = "Sample Evaluation"
|
||||
};
|
||||
|
||||
// Run evaluation
|
||||
Evaluation result = projectClient.Evaluations.Create(evaluation: evaluation);
|
||||
|
||||
// Get evaluation
|
||||
Evaluation getResult = projectClient.Evaluations.Get(result.Name);
|
||||
|
||||
// List evaluations
|
||||
foreach (var eval in projectClient.Evaluations.GetAll())
|
||||
{
|
||||
Console.WriteLine($"{eval.DisplayName}: {eval.Status}");
|
||||
}
|
||||
```
|
||||
|
||||
### 8. Get Azure OpenAI Chat Client
|
||||
|
||||
```csharp
|
||||
using Azure.AI.OpenAI;
|
||||
using OpenAI.Chat;
|
||||
|
||||
ClientConnection connection = projectClient.GetConnection(typeof(AzureOpenAIClient).FullName!);
|
||||
|
||||
if (!connection.TryGetLocatorAsUri(out Uri uri) || uri is null)
|
||||
throw new InvalidOperationException("Invalid URI.");
|
||||
|
||||
uri = new Uri($"https://{uri.Host}");
|
||||
|
||||
AzureOpenAIClient azureOpenAIClient = new AzureOpenAIClient(uri, new DefaultAzureCredential());
|
||||
ChatClient chatClient = azureOpenAIClient.GetChatClient("gpt-4o-mini");
|
||||
|
||||
ChatCompletion result = chatClient.CompleteChat("List all rainbow colors");
|
||||
Console.WriteLine(result.Content[0].Text);
|
||||
```
|
||||
|
||||
## Available Agent Tools
|
||||
|
||||
| Tool | Class | Purpose |
|
||||
|------|-------|---------|
|
||||
| Code Interpreter | `CodeInterpreterToolDefinition` | Execute Python code |
|
||||
| File Search | `FileSearchToolDefinition` | Search uploaded files |
|
||||
| Function Calling | `FunctionToolDefinition` | Call custom functions |
|
||||
| Bing Grounding | `BingGroundingToolDefinition` | Web search via Bing |
|
||||
| Azure AI Search | `AzureAISearchToolDefinition` | Search Azure AI indexes |
|
||||
| OpenAPI | `OpenApiToolDefinition` | Call external APIs |
|
||||
| Azure Functions | `AzureFunctionToolDefinition` | Invoke Azure Functions |
|
||||
| MCP | `MCPToolDefinition` | Model Context Protocol tools |
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `AIProjectClient` | Main entry point |
|
||||
| `PersistentAgentsClient` | Low-level agent operations |
|
||||
| `PromptAgentDefinition` | Versioned agent definition |
|
||||
| `AgentVersion` | Versioned agent instance |
|
||||
| `AIProjectConnection` | Connection to Azure resource |
|
||||
| `AIProjectDeployment` | Model deployment info |
|
||||
| `AIProjectDataset` | Dataset metadata |
|
||||
| `AIProjectIndex` | Search index metadata |
|
||||
| `Evaluation` | Evaluation configuration and results |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `DefaultAzureCredential`** for production authentication
|
||||
2. **Use async methods** (`*Async`) for all I/O operations
|
||||
3. **Poll with appropriate delays** (500ms recommended) when waiting for runs
|
||||
4. **Clean up resources** — delete threads, agents, and files when done
|
||||
5. **Use versioned agents** (via `Azure.AI.Projects.OpenAI`) for production scenarios
|
||||
6. **Store connection IDs** rather than names for tool configurations
|
||||
7. **Use `includeCredentials: true`** only when credentials are needed
|
||||
8. **Handle pagination** — use `AsyncPageable<T>` for listing operations
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var result = await projectClient.Evaluations.CreateAsync(evaluation);
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.Projects` | High-level project client (this SDK) | `dotnet add package Azure.AI.Projects` |
|
||||
| `Azure.AI.Agents.Persistent` | Low-level agent operations | `dotnet add package Azure.AI.Agents.Persistent` |
|
||||
| `Azure.AI.Projects.OpenAI` | Versioned agents with OpenAI | `dotnet add package Azure.AI.Projects.OpenAI` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.Projects |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.projects |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.Projects |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.Projects/samples |
|
||||
152
skills/azure-ai-projects-java/SKILL.md
Normal file
152
skills/azure-ai-projects-java/SKILL.md
Normal file
@@ -0,0 +1,152 @@
|
||||
---
|
||||
name: azure-ai-projects-java
|
||||
description: |
|
||||
Azure AI Projects SDK for Java. High-level SDK for Azure AI Foundry project management including connections, datasets, indexes, and evaluations.
|
||||
Triggers: "AIProjectClient java", "azure ai projects java", "Foundry project java", "ConnectionsClient", "DatasetsClient", "IndexesClient".
|
||||
package: com.azure:azure-ai-projects
|
||||
---
|
||||
|
||||
# Azure AI Projects SDK for Java
|
||||
|
||||
High-level SDK for Azure AI Foundry project management with access to connections, datasets, indexes, and evaluations.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-projects</artifactId>
|
||||
<version>1.0.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```java
|
||||
import com.azure.ai.projects.AIProjectClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
AIProjectClientBuilder builder = new AIProjectClientBuilder()
|
||||
.endpoint(System.getenv("PROJECT_ENDPOINT"))
|
||||
.credential(new DefaultAzureCredentialBuilder().build());
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
The SDK provides multiple sub-clients for different operations:
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `ConnectionsClient` | Enumerate connected Azure resources |
|
||||
| `DatasetsClient` | Upload documents and manage datasets |
|
||||
| `DeploymentsClient` | Enumerate AI model deployments |
|
||||
| `IndexesClient` | Create and manage search indexes |
|
||||
| `EvaluationsClient` | Run AI model evaluations |
|
||||
| `EvaluatorsClient` | Manage evaluator configurations |
|
||||
| `SchedulesClient` | Manage scheduled operations |
|
||||
|
||||
```java
|
||||
// Build sub-clients from builder
|
||||
ConnectionsClient connectionsClient = builder.buildConnectionsClient();
|
||||
DatasetsClient datasetsClient = builder.buildDatasetsClient();
|
||||
DeploymentsClient deploymentsClient = builder.buildDeploymentsClient();
|
||||
IndexesClient indexesClient = builder.buildIndexesClient();
|
||||
EvaluationsClient evaluationsClient = builder.buildEvaluationsClient();
|
||||
```
|
||||
|
||||
## Core Operations
|
||||
|
||||
### List Connections
|
||||
|
||||
```java
|
||||
import com.azure.ai.projects.models.Connection;
|
||||
import com.azure.core.http.rest.PagedIterable;
|
||||
|
||||
PagedIterable<Connection> connections = connectionsClient.listConnections();
|
||||
for (Connection connection : connections) {
|
||||
System.out.println("Name: " + connection.getName());
|
||||
System.out.println("Type: " + connection.getType());
|
||||
System.out.println("Credential Type: " + connection.getCredentials().getType());
|
||||
}
|
||||
```
|
||||
|
||||
### List Indexes
|
||||
|
||||
```java
|
||||
indexesClient.listLatest().forEach(index -> {
|
||||
System.out.println("Index name: " + index.getName());
|
||||
System.out.println("Version: " + index.getVersion());
|
||||
System.out.println("Description: " + index.getDescription());
|
||||
});
|
||||
```
|
||||
|
||||
### Create or Update Index
|
||||
|
||||
```java
|
||||
import com.azure.ai.projects.models.AzureAISearchIndex;
|
||||
import com.azure.ai.projects.models.Index;
|
||||
|
||||
String indexName = "my-index";
|
||||
String indexVersion = "1.0";
|
||||
String searchConnectionName = System.getenv("AI_SEARCH_CONNECTION_NAME");
|
||||
String searchIndexName = System.getenv("AI_SEARCH_INDEX_NAME");
|
||||
|
||||
Index index = indexesClient.createOrUpdate(
|
||||
indexName,
|
||||
indexVersion,
|
||||
new AzureAISearchIndex()
|
||||
.setConnectionName(searchConnectionName)
|
||||
.setIndexName(searchIndexName)
|
||||
);
|
||||
|
||||
System.out.println("Created index: " + index.getName());
|
||||
```
|
||||
|
||||
### Access OpenAI Evaluations
|
||||
|
||||
The SDK exposes OpenAI's official SDK for evaluations:
|
||||
|
||||
```java
|
||||
import com.openai.services.EvalService;
|
||||
|
||||
EvalService evalService = evaluationsClient.getOpenAIClient();
|
||||
// Use OpenAI evaluation APIs directly
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for production authentication
|
||||
2. **Reuse client builder** to create multiple sub-clients efficiently
|
||||
3. **Handle pagination** when listing resources with `PagedIterable`
|
||||
4. **Use environment variables** for connection names and configuration
|
||||
5. **Check connection types** before accessing credentials
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
import com.azure.core.exception.ResourceNotFoundException;
|
||||
|
||||
try {
|
||||
Index index = indexesClient.get(indexName, version);
|
||||
} catch (ResourceNotFoundException e) {
|
||||
System.err.println("Index not found: " + indexName);
|
||||
} catch (HttpResponseException e) {
|
||||
System.err.println("Error: " + e.getResponse().getStatusCode());
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Product Docs | https://learn.microsoft.com/azure/ai-studio/ |
|
||||
| API Reference | https://learn.microsoft.com/rest/api/aifoundry/aiprojects/ |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-projects |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-projects/src/samples |
|
||||
295
skills/azure-ai-projects-py/SKILL.md
Normal file
295
skills/azure-ai-projects-py/SKILL.md
Normal file
@@ -0,0 +1,295 @@
|
||||
---
|
||||
name: azure-ai-projects-py
|
||||
description: Build AI applications using the Azure AI Projects Python SDK (azure-ai-projects). Use when working with Foundry project clients, creating versioned agents with PromptAgentDefinition, running evaluations, managing connections/deployments/datasets/indexes, or using OpenAI-compatible clients. This is the high-level Foundry SDK - for low-level agent operations, use azure-ai-agents-python skill.
|
||||
package: azure-ai-projects
|
||||
---
|
||||
|
||||
# Azure AI Projects Python SDK (Foundry SDK)
|
||||
|
||||
Build AI applications on Microsoft Foundry using the `azure-ai-projects` SDK.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-projects azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_AI_PROJECT_ENDPOINT="https://<resource>.services.ai.azure.com/api/projects/<project>"
|
||||
AZURE_AI_MODEL_DEPLOYMENT_NAME="gpt-4o-mini"
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.ai.projects import AIProjectClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=credential,
|
||||
)
|
||||
```
|
||||
|
||||
## Client Operations Overview
|
||||
|
||||
| Operation | Access | Purpose |
|
||||
|-----------|--------|---------|
|
||||
| `client.agents` | `.agents.*` | Agent CRUD, versions, threads, runs |
|
||||
| `client.connections` | `.connections.*` | List/get project connections |
|
||||
| `client.deployments` | `.deployments.*` | List model deployments |
|
||||
| `client.datasets` | `.datasets.*` | Dataset management |
|
||||
| `client.indexes` | `.indexes.*` | Index management |
|
||||
| `client.evaluations` | `.evaluations.*` | Run evaluations |
|
||||
| `client.red_teams` | `.red_teams.*` | Red team operations |
|
||||
|
||||
## Two Client Approaches
|
||||
|
||||
### 1. AIProjectClient (Native Foundry)
|
||||
|
||||
```python
|
||||
from azure.ai.projects import AIProjectClient
|
||||
|
||||
client = AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=DefaultAzureCredential(),
|
||||
)
|
||||
|
||||
# Use Foundry-native operations
|
||||
agent = client.agents.create_agent(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
name="my-agent",
|
||||
instructions="You are helpful.",
|
||||
)
|
||||
```
|
||||
|
||||
### 2. OpenAI-Compatible Client
|
||||
|
||||
```python
|
||||
# Get OpenAI-compatible client from project
|
||||
openai_client = client.get_openai_client()
|
||||
|
||||
# Use standard OpenAI API
|
||||
response = openai_client.chat.completions.create(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
messages=[{"role": "user", "content": "Hello!"}],
|
||||
)
|
||||
```
|
||||
|
||||
## Agent Operations
|
||||
|
||||
### Create Agent (Basic)
|
||||
|
||||
```python
|
||||
agent = client.agents.create_agent(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
name="my-agent",
|
||||
instructions="You are a helpful assistant.",
|
||||
)
|
||||
```
|
||||
|
||||
### Create Agent with Tools
|
||||
|
||||
```python
|
||||
from azure.ai.agents import CodeInterpreterTool, FileSearchTool
|
||||
|
||||
agent = client.agents.create_agent(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
name="tool-agent",
|
||||
instructions="You can execute code and search files.",
|
||||
tools=[CodeInterpreterTool(), FileSearchTool()],
|
||||
)
|
||||
```
|
||||
|
||||
### Versioned Agents with PromptAgentDefinition
|
||||
|
||||
```python
|
||||
from azure.ai.projects.models import PromptAgentDefinition
|
||||
|
||||
# Create a versioned agent
|
||||
agent_version = client.agents.create_version(
|
||||
agent_name="customer-support-agent",
|
||||
definition=PromptAgentDefinition(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
instructions="You are a customer support specialist.",
|
||||
tools=[], # Add tools as needed
|
||||
),
|
||||
version_label="v1.0",
|
||||
)
|
||||
```
|
||||
|
||||
See [references/agents.md](references/agents.md) for detailed agent patterns.
|
||||
|
||||
## Tools Overview
|
||||
|
||||
| Tool | Class | Use Case |
|
||||
|------|-------|----------|
|
||||
| Code Interpreter | `CodeInterpreterTool` | Execute Python, generate files |
|
||||
| File Search | `FileSearchTool` | RAG over uploaded documents |
|
||||
| Bing Grounding | `BingGroundingTool` | Web search (requires connection) |
|
||||
| Azure AI Search | `AzureAISearchTool` | Search your indexes |
|
||||
| Function Calling | `FunctionTool` | Call your Python functions |
|
||||
| OpenAPI | `OpenApiTool` | Call REST APIs |
|
||||
| MCP | `McpTool` | Model Context Protocol servers |
|
||||
| Memory Search | `MemorySearchTool` | Search agent memory stores |
|
||||
| SharePoint | `SharepointGroundingTool` | Search SharePoint content |
|
||||
|
||||
See [references/tools.md](references/tools.md) for all tool patterns.
|
||||
|
||||
## Thread and Message Flow
|
||||
|
||||
```python
|
||||
# 1. Create thread
|
||||
thread = client.agents.threads.create()
|
||||
|
||||
# 2. Add message
|
||||
client.agents.messages.create(
|
||||
thread_id=thread.id,
|
||||
role="user",
|
||||
content="What's the weather like?",
|
||||
)
|
||||
|
||||
# 3. Create and process run
|
||||
run = client.agents.runs.create_and_process(
|
||||
thread_id=thread.id,
|
||||
agent_id=agent.id,
|
||||
)
|
||||
|
||||
# 4. Get response
|
||||
if run.status == "completed":
|
||||
messages = client.agents.messages.list(thread_id=thread.id)
|
||||
for msg in messages:
|
||||
if msg.role == "assistant":
|
||||
print(msg.content[0].text.value)
|
||||
```
|
||||
|
||||
## Connections
|
||||
|
||||
```python
|
||||
# List all connections
|
||||
connections = client.connections.list()
|
||||
for conn in connections:
|
||||
print(f"{conn.name}: {conn.connection_type}")
|
||||
|
||||
# Get specific connection
|
||||
connection = client.connections.get(connection_name="my-search-connection")
|
||||
```
|
||||
|
||||
See [references/connections.md](references/connections.md) for connection patterns.
|
||||
|
||||
## Deployments
|
||||
|
||||
```python
|
||||
# List available model deployments
|
||||
deployments = client.deployments.list()
|
||||
for deployment in deployments:
|
||||
print(f"{deployment.name}: {deployment.model}")
|
||||
```
|
||||
|
||||
See [references/deployments.md](references/deployments.md) for deployment patterns.
|
||||
|
||||
## Datasets and Indexes
|
||||
|
||||
```python
|
||||
# List datasets
|
||||
datasets = client.datasets.list()
|
||||
|
||||
# List indexes
|
||||
indexes = client.indexes.list()
|
||||
```
|
||||
|
||||
See [references/datasets-indexes.md](references/datasets-indexes.md) for data operations.
|
||||
|
||||
## Evaluation
|
||||
|
||||
```python
|
||||
# Using OpenAI client for evals
|
||||
openai_client = client.get_openai_client()
|
||||
|
||||
# Create evaluation with built-in evaluators
|
||||
eval_run = openai_client.evals.runs.create(
|
||||
eval_id="my-eval",
|
||||
name="quality-check",
|
||||
data_source={
|
||||
"type": "custom",
|
||||
"item_references": [{"item_id": "test-1"}],
|
||||
},
|
||||
testing_criteria=[
|
||||
{"type": "fluency"},
|
||||
{"type": "task_adherence"},
|
||||
],
|
||||
)
|
||||
```
|
||||
|
||||
See [references/evaluation.md](references/evaluation.md) for evaluation patterns.
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.projects.aio import AIProjectClient
|
||||
|
||||
async with AIProjectClient(
|
||||
endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
|
||||
credential=DefaultAzureCredential(),
|
||||
) as client:
|
||||
agent = await client.agents.create_agent(...)
|
||||
# ... async operations
|
||||
```
|
||||
|
||||
See [references/async-patterns.md](references/async-patterns.md) for async patterns.
|
||||
|
||||
## Memory Stores
|
||||
|
||||
```python
|
||||
# Create memory store for agent
|
||||
memory_store = client.agents.create_memory_store(
|
||||
name="conversation-memory",
|
||||
)
|
||||
|
||||
# Attach to agent for persistent memory
|
||||
agent = client.agents.create_agent(
|
||||
model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
|
||||
name="memory-agent",
|
||||
tools=[MemorySearchTool()],
|
||||
tool_resources={"memory": {"store_ids": [memory_store.id]}},
|
||||
)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use context managers** for async client: `async with AIProjectClient(...) as client:`
|
||||
2. **Clean up agents** when done: `client.agents.delete_agent(agent.id)`
|
||||
3. **Use `create_and_process`** for simple runs, **streaming** for real-time UX
|
||||
4. **Use versioned agents** for production deployments
|
||||
5. **Prefer connections** for external service integration (AI Search, Bing, etc.)
|
||||
|
||||
## SDK Comparison
|
||||
|
||||
| Feature | `azure-ai-projects` | `azure-ai-agents` |
|
||||
|---------|---------------------|-------------------|
|
||||
| Level | High-level (Foundry) | Low-level (Agents) |
|
||||
| Client | `AIProjectClient` | `AgentsClient` |
|
||||
| Versioning | `create_version()` | Not available |
|
||||
| Connections | Yes | No |
|
||||
| Deployments | Yes | No |
|
||||
| Datasets/Indexes | Yes | No |
|
||||
| Evaluation | Via OpenAI client | No |
|
||||
| When to use | Full Foundry integration | Standalone agent apps |
|
||||
|
||||
## Reference Files
|
||||
|
||||
- [references/agents.md](references/agents.md): Agent operations with PromptAgentDefinition
|
||||
- [references/tools.md](references/tools.md): All agent tools with examples
|
||||
- [references/evaluation.md](references/evaluation.md): Evaluation operations overview
|
||||
- [references/built-in-evaluators.md](references/built-in-evaluators.md): Complete built-in evaluator reference
|
||||
- [references/custom-evaluators.md](references/custom-evaluators.md): Code and prompt-based evaluator patterns
|
||||
- [references/connections.md](references/connections.md): Connection operations
|
||||
- [references/deployments.md](references/deployments.md): Deployment enumeration
|
||||
- [references/datasets-indexes.md](references/datasets-indexes.md): Dataset and index operations
|
||||
- [references/async-patterns.md](references/async-patterns.md): Async client usage
|
||||
- [references/api-reference.md](references/api-reference.md): Complete API reference for all 373 SDK exports (v2.0.0b4)
|
||||
- [scripts/run_batch_evaluation.py](scripts/run_batch_evaluation.py): CLI tool for batch evaluations
|
||||
289
skills/azure-ai-projects-ts/SKILL.md
Normal file
289
skills/azure-ai-projects-ts/SKILL.md
Normal file
@@ -0,0 +1,289 @@
|
||||
---
|
||||
name: azure-ai-projects-ts
|
||||
description: Build AI applications using Azure AI Projects SDK for JavaScript (@azure/ai-projects). Use when working with Foundry project clients, agents, connections, deployments, datasets, indexes, evaluations, or getting OpenAI clients.
|
||||
package: @azure/ai-projects
|
||||
---
|
||||
|
||||
# Azure AI Projects SDK for TypeScript
|
||||
|
||||
High-level SDK for Azure AI Foundry projects with agents, connections, deployments, and evaluations.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @azure/ai-projects @azure/identity
|
||||
```
|
||||
|
||||
For tracing:
|
||||
```bash
|
||||
npm install @azure/monitor-opentelemetry @opentelemetry/api
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_AI_PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project>
|
||||
MODEL_DEPLOYMENT_NAME=gpt-4o
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```typescript
|
||||
import { AIProjectClient } from "@azure/ai-projects";
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
const client = new AIProjectClient(
|
||||
process.env.AZURE_AI_PROJECT_ENDPOINT!,
|
||||
new DefaultAzureCredential()
|
||||
);
|
||||
```
|
||||
|
||||
## Operation Groups
|
||||
|
||||
| Group | Purpose |
|
||||
|-------|---------|
|
||||
| `client.agents` | Create and manage AI agents |
|
||||
| `client.connections` | List connected Azure resources |
|
||||
| `client.deployments` | List model deployments |
|
||||
| `client.datasets` | Upload and manage datasets |
|
||||
| `client.indexes` | Create and manage search indexes |
|
||||
| `client.evaluators` | Manage evaluation metrics |
|
||||
| `client.memoryStores` | Manage agent memory |
|
||||
|
||||
## Getting OpenAI Client
|
||||
|
||||
```typescript
|
||||
const openAIClient = await client.getOpenAIClient();
|
||||
|
||||
// Use for responses
|
||||
const response = await openAIClient.responses.create({
|
||||
model: "gpt-4o",
|
||||
input: "What is the capital of France?"
|
||||
});
|
||||
|
||||
// Use for conversations
|
||||
const conversation = await openAIClient.conversations.create({
|
||||
items: [{ type: "message", role: "user", content: "Hello!" }]
|
||||
});
|
||||
```
|
||||
|
||||
## Agents
|
||||
|
||||
### Create Agent
|
||||
|
||||
```typescript
|
||||
const agent = await client.agents.createVersion("my-agent", {
|
||||
kind: "prompt",
|
||||
model: "gpt-4o",
|
||||
instructions: "You are a helpful assistant."
|
||||
});
|
||||
```
|
||||
|
||||
### Agent with Tools
|
||||
|
||||
```typescript
|
||||
// Code Interpreter
|
||||
const agent = await client.agents.createVersion("code-agent", {
|
||||
kind: "prompt",
|
||||
model: "gpt-4o",
|
||||
instructions: "You can execute code.",
|
||||
tools: [{ type: "code_interpreter", container: { type: "auto" } }]
|
||||
});
|
||||
|
||||
// File Search
|
||||
const agent = await client.agents.createVersion("search-agent", {
|
||||
kind: "prompt",
|
||||
model: "gpt-4o",
|
||||
tools: [{ type: "file_search", vector_store_ids: [vectorStoreId] }]
|
||||
});
|
||||
|
||||
// Web Search
|
||||
const agent = await client.agents.createVersion("web-agent", {
|
||||
kind: "prompt",
|
||||
model: "gpt-4o",
|
||||
tools: [{
|
||||
type: "web_search_preview",
|
||||
user_location: { type: "approximate", country: "US", city: "Seattle" }
|
||||
}]
|
||||
});
|
||||
|
||||
// Azure AI Search
|
||||
const agent = await client.agents.createVersion("aisearch-agent", {
|
||||
kind: "prompt",
|
||||
model: "gpt-4o",
|
||||
tools: [{
|
||||
type: "azure_ai_search",
|
||||
azure_ai_search: {
|
||||
indexes: [{
|
||||
project_connection_id: connectionId,
|
||||
index_name: "my-index",
|
||||
query_type: "simple"
|
||||
}]
|
||||
}
|
||||
}]
|
||||
});
|
||||
|
||||
// Function Tool
|
||||
const agent = await client.agents.createVersion("func-agent", {
|
||||
kind: "prompt",
|
||||
model: "gpt-4o",
|
||||
tools: [{
|
||||
type: "function",
|
||||
function: {
|
||||
name: "get_weather",
|
||||
description: "Get weather for a location",
|
||||
strict: true,
|
||||
parameters: {
|
||||
type: "object",
|
||||
properties: { location: { type: "string" } },
|
||||
required: ["location"]
|
||||
}
|
||||
}
|
||||
}]
|
||||
});
|
||||
|
||||
// MCP Tool
|
||||
const agent = await client.agents.createVersion("mcp-agent", {
|
||||
kind: "prompt",
|
||||
model: "gpt-4o",
|
||||
tools: [{
|
||||
type: "mcp",
|
||||
server_label: "my-mcp",
|
||||
server_url: "https://mcp-server.example.com",
|
||||
require_approval: "always"
|
||||
}]
|
||||
});
|
||||
```
|
||||
|
||||
### Run Agent
|
||||
|
||||
```typescript
|
||||
const openAIClient = await client.getOpenAIClient();
|
||||
|
||||
// Create conversation
|
||||
const conversation = await openAIClient.conversations.create({
|
||||
items: [{ type: "message", role: "user", content: "Hello!" }]
|
||||
});
|
||||
|
||||
// Generate response using agent
|
||||
const response = await openAIClient.responses.create(
|
||||
{ conversation: conversation.id },
|
||||
{ body: { agent: { name: agent.name, type: "agent_reference" } } }
|
||||
);
|
||||
|
||||
// Cleanup
|
||||
await openAIClient.conversations.delete(conversation.id);
|
||||
await client.agents.deleteVersion(agent.name, agent.version);
|
||||
```
|
||||
|
||||
## Connections
|
||||
|
||||
```typescript
|
||||
// List all connections
|
||||
for await (const conn of client.connections.list()) {
|
||||
console.log(conn.name, conn.type);
|
||||
}
|
||||
|
||||
// Get connection by name
|
||||
const conn = await client.connections.get("my-connection");
|
||||
|
||||
// Get connection with credentials
|
||||
const connWithCreds = await client.connections.getWithCredentials("my-connection");
|
||||
|
||||
// Get default connection by type
|
||||
const defaultAzureOpenAI = await client.connections.getDefault("AzureOpenAI", true);
|
||||
```
|
||||
|
||||
## Deployments
|
||||
|
||||
```typescript
|
||||
// List all deployments
|
||||
for await (const deployment of client.deployments.list()) {
|
||||
if (deployment.type === "ModelDeployment") {
|
||||
console.log(deployment.name, deployment.modelName);
|
||||
}
|
||||
}
|
||||
|
||||
// Filter by publisher
|
||||
for await (const d of client.deployments.list({ modelPublisher: "OpenAI" })) {
|
||||
console.log(d.name);
|
||||
}
|
||||
|
||||
// Get specific deployment
|
||||
const deployment = await client.deployments.get("gpt-4o");
|
||||
```
|
||||
|
||||
## Datasets
|
||||
|
||||
```typescript
|
||||
// Upload single file
|
||||
const dataset = await client.datasets.uploadFile(
|
||||
"my-dataset",
|
||||
"1.0",
|
||||
"./data/training.jsonl"
|
||||
);
|
||||
|
||||
// Upload folder
|
||||
const dataset = await client.datasets.uploadFolder(
|
||||
"my-dataset",
|
||||
"2.0",
|
||||
"./data/documents/"
|
||||
);
|
||||
|
||||
// Get dataset
|
||||
const ds = await client.datasets.get("my-dataset", "1.0");
|
||||
|
||||
// List versions
|
||||
for await (const version of client.datasets.listVersions("my-dataset")) {
|
||||
console.log(version);
|
||||
}
|
||||
|
||||
// Delete
|
||||
await client.datasets.delete("my-dataset", "1.0");
|
||||
```
|
||||
|
||||
## Indexes
|
||||
|
||||
```typescript
|
||||
import { AzureAISearchIndex } from "@azure/ai-projects";
|
||||
|
||||
const indexConfig: AzureAISearchIndex = {
|
||||
name: "my-index",
|
||||
type: "AzureSearch",
|
||||
version: "1",
|
||||
indexName: "my-index",
|
||||
connectionName: "search-connection"
|
||||
};
|
||||
|
||||
// Create index
|
||||
const index = await client.indexes.createOrUpdate("my-index", "1", indexConfig);
|
||||
|
||||
// List indexes
|
||||
for await (const idx of client.indexes.list()) {
|
||||
console.log(idx.name);
|
||||
}
|
||||
|
||||
// Delete
|
||||
await client.indexes.delete("my-index", "1");
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
import {
|
||||
AIProjectClient,
|
||||
AIProjectClientOptionalParams,
|
||||
Connection,
|
||||
ModelDeployment,
|
||||
DatasetVersionUnion,
|
||||
AzureAISearchIndex
|
||||
} from "@azure/ai-projects";
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use getOpenAIClient()** - For responses, conversations, files, and vector stores
|
||||
2. **Version your agents** - Use `createVersion` for reproducible agent definitions
|
||||
3. **Clean up resources** - Delete agents, conversations when done
|
||||
4. **Use connections** - Get credentials from project connections, don't hardcode
|
||||
5. **Filter deployments** - Use `modelPublisher` filter to find specific models
|
||||
227
skills/azure-ai-textanalytics-py/SKILL.md
Normal file
227
skills/azure-ai-textanalytics-py/SKILL.md
Normal file
@@ -0,0 +1,227 @@
|
||||
---
|
||||
name: azure-ai-textanalytics-py
|
||||
description: |
|
||||
Azure AI Text Analytics SDK for sentiment analysis, entity recognition, key phrases, language detection, PII, and healthcare NLP. Use for natural language processing on text.
|
||||
Triggers: "text analytics", "sentiment analysis", "entity recognition", "key phrase", "PII detection", "TextAnalyticsClient".
|
||||
package: azure-ai-textanalytics
|
||||
---
|
||||
|
||||
# Azure AI Text Analytics SDK for Python
|
||||
|
||||
Client library for Azure AI Language service NLP capabilities including sentiment, entities, key phrases, and more.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-textanalytics
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_LANGUAGE_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
AZURE_LANGUAGE_KEY=<your-api-key> # If using API key
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
from azure.ai.textanalytics import TextAnalyticsClient
|
||||
|
||||
endpoint = os.environ["AZURE_LANGUAGE_ENDPOINT"]
|
||||
key = os.environ["AZURE_LANGUAGE_KEY"]
|
||||
|
||||
client = TextAnalyticsClient(endpoint, AzureKeyCredential(key))
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.ai.textanalytics import TextAnalyticsClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = TextAnalyticsClient(
|
||||
endpoint=os.environ["AZURE_LANGUAGE_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Sentiment Analysis
|
||||
|
||||
```python
|
||||
documents = [
|
||||
"I had a wonderful trip to Seattle last week!",
|
||||
"The food was terrible and the service was slow."
|
||||
]
|
||||
|
||||
result = client.analyze_sentiment(documents, show_opinion_mining=True)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
print(f"Sentiment: {doc.sentiment}")
|
||||
print(f"Scores: pos={doc.confidence_scores.positive:.2f}, "
|
||||
f"neg={doc.confidence_scores.negative:.2f}, "
|
||||
f"neu={doc.confidence_scores.neutral:.2f}")
|
||||
|
||||
# Opinion mining (aspect-based sentiment)
|
||||
for sentence in doc.sentences:
|
||||
for opinion in sentence.mined_opinions:
|
||||
target = opinion.target
|
||||
print(f" Target: '{target.text}' - {target.sentiment}")
|
||||
for assessment in opinion.assessments:
|
||||
print(f" Assessment: '{assessment.text}' - {assessment.sentiment}")
|
||||
```
|
||||
|
||||
## Entity Recognition
|
||||
|
||||
```python
|
||||
documents = ["Microsoft was founded by Bill Gates and Paul Allen in Albuquerque."]
|
||||
|
||||
result = client.recognize_entities(documents)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
for entity in doc.entities:
|
||||
print(f"Entity: {entity.text}")
|
||||
print(f" Category: {entity.category}")
|
||||
print(f" Subcategory: {entity.subcategory}")
|
||||
print(f" Confidence: {entity.confidence_score:.2f}")
|
||||
```
|
||||
|
||||
## PII Detection
|
||||
|
||||
```python
|
||||
documents = ["My SSN is 123-45-6789 and my email is john@example.com"]
|
||||
|
||||
result = client.recognize_pii_entities(documents)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
print(f"Redacted: {doc.redacted_text}")
|
||||
for entity in doc.entities:
|
||||
print(f"PII: {entity.text} ({entity.category})")
|
||||
```
|
||||
|
||||
## Key Phrase Extraction
|
||||
|
||||
```python
|
||||
documents = ["Azure AI provides powerful machine learning capabilities for developers."]
|
||||
|
||||
result = client.extract_key_phrases(documents)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
print(f"Key phrases: {doc.key_phrases}")
|
||||
```
|
||||
|
||||
## Language Detection
|
||||
|
||||
```python
|
||||
documents = ["Ce document est en francais.", "This is written in English."]
|
||||
|
||||
result = client.detect_language(documents)
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
print(f"Language: {doc.primary_language.name} ({doc.primary_language.iso6391_name})")
|
||||
print(f"Confidence: {doc.primary_language.confidence_score:.2f}")
|
||||
```
|
||||
|
||||
## Healthcare Text Analytics
|
||||
|
||||
```python
|
||||
documents = ["Patient has diabetes and was prescribed metformin 500mg twice daily."]
|
||||
|
||||
poller = client.begin_analyze_healthcare_entities(documents)
|
||||
result = poller.result()
|
||||
|
||||
for doc in result:
|
||||
if not doc.is_error:
|
||||
for entity in doc.entities:
|
||||
print(f"Entity: {entity.text}")
|
||||
print(f" Category: {entity.category}")
|
||||
print(f" Normalized: {entity.normalized_text}")
|
||||
|
||||
# Entity links (UMLS, etc.)
|
||||
for link in entity.data_sources:
|
||||
print(f" Link: {link.name} - {link.entity_id}")
|
||||
```
|
||||
|
||||
## Multiple Analysis (Batch)
|
||||
|
||||
```python
|
||||
from azure.ai.textanalytics import (
|
||||
RecognizeEntitiesAction,
|
||||
ExtractKeyPhrasesAction,
|
||||
AnalyzeSentimentAction
|
||||
)
|
||||
|
||||
documents = ["Microsoft announced new Azure AI features at Build conference."]
|
||||
|
||||
poller = client.begin_analyze_actions(
|
||||
documents,
|
||||
actions=[
|
||||
RecognizeEntitiesAction(),
|
||||
ExtractKeyPhrasesAction(),
|
||||
AnalyzeSentimentAction()
|
||||
]
|
||||
)
|
||||
|
||||
results = poller.result()
|
||||
for doc_results in results:
|
||||
for result in doc_results:
|
||||
if result.kind == "EntityRecognition":
|
||||
print(f"Entities: {[e.text for e in result.entities]}")
|
||||
elif result.kind == "KeyPhraseExtraction":
|
||||
print(f"Key phrases: {result.key_phrases}")
|
||||
elif result.kind == "SentimentAnalysis":
|
||||
print(f"Sentiment: {result.sentiment}")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.textanalytics.aio import TextAnalyticsClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def analyze():
|
||||
async with TextAnalyticsClient(
|
||||
endpoint=endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
) as client:
|
||||
result = await client.analyze_sentiment(documents)
|
||||
# Process results...
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `TextAnalyticsClient` | All text analytics operations |
|
||||
| `TextAnalyticsClient` (aio) | Async version |
|
||||
|
||||
## Available Operations
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `analyze_sentiment` | Sentiment analysis with opinion mining |
|
||||
| `recognize_entities` | Named entity recognition |
|
||||
| `recognize_pii_entities` | PII detection and redaction |
|
||||
| `recognize_linked_entities` | Entity linking to Wikipedia |
|
||||
| `extract_key_phrases` | Key phrase extraction |
|
||||
| `detect_language` | Language detection |
|
||||
| `begin_analyze_healthcare_entities` | Healthcare NLP (long-running) |
|
||||
| `begin_analyze_actions` | Multiple analyses in batch |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use batch operations** for multiple documents (up to 10 per request)
|
||||
2. **Enable opinion mining** for detailed aspect-based sentiment
|
||||
3. **Use async client** for high-throughput scenarios
|
||||
4. **Handle document errors** — results list may contain errors for some docs
|
||||
5. **Specify language** when known to improve accuracy
|
||||
6. **Use context manager** or close client explicitly
|
||||
69
skills/azure-ai-transcription-py/SKILL.md
Normal file
69
skills/azure-ai-transcription-py/SKILL.md
Normal file
@@ -0,0 +1,69 @@
|
||||
---
|
||||
name: azure-ai-transcription-py
|
||||
description: |
|
||||
Azure AI Transcription SDK for Python. Use for real-time and batch speech-to-text transcription with timestamps and diarization.
|
||||
Triggers: "transcription", "speech to text", "Azure AI Transcription", "TranscriptionClient".
|
||||
package: azure-ai-transcription
|
||||
---
|
||||
|
||||
# Azure AI Transcription SDK for Python
|
||||
|
||||
Client library for Azure AI Transcription (speech-to-text) with real-time and batch transcription.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-transcription
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
TRANSCRIPTION_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
TRANSCRIPTION_KEY=<your-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
Use subscription key authentication (DefaultAzureCredential is not supported for this client):
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.transcription import TranscriptionClient
|
||||
|
||||
client = TranscriptionClient(
|
||||
endpoint=os.environ["TRANSCRIPTION_ENDPOINT"],
|
||||
credential=os.environ["TRANSCRIPTION_KEY"]
|
||||
)
|
||||
```
|
||||
|
||||
## Transcription (Batch)
|
||||
|
||||
```python
|
||||
job = client.begin_transcription(
|
||||
name="meeting-transcription",
|
||||
locale="en-US",
|
||||
content_urls=["https://<storage>/audio.wav"],
|
||||
diarization_enabled=True
|
||||
)
|
||||
result = job.result()
|
||||
print(result.status)
|
||||
```
|
||||
|
||||
## Transcription (Real-time)
|
||||
|
||||
```python
|
||||
stream = client.begin_stream_transcription(locale="en-US")
|
||||
stream.send_audio_file("audio.wav")
|
||||
for event in stream:
|
||||
print(event.text)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Enable diarization** when multiple speakers are present
|
||||
2. **Use batch transcription** for long files stored in blob storage
|
||||
3. **Capture timestamps** for subtitle generation
|
||||
4. **Specify language** to improve recognition accuracy
|
||||
5. **Handle streaming backpressure** for real-time transcription
|
||||
6. **Close transcription sessions** when complete
|
||||
249
skills/azure-ai-translation-document-py/SKILL.md
Normal file
249
skills/azure-ai-translation-document-py/SKILL.md
Normal file
@@ -0,0 +1,249 @@
|
||||
---
|
||||
name: azure-ai-translation-document-py
|
||||
description: |
|
||||
Azure AI Document Translation SDK for batch translation of documents with format preservation. Use for translating Word, PDF, Excel, PowerPoint, and other document formats at scale.
|
||||
Triggers: "document translation", "batch translation", "translate documents", "DocumentTranslationClient".
|
||||
package: azure-ai-translation-document
|
||||
---
|
||||
|
||||
# Azure AI Document Translation SDK for Python
|
||||
|
||||
Client library for Azure AI Translator document translation service for batch document translation with format preservation.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-translation-document
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_DOCUMENT_TRANSLATION_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
AZURE_DOCUMENT_TRANSLATION_KEY=<your-api-key> # If using API key
|
||||
|
||||
# Storage for source and target documents
|
||||
AZURE_SOURCE_CONTAINER_URL=https://<storage>.blob.core.windows.net/<container>?<sas>
|
||||
AZURE_TARGET_CONTAINER_URL=https://<storage>.blob.core.windows.net/<container>?<sas>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.translation.document import DocumentTranslationClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
endpoint = os.environ["AZURE_DOCUMENT_TRANSLATION_ENDPOINT"]
|
||||
key = os.environ["AZURE_DOCUMENT_TRANSLATION_KEY"]
|
||||
|
||||
client = DocumentTranslationClient(endpoint, AzureKeyCredential(key))
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document import DocumentTranslationClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = DocumentTranslationClient(
|
||||
endpoint=os.environ["AZURE_DOCUMENT_TRANSLATION_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Basic Document Translation
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document import DocumentTranslationInput, TranslationTarget
|
||||
|
||||
source_url = os.environ["AZURE_SOURCE_CONTAINER_URL"]
|
||||
target_url = os.environ["AZURE_TARGET_CONTAINER_URL"]
|
||||
|
||||
# Start translation job
|
||||
poller = client.begin_translation(
|
||||
inputs=[
|
||||
DocumentTranslationInput(
|
||||
source_url=source_url,
|
||||
targets=[
|
||||
TranslationTarget(
|
||||
target_url=target_url,
|
||||
language="es" # Translate to Spanish
|
||||
)
|
||||
]
|
||||
)
|
||||
]
|
||||
)
|
||||
|
||||
# Wait for completion
|
||||
result = poller.result()
|
||||
|
||||
print(f"Status: {poller.status()}")
|
||||
print(f"Documents translated: {poller.details.documents_succeeded_count}")
|
||||
print(f"Documents failed: {poller.details.documents_failed_count}")
|
||||
```
|
||||
|
||||
## Multiple Target Languages
|
||||
|
||||
```python
|
||||
poller = client.begin_translation(
|
||||
inputs=[
|
||||
DocumentTranslationInput(
|
||||
source_url=source_url,
|
||||
targets=[
|
||||
TranslationTarget(target_url=target_url_es, language="es"),
|
||||
TranslationTarget(target_url=target_url_fr, language="fr"),
|
||||
TranslationTarget(target_url=target_url_de, language="de")
|
||||
]
|
||||
)
|
||||
]
|
||||
)
|
||||
```
|
||||
|
||||
## Translate Single Document
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document import SingleDocumentTranslationClient
|
||||
|
||||
single_client = SingleDocumentTranslationClient(endpoint, AzureKeyCredential(key))
|
||||
|
||||
with open("document.docx", "rb") as f:
|
||||
document_content = f.read()
|
||||
|
||||
result = single_client.translate(
|
||||
body=document_content,
|
||||
target_language="es",
|
||||
content_type="application/vnd.openxmlformats-officedocument.wordprocessingml.document"
|
||||
)
|
||||
|
||||
# Save translated document
|
||||
with open("document_es.docx", "wb") as f:
|
||||
f.write(result)
|
||||
```
|
||||
|
||||
## Check Translation Status
|
||||
|
||||
```python
|
||||
# Get all translation operations
|
||||
operations = client.list_translation_statuses()
|
||||
|
||||
for op in operations:
|
||||
print(f"Operation ID: {op.id}")
|
||||
print(f"Status: {op.status}")
|
||||
print(f"Created: {op.created_on}")
|
||||
print(f"Total documents: {op.documents_total_count}")
|
||||
print(f"Succeeded: {op.documents_succeeded_count}")
|
||||
print(f"Failed: {op.documents_failed_count}")
|
||||
```
|
||||
|
||||
## List Document Statuses
|
||||
|
||||
```python
|
||||
# Get status of individual documents in a job
|
||||
operation_id = poller.id
|
||||
document_statuses = client.list_document_statuses(operation_id)
|
||||
|
||||
for doc in document_statuses:
|
||||
print(f"Document: {doc.source_document_url}")
|
||||
print(f" Status: {doc.status}")
|
||||
print(f" Translated to: {doc.translated_to}")
|
||||
if doc.error:
|
||||
print(f" Error: {doc.error.message}")
|
||||
```
|
||||
|
||||
## Cancel Translation
|
||||
|
||||
```python
|
||||
# Cancel a running translation
|
||||
client.cancel_translation(operation_id)
|
||||
```
|
||||
|
||||
## Using Glossary
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document import TranslationGlossary
|
||||
|
||||
poller = client.begin_translation(
|
||||
inputs=[
|
||||
DocumentTranslationInput(
|
||||
source_url=source_url,
|
||||
targets=[
|
||||
TranslationTarget(
|
||||
target_url=target_url,
|
||||
language="es",
|
||||
glossaries=[
|
||||
TranslationGlossary(
|
||||
glossary_url="https://<storage>.blob.core.windows.net/glossary/terms.csv?<sas>",
|
||||
file_format="csv"
|
||||
)
|
||||
]
|
||||
)
|
||||
]
|
||||
)
|
||||
]
|
||||
)
|
||||
```
|
||||
|
||||
## Supported Document Formats
|
||||
|
||||
```python
|
||||
# Get supported formats
|
||||
formats = client.get_supported_document_formats()
|
||||
|
||||
for fmt in formats:
|
||||
print(f"Format: {fmt.format}")
|
||||
print(f" Extensions: {fmt.file_extensions}")
|
||||
print(f" Content types: {fmt.content_types}")
|
||||
```
|
||||
|
||||
## Supported Languages
|
||||
|
||||
```python
|
||||
# Get supported languages
|
||||
languages = client.get_supported_languages()
|
||||
|
||||
for lang in languages:
|
||||
print(f"Language: {lang.name} ({lang.code})")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.translation.document.aio import DocumentTranslationClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def translate_documents():
|
||||
async with DocumentTranslationClient(
|
||||
endpoint=endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
) as client:
|
||||
poller = await client.begin_translation(inputs=[...])
|
||||
result = await poller.result()
|
||||
```
|
||||
|
||||
## Supported Formats
|
||||
|
||||
| Category | Formats |
|
||||
|----------|---------|
|
||||
| Documents | DOCX, PDF, PPTX, XLSX, HTML, TXT, RTF |
|
||||
| Structured | CSV, TSV, JSON, XML |
|
||||
| Localization | XLIFF, XLF, MHTML |
|
||||
|
||||
## Storage Requirements
|
||||
|
||||
- Source and target containers must be Azure Blob Storage
|
||||
- Use SAS tokens with appropriate permissions:
|
||||
- Source: Read, List
|
||||
- Target: Write, List
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use SAS tokens** with minimal required permissions
|
||||
2. **Monitor long-running operations** with `poller.status()`
|
||||
3. **Handle document-level errors** by iterating document statuses
|
||||
4. **Use glossaries** for domain-specific terminology
|
||||
5. **Separate target containers** for each language
|
||||
6. **Use async client** for multiple concurrent jobs
|
||||
7. **Check supported formats** before submitting documents
|
||||
274
skills/azure-ai-translation-text-py/SKILL.md
Normal file
274
skills/azure-ai-translation-text-py/SKILL.md
Normal file
@@ -0,0 +1,274 @@
|
||||
---
|
||||
name: azure-ai-translation-text-py
|
||||
description: |
|
||||
Azure AI Text Translation SDK for real-time text translation, transliteration, language detection, and dictionary lookup. Use for translating text content in applications.
|
||||
Triggers: "text translation", "translator", "translate text", "transliterate", "TextTranslationClient".
|
||||
package: azure-ai-translation-text
|
||||
---
|
||||
|
||||
# Azure AI Text Translation SDK for Python
|
||||
|
||||
Client library for Azure AI Translator text translation service for real-time text translation, transliteration, and language operations.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-translation-text
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_TRANSLATOR_KEY=<your-api-key>
|
||||
AZURE_TRANSLATOR_REGION=<your-region> # e.g., eastus, westus2
|
||||
# Or use custom endpoint
|
||||
AZURE_TRANSLATOR_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key with Region
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.translation.text import TextTranslationClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
key = os.environ["AZURE_TRANSLATOR_KEY"]
|
||||
region = os.environ["AZURE_TRANSLATOR_REGION"]
|
||||
|
||||
# Create credential with region
|
||||
credential = AzureKeyCredential(key)
|
||||
client = TextTranslationClient(credential=credential, region=region)
|
||||
```
|
||||
|
||||
### API Key with Custom Endpoint
|
||||
|
||||
```python
|
||||
endpoint = os.environ["AZURE_TRANSLATOR_ENDPOINT"]
|
||||
|
||||
client = TextTranslationClient(
|
||||
credential=AzureKeyCredential(key),
|
||||
endpoint=endpoint
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.ai.translation.text import TextTranslationClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = TextTranslationClient(
|
||||
credential=DefaultAzureCredential(),
|
||||
endpoint=os.environ["AZURE_TRANSLATOR_ENDPOINT"]
|
||||
)
|
||||
```
|
||||
|
||||
## Basic Translation
|
||||
|
||||
```python
|
||||
# Translate to a single language
|
||||
result = client.translate(
|
||||
body=["Hello, how are you?", "Welcome to Azure!"],
|
||||
to=["es"] # Spanish
|
||||
)
|
||||
|
||||
for item in result:
|
||||
for translation in item.translations:
|
||||
print(f"Translated: {translation.text}")
|
||||
print(f"Target language: {translation.to}")
|
||||
```
|
||||
|
||||
## Translate to Multiple Languages
|
||||
|
||||
```python
|
||||
result = client.translate(
|
||||
body=["Hello, world!"],
|
||||
to=["es", "fr", "de", "ja"] # Spanish, French, German, Japanese
|
||||
)
|
||||
|
||||
for item in result:
|
||||
print(f"Source: {item.detected_language.language if item.detected_language else 'unknown'}")
|
||||
for translation in item.translations:
|
||||
print(f" {translation.to}: {translation.text}")
|
||||
```
|
||||
|
||||
## Specify Source Language
|
||||
|
||||
```python
|
||||
result = client.translate(
|
||||
body=["Bonjour le monde"],
|
||||
from_parameter="fr", # Source is French
|
||||
to=["en", "es"]
|
||||
)
|
||||
```
|
||||
|
||||
## Language Detection
|
||||
|
||||
```python
|
||||
result = client.translate(
|
||||
body=["Hola, como estas?"],
|
||||
to=["en"]
|
||||
)
|
||||
|
||||
for item in result:
|
||||
if item.detected_language:
|
||||
print(f"Detected language: {item.detected_language.language}")
|
||||
print(f"Confidence: {item.detected_language.score:.2f}")
|
||||
```
|
||||
|
||||
## Transliteration
|
||||
|
||||
Convert text from one script to another:
|
||||
|
||||
```python
|
||||
result = client.transliterate(
|
||||
body=["konnichiwa"],
|
||||
language="ja",
|
||||
from_script="Latn", # From Latin script
|
||||
to_script="Jpan" # To Japanese script
|
||||
)
|
||||
|
||||
for item in result:
|
||||
print(f"Transliterated: {item.text}")
|
||||
print(f"Script: {item.script}")
|
||||
```
|
||||
|
||||
## Dictionary Lookup
|
||||
|
||||
Find alternate translations and definitions:
|
||||
|
||||
```python
|
||||
result = client.lookup_dictionary_entries(
|
||||
body=["fly"],
|
||||
from_parameter="en",
|
||||
to="es"
|
||||
)
|
||||
|
||||
for item in result:
|
||||
print(f"Source: {item.normalized_source} ({item.display_source})")
|
||||
for translation in item.translations:
|
||||
print(f" Translation: {translation.normalized_target}")
|
||||
print(f" Part of speech: {translation.pos_tag}")
|
||||
print(f" Confidence: {translation.confidence:.2f}")
|
||||
```
|
||||
|
||||
## Dictionary Examples
|
||||
|
||||
Get usage examples for translations:
|
||||
|
||||
```python
|
||||
from azure.ai.translation.text.models import DictionaryExampleTextItem
|
||||
|
||||
result = client.lookup_dictionary_examples(
|
||||
body=[DictionaryExampleTextItem(text="fly", translation="volar")],
|
||||
from_parameter="en",
|
||||
to="es"
|
||||
)
|
||||
|
||||
for item in result:
|
||||
for example in item.examples:
|
||||
print(f"Source: {example.source_prefix}{example.source_term}{example.source_suffix}")
|
||||
print(f"Target: {example.target_prefix}{example.target_term}{example.target_suffix}")
|
||||
```
|
||||
|
||||
## Get Supported Languages
|
||||
|
||||
```python
|
||||
# Get all supported languages
|
||||
languages = client.get_supported_languages()
|
||||
|
||||
# Translation languages
|
||||
print("Translation languages:")
|
||||
for code, lang in languages.translation.items():
|
||||
print(f" {code}: {lang.name} ({lang.native_name})")
|
||||
|
||||
# Transliteration languages
|
||||
print("\nTransliteration languages:")
|
||||
for code, lang in languages.transliteration.items():
|
||||
print(f" {code}: {lang.name}")
|
||||
for script in lang.scripts:
|
||||
print(f" {script.code} -> {[t.code for t in script.to_scripts]}")
|
||||
|
||||
# Dictionary languages
|
||||
print("\nDictionary languages:")
|
||||
for code, lang in languages.dictionary.items():
|
||||
print(f" {code}: {lang.name}")
|
||||
```
|
||||
|
||||
## Break Sentence
|
||||
|
||||
Identify sentence boundaries:
|
||||
|
||||
```python
|
||||
result = client.find_sentence_boundaries(
|
||||
body=["Hello! How are you? I hope you are well."],
|
||||
language="en"
|
||||
)
|
||||
|
||||
for item in result:
|
||||
print(f"Sentence lengths: {item.sent_len}")
|
||||
```
|
||||
|
||||
## Translation Options
|
||||
|
||||
```python
|
||||
result = client.translate(
|
||||
body=["Hello, world!"],
|
||||
to=["de"],
|
||||
text_type="html", # "plain" or "html"
|
||||
profanity_action="Marked", # "NoAction", "Deleted", "Marked"
|
||||
profanity_marker="Asterisk", # "Asterisk", "Tag"
|
||||
include_alignment=True, # Include word alignment
|
||||
include_sentence_length=True # Include sentence boundaries
|
||||
)
|
||||
|
||||
for item in result:
|
||||
translation = item.translations[0]
|
||||
print(f"Translated: {translation.text}")
|
||||
if translation.alignment:
|
||||
print(f"Alignment: {translation.alignment.proj}")
|
||||
if translation.sent_len:
|
||||
print(f"Sentence lengths: {translation.sent_len.src_sent_len}")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.translation.text.aio import TextTranslationClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
async def translate_text():
|
||||
async with TextTranslationClient(
|
||||
credential=AzureKeyCredential(key),
|
||||
region=region
|
||||
) as client:
|
||||
result = await client.translate(
|
||||
body=["Hello, world!"],
|
||||
to=["es"]
|
||||
)
|
||||
print(result[0].translations[0].text)
|
||||
```
|
||||
|
||||
## Client Methods
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `translate` | Translate text to one or more languages |
|
||||
| `transliterate` | Convert text between scripts |
|
||||
| `detect` | Detect language of text |
|
||||
| `find_sentence_boundaries` | Identify sentence boundaries |
|
||||
| `lookup_dictionary_entries` | Dictionary lookup for translations |
|
||||
| `lookup_dictionary_examples` | Get usage examples |
|
||||
| `get_supported_languages` | List supported languages |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Batch translations** — Send multiple texts in one request (up to 100)
|
||||
2. **Specify source language** when known to improve accuracy
|
||||
3. **Use async client** for high-throughput scenarios
|
||||
4. **Cache language list** — Supported languages don't change frequently
|
||||
5. **Handle profanity** appropriately for your application
|
||||
6. **Use html text_type** when translating HTML content
|
||||
7. **Include alignment** for applications needing word mapping
|
||||
286
skills/azure-ai-translation-ts/SKILL.md
Normal file
286
skills/azure-ai-translation-ts/SKILL.md
Normal file
@@ -0,0 +1,286 @@
|
||||
---
|
||||
name: azure-ai-translation-ts
|
||||
description: Build translation applications using Azure Translation SDKs for JavaScript (@azure-rest/ai-translation-text, @azure-rest/ai-translation-document). Use when implementing text translation, transliteration, language detection, or batch document translation.
|
||||
package: @azure-rest/ai-translation-text, @azure-rest/ai-translation-document
|
||||
---
|
||||
|
||||
# Azure Translation SDKs for TypeScript
|
||||
|
||||
Text and document translation with REST-style clients.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Text translation
|
||||
npm install @azure-rest/ai-translation-text @azure/identity
|
||||
|
||||
# Document translation
|
||||
npm install @azure-rest/ai-translation-document @azure/identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
TRANSLATOR_ENDPOINT=https://api.cognitive.microsofttranslator.com
|
||||
TRANSLATOR_SUBSCRIPTION_KEY=<your-api-key>
|
||||
TRANSLATOR_REGION=<your-region> # e.g., westus, eastus
|
||||
```
|
||||
|
||||
## Text Translation Client
|
||||
|
||||
### Authentication
|
||||
|
||||
```typescript
|
||||
import TextTranslationClient, { TranslatorCredential } from "@azure-rest/ai-translation-text";
|
||||
|
||||
// API Key + Region
|
||||
const credential: TranslatorCredential = {
|
||||
key: process.env.TRANSLATOR_SUBSCRIPTION_KEY!,
|
||||
region: process.env.TRANSLATOR_REGION!,
|
||||
};
|
||||
const client = TextTranslationClient(process.env.TRANSLATOR_ENDPOINT!, credential);
|
||||
|
||||
// Or just credential (uses global endpoint)
|
||||
const client2 = TextTranslationClient(credential);
|
||||
```
|
||||
|
||||
### Translate Text
|
||||
|
||||
```typescript
|
||||
import TextTranslationClient, { isUnexpected } from "@azure-rest/ai-translation-text";
|
||||
|
||||
const response = await client.path("/translate").post({
|
||||
body: {
|
||||
inputs: [
|
||||
{
|
||||
text: "Hello, how are you?",
|
||||
language: "en", // source (optional, auto-detect)
|
||||
targets: [
|
||||
{ language: "es" },
|
||||
{ language: "fr" },
|
||||
],
|
||||
},
|
||||
],
|
||||
},
|
||||
});
|
||||
|
||||
if (isUnexpected(response)) {
|
||||
throw response.body.error;
|
||||
}
|
||||
|
||||
for (const result of response.body.value) {
|
||||
for (const translation of result.translations) {
|
||||
console.log(`${translation.language}: ${translation.text}`);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Translate with Options
|
||||
|
||||
```typescript
|
||||
const response = await client.path("/translate").post({
|
||||
body: {
|
||||
inputs: [
|
||||
{
|
||||
text: "Hello world",
|
||||
language: "en",
|
||||
textType: "Plain", // or "Html"
|
||||
targets: [
|
||||
{
|
||||
language: "de",
|
||||
profanityAction: "NoAction", // "Marked" | "Deleted"
|
||||
tone: "formal", // LLM-specific
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
### Get Supported Languages
|
||||
|
||||
```typescript
|
||||
const response = await client.path("/languages").get();
|
||||
|
||||
if (isUnexpected(response)) {
|
||||
throw response.body.error;
|
||||
}
|
||||
|
||||
// Translation languages
|
||||
for (const [code, lang] of Object.entries(response.body.translation || {})) {
|
||||
console.log(`${code}: ${lang.name} (${lang.nativeName})`);
|
||||
}
|
||||
```
|
||||
|
||||
### Transliterate
|
||||
|
||||
```typescript
|
||||
const response = await client.path("/transliterate").post({
|
||||
body: { inputs: [{ text: "这是个测试" }] },
|
||||
queryParameters: {
|
||||
language: "zh-Hans",
|
||||
fromScript: "Hans",
|
||||
toScript: "Latn",
|
||||
},
|
||||
});
|
||||
|
||||
if (!isUnexpected(response)) {
|
||||
for (const t of response.body.value) {
|
||||
console.log(`${t.script}: ${t.text}`); // Latn: zhè shì gè cè shì
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Detect Language
|
||||
|
||||
```typescript
|
||||
const response = await client.path("/detect").post({
|
||||
body: { inputs: [{ text: "Bonjour le monde" }] },
|
||||
});
|
||||
|
||||
if (!isUnexpected(response)) {
|
||||
for (const result of response.body.value) {
|
||||
console.log(`Language: ${result.language}, Score: ${result.score}`);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Document Translation Client
|
||||
|
||||
### Authentication
|
||||
|
||||
```typescript
|
||||
import DocumentTranslationClient from "@azure-rest/ai-translation-document";
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
const endpoint = "https://<translator>.cognitiveservices.azure.com";
|
||||
|
||||
// TokenCredential
|
||||
const client = DocumentTranslationClient(endpoint, new DefaultAzureCredential());
|
||||
|
||||
// API Key
|
||||
const client2 = DocumentTranslationClient(endpoint, { key: "<api-key>" });
|
||||
```
|
||||
|
||||
### Single Document Translation
|
||||
|
||||
```typescript
|
||||
import DocumentTranslationClient from "@azure-rest/ai-translation-document";
|
||||
import { writeFile } from "node:fs/promises";
|
||||
|
||||
const response = await client.path("/document:translate").post({
|
||||
queryParameters: {
|
||||
targetLanguage: "es",
|
||||
sourceLanguage: "en", // optional
|
||||
},
|
||||
contentType: "multipart/form-data",
|
||||
body: [
|
||||
{
|
||||
name: "document",
|
||||
body: "Hello, this is a test document.",
|
||||
filename: "test.txt",
|
||||
contentType: "text/plain",
|
||||
},
|
||||
],
|
||||
}).asNodeStream();
|
||||
|
||||
if (response.status === "200") {
|
||||
await writeFile("translated.txt", response.body);
|
||||
}
|
||||
```
|
||||
|
||||
### Batch Document Translation
|
||||
|
||||
```typescript
|
||||
import { ContainerSASPermissions, BlobServiceClient } from "@azure/storage-blob";
|
||||
|
||||
// Generate SAS URLs for source and target containers
|
||||
const sourceSas = await sourceContainer.generateSasUrl({
|
||||
permissions: ContainerSASPermissions.parse("rl"),
|
||||
expiresOn: new Date(Date.now() + 24 * 60 * 60 * 1000),
|
||||
});
|
||||
|
||||
const targetSas = await targetContainer.generateSasUrl({
|
||||
permissions: ContainerSASPermissions.parse("rwl"),
|
||||
expiresOn: new Date(Date.now() + 24 * 60 * 60 * 1000),
|
||||
});
|
||||
|
||||
// Start batch translation
|
||||
const response = await client.path("/document/batches").post({
|
||||
body: {
|
||||
inputs: [
|
||||
{
|
||||
source: { sourceUrl: sourceSas },
|
||||
targets: [
|
||||
{ targetUrl: targetSas, language: "fr" },
|
||||
],
|
||||
},
|
||||
],
|
||||
},
|
||||
});
|
||||
|
||||
// Get operation ID from header
|
||||
const operationId = new URL(response.headers["operation-location"])
|
||||
.pathname.split("/").pop();
|
||||
```
|
||||
|
||||
### Get Translation Status
|
||||
|
||||
```typescript
|
||||
import { isUnexpected, paginate } from "@azure-rest/ai-translation-document";
|
||||
|
||||
const statusResponse = await client.path("/document/batches/{id}", operationId).get();
|
||||
|
||||
if (!isUnexpected(statusResponse)) {
|
||||
const status = statusResponse.body;
|
||||
console.log(`Status: ${status.status}`);
|
||||
console.log(`Total: ${status.summary.total}`);
|
||||
console.log(`Success: ${status.summary.success}`);
|
||||
}
|
||||
|
||||
// List documents with pagination
|
||||
const docsResponse = await client.path("/document/batches/{id}/documents", operationId).get();
|
||||
const documents = paginate(client, docsResponse);
|
||||
|
||||
for await (const doc of documents) {
|
||||
console.log(`${doc.id}: ${doc.status}`);
|
||||
}
|
||||
```
|
||||
|
||||
### Get Supported Formats
|
||||
|
||||
```typescript
|
||||
const response = await client.path("/document/formats").get();
|
||||
|
||||
if (!isUnexpected(response)) {
|
||||
for (const format of response.body.value) {
|
||||
console.log(`${format.format}: ${format.fileExtensions.join(", ")}`);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
// Text Translation
|
||||
import type {
|
||||
TranslatorCredential,
|
||||
TranslatorTokenCredential,
|
||||
} from "@azure-rest/ai-translation-text";
|
||||
|
||||
// Document Translation
|
||||
import type {
|
||||
DocumentTranslateParameters,
|
||||
StartTranslationDetails,
|
||||
TranslationStatus,
|
||||
} from "@azure-rest/ai-translation-document";
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Auto-detect source** - Omit `language` parameter to auto-detect
|
||||
2. **Batch requests** - Translate multiple texts in one call for efficiency
|
||||
3. **Use SAS tokens** - For document translation, use time-limited SAS URLs
|
||||
4. **Handle errors** - Always check `isUnexpected(response)` before accessing body
|
||||
5. **Regional endpoints** - Use regional endpoints for lower latency
|
||||
289
skills/azure-ai-vision-imageanalysis-java/SKILL.md
Normal file
289
skills/azure-ai-vision-imageanalysis-java/SKILL.md
Normal file
@@ -0,0 +1,289 @@
|
||||
---
|
||||
name: azure-ai-vision-imageanalysis-java
|
||||
description: Build image analysis applications with Azure AI Vision SDK for Java. Use when implementing image captioning, OCR text extraction, object detection, tagging, or smart cropping.
|
||||
package: com.azure:azure-ai-vision-imageanalysis
|
||||
---
|
||||
|
||||
# Azure AI Vision Image Analysis SDK for Java
|
||||
|
||||
Build image analysis applications using the Azure AI Vision Image Analysis SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-vision-imageanalysis</artifactId>
|
||||
<version>1.1.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With API Key
|
||||
|
||||
```java
|
||||
import com.azure.ai.vision.imageanalysis.ImageAnalysisClient;
|
||||
import com.azure.ai.vision.imageanalysis.ImageAnalysisClientBuilder;
|
||||
import com.azure.core.credential.KeyCredential;
|
||||
|
||||
String endpoint = System.getenv("VISION_ENDPOINT");
|
||||
String key = System.getenv("VISION_KEY");
|
||||
|
||||
ImageAnalysisClient client = new ImageAnalysisClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(new KeyCredential(key))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.ai.vision.imageanalysis.ImageAnalysisAsyncClient;
|
||||
|
||||
ImageAnalysisAsyncClient asyncClient = new ImageAnalysisClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(new KeyCredential(key))
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
ImageAnalysisClient client = new ImageAnalysisClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Visual Features
|
||||
|
||||
| Feature | Description |
|
||||
|---------|-------------|
|
||||
| `CAPTION` | Generate human-readable image description |
|
||||
| `DENSE_CAPTIONS` | Captions for up to 10 regions |
|
||||
| `READ` | OCR - Extract text from images |
|
||||
| `TAGS` | Content tags for objects, scenes, actions |
|
||||
| `OBJECTS` | Detect objects with bounding boxes |
|
||||
| `SMART_CROPS` | Smart thumbnail regions |
|
||||
| `PEOPLE` | Detect people with locations |
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Generate Caption
|
||||
|
||||
```java
|
||||
import com.azure.ai.vision.imageanalysis.models.*;
|
||||
import com.azure.core.util.BinaryData;
|
||||
import java.io.File;
|
||||
import java.util.Arrays;
|
||||
|
||||
// From file
|
||||
BinaryData imageData = BinaryData.fromFile(new File("image.jpg").toPath());
|
||||
|
||||
ImageAnalysisResult result = client.analyze(
|
||||
imageData,
|
||||
Arrays.asList(VisualFeatures.CAPTION),
|
||||
new ImageAnalysisOptions().setGenderNeutralCaption(true));
|
||||
|
||||
System.out.printf("Caption: \"%s\" (confidence: %.4f)%n",
|
||||
result.getCaption().getText(),
|
||||
result.getCaption().getConfidence());
|
||||
```
|
||||
|
||||
### Generate Caption from URL
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
"https://example.com/image.jpg",
|
||||
Arrays.asList(VisualFeatures.CAPTION),
|
||||
new ImageAnalysisOptions().setGenderNeutralCaption(true));
|
||||
|
||||
System.out.printf("Caption: \"%s\"%n", result.getCaption().getText());
|
||||
```
|
||||
|
||||
### Extract Text (OCR)
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyze(
|
||||
BinaryData.fromFile(new File("document.jpg").toPath()),
|
||||
Arrays.asList(VisualFeatures.READ),
|
||||
null);
|
||||
|
||||
for (DetectedTextBlock block : result.getRead().getBlocks()) {
|
||||
for (DetectedTextLine line : block.getLines()) {
|
||||
System.out.printf("Line: '%s'%n", line.getText());
|
||||
System.out.printf(" Bounding polygon: %s%n", line.getBoundingPolygon());
|
||||
|
||||
for (DetectedTextWord word : line.getWords()) {
|
||||
System.out.printf(" Word: '%s' (confidence: %.4f)%n",
|
||||
word.getText(),
|
||||
word.getConfidence());
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Detect Objects
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.OBJECTS),
|
||||
null);
|
||||
|
||||
for (DetectedObject obj : result.getObjects()) {
|
||||
System.out.printf("Object: %s (confidence: %.4f)%n",
|
||||
obj.getTags().get(0).getName(),
|
||||
obj.getTags().get(0).getConfidence());
|
||||
|
||||
ImageBoundingBox box = obj.getBoundingBox();
|
||||
System.out.printf(" Location: x=%d, y=%d, w=%d, h=%d%n",
|
||||
box.getX(), box.getY(), box.getWidth(), box.getHeight());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Tags
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.TAGS),
|
||||
null);
|
||||
|
||||
for (DetectedTag tag : result.getTags()) {
|
||||
System.out.printf("Tag: %s (confidence: %.4f)%n",
|
||||
tag.getName(),
|
||||
tag.getConfidence());
|
||||
}
|
||||
```
|
||||
|
||||
### Detect People
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.PEOPLE),
|
||||
null);
|
||||
|
||||
for (DetectedPerson person : result.getPeople()) {
|
||||
ImageBoundingBox box = person.getBoundingBox();
|
||||
System.out.printf("Person at x=%d, y=%d (confidence: %.4f)%n",
|
||||
box.getX(), box.getY(), person.getConfidence());
|
||||
}
|
||||
```
|
||||
|
||||
### Smart Cropping
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.SMART_CROPS),
|
||||
new ImageAnalysisOptions().setSmartCropsAspectRatios(Arrays.asList(1.0, 1.5)));
|
||||
|
||||
for (CropRegion crop : result.getSmartCrops()) {
|
||||
System.out.printf("Crop region: aspect=%.2f, x=%d, y=%d, w=%d, h=%d%n",
|
||||
crop.getAspectRatio(),
|
||||
crop.getBoundingBox().getX(),
|
||||
crop.getBoundingBox().getY(),
|
||||
crop.getBoundingBox().getWidth(),
|
||||
crop.getBoundingBox().getHeight());
|
||||
}
|
||||
```
|
||||
|
||||
### Dense Captions
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.DENSE_CAPTIONS),
|
||||
new ImageAnalysisOptions().setGenderNeutralCaption(true));
|
||||
|
||||
for (DenseCaption caption : result.getDenseCaptions()) {
|
||||
System.out.printf("Caption: \"%s\" (confidence: %.4f)%n",
|
||||
caption.getText(),
|
||||
caption.getConfidence());
|
||||
System.out.printf(" Region: x=%d, y=%d, w=%d, h=%d%n",
|
||||
caption.getBoundingBox().getX(),
|
||||
caption.getBoundingBox().getY(),
|
||||
caption.getBoundingBox().getWidth(),
|
||||
caption.getBoundingBox().getHeight());
|
||||
}
|
||||
```
|
||||
|
||||
### Multiple Features
|
||||
|
||||
```java
|
||||
ImageAnalysisResult result = client.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(
|
||||
VisualFeatures.CAPTION,
|
||||
VisualFeatures.TAGS,
|
||||
VisualFeatures.OBJECTS,
|
||||
VisualFeatures.READ),
|
||||
new ImageAnalysisOptions()
|
||||
.setGenderNeutralCaption(true)
|
||||
.setLanguage("en"));
|
||||
|
||||
// Access all results
|
||||
System.out.println("Caption: " + result.getCaption().getText());
|
||||
System.out.println("Tags: " + result.getTags().size());
|
||||
System.out.println("Objects: " + result.getObjects().size());
|
||||
System.out.println("Text blocks: " + result.getRead().getBlocks().size());
|
||||
```
|
||||
|
||||
### Async Analysis
|
||||
|
||||
```java
|
||||
asyncClient.analyzeFromUrl(
|
||||
imageUrl,
|
||||
Arrays.asList(VisualFeatures.CAPTION),
|
||||
null)
|
||||
.subscribe(
|
||||
result -> System.out.println("Caption: " + result.getCaption().getText()),
|
||||
error -> System.err.println("Error: " + error.getMessage()),
|
||||
() -> System.out.println("Complete")
|
||||
);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.analyzeFromUrl(imageUrl, Arrays.asList(VisualFeatures.CAPTION), null);
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
VISION_ENDPOINT=https://<resource>.cognitiveservices.azure.com/
|
||||
VISION_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Image Requirements
|
||||
|
||||
- Formats: JPEG, PNG, GIF, BMP, WEBP, ICO, TIFF, MPO
|
||||
- Size: < 20 MB
|
||||
- Dimensions: 50x50 to 16000x16000 pixels
|
||||
|
||||
## Regional Availability
|
||||
|
||||
Caption and Dense Captions require GPU-supported regions. Check [supported regions](https://learn.microsoft.com/azure/ai-services/computer-vision/concept-describe-images-40) before deployment.
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "image analysis Java"
|
||||
- "Azure Vision SDK"
|
||||
- "image captioning"
|
||||
- "OCR image text extraction"
|
||||
- "object detection image"
|
||||
- "smart crop thumbnail"
|
||||
- "detect people image"
|
||||
260
skills/azure-ai-vision-imageanalysis-py/SKILL.md
Normal file
260
skills/azure-ai-vision-imageanalysis-py/SKILL.md
Normal file
@@ -0,0 +1,260 @@
|
||||
---
|
||||
name: azure-ai-vision-imageanalysis-py
|
||||
description: |
|
||||
Azure AI Vision Image Analysis SDK for captions, tags, objects, OCR, people detection, and smart cropping. Use for computer vision and image understanding tasks.
|
||||
Triggers: "image analysis", "computer vision", "OCR", "object detection", "ImageAnalysisClient", "image caption".
|
||||
package: azure-ai-vision-imageanalysis
|
||||
---
|
||||
|
||||
# Azure AI Vision Image Analysis SDK for Python
|
||||
|
||||
Client library for Azure AI Vision 4.0 image analysis including captions, tags, objects, OCR, and more.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-vision-imageanalysis
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
VISION_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
VISION_KEY=<your-api-key> # If using API key
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```python
|
||||
import os
|
||||
from azure.ai.vision.imageanalysis import ImageAnalysisClient
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
endpoint = os.environ["VISION_ENDPOINT"]
|
||||
key = os.environ["VISION_KEY"]
|
||||
|
||||
client = ImageAnalysisClient(
|
||||
endpoint=endpoint,
|
||||
credential=AzureKeyCredential(key)
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.ai.vision.imageanalysis import ImageAnalysisClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = ImageAnalysisClient(
|
||||
endpoint=os.environ["VISION_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Analyze Image from URL
|
||||
|
||||
```python
|
||||
from azure.ai.vision.imageanalysis.models import VisualFeatures
|
||||
|
||||
image_url = "https://example.com/image.jpg"
|
||||
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[
|
||||
VisualFeatures.CAPTION,
|
||||
VisualFeatures.TAGS,
|
||||
VisualFeatures.OBJECTS,
|
||||
VisualFeatures.READ,
|
||||
VisualFeatures.PEOPLE,
|
||||
VisualFeatures.SMART_CROPS,
|
||||
VisualFeatures.DENSE_CAPTIONS
|
||||
],
|
||||
gender_neutral_caption=True,
|
||||
language="en"
|
||||
)
|
||||
```
|
||||
|
||||
## Analyze Image from File
|
||||
|
||||
```python
|
||||
with open("image.jpg", "rb") as f:
|
||||
image_data = f.read()
|
||||
|
||||
result = client.analyze(
|
||||
image_data=image_data,
|
||||
visual_features=[VisualFeatures.CAPTION, VisualFeatures.TAGS]
|
||||
)
|
||||
```
|
||||
|
||||
## Image Caption
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.CAPTION],
|
||||
gender_neutral_caption=True
|
||||
)
|
||||
|
||||
if result.caption:
|
||||
print(f"Caption: {result.caption.text}")
|
||||
print(f"Confidence: {result.caption.confidence:.2f}")
|
||||
```
|
||||
|
||||
## Dense Captions (Multiple Regions)
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.DENSE_CAPTIONS]
|
||||
)
|
||||
|
||||
if result.dense_captions:
|
||||
for caption in result.dense_captions.list:
|
||||
print(f"Caption: {caption.text}")
|
||||
print(f" Confidence: {caption.confidence:.2f}")
|
||||
print(f" Bounding box: {caption.bounding_box}")
|
||||
```
|
||||
|
||||
## Tags
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.TAGS]
|
||||
)
|
||||
|
||||
if result.tags:
|
||||
for tag in result.tags.list:
|
||||
print(f"Tag: {tag.name} (confidence: {tag.confidence:.2f})")
|
||||
```
|
||||
|
||||
## Object Detection
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.OBJECTS]
|
||||
)
|
||||
|
||||
if result.objects:
|
||||
for obj in result.objects.list:
|
||||
print(f"Object: {obj.tags[0].name}")
|
||||
print(f" Confidence: {obj.tags[0].confidence:.2f}")
|
||||
box = obj.bounding_box
|
||||
print(f" Bounding box: x={box.x}, y={box.y}, w={box.width}, h={box.height}")
|
||||
```
|
||||
|
||||
## OCR (Text Extraction)
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.READ]
|
||||
)
|
||||
|
||||
if result.read:
|
||||
for block in result.read.blocks:
|
||||
for line in block.lines:
|
||||
print(f"Line: {line.text}")
|
||||
print(f" Bounding polygon: {line.bounding_polygon}")
|
||||
|
||||
# Word-level details
|
||||
for word in line.words:
|
||||
print(f" Word: {word.text} (confidence: {word.confidence:.2f})")
|
||||
```
|
||||
|
||||
## People Detection
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.PEOPLE]
|
||||
)
|
||||
|
||||
if result.people:
|
||||
for person in result.people.list:
|
||||
print(f"Person detected:")
|
||||
print(f" Confidence: {person.confidence:.2f}")
|
||||
box = person.bounding_box
|
||||
print(f" Bounding box: x={box.x}, y={box.y}, w={box.width}, h={box.height}")
|
||||
```
|
||||
|
||||
## Smart Cropping
|
||||
|
||||
```python
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.SMART_CROPS],
|
||||
smart_crops_aspect_ratios=[0.9, 1.33, 1.78] # Portrait, 4:3, 16:9
|
||||
)
|
||||
|
||||
if result.smart_crops:
|
||||
for crop in result.smart_crops.list:
|
||||
print(f"Aspect ratio: {crop.aspect_ratio}")
|
||||
box = crop.bounding_box
|
||||
print(f" Crop region: x={box.x}, y={box.y}, w={box.width}, h={box.height}")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.ai.vision.imageanalysis.aio import ImageAnalysisClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def analyze_image():
|
||||
async with ImageAnalysisClient(
|
||||
endpoint=endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
) as client:
|
||||
result = await client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.CAPTION]
|
||||
)
|
||||
print(result.caption.text)
|
||||
```
|
||||
|
||||
## Visual Features
|
||||
|
||||
| Feature | Description |
|
||||
|---------|-------------|
|
||||
| `CAPTION` | Single sentence describing the image |
|
||||
| `DENSE_CAPTIONS` | Captions for multiple regions |
|
||||
| `TAGS` | Content tags (objects, scenes, actions) |
|
||||
| `OBJECTS` | Object detection with bounding boxes |
|
||||
| `READ` | OCR text extraction |
|
||||
| `PEOPLE` | People detection with bounding boxes |
|
||||
| `SMART_CROPS` | Suggested crop regions for thumbnails |
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
from azure.core.exceptions import HttpResponseError
|
||||
|
||||
try:
|
||||
result = client.analyze_from_url(
|
||||
image_url=image_url,
|
||||
visual_features=[VisualFeatures.CAPTION]
|
||||
)
|
||||
except HttpResponseError as e:
|
||||
print(f"Status code: {e.status_code}")
|
||||
print(f"Reason: {e.reason}")
|
||||
print(f"Message: {e.error.message}")
|
||||
```
|
||||
|
||||
## Image Requirements
|
||||
|
||||
- Formats: JPEG, PNG, GIF, BMP, WEBP, ICO, TIFF, MPO
|
||||
- Max size: 20 MB
|
||||
- Dimensions: 50x50 to 16000x16000 pixels
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Select only needed features** to optimize latency and cost
|
||||
2. **Use async client** for high-throughput scenarios
|
||||
3. **Handle HttpResponseError** for invalid images or auth issues
|
||||
4. **Enable gender_neutral_caption** for inclusive descriptions
|
||||
5. **Specify language** for localized captions
|
||||
6. **Use smart_crops_aspect_ratios** matching your thumbnail requirements
|
||||
7. **Cache results** when analyzing the same image multiple times
|
||||
265
skills/azure-ai-voicelive-dotnet/SKILL.md
Normal file
265
skills/azure-ai-voicelive-dotnet/SKILL.md
Normal file
@@ -0,0 +1,265 @@
|
||||
---
|
||||
name: azure-ai-voicelive-dotnet
|
||||
description: |
|
||||
Azure AI Voice Live SDK for .NET. Build real-time voice AI applications with bidirectional WebSocket communication. Use for voice assistants, conversational AI, real-time speech-to-speech, and voice-enabled chatbots. Triggers: "voice live", "real-time voice", "VoiceLiveClient", "VoiceLiveSession", "voice assistant .NET", "bidirectional audio", "speech-to-speech".
|
||||
package: Azure.AI.VoiceLive
|
||||
---
|
||||
|
||||
# Azure.AI.VoiceLive (.NET)
|
||||
|
||||
Real-time voice AI SDK for building bidirectional voice assistants with Azure AI.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.AI.VoiceLive
|
||||
dotnet add package Azure.Identity
|
||||
dotnet add package NAudio # For audio capture/playback
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.0.0, Preview v1.1.0-beta.1
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_VOICELIVE_ENDPOINT=https://<resource>.services.ai.azure.com/
|
||||
AZURE_VOICELIVE_MODEL=gpt-4o-realtime-preview
|
||||
AZURE_VOICELIVE_VOICE=en-US-AvaNeural
|
||||
# Optional: API key if not using Entra ID
|
||||
AZURE_VOICELIVE_API_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Microsoft Entra ID (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.VoiceLive;
|
||||
|
||||
Uri endpoint = new Uri("https://your-resource.cognitiveservices.azure.com");
|
||||
DefaultAzureCredential credential = new DefaultAzureCredential();
|
||||
VoiceLiveClient client = new VoiceLiveClient(endpoint, credential);
|
||||
```
|
||||
|
||||
**Required Role**: `Cognitive Services User` (assign in Azure Portal → Access control)
|
||||
|
||||
### API Key
|
||||
|
||||
```csharp
|
||||
Uri endpoint = new Uri("https://your-resource.cognitiveservices.azure.com");
|
||||
AzureKeyCredential credential = new AzureKeyCredential("your-api-key");
|
||||
VoiceLiveClient client = new VoiceLiveClient(endpoint, credential);
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
VoiceLiveClient
|
||||
└── VoiceLiveSession (WebSocket connection)
|
||||
├── ConfigureSessionAsync()
|
||||
├── GetUpdatesAsync() → SessionUpdate events
|
||||
├── AddItemAsync() → UserMessageItem, FunctionCallOutputItem
|
||||
├── SendAudioAsync()
|
||||
└── StartResponseAsync()
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Start Session and Configure
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.AI.VoiceLive;
|
||||
|
||||
var endpoint = new Uri(Environment.GetEnvironmentVariable("AZURE_VOICELIVE_ENDPOINT"));
|
||||
var client = new VoiceLiveClient(endpoint, new DefaultAzureCredential());
|
||||
|
||||
var model = "gpt-4o-mini-realtime-preview";
|
||||
|
||||
// Start session
|
||||
using VoiceLiveSession session = await client.StartSessionAsync(model);
|
||||
|
||||
// Configure session
|
||||
VoiceLiveSessionOptions sessionOptions = new()
|
||||
{
|
||||
Model = model,
|
||||
Instructions = "You are a helpful AI assistant. Respond naturally.",
|
||||
Voice = new AzureStandardVoice("en-US-AvaNeural"),
|
||||
TurnDetection = new AzureSemanticVadTurnDetection()
|
||||
{
|
||||
Threshold = 0.5f,
|
||||
PrefixPadding = TimeSpan.FromMilliseconds(300),
|
||||
SilenceDuration = TimeSpan.FromMilliseconds(500)
|
||||
},
|
||||
InputAudioFormat = InputAudioFormat.Pcm16,
|
||||
OutputAudioFormat = OutputAudioFormat.Pcm16
|
||||
};
|
||||
|
||||
// Set modalities (both text and audio for voice assistants)
|
||||
sessionOptions.Modalities.Clear();
|
||||
sessionOptions.Modalities.Add(InteractionModality.Text);
|
||||
sessionOptions.Modalities.Add(InteractionModality.Audio);
|
||||
|
||||
await session.ConfigureSessionAsync(sessionOptions);
|
||||
```
|
||||
|
||||
### 2. Process Events
|
||||
|
||||
```csharp
|
||||
await foreach (SessionUpdate serverEvent in session.GetUpdatesAsync())
|
||||
{
|
||||
switch (serverEvent)
|
||||
{
|
||||
case SessionUpdateResponseAudioDelta audioDelta:
|
||||
byte[] audioData = audioDelta.Delta.ToArray();
|
||||
// Play audio via NAudio or other audio library
|
||||
break;
|
||||
|
||||
case SessionUpdateResponseTextDelta textDelta:
|
||||
Console.Write(textDelta.Delta);
|
||||
break;
|
||||
|
||||
case SessionUpdateResponseFunctionCallArgumentsDone functionCall:
|
||||
// Handle function call (see Function Calling section)
|
||||
break;
|
||||
|
||||
case SessionUpdateError error:
|
||||
Console.WriteLine($"Error: {error.Error.Message}");
|
||||
break;
|
||||
|
||||
case SessionUpdateResponseDone:
|
||||
Console.WriteLine("\n--- Response complete ---");
|
||||
break;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Send User Message
|
||||
|
||||
```csharp
|
||||
await session.AddItemAsync(new UserMessageItem("Hello, can you help me?"));
|
||||
await session.StartResponseAsync();
|
||||
```
|
||||
|
||||
### 4. Function Calling
|
||||
|
||||
```csharp
|
||||
// Define function
|
||||
var weatherFunction = new VoiceLiveFunctionDefinition("get_current_weather")
|
||||
{
|
||||
Description = "Get the current weather for a given location",
|
||||
Parameters = BinaryData.FromString("""
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"location": {
|
||||
"type": "string",
|
||||
"description": "The city and state or country"
|
||||
}
|
||||
},
|
||||
"required": ["location"]
|
||||
}
|
||||
""")
|
||||
};
|
||||
|
||||
// Add to session options
|
||||
sessionOptions.Tools.Add(weatherFunction);
|
||||
|
||||
// Handle function call in event loop
|
||||
if (serverEvent is SessionUpdateResponseFunctionCallArgumentsDone functionCall)
|
||||
{
|
||||
if (functionCall.Name == "get_current_weather")
|
||||
{
|
||||
var parameters = JsonSerializer.Deserialize<Dictionary<string, string>>(functionCall.Arguments);
|
||||
string location = parameters?["location"] ?? "";
|
||||
|
||||
// Call external service
|
||||
string weatherInfo = $"The weather in {location} is sunny, 75°F.";
|
||||
|
||||
// Send response
|
||||
await session.AddItemAsync(new FunctionCallOutputItem(functionCall.CallId, weatherInfo));
|
||||
await session.StartResponseAsync();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Voice Options
|
||||
|
||||
| Voice Type | Class | Example |
|
||||
|------------|-------|---------|
|
||||
| Azure Standard | `AzureStandardVoice` | `"en-US-AvaNeural"` |
|
||||
| Azure HD | `AzureStandardVoice` | `"en-US-Ava:DragonHDLatestNeural"` |
|
||||
| Azure Custom | `AzureCustomVoice` | Custom voice with endpoint ID |
|
||||
|
||||
## Supported Models
|
||||
|
||||
| Model | Description |
|
||||
|-------|-------------|
|
||||
| `gpt-4o-realtime-preview` | GPT-4o with real-time audio |
|
||||
| `gpt-4o-mini-realtime-preview` | Lightweight, fast interactions |
|
||||
| `phi4-mm-realtime` | Cost-effective multimodal |
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `VoiceLiveClient` | Main client for creating sessions |
|
||||
| `VoiceLiveSession` | Active WebSocket session |
|
||||
| `VoiceLiveSessionOptions` | Session configuration |
|
||||
| `AzureStandardVoice` | Standard Azure voice provider |
|
||||
| `AzureSemanticVadTurnDetection` | Voice activity detection |
|
||||
| `VoiceLiveFunctionDefinition` | Function tool definition |
|
||||
| `UserMessageItem` | User text message |
|
||||
| `FunctionCallOutputItem` | Function call response |
|
||||
| `SessionUpdateResponseAudioDelta` | Audio chunk event |
|
||||
| `SessionUpdateResponseTextDelta` | Text chunk event |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always set both modalities** — Include `Text` and `Audio` for voice assistants
|
||||
2. **Use `AzureSemanticVadTurnDetection`** — Provides natural conversation flow
|
||||
3. **Configure appropriate silence duration** — 500ms typical to avoid premature cutoffs
|
||||
4. **Use `using` statement** — Ensures proper session disposal
|
||||
5. **Handle all event types** — Check for errors, audio, text, and function calls
|
||||
6. **Use DefaultAzureCredential** — Never hardcode API keys
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
if (serverEvent is SessionUpdateError error)
|
||||
{
|
||||
if (error.Error.Message.Contains("Cancellation failed: no active response"))
|
||||
{
|
||||
// Benign error, can ignore
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.WriteLine($"Error: {error.Error.Message}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Audio Configuration
|
||||
|
||||
- **Input Format**: `InputAudioFormat.Pcm16` (16-bit PCM)
|
||||
- **Output Format**: `OutputAudioFormat.Pcm16`
|
||||
- **Sample Rate**: 24kHz recommended
|
||||
- **Channels**: Mono
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.AI.VoiceLive` | Real-time voice (this SDK) | `dotnet add package Azure.AI.VoiceLive` |
|
||||
| `Microsoft.CognitiveServices.Speech` | Speech-to-text, text-to-speech | `dotnet add package Microsoft.CognitiveServices.Speech` |
|
||||
| `NAudio` | Audio capture/playback | `dotnet add package NAudio` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.AI.VoiceLive |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.ai.voicelive |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/ai/Azure.AI.VoiceLive |
|
||||
| Quickstart | https://learn.microsoft.com/azure/ai-services/speech-service/voice-live-quickstart |
|
||||
225
skills/azure-ai-voicelive-java/SKILL.md
Normal file
225
skills/azure-ai-voicelive-java/SKILL.md
Normal file
@@ -0,0 +1,225 @@
|
||||
---
|
||||
name: azure-ai-voicelive-java
|
||||
description: |
|
||||
Azure AI VoiceLive SDK for Java. Real-time bidirectional voice conversations with AI assistants using WebSocket.
|
||||
Triggers: "VoiceLiveClient java", "voice assistant java", "real-time voice java", "audio streaming java", "voice activity detection java".
|
||||
package: com.azure:azure-ai-voicelive
|
||||
---
|
||||
|
||||
# Azure AI VoiceLive SDK for Java
|
||||
|
||||
Real-time, bidirectional voice conversations with AI assistants using WebSocket technology.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-ai-voicelive</artifactId>
|
||||
<version>1.0.0-beta.2</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_VOICELIVE_ENDPOINT=https://<resource>.openai.azure.com/
|
||||
AZURE_VOICELIVE_API_KEY=<your-api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key
|
||||
|
||||
```java
|
||||
import com.azure.ai.voicelive.VoiceLiveAsyncClient;
|
||||
import com.azure.ai.voicelive.VoiceLiveClientBuilder;
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
VoiceLiveAsyncClient client = new VoiceLiveClientBuilder()
|
||||
.endpoint(System.getenv("AZURE_VOICELIVE_ENDPOINT"))
|
||||
.credential(new AzureKeyCredential(System.getenv("AZURE_VOICELIVE_API_KEY")))
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### DefaultAzureCredential (Recommended)
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
VoiceLiveAsyncClient client = new VoiceLiveClientBuilder()
|
||||
.endpoint(System.getenv("AZURE_VOICELIVE_ENDPOINT"))
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| `VoiceLiveAsyncClient` | Main entry point for voice sessions |
|
||||
| `VoiceLiveSessionAsyncClient` | Active WebSocket connection for streaming |
|
||||
| `VoiceLiveSessionOptions` | Configuration for session behavior |
|
||||
|
||||
### Audio Requirements
|
||||
|
||||
- **Sample Rate**: 24kHz (24000 Hz)
|
||||
- **Bit Depth**: 16-bit PCM
|
||||
- **Channels**: Mono (1 channel)
|
||||
- **Format**: Signed PCM, little-endian
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Start Session
|
||||
|
||||
```java
|
||||
import reactor.core.publisher.Mono;
|
||||
|
||||
client.startSession("gpt-4o-realtime-preview")
|
||||
.flatMap(session -> {
|
||||
System.out.println("Session started");
|
||||
|
||||
// Subscribe to events
|
||||
session.receiveEvents()
|
||||
.subscribe(
|
||||
event -> System.out.println("Event: " + event.getType()),
|
||||
error -> System.err.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
return Mono.just(session);
|
||||
})
|
||||
.block();
|
||||
```
|
||||
|
||||
### 2. Configure Session Options
|
||||
|
||||
```java
|
||||
import com.azure.ai.voicelive.models.*;
|
||||
import java.util.Arrays;
|
||||
|
||||
ServerVadTurnDetection turnDetection = new ServerVadTurnDetection()
|
||||
.setThreshold(0.5) // Sensitivity (0.0-1.0)
|
||||
.setPrefixPaddingMs(300) // Audio before speech
|
||||
.setSilenceDurationMs(500) // Silence to end turn
|
||||
.setInterruptResponse(true) // Allow interruptions
|
||||
.setAutoTruncate(true)
|
||||
.setCreateResponse(true);
|
||||
|
||||
AudioInputTranscriptionOptions transcription = new AudioInputTranscriptionOptions(
|
||||
AudioInputTranscriptionOptionsModel.WHISPER_1);
|
||||
|
||||
VoiceLiveSessionOptions options = new VoiceLiveSessionOptions()
|
||||
.setInstructions("You are a helpful AI voice assistant.")
|
||||
.setVoice(BinaryData.fromObject(new OpenAIVoice(OpenAIVoiceName.ALLOY)))
|
||||
.setModalities(Arrays.asList(InteractionModality.TEXT, InteractionModality.AUDIO))
|
||||
.setInputAudioFormat(InputAudioFormat.PCM16)
|
||||
.setOutputAudioFormat(OutputAudioFormat.PCM16)
|
||||
.setInputAudioSamplingRate(24000)
|
||||
.setInputAudioNoiseReduction(new AudioNoiseReduction(AudioNoiseReductionType.NEAR_FIELD))
|
||||
.setInputAudioEchoCancellation(new AudioEchoCancellation())
|
||||
.setInputAudioTranscription(transcription)
|
||||
.setTurnDetection(turnDetection);
|
||||
|
||||
// Send configuration
|
||||
ClientEventSessionUpdate updateEvent = new ClientEventSessionUpdate(options);
|
||||
session.sendEvent(updateEvent).subscribe();
|
||||
```
|
||||
|
||||
### 3. Send Audio Input
|
||||
|
||||
```java
|
||||
byte[] audioData = readAudioChunk(); // Your PCM16 audio data
|
||||
session.sendInputAudio(BinaryData.fromBytes(audioData)).subscribe();
|
||||
```
|
||||
|
||||
### 4. Handle Events
|
||||
|
||||
```java
|
||||
session.receiveEvents().subscribe(event -> {
|
||||
ServerEventType eventType = event.getType();
|
||||
|
||||
if (ServerEventType.SESSION_CREATED.equals(eventType)) {
|
||||
System.out.println("Session created");
|
||||
} else if (ServerEventType.INPUT_AUDIO_BUFFER_SPEECH_STARTED.equals(eventType)) {
|
||||
System.out.println("User started speaking");
|
||||
} else if (ServerEventType.INPUT_AUDIO_BUFFER_SPEECH_STOPPED.equals(eventType)) {
|
||||
System.out.println("User stopped speaking");
|
||||
} else if (ServerEventType.RESPONSE_AUDIO_DELTA.equals(eventType)) {
|
||||
if (event instanceof SessionUpdateResponseAudioDelta) {
|
||||
SessionUpdateResponseAudioDelta audioEvent = (SessionUpdateResponseAudioDelta) event;
|
||||
playAudioChunk(audioEvent.getDelta());
|
||||
}
|
||||
} else if (ServerEventType.RESPONSE_DONE.equals(eventType)) {
|
||||
System.out.println("Response complete");
|
||||
} else if (ServerEventType.ERROR.equals(eventType)) {
|
||||
if (event instanceof SessionUpdateError) {
|
||||
SessionUpdateError errorEvent = (SessionUpdateError) event;
|
||||
System.err.println("Error: " + errorEvent.getError().getMessage());
|
||||
}
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Voice Configuration
|
||||
|
||||
### OpenAI Voices
|
||||
|
||||
```java
|
||||
// Available: ALLOY, ASH, BALLAD, CORAL, ECHO, SAGE, SHIMMER, VERSE
|
||||
VoiceLiveSessionOptions options = new VoiceLiveSessionOptions()
|
||||
.setVoice(BinaryData.fromObject(new OpenAIVoice(OpenAIVoiceName.ALLOY)));
|
||||
```
|
||||
|
||||
### Azure Voices
|
||||
|
||||
```java
|
||||
// Azure Standard Voice
|
||||
options.setVoice(BinaryData.fromObject(new AzureStandardVoice("en-US-JennyNeural")));
|
||||
|
||||
// Azure Custom Voice
|
||||
options.setVoice(BinaryData.fromObject(new AzureCustomVoice("myVoice", "endpointId")));
|
||||
|
||||
// Azure Personal Voice
|
||||
options.setVoice(BinaryData.fromObject(
|
||||
new AzurePersonalVoice("speakerProfileId", PersonalVoiceModels.PHOENIX_LATEST_NEURAL)));
|
||||
```
|
||||
|
||||
## Function Calling
|
||||
|
||||
```java
|
||||
VoiceLiveFunctionDefinition weatherFunction = new VoiceLiveFunctionDefinition("get_weather")
|
||||
.setDescription("Get current weather for a location")
|
||||
.setParameters(BinaryData.fromObject(parametersSchema));
|
||||
|
||||
VoiceLiveSessionOptions options = new VoiceLiveSessionOptions()
|
||||
.setTools(Arrays.asList(weatherFunction))
|
||||
.setInstructions("You have access to weather information.");
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use async client** — VoiceLive requires reactive patterns
|
||||
2. **Configure turn detection** for natural conversation flow
|
||||
3. **Enable noise reduction** for better speech recognition
|
||||
4. **Handle interruptions** gracefully with `setInterruptResponse(true)`
|
||||
5. **Use Whisper transcription** for input audio transcription
|
||||
6. **Close sessions** properly when conversation ends
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
session.receiveEvents()
|
||||
.doOnError(error -> System.err.println("Connection error: " + error.getMessage()))
|
||||
.onErrorResume(error -> {
|
||||
// Attempt reconnection or cleanup
|
||||
return Flux.empty();
|
||||
})
|
||||
.subscribe();
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-voicelive |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/ai/azure-ai-voicelive/src/samples |
|
||||
309
skills/azure-ai-voicelive-py/SKILL.md
Normal file
309
skills/azure-ai-voicelive-py/SKILL.md
Normal file
@@ -0,0 +1,309 @@
|
||||
---
|
||||
name: azure-ai-voicelive-py
|
||||
description: Build real-time voice AI applications using Azure AI Voice Live SDK (azure-ai-voicelive). Use this skill when creating Python applications that need real-time bidirectional audio communication with Azure AI, including voice assistants, voice-enabled chatbots, real-time speech-to-speech translation, voice-driven avatars, or any WebSocket-based audio streaming with AI models. Supports Server VAD (Voice Activity Detection), turn-based conversation, function calling, MCP tools, avatar integration, and transcription.
|
||||
package: azure-ai-voicelive
|
||||
---
|
||||
|
||||
# Azure AI Voice Live SDK
|
||||
|
||||
Build real-time voice AI applications with bidirectional WebSocket communication.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-ai-voicelive aiohttp azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COGNITIVE_SERVICES_ENDPOINT=https://<region>.api.cognitive.microsoft.com
|
||||
# For API key auth (not recommended for production)
|
||||
AZURE_COGNITIVE_SERVICES_KEY=<api-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
**DefaultAzureCredential (preferred)**:
|
||||
```python
|
||||
from azure.ai.voicelive.aio import connect
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async with connect(
|
||||
endpoint=os.environ["AZURE_COGNITIVE_SERVICES_ENDPOINT"],
|
||||
credential=DefaultAzureCredential(),
|
||||
model="gpt-4o-realtime-preview",
|
||||
credential_scopes=["https://cognitiveservices.azure.com/.default"]
|
||||
) as conn:
|
||||
...
|
||||
```
|
||||
|
||||
**API Key**:
|
||||
```python
|
||||
from azure.ai.voicelive.aio import connect
|
||||
from azure.core.credentials import AzureKeyCredential
|
||||
|
||||
async with connect(
|
||||
endpoint=os.environ["AZURE_COGNITIVE_SERVICES_ENDPOINT"],
|
||||
credential=AzureKeyCredential(os.environ["AZURE_COGNITIVE_SERVICES_KEY"]),
|
||||
model="gpt-4o-realtime-preview"
|
||||
) as conn:
|
||||
...
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import os
|
||||
from azure.ai.voicelive.aio import connect
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def main():
|
||||
async with connect(
|
||||
endpoint=os.environ["AZURE_COGNITIVE_SERVICES_ENDPOINT"],
|
||||
credential=DefaultAzureCredential(),
|
||||
model="gpt-4o-realtime-preview",
|
||||
credential_scopes=["https://cognitiveservices.azure.com/.default"]
|
||||
) as conn:
|
||||
# Update session with instructions
|
||||
await conn.session.update(session={
|
||||
"instructions": "You are a helpful assistant.",
|
||||
"modalities": ["text", "audio"],
|
||||
"voice": "alloy"
|
||||
})
|
||||
|
||||
# Listen for events
|
||||
async for event in conn:
|
||||
print(f"Event: {event.type}")
|
||||
if event.type == "response.audio_transcript.done":
|
||||
print(f"Transcript: {event.transcript}")
|
||||
elif event.type == "response.done":
|
||||
break
|
||||
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
## Core Architecture
|
||||
|
||||
### Connection Resources
|
||||
|
||||
The `VoiceLiveConnection` exposes these resources:
|
||||
|
||||
| Resource | Purpose | Key Methods |
|
||||
|----------|---------|-------------|
|
||||
| `conn.session` | Session configuration | `update(session=...)` |
|
||||
| `conn.response` | Model responses | `create()`, `cancel()` |
|
||||
| `conn.input_audio_buffer` | Audio input | `append()`, `commit()`, `clear()` |
|
||||
| `conn.output_audio_buffer` | Audio output | `clear()` |
|
||||
| `conn.conversation` | Conversation state | `item.create()`, `item.delete()`, `item.truncate()` |
|
||||
| `conn.transcription_session` | Transcription config | `update(session=...)` |
|
||||
|
||||
## Session Configuration
|
||||
|
||||
```python
|
||||
from azure.ai.voicelive.models import RequestSession, FunctionTool
|
||||
|
||||
await conn.session.update(session=RequestSession(
|
||||
instructions="You are a helpful voice assistant.",
|
||||
modalities=["text", "audio"],
|
||||
voice="alloy", # or "echo", "shimmer", "sage", etc.
|
||||
input_audio_format="pcm16",
|
||||
output_audio_format="pcm16",
|
||||
turn_detection={
|
||||
"type": "server_vad",
|
||||
"threshold": 0.5,
|
||||
"prefix_padding_ms": 300,
|
||||
"silence_duration_ms": 500
|
||||
},
|
||||
tools=[
|
||||
FunctionTool(
|
||||
type="function",
|
||||
name="get_weather",
|
||||
description="Get current weather",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"location": {"type": "string"}
|
||||
},
|
||||
"required": ["location"]
|
||||
}
|
||||
)
|
||||
]
|
||||
))
|
||||
```
|
||||
|
||||
## Audio Streaming
|
||||
|
||||
### Send Audio (Base64 PCM16)
|
||||
|
||||
```python
|
||||
import base64
|
||||
|
||||
# Read audio chunk (16-bit PCM, 24kHz mono)
|
||||
audio_chunk = await read_audio_from_microphone()
|
||||
b64_audio = base64.b64encode(audio_chunk).decode()
|
||||
|
||||
await conn.input_audio_buffer.append(audio=b64_audio)
|
||||
```
|
||||
|
||||
### Receive Audio
|
||||
|
||||
```python
|
||||
async for event in conn:
|
||||
if event.type == "response.audio.delta":
|
||||
audio_bytes = base64.b64decode(event.delta)
|
||||
await play_audio(audio_bytes)
|
||||
elif event.type == "response.audio.done":
|
||||
print("Audio complete")
|
||||
```
|
||||
|
||||
## Event Handling
|
||||
|
||||
```python
|
||||
async for event in conn:
|
||||
match event.type:
|
||||
# Session events
|
||||
case "session.created":
|
||||
print(f"Session: {event.session}")
|
||||
case "session.updated":
|
||||
print("Session updated")
|
||||
|
||||
# Audio input events
|
||||
case "input_audio_buffer.speech_started":
|
||||
print(f"Speech started at {event.audio_start_ms}ms")
|
||||
case "input_audio_buffer.speech_stopped":
|
||||
print(f"Speech stopped at {event.audio_end_ms}ms")
|
||||
|
||||
# Transcription events
|
||||
case "conversation.item.input_audio_transcription.completed":
|
||||
print(f"User said: {event.transcript}")
|
||||
case "conversation.item.input_audio_transcription.delta":
|
||||
print(f"Partial: {event.delta}")
|
||||
|
||||
# Response events
|
||||
case "response.created":
|
||||
print(f"Response started: {event.response.id}")
|
||||
case "response.audio_transcript.delta":
|
||||
print(event.delta, end="", flush=True)
|
||||
case "response.audio.delta":
|
||||
audio = base64.b64decode(event.delta)
|
||||
case "response.done":
|
||||
print(f"Response complete: {event.response.status}")
|
||||
|
||||
# Function calls
|
||||
case "response.function_call_arguments.done":
|
||||
result = handle_function(event.name, event.arguments)
|
||||
await conn.conversation.item.create(item={
|
||||
"type": "function_call_output",
|
||||
"call_id": event.call_id,
|
||||
"output": json.dumps(result)
|
||||
})
|
||||
await conn.response.create()
|
||||
|
||||
# Errors
|
||||
case "error":
|
||||
print(f"Error: {event.error.message}")
|
||||
```
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### Manual Turn Mode (No VAD)
|
||||
|
||||
```python
|
||||
await conn.session.update(session={"turn_detection": None})
|
||||
|
||||
# Manually control turns
|
||||
await conn.input_audio_buffer.append(audio=b64_audio)
|
||||
await conn.input_audio_buffer.commit() # End of user turn
|
||||
await conn.response.create() # Trigger response
|
||||
```
|
||||
|
||||
### Interrupt Handling
|
||||
|
||||
```python
|
||||
async for event in conn:
|
||||
if event.type == "input_audio_buffer.speech_started":
|
||||
# User interrupted - cancel current response
|
||||
await conn.response.cancel()
|
||||
await conn.output_audio_buffer.clear()
|
||||
```
|
||||
|
||||
### Conversation History
|
||||
|
||||
```python
|
||||
# Add system message
|
||||
await conn.conversation.item.create(item={
|
||||
"type": "message",
|
||||
"role": "system",
|
||||
"content": [{"type": "input_text", "text": "Be concise."}]
|
||||
})
|
||||
|
||||
# Add user message
|
||||
await conn.conversation.item.create(item={
|
||||
"type": "message",
|
||||
"role": "user",
|
||||
"content": [{"type": "input_text", "text": "Hello!"}]
|
||||
})
|
||||
|
||||
await conn.response.create()
|
||||
```
|
||||
|
||||
## Voice Options
|
||||
|
||||
| Voice | Description |
|
||||
|-------|-------------|
|
||||
| `alloy` | Neutral, balanced |
|
||||
| `echo` | Warm, conversational |
|
||||
| `shimmer` | Clear, professional |
|
||||
| `sage` | Calm, authoritative |
|
||||
| `coral` | Friendly, upbeat |
|
||||
| `ash` | Deep, measured |
|
||||
| `ballad` | Expressive |
|
||||
| `verse` | Storytelling |
|
||||
|
||||
Azure voices: Use `AzureStandardVoice`, `AzureCustomVoice`, or `AzurePersonalVoice` models.
|
||||
|
||||
## Audio Formats
|
||||
|
||||
| Format | Sample Rate | Use Case |
|
||||
|--------|-------------|----------|
|
||||
| `pcm16` | 24kHz | Default, high quality |
|
||||
| `pcm16-8000hz` | 8kHz | Telephony |
|
||||
| `pcm16-16000hz` | 16kHz | Voice assistants |
|
||||
| `g711_ulaw` | 8kHz | Telephony (US) |
|
||||
| `g711_alaw` | 8kHz | Telephony (EU) |
|
||||
|
||||
## Turn Detection Options
|
||||
|
||||
```python
|
||||
# Server VAD (default)
|
||||
{"type": "server_vad", "threshold": 0.5, "silence_duration_ms": 500}
|
||||
|
||||
# Azure Semantic VAD (smarter detection)
|
||||
{"type": "azure_semantic_vad"}
|
||||
{"type": "azure_semantic_vad_en"} # English optimized
|
||||
{"type": "azure_semantic_vad_multilingual"}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
from azure.ai.voicelive.aio import ConnectionError, ConnectionClosed
|
||||
|
||||
try:
|
||||
async with connect(...) as conn:
|
||||
async for event in conn:
|
||||
if event.type == "error":
|
||||
print(f"API Error: {event.error.code} - {event.error.message}")
|
||||
except ConnectionClosed as e:
|
||||
print(f"Connection closed: {e.code} - {e.reason}")
|
||||
except ConnectionError as e:
|
||||
print(f"Connection error: {e}")
|
||||
```
|
||||
|
||||
## References
|
||||
|
||||
- **Detailed API Reference**: See [references/api-reference.md](references/api-reference.md)
|
||||
- **Complete Examples**: See [references/examples.md](references/examples.md)
|
||||
- **All Models & Types**: See [references/models.md](references/models.md)
|
||||
465
skills/azure-ai-voicelive-ts/SKILL.md
Normal file
465
skills/azure-ai-voicelive-ts/SKILL.md
Normal file
@@ -0,0 +1,465 @@
|
||||
---
|
||||
name: azure-ai-voicelive-ts
|
||||
description: |
|
||||
Azure AI Voice Live SDK for JavaScript/TypeScript. Build real-time voice AI applications with bidirectional WebSocket communication. Use for voice assistants, conversational AI, real-time speech-to-speech, and voice-enabled chatbots in Node.js or browser environments. Triggers: "voice live", "real-time voice", "VoiceLiveClient", "VoiceLiveSession", "voice assistant TypeScript", "bidirectional audio", "speech-to-speech JavaScript".
|
||||
package: @azure/ai-voicelive
|
||||
---
|
||||
|
||||
# @azure/ai-voicelive (JavaScript/TypeScript)
|
||||
|
||||
Real-time voice AI SDK for building bidirectional voice assistants with Azure AI in Node.js and browser environments.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @azure/ai-voicelive @azure/identity
|
||||
# TypeScript users
|
||||
npm install @types/node
|
||||
```
|
||||
|
||||
**Current Version**: 1.0.0-beta.3
|
||||
|
||||
**Supported Environments**:
|
||||
- Node.js LTS versions (20+)
|
||||
- Modern browsers (Chrome, Firefox, Safari, Edge)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_VOICELIVE_ENDPOINT=https://<resource>.cognitiveservices.azure.com
|
||||
# Optional: API key if not using Entra ID
|
||||
AZURE_VOICELIVE_API_KEY=<your-api-key>
|
||||
# Optional: Logging
|
||||
AZURE_LOG_LEVEL=info
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Microsoft Entra ID (Recommended)
|
||||
|
||||
```typescript
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
import { VoiceLiveClient } from "@azure/ai-voicelive";
|
||||
|
||||
const credential = new DefaultAzureCredential();
|
||||
const endpoint = "https://your-resource.cognitiveservices.azure.com";
|
||||
|
||||
const client = new VoiceLiveClient(endpoint, credential);
|
||||
```
|
||||
|
||||
### API Key
|
||||
|
||||
```typescript
|
||||
import { AzureKeyCredential } from "@azure/core-auth";
|
||||
import { VoiceLiveClient } from "@azure/ai-voicelive";
|
||||
|
||||
const endpoint = "https://your-resource.cognitiveservices.azure.com";
|
||||
const credential = new AzureKeyCredential("your-api-key");
|
||||
|
||||
const client = new VoiceLiveClient(endpoint, credential);
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
VoiceLiveClient
|
||||
└── VoiceLiveSession (WebSocket connection)
|
||||
├── updateSession() → Configure session options
|
||||
├── subscribe() → Event handlers (Azure SDK pattern)
|
||||
├── sendAudio() → Stream audio input
|
||||
├── addConversationItem() → Add messages/function outputs
|
||||
└── sendEvent() → Send raw protocol events
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
```typescript
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
import { VoiceLiveClient } from "@azure/ai-voicelive";
|
||||
|
||||
const credential = new DefaultAzureCredential();
|
||||
const endpoint = process.env.AZURE_VOICELIVE_ENDPOINT!;
|
||||
|
||||
// Create client and start session
|
||||
const client = new VoiceLiveClient(endpoint, credential);
|
||||
const session = await client.startSession("gpt-4o-mini-realtime-preview");
|
||||
|
||||
// Configure session
|
||||
await session.updateSession({
|
||||
modalities: ["text", "audio"],
|
||||
instructions: "You are a helpful AI assistant. Respond naturally.",
|
||||
voice: {
|
||||
type: "azure-standard",
|
||||
name: "en-US-AvaNeural",
|
||||
},
|
||||
turnDetection: {
|
||||
type: "server_vad",
|
||||
threshold: 0.5,
|
||||
prefixPaddingMs: 300,
|
||||
silenceDurationMs: 500,
|
||||
},
|
||||
inputAudioFormat: "pcm16",
|
||||
outputAudioFormat: "pcm16",
|
||||
});
|
||||
|
||||
// Subscribe to events
|
||||
const subscription = session.subscribe({
|
||||
onResponseAudioDelta: async (event, context) => {
|
||||
// Handle streaming audio output
|
||||
const audioData = event.delta;
|
||||
playAudioChunk(audioData);
|
||||
},
|
||||
onResponseTextDelta: async (event, context) => {
|
||||
// Handle streaming text
|
||||
process.stdout.write(event.delta);
|
||||
},
|
||||
onInputAudioTranscriptionCompleted: async (event, context) => {
|
||||
console.log("User said:", event.transcript);
|
||||
},
|
||||
});
|
||||
|
||||
// Send audio from microphone
|
||||
function sendAudioChunk(audioBuffer: ArrayBuffer) {
|
||||
session.sendAudio(audioBuffer);
|
||||
}
|
||||
```
|
||||
|
||||
## Session Configuration
|
||||
|
||||
```typescript
|
||||
await session.updateSession({
|
||||
// Modalities
|
||||
modalities: ["audio", "text"],
|
||||
|
||||
// System instructions
|
||||
instructions: "You are a customer service representative.",
|
||||
|
||||
// Voice selection
|
||||
voice: {
|
||||
type: "azure-standard", // or "azure-custom", "openai"
|
||||
name: "en-US-AvaNeural",
|
||||
},
|
||||
|
||||
// Turn detection (VAD)
|
||||
turnDetection: {
|
||||
type: "server_vad", // or "azure_semantic_vad"
|
||||
threshold: 0.5,
|
||||
prefixPaddingMs: 300,
|
||||
silenceDurationMs: 500,
|
||||
},
|
||||
|
||||
// Audio formats
|
||||
inputAudioFormat: "pcm16",
|
||||
outputAudioFormat: "pcm16",
|
||||
|
||||
// Tools (function calling)
|
||||
tools: [
|
||||
{
|
||||
type: "function",
|
||||
name: "get_weather",
|
||||
description: "Get current weather",
|
||||
parameters: {
|
||||
type: "object",
|
||||
properties: {
|
||||
location: { type: "string" }
|
||||
},
|
||||
required: ["location"]
|
||||
}
|
||||
}
|
||||
],
|
||||
toolChoice: "auto",
|
||||
});
|
||||
```
|
||||
|
||||
## Event Handling (Azure SDK Pattern)
|
||||
|
||||
The SDK uses a subscription-based event handling pattern:
|
||||
|
||||
```typescript
|
||||
const subscription = session.subscribe({
|
||||
// Connection lifecycle
|
||||
onConnected: async (args, context) => {
|
||||
console.log("Connected:", args.connectionId);
|
||||
},
|
||||
onDisconnected: async (args, context) => {
|
||||
console.log("Disconnected:", args.code, args.reason);
|
||||
},
|
||||
onError: async (args, context) => {
|
||||
console.error("Error:", args.error.message);
|
||||
},
|
||||
|
||||
// Session events
|
||||
onSessionCreated: async (event, context) => {
|
||||
console.log("Session created:", context.sessionId);
|
||||
},
|
||||
onSessionUpdated: async (event, context) => {
|
||||
console.log("Session updated");
|
||||
},
|
||||
|
||||
// Audio input events (VAD)
|
||||
onInputAudioBufferSpeechStarted: async (event, context) => {
|
||||
console.log("Speech started at:", event.audioStartMs);
|
||||
},
|
||||
onInputAudioBufferSpeechStopped: async (event, context) => {
|
||||
console.log("Speech stopped at:", event.audioEndMs);
|
||||
},
|
||||
|
||||
// Transcription events
|
||||
onConversationItemInputAudioTranscriptionCompleted: async (event, context) => {
|
||||
console.log("User said:", event.transcript);
|
||||
},
|
||||
onConversationItemInputAudioTranscriptionDelta: async (event, context) => {
|
||||
process.stdout.write(event.delta);
|
||||
},
|
||||
|
||||
// Response events
|
||||
onResponseCreated: async (event, context) => {
|
||||
console.log("Response started");
|
||||
},
|
||||
onResponseDone: async (event, context) => {
|
||||
console.log("Response complete");
|
||||
},
|
||||
|
||||
// Streaming text
|
||||
onResponseTextDelta: async (event, context) => {
|
||||
process.stdout.write(event.delta);
|
||||
},
|
||||
onResponseTextDone: async (event, context) => {
|
||||
console.log("\n--- Text complete ---");
|
||||
},
|
||||
|
||||
// Streaming audio
|
||||
onResponseAudioDelta: async (event, context) => {
|
||||
const audioData = event.delta;
|
||||
playAudioChunk(audioData);
|
||||
},
|
||||
onResponseAudioDone: async (event, context) => {
|
||||
console.log("Audio complete");
|
||||
},
|
||||
|
||||
// Audio transcript (what assistant said)
|
||||
onResponseAudioTranscriptDelta: async (event, context) => {
|
||||
process.stdout.write(event.delta);
|
||||
},
|
||||
|
||||
// Function calling
|
||||
onResponseFunctionCallArgumentsDone: async (event, context) => {
|
||||
if (event.name === "get_weather") {
|
||||
const args = JSON.parse(event.arguments);
|
||||
const result = await getWeather(args.location);
|
||||
|
||||
await session.addConversationItem({
|
||||
type: "function_call_output",
|
||||
callId: event.callId,
|
||||
output: JSON.stringify(result),
|
||||
});
|
||||
|
||||
await session.sendEvent({ type: "response.create" });
|
||||
}
|
||||
},
|
||||
|
||||
// Catch-all for debugging
|
||||
onServerEvent: async (event, context) => {
|
||||
console.log("Event:", event.type);
|
||||
},
|
||||
});
|
||||
|
||||
// Clean up when done
|
||||
await subscription.close();
|
||||
```
|
||||
|
||||
## Function Calling
|
||||
|
||||
```typescript
|
||||
// Define tools in session config
|
||||
await session.updateSession({
|
||||
modalities: ["audio", "text"],
|
||||
instructions: "Help users with weather information.",
|
||||
tools: [
|
||||
{
|
||||
type: "function",
|
||||
name: "get_weather",
|
||||
description: "Get current weather for a location",
|
||||
parameters: {
|
||||
type: "object",
|
||||
properties: {
|
||||
location: {
|
||||
type: "string",
|
||||
description: "City and state or country",
|
||||
},
|
||||
},
|
||||
required: ["location"],
|
||||
},
|
||||
},
|
||||
],
|
||||
toolChoice: "auto",
|
||||
});
|
||||
|
||||
// Handle function calls
|
||||
const subscription = session.subscribe({
|
||||
onResponseFunctionCallArgumentsDone: async (event, context) => {
|
||||
if (event.name === "get_weather") {
|
||||
const args = JSON.parse(event.arguments);
|
||||
const weatherData = await fetchWeather(args.location);
|
||||
|
||||
// Send function result
|
||||
await session.addConversationItem({
|
||||
type: "function_call_output",
|
||||
callId: event.callId,
|
||||
output: JSON.stringify(weatherData),
|
||||
});
|
||||
|
||||
// Trigger response generation
|
||||
await session.sendEvent({ type: "response.create" });
|
||||
}
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Voice Options
|
||||
|
||||
| Voice Type | Config | Example |
|
||||
|------------|--------|---------|
|
||||
| Azure Standard | `{ type: "azure-standard", name: "..." }` | `"en-US-AvaNeural"` |
|
||||
| Azure Custom | `{ type: "azure-custom", name: "...", endpointId: "..." }` | Custom voice endpoint |
|
||||
| Azure Personal | `{ type: "azure-personal", speakerProfileId: "..." }` | Personal voice clone |
|
||||
| OpenAI | `{ type: "openai", name: "..." }` | `"alloy"`, `"echo"`, `"shimmer"` |
|
||||
|
||||
## Supported Models
|
||||
|
||||
| Model | Description | Use Case |
|
||||
|-------|-------------|----------|
|
||||
| `gpt-4o-realtime-preview` | GPT-4o with real-time audio | High-quality conversational AI |
|
||||
| `gpt-4o-mini-realtime-preview` | Lightweight GPT-4o | Fast, efficient interactions |
|
||||
| `phi4-mm-realtime` | Phi multimodal | Cost-effective applications |
|
||||
|
||||
## Turn Detection Options
|
||||
|
||||
```typescript
|
||||
// Server VAD (default)
|
||||
turnDetection: {
|
||||
type: "server_vad",
|
||||
threshold: 0.5,
|
||||
prefixPaddingMs: 300,
|
||||
silenceDurationMs: 500,
|
||||
}
|
||||
|
||||
// Azure Semantic VAD (smarter detection)
|
||||
turnDetection: {
|
||||
type: "azure_semantic_vad",
|
||||
}
|
||||
|
||||
// Azure Semantic VAD (English optimized)
|
||||
turnDetection: {
|
||||
type: "azure_semantic_vad_en",
|
||||
}
|
||||
|
||||
// Azure Semantic VAD (Multilingual)
|
||||
turnDetection: {
|
||||
type: "azure_semantic_vad_multilingual",
|
||||
}
|
||||
```
|
||||
|
||||
## Audio Formats
|
||||
|
||||
| Format | Sample Rate | Use Case |
|
||||
|--------|-------------|----------|
|
||||
| `pcm16` | 24kHz | Default, high quality |
|
||||
| `pcm16-8000hz` | 8kHz | Telephony |
|
||||
| `pcm16-16000hz` | 16kHz | Voice assistants |
|
||||
| `g711_ulaw` | 8kHz | Telephony (US) |
|
||||
| `g711_alaw` | 8kHz | Telephony (EU) |
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `VoiceLiveClient` | Main client for creating sessions |
|
||||
| `VoiceLiveSession` | Active WebSocket session |
|
||||
| `VoiceLiveSessionHandlers` | Event handler interface |
|
||||
| `VoiceLiveSubscription` | Active event subscription |
|
||||
| `ConnectionContext` | Context for connection events |
|
||||
| `SessionContext` | Context for session events |
|
||||
| `ServerEventUnion` | Union of all server events |
|
||||
|
||||
## Error Handling
|
||||
|
||||
```typescript
|
||||
import {
|
||||
VoiceLiveError,
|
||||
VoiceLiveConnectionError,
|
||||
VoiceLiveAuthenticationError,
|
||||
VoiceLiveProtocolError,
|
||||
} from "@azure/ai-voicelive";
|
||||
|
||||
const subscription = session.subscribe({
|
||||
onError: async (args, context) => {
|
||||
const { error } = args;
|
||||
|
||||
if (error instanceof VoiceLiveConnectionError) {
|
||||
console.error("Connection error:", error.message);
|
||||
} else if (error instanceof VoiceLiveAuthenticationError) {
|
||||
console.error("Auth error:", error.message);
|
||||
} else if (error instanceof VoiceLiveProtocolError) {
|
||||
console.error("Protocol error:", error.message);
|
||||
}
|
||||
},
|
||||
|
||||
onServerError: async (event, context) => {
|
||||
console.error("Server error:", event.error?.message);
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Logging
|
||||
|
||||
```typescript
|
||||
import { setLogLevel } from "@azure/logger";
|
||||
|
||||
// Enable verbose logging
|
||||
setLogLevel("info");
|
||||
|
||||
// Or via environment variable
|
||||
// AZURE_LOG_LEVEL=info
|
||||
```
|
||||
|
||||
## Browser Usage
|
||||
|
||||
```typescript
|
||||
// Browser requires bundler (Vite, webpack, etc.)
|
||||
import { VoiceLiveClient } from "@azure/ai-voicelive";
|
||||
import { InteractiveBrowserCredential } from "@azure/identity";
|
||||
|
||||
// Use browser-compatible credential
|
||||
const credential = new InteractiveBrowserCredential({
|
||||
clientId: "your-client-id",
|
||||
tenantId: "your-tenant-id",
|
||||
});
|
||||
|
||||
const client = new VoiceLiveClient(endpoint, credential);
|
||||
|
||||
// Request microphone access
|
||||
const stream = await navigator.mediaDevices.getUserMedia({ audio: true });
|
||||
const audioContext = new AudioContext({ sampleRate: 24000 });
|
||||
|
||||
// Process audio and send to session
|
||||
// ... (see samples for full implementation)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always use `DefaultAzureCredential`** — Never hardcode API keys
|
||||
2. **Set both modalities** — Include `["text", "audio"]` for voice assistants
|
||||
3. **Use Azure Semantic VAD** — Better turn detection than basic server VAD
|
||||
4. **Handle all error types** — Connection, auth, and protocol errors
|
||||
5. **Clean up subscriptions** — Call `subscription.close()` when done
|
||||
6. **Use appropriate audio format** — PCM16 at 24kHz for best quality
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| npm Package | https://www.npmjs.com/package/@azure/ai-voicelive |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-voicelive |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-voicelive/samples |
|
||||
| API Reference | https://learn.microsoft.com/javascript/api/@azure/ai-voicelive |
|
||||
470
skills/azure-appconfiguration-java/SKILL.md
Normal file
470
skills/azure-appconfiguration-java/SKILL.md
Normal file
@@ -0,0 +1,470 @@
|
||||
---
|
||||
name: azure-appconfiguration-java
|
||||
description: |
|
||||
Azure App Configuration SDK for Java. Centralized application configuration management with key-value settings, feature flags, and snapshots.
|
||||
Triggers: "ConfigurationClient java", "app configuration java", "feature flag java", "configuration setting java", "azure config java".
|
||||
package: com.azure:azure-data-appconfiguration
|
||||
---
|
||||
|
||||
# Azure App Configuration SDK for Java
|
||||
|
||||
Client library for Azure App Configuration, a managed service for centralizing application configurations.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-data-appconfiguration</artifactId>
|
||||
<version>1.8.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
Or use Azure SDK BOM:
|
||||
|
||||
```xml
|
||||
<dependencyManagement>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-sdk-bom</artifactId>
|
||||
<version>{bom_version}</version>
|
||||
<type>pom</type>
|
||||
<scope>import</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</dependencyManagement>
|
||||
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-data-appconfiguration</artifactId>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Azure App Configuration store
|
||||
- Connection string or Entra ID credentials
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_APPCONFIG_CONNECTION_STRING=Endpoint=https://<store>.azconfig.io;Id=<id>;Secret=<secret>
|
||||
AZURE_APPCONFIG_ENDPOINT=https://<store>.azconfig.io
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With Connection String
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.ConfigurationClient;
|
||||
import com.azure.data.appconfiguration.ConfigurationClientBuilder;
|
||||
|
||||
ConfigurationClient configClient = new ConfigurationClientBuilder()
|
||||
.connectionString(System.getenv("AZURE_APPCONFIG_CONNECTION_STRING"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.ConfigurationAsyncClient;
|
||||
|
||||
ConfigurationAsyncClient asyncClient = new ConfigurationClientBuilder()
|
||||
.connectionString(connectionString)
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### With Entra ID (Recommended)
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
ConfigurationClient configClient = new ConfigurationClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(System.getenv("AZURE_APPCONFIG_ENDPOINT"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| Configuration Setting | Key-value pair with optional label |
|
||||
| Label | Dimension for separating settings (e.g., environments) |
|
||||
| Feature Flag | Special setting for feature management |
|
||||
| Secret Reference | Setting pointing to Key Vault secret |
|
||||
| Snapshot | Point-in-time immutable view of settings |
|
||||
|
||||
## Configuration Setting Operations
|
||||
|
||||
### Create Setting (Add)
|
||||
|
||||
Creates only if setting doesn't exist:
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.ConfigurationSetting;
|
||||
|
||||
ConfigurationSetting setting = configClient.addConfigurationSetting(
|
||||
"app/database/connection",
|
||||
"Production",
|
||||
"Server=prod.db.com;Database=myapp"
|
||||
);
|
||||
```
|
||||
|
||||
### Create or Update Setting (Set)
|
||||
|
||||
Creates or overwrites:
|
||||
|
||||
```java
|
||||
ConfigurationSetting setting = configClient.setConfigurationSetting(
|
||||
"app/cache/enabled",
|
||||
"Production",
|
||||
"true"
|
||||
);
|
||||
```
|
||||
|
||||
### Get Setting
|
||||
|
||||
```java
|
||||
ConfigurationSetting setting = configClient.getConfigurationSetting(
|
||||
"app/database/connection",
|
||||
"Production"
|
||||
);
|
||||
System.out.println("Value: " + setting.getValue());
|
||||
System.out.println("Content-Type: " + setting.getContentType());
|
||||
System.out.println("Last Modified: " + setting.getLastModified());
|
||||
```
|
||||
|
||||
### Conditional Get (If Changed)
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.Response;
|
||||
import com.azure.core.util.Context;
|
||||
|
||||
Response<ConfigurationSetting> response = configClient.getConfigurationSettingWithResponse(
|
||||
setting, // Setting with ETag
|
||||
null, // Accept datetime
|
||||
true, // ifChanged - only fetch if modified
|
||||
Context.NONE
|
||||
);
|
||||
|
||||
if (response.getStatusCode() == 304) {
|
||||
System.out.println("Setting not modified");
|
||||
} else {
|
||||
ConfigurationSetting updated = response.getValue();
|
||||
}
|
||||
```
|
||||
|
||||
### Update Setting
|
||||
|
||||
```java
|
||||
ConfigurationSetting updated = configClient.setConfigurationSetting(
|
||||
"app/cache/enabled",
|
||||
"Production",
|
||||
"false"
|
||||
);
|
||||
```
|
||||
|
||||
### Conditional Update (If Unchanged)
|
||||
|
||||
```java
|
||||
// Only update if ETag matches (no concurrent modifications)
|
||||
Response<ConfigurationSetting> response = configClient.setConfigurationSettingWithResponse(
|
||||
setting, // Setting with current ETag
|
||||
true, // ifUnchanged
|
||||
Context.NONE
|
||||
);
|
||||
```
|
||||
|
||||
### Delete Setting
|
||||
|
||||
```java
|
||||
ConfigurationSetting deleted = configClient.deleteConfigurationSetting(
|
||||
"app/cache/enabled",
|
||||
"Production"
|
||||
);
|
||||
```
|
||||
|
||||
### Conditional Delete
|
||||
|
||||
```java
|
||||
Response<ConfigurationSetting> response = configClient.deleteConfigurationSettingWithResponse(
|
||||
setting, // Setting with ETag
|
||||
true, // ifUnchanged
|
||||
Context.NONE
|
||||
);
|
||||
```
|
||||
|
||||
## List and Filter Settings
|
||||
|
||||
### List by Key Pattern
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.SettingSelector;
|
||||
import com.azure.core.http.rest.PagedIterable;
|
||||
|
||||
SettingSelector selector = new SettingSelector()
|
||||
.setKeyFilter("app/*");
|
||||
|
||||
PagedIterable<ConfigurationSetting> settings = configClient.listConfigurationSettings(selector);
|
||||
for (ConfigurationSetting s : settings) {
|
||||
System.out.println(s.getKey() + " = " + s.getValue());
|
||||
}
|
||||
```
|
||||
|
||||
### List by Label
|
||||
|
||||
```java
|
||||
SettingSelector selector = new SettingSelector()
|
||||
.setKeyFilter("*")
|
||||
.setLabelFilter("Production");
|
||||
|
||||
PagedIterable<ConfigurationSetting> settings = configClient.listConfigurationSettings(selector);
|
||||
```
|
||||
|
||||
### List by Multiple Keys
|
||||
|
||||
```java
|
||||
SettingSelector selector = new SettingSelector()
|
||||
.setKeyFilter("app/database/*,app/cache/*");
|
||||
|
||||
PagedIterable<ConfigurationSetting> settings = configClient.listConfigurationSettings(selector);
|
||||
```
|
||||
|
||||
### List Revisions
|
||||
|
||||
```java
|
||||
SettingSelector selector = new SettingSelector()
|
||||
.setKeyFilter("app/database/connection");
|
||||
|
||||
PagedIterable<ConfigurationSetting> revisions = configClient.listRevisions(selector);
|
||||
for (ConfigurationSetting revision : revisions) {
|
||||
System.out.println("Value: " + revision.getValue() + ", Modified: " + revision.getLastModified());
|
||||
}
|
||||
```
|
||||
|
||||
## Feature Flags
|
||||
|
||||
### Create Feature Flag
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.FeatureFlagConfigurationSetting;
|
||||
import com.azure.data.appconfiguration.models.FeatureFlagFilter;
|
||||
import java.util.Arrays;
|
||||
|
||||
FeatureFlagFilter percentageFilter = new FeatureFlagFilter("Microsoft.Percentage")
|
||||
.addParameter("Value", 50);
|
||||
|
||||
FeatureFlagConfigurationSetting featureFlag = new FeatureFlagConfigurationSetting("beta-feature", true)
|
||||
.setDescription("Beta feature rollout")
|
||||
.setClientFilters(Arrays.asList(percentageFilter));
|
||||
|
||||
FeatureFlagConfigurationSetting created = (FeatureFlagConfigurationSetting)
|
||||
configClient.addConfigurationSetting(featureFlag);
|
||||
```
|
||||
|
||||
### Get Feature Flag
|
||||
|
||||
```java
|
||||
FeatureFlagConfigurationSetting flag = (FeatureFlagConfigurationSetting)
|
||||
configClient.getConfigurationSetting(featureFlag);
|
||||
|
||||
System.out.println("Feature: " + flag.getFeatureId());
|
||||
System.out.println("Enabled: " + flag.isEnabled());
|
||||
System.out.println("Filters: " + flag.getClientFilters());
|
||||
```
|
||||
|
||||
### Update Feature Flag
|
||||
|
||||
```java
|
||||
featureFlag.setEnabled(false);
|
||||
FeatureFlagConfigurationSetting updated = (FeatureFlagConfigurationSetting)
|
||||
configClient.setConfigurationSetting(featureFlag);
|
||||
```
|
||||
|
||||
## Secret References
|
||||
|
||||
### Create Secret Reference
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.SecretReferenceConfigurationSetting;
|
||||
|
||||
SecretReferenceConfigurationSetting secretRef = new SecretReferenceConfigurationSetting(
|
||||
"app/secrets/api-key",
|
||||
"https://myvault.vault.azure.net/secrets/api-key"
|
||||
);
|
||||
|
||||
SecretReferenceConfigurationSetting created = (SecretReferenceConfigurationSetting)
|
||||
configClient.addConfigurationSetting(secretRef);
|
||||
```
|
||||
|
||||
### Get Secret Reference
|
||||
|
||||
```java
|
||||
SecretReferenceConfigurationSetting ref = (SecretReferenceConfigurationSetting)
|
||||
configClient.getConfigurationSetting(secretRef);
|
||||
|
||||
System.out.println("Secret URI: " + ref.getSecretId());
|
||||
```
|
||||
|
||||
## Read-Only Settings
|
||||
|
||||
### Set Read-Only
|
||||
|
||||
```java
|
||||
ConfigurationSetting readOnly = configClient.setReadOnly(
|
||||
"app/critical/setting",
|
||||
"Production",
|
||||
true
|
||||
);
|
||||
```
|
||||
|
||||
### Clear Read-Only
|
||||
|
||||
```java
|
||||
ConfigurationSetting writable = configClient.setReadOnly(
|
||||
"app/critical/setting",
|
||||
"Production",
|
||||
false
|
||||
);
|
||||
```
|
||||
|
||||
## Snapshots
|
||||
|
||||
### Create Snapshot
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.ConfigurationSnapshot;
|
||||
import com.azure.data.appconfiguration.models.ConfigurationSettingsFilter;
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
import com.azure.core.util.polling.PollOperationDetails;
|
||||
|
||||
List<ConfigurationSettingsFilter> filters = new ArrayList<>();
|
||||
filters.add(new ConfigurationSettingsFilter("app/*"));
|
||||
|
||||
SyncPoller<PollOperationDetails, ConfigurationSnapshot> poller = configClient.beginCreateSnapshot(
|
||||
"release-v1.0",
|
||||
new ConfigurationSnapshot(filters),
|
||||
Context.NONE
|
||||
);
|
||||
poller.setPollInterval(Duration.ofSeconds(10));
|
||||
poller.waitForCompletion();
|
||||
|
||||
ConfigurationSnapshot snapshot = poller.getFinalResult();
|
||||
System.out.println("Snapshot: " + snapshot.getName() + ", Status: " + snapshot.getStatus());
|
||||
```
|
||||
|
||||
### Get Snapshot
|
||||
|
||||
```java
|
||||
ConfigurationSnapshot snapshot = configClient.getSnapshot("release-v1.0");
|
||||
System.out.println("Created: " + snapshot.getCreatedAt());
|
||||
System.out.println("Items: " + snapshot.getItemCount());
|
||||
```
|
||||
|
||||
### List Settings in Snapshot
|
||||
|
||||
```java
|
||||
PagedIterable<ConfigurationSetting> settings =
|
||||
configClient.listConfigurationSettingsForSnapshot("release-v1.0");
|
||||
|
||||
for (ConfigurationSetting setting : settings) {
|
||||
System.out.println(setting.getKey() + " = " + setting.getValue());
|
||||
}
|
||||
```
|
||||
|
||||
### Archive Snapshot
|
||||
|
||||
```java
|
||||
ConfigurationSnapshot archived = configClient.archiveSnapshot("release-v1.0");
|
||||
System.out.println("Status: " + archived.getStatus()); // archived
|
||||
```
|
||||
|
||||
### Recover Snapshot
|
||||
|
||||
```java
|
||||
ConfigurationSnapshot recovered = configClient.recoverSnapshot("release-v1.0");
|
||||
System.out.println("Status: " + recovered.getStatus()); // ready
|
||||
```
|
||||
|
||||
### List All Snapshots
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.SnapshotSelector;
|
||||
|
||||
SnapshotSelector selector = new SnapshotSelector().setNameFilter("release-*");
|
||||
PagedIterable<ConfigurationSnapshot> snapshots = configClient.listSnapshots(selector);
|
||||
|
||||
for (ConfigurationSnapshot snap : snapshots) {
|
||||
System.out.println(snap.getName() + " - " + snap.getStatus());
|
||||
}
|
||||
```
|
||||
|
||||
## Labels
|
||||
|
||||
### List Labels
|
||||
|
||||
```java
|
||||
import com.azure.data.appconfiguration.models.SettingLabelSelector;
|
||||
|
||||
configClient.listLabels(new SettingLabelSelector().setNameFilter("*"))
|
||||
.forEach(label -> System.out.println("Label: " + label.getName()));
|
||||
```
|
||||
|
||||
## Async Operations
|
||||
|
||||
```java
|
||||
ConfigurationAsyncClient asyncClient = new ConfigurationClientBuilder()
|
||||
.connectionString(connectionString)
|
||||
.buildAsyncClient();
|
||||
|
||||
// Async list with reactive streams
|
||||
asyncClient.listConfigurationSettings(new SettingSelector().setLabelFilter("Production"))
|
||||
.subscribe(
|
||||
setting -> System.out.println(setting.getKey() + " = " + setting.getValue()),
|
||||
error -> System.err.println("Error: " + error.getMessage()),
|
||||
() -> System.out.println("Completed")
|
||||
);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
configClient.getConfigurationSetting("nonexistent", null);
|
||||
} catch (HttpResponseException e) {
|
||||
if (e.getResponse().getStatusCode() == 404) {
|
||||
System.err.println("Setting not found");
|
||||
} else {
|
||||
System.err.println("Error: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use labels** — Separate configurations by environment (Dev, Staging, Production)
|
||||
2. **Use snapshots** — Create immutable snapshots for releases
|
||||
3. **Feature flags** — Use for gradual rollouts and A/B testing
|
||||
4. **Secret references** — Store sensitive values in Key Vault
|
||||
5. **Conditional requests** — Use ETags for optimistic concurrency
|
||||
6. **Read-only protection** — Lock critical production settings
|
||||
7. **Use Entra ID** — Preferred over connection strings
|
||||
8. **Async client** — Use for high-throughput scenarios
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-data-appconfiguration |
|
||||
| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/appconfiguration/azure-data-appconfiguration |
|
||||
| API Documentation | https://aka.ms/java-docs |
|
||||
| Product Docs | https://learn.microsoft.com/azure/azure-app-configuration |
|
||||
| Samples | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/appconfiguration/azure-data-appconfiguration/src/samples |
|
||||
| Troubleshooting | https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/appconfiguration/azure-data-appconfiguration/TROUBLESHOOTING.md |
|
||||
249
skills/azure-appconfiguration-py/SKILL.md
Normal file
249
skills/azure-appconfiguration-py/SKILL.md
Normal file
@@ -0,0 +1,249 @@
|
||||
---
|
||||
name: azure-appconfiguration-py
|
||||
description: |
|
||||
Azure App Configuration SDK for Python. Use for centralized configuration management, feature flags, and dynamic settings.
|
||||
Triggers: "azure-appconfiguration", "AzureAppConfigurationClient", "feature flags", "configuration", "key-value settings".
|
||||
package: azure-appconfiguration
|
||||
---
|
||||
|
||||
# Azure App Configuration SDK for Python
|
||||
|
||||
Centralized configuration management with feature flags and dynamic settings.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-appconfiguration
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_APPCONFIGURATION_CONNECTION_STRING=Endpoint=https://<name>.azconfig.io;Id=...;Secret=...
|
||||
# Or for Entra ID:
|
||||
AZURE_APPCONFIGURATION_ENDPOINT=https://<name>.azconfig.io
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Connection String
|
||||
|
||||
```python
|
||||
from azure.appconfiguration import AzureAppConfigurationClient
|
||||
|
||||
client = AzureAppConfigurationClient.from_connection_string(
|
||||
os.environ["AZURE_APPCONFIGURATION_CONNECTION_STRING"]
|
||||
)
|
||||
```
|
||||
|
||||
### Entra ID
|
||||
|
||||
```python
|
||||
from azure.appconfiguration import AzureAppConfigurationClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = AzureAppConfigurationClient(
|
||||
base_url=os.environ["AZURE_APPCONFIGURATION_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Configuration Settings
|
||||
|
||||
### Get Setting
|
||||
|
||||
```python
|
||||
setting = client.get_configuration_setting(key="app:settings:message")
|
||||
print(f"{setting.key} = {setting.value}")
|
||||
```
|
||||
|
||||
### Get with Label
|
||||
|
||||
```python
|
||||
# Labels allow environment-specific values
|
||||
setting = client.get_configuration_setting(
|
||||
key="app:settings:message",
|
||||
label="production"
|
||||
)
|
||||
```
|
||||
|
||||
### Set Setting
|
||||
|
||||
```python
|
||||
from azure.appconfiguration import ConfigurationSetting
|
||||
|
||||
setting = ConfigurationSetting(
|
||||
key="app:settings:message",
|
||||
value="Hello, World!",
|
||||
label="development",
|
||||
content_type="text/plain",
|
||||
tags={"environment": "dev"}
|
||||
)
|
||||
|
||||
client.set_configuration_setting(setting)
|
||||
```
|
||||
|
||||
### Delete Setting
|
||||
|
||||
```python
|
||||
client.delete_configuration_setting(
|
||||
key="app:settings:message",
|
||||
label="development"
|
||||
)
|
||||
```
|
||||
|
||||
## List Settings
|
||||
|
||||
### All Settings
|
||||
|
||||
```python
|
||||
settings = client.list_configuration_settings()
|
||||
for setting in settings:
|
||||
print(f"{setting.key} [{setting.label}] = {setting.value}")
|
||||
```
|
||||
|
||||
### Filter by Key Prefix
|
||||
|
||||
```python
|
||||
settings = client.list_configuration_settings(
|
||||
key_filter="app:settings:*"
|
||||
)
|
||||
```
|
||||
|
||||
### Filter by Label
|
||||
|
||||
```python
|
||||
settings = client.list_configuration_settings(
|
||||
label_filter="production"
|
||||
)
|
||||
```
|
||||
|
||||
## Feature Flags
|
||||
|
||||
### Set Feature Flag
|
||||
|
||||
```python
|
||||
from azure.appconfiguration import ConfigurationSetting
|
||||
import json
|
||||
|
||||
feature_flag = ConfigurationSetting(
|
||||
key=".appconfig.featureflag/beta-feature",
|
||||
value=json.dumps({
|
||||
"id": "beta-feature",
|
||||
"enabled": True,
|
||||
"conditions": {
|
||||
"client_filters": []
|
||||
}
|
||||
}),
|
||||
content_type="application/vnd.microsoft.appconfig.ff+json;charset=utf-8"
|
||||
)
|
||||
|
||||
client.set_configuration_setting(feature_flag)
|
||||
```
|
||||
|
||||
### Get Feature Flag
|
||||
|
||||
```python
|
||||
setting = client.get_configuration_setting(
|
||||
key=".appconfig.featureflag/beta-feature"
|
||||
)
|
||||
flag_data = json.loads(setting.value)
|
||||
print(f"Feature enabled: {flag_data['enabled']}")
|
||||
```
|
||||
|
||||
### List Feature Flags
|
||||
|
||||
```python
|
||||
flags = client.list_configuration_settings(
|
||||
key_filter=".appconfig.featureflag/*"
|
||||
)
|
||||
for flag in flags:
|
||||
data = json.loads(flag.value)
|
||||
print(f"{data['id']}: {'enabled' if data['enabled'] else 'disabled'}")
|
||||
```
|
||||
|
||||
## Read-Only Settings
|
||||
|
||||
```python
|
||||
# Make setting read-only
|
||||
client.set_read_only(
|
||||
configuration_setting=setting,
|
||||
read_only=True
|
||||
)
|
||||
|
||||
# Remove read-only
|
||||
client.set_read_only(
|
||||
configuration_setting=setting,
|
||||
read_only=False
|
||||
)
|
||||
```
|
||||
|
||||
## Snapshots
|
||||
|
||||
### Create Snapshot
|
||||
|
||||
```python
|
||||
from azure.appconfiguration import ConfigurationSnapshot, ConfigurationSettingFilter
|
||||
|
||||
snapshot = ConfigurationSnapshot(
|
||||
name="v1-snapshot",
|
||||
filters=[
|
||||
ConfigurationSettingFilter(key="app:*", label="production")
|
||||
]
|
||||
)
|
||||
|
||||
created = client.begin_create_snapshot(
|
||||
name="v1-snapshot",
|
||||
snapshot=snapshot
|
||||
).result()
|
||||
```
|
||||
|
||||
### List Snapshot Settings
|
||||
|
||||
```python
|
||||
settings = client.list_configuration_settings(
|
||||
snapshot_name="v1-snapshot"
|
||||
)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.appconfiguration.aio import AzureAppConfigurationClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def main():
|
||||
credential = DefaultAzureCredential()
|
||||
client = AzureAppConfigurationClient(
|
||||
base_url=endpoint,
|
||||
credential=credential
|
||||
)
|
||||
|
||||
setting = await client.get_configuration_setting(key="app:message")
|
||||
print(setting.value)
|
||||
|
||||
await client.close()
|
||||
await credential.close()
|
||||
```
|
||||
|
||||
## Client Operations
|
||||
|
||||
| Operation | Description |
|
||||
|-----------|-------------|
|
||||
| `get_configuration_setting` | Get single setting |
|
||||
| `set_configuration_setting` | Create or update setting |
|
||||
| `delete_configuration_setting` | Delete setting |
|
||||
| `list_configuration_settings` | List with filters |
|
||||
| `set_read_only` | Lock/unlock setting |
|
||||
| `begin_create_snapshot` | Create point-in-time snapshot |
|
||||
| `list_snapshots` | List all snapshots |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use labels** for environment separation (dev, staging, prod)
|
||||
2. **Use key prefixes** for logical grouping (app:database:*, app:cache:*)
|
||||
3. **Make production settings read-only** to prevent accidental changes
|
||||
4. **Create snapshots** before deployments for rollback capability
|
||||
5. **Use Entra ID** instead of connection strings in production
|
||||
6. **Refresh settings periodically** in long-running applications
|
||||
7. **Use feature flags** for gradual rollouts and A/B testing
|
||||
349
skills/azure-appconfiguration-ts/SKILL.md
Normal file
349
skills/azure-appconfiguration-ts/SKILL.md
Normal file
@@ -0,0 +1,349 @@
|
||||
---
|
||||
name: azure-appconfiguration-ts
|
||||
description: Build applications using Azure App Configuration SDK for JavaScript (@azure/app-configuration). Use when working with configuration settings, feature flags, Key Vault references, dynamic refresh, or centralized configuration management.
|
||||
package: @azure/app-configuration
|
||||
---
|
||||
|
||||
# Azure App Configuration SDK for TypeScript
|
||||
|
||||
Centralized configuration management with feature flags and dynamic refresh.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Low-level CRUD SDK
|
||||
npm install @azure/app-configuration @azure/identity
|
||||
|
||||
# High-level provider (recommended for apps)
|
||||
npm install @azure/app-configuration-provider @azure/identity
|
||||
|
||||
# Feature flag management
|
||||
npm install @microsoft/feature-management
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_APPCONFIG_ENDPOINT=https://<your-resource>.azconfig.io
|
||||
# OR
|
||||
AZURE_APPCONFIG_CONNECTION_STRING=Endpoint=https://...;Id=...;Secret=...
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```typescript
|
||||
import { AppConfigurationClient } from "@azure/app-configuration";
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
// DefaultAzureCredential (recommended)
|
||||
const client = new AppConfigurationClient(
|
||||
process.env.AZURE_APPCONFIG_ENDPOINT!,
|
||||
new DefaultAzureCredential()
|
||||
);
|
||||
|
||||
// Connection string
|
||||
const client2 = new AppConfigurationClient(
|
||||
process.env.AZURE_APPCONFIG_CONNECTION_STRING!
|
||||
);
|
||||
```
|
||||
|
||||
## CRUD Operations
|
||||
|
||||
### Create/Update Settings
|
||||
|
||||
```typescript
|
||||
// Add new (fails if exists)
|
||||
await client.addConfigurationSetting({
|
||||
key: "app:settings:message",
|
||||
value: "Hello World",
|
||||
label: "production",
|
||||
contentType: "text/plain",
|
||||
tags: { environment: "prod" },
|
||||
});
|
||||
|
||||
// Set (create or update)
|
||||
await client.setConfigurationSetting({
|
||||
key: "app:settings:message",
|
||||
value: "Updated value",
|
||||
label: "production",
|
||||
});
|
||||
|
||||
// Update with optimistic concurrency
|
||||
const existing = await client.getConfigurationSetting({ key: "myKey" });
|
||||
existing.value = "new value";
|
||||
await client.setConfigurationSetting(existing, { onlyIfUnchanged: true });
|
||||
```
|
||||
|
||||
### Read Settings
|
||||
|
||||
```typescript
|
||||
// Get single setting
|
||||
const setting = await client.getConfigurationSetting({
|
||||
key: "app:settings:message",
|
||||
label: "production", // optional
|
||||
});
|
||||
console.log(setting.value);
|
||||
|
||||
// List with filters
|
||||
const settings = client.listConfigurationSettings({
|
||||
keyFilter: "app:*",
|
||||
labelFilter: "production",
|
||||
});
|
||||
|
||||
for await (const setting of settings) {
|
||||
console.log(`${setting.key}: ${setting.value}`);
|
||||
}
|
||||
```
|
||||
|
||||
### Delete Settings
|
||||
|
||||
```typescript
|
||||
await client.deleteConfigurationSetting({
|
||||
key: "app:settings:message",
|
||||
label: "production",
|
||||
});
|
||||
```
|
||||
|
||||
### Lock/Unlock (Read-Only)
|
||||
|
||||
```typescript
|
||||
// Lock
|
||||
await client.setReadOnly({ key: "myKey", label: "prod" }, true);
|
||||
|
||||
// Unlock
|
||||
await client.setReadOnly({ key: "myKey", label: "prod" }, false);
|
||||
```
|
||||
|
||||
## App Configuration Provider
|
||||
|
||||
### Load Configuration
|
||||
|
||||
```typescript
|
||||
import { load } from "@azure/app-configuration-provider";
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
const appConfig = await load(
|
||||
process.env.AZURE_APPCONFIG_ENDPOINT!,
|
||||
new DefaultAzureCredential(),
|
||||
{
|
||||
selectors: [
|
||||
{ keyFilter: "app:*", labelFilter: "production" },
|
||||
],
|
||||
trimKeyPrefixes: ["app:"],
|
||||
}
|
||||
);
|
||||
|
||||
// Map-style access
|
||||
const value = appConfig.get("settings:message");
|
||||
|
||||
// Object-style access
|
||||
const config = appConfig.constructConfigurationObject({ separator: ":" });
|
||||
console.log(config.settings.message);
|
||||
```
|
||||
|
||||
### Dynamic Refresh
|
||||
|
||||
```typescript
|
||||
const appConfig = await load(endpoint, credential, {
|
||||
selectors: [{ keyFilter: "app:*" }],
|
||||
refreshOptions: {
|
||||
enabled: true,
|
||||
refreshIntervalInMs: 30_000, // 30 seconds
|
||||
},
|
||||
});
|
||||
|
||||
// Trigger refresh (non-blocking)
|
||||
appConfig.refresh();
|
||||
|
||||
// Listen for refresh events
|
||||
const disposer = appConfig.onRefresh(() => {
|
||||
console.log("Configuration refreshed!");
|
||||
});
|
||||
|
||||
// Express middleware pattern
|
||||
app.use((req, res, next) => {
|
||||
appConfig.refresh();
|
||||
next();
|
||||
});
|
||||
```
|
||||
|
||||
### Key Vault References
|
||||
|
||||
```typescript
|
||||
const appConfig = await load(endpoint, credential, {
|
||||
selectors: [{ keyFilter: "app:*" }],
|
||||
keyVaultOptions: {
|
||||
credential: new DefaultAzureCredential(),
|
||||
secretRefreshIntervalInMs: 7200_000, // 2 hours
|
||||
},
|
||||
});
|
||||
|
||||
// Secrets are automatically resolved
|
||||
const dbPassword = appConfig.get("database:password");
|
||||
```
|
||||
|
||||
## Feature Flags
|
||||
|
||||
### Create Feature Flag (Low-Level)
|
||||
|
||||
```typescript
|
||||
import {
|
||||
featureFlagPrefix,
|
||||
featureFlagContentType,
|
||||
FeatureFlagValue,
|
||||
ConfigurationSetting,
|
||||
} from "@azure/app-configuration";
|
||||
|
||||
const flag: ConfigurationSetting<FeatureFlagValue> = {
|
||||
key: `${featureFlagPrefix}Beta`,
|
||||
contentType: featureFlagContentType,
|
||||
value: {
|
||||
id: "Beta",
|
||||
enabled: true,
|
||||
description: "Beta feature",
|
||||
conditions: {
|
||||
clientFilters: [
|
||||
{
|
||||
name: "Microsoft.Targeting",
|
||||
parameters: {
|
||||
Audience: {
|
||||
Users: ["user@example.com"],
|
||||
Groups: [{ Name: "beta-testers", RolloutPercentage: 50 }],
|
||||
DefaultRolloutPercentage: 0,
|
||||
},
|
||||
},
|
||||
},
|
||||
],
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
await client.addConfigurationSetting(flag);
|
||||
```
|
||||
|
||||
### Load and Evaluate Feature Flags
|
||||
|
||||
```typescript
|
||||
import { load } from "@azure/app-configuration-provider";
|
||||
import {
|
||||
ConfigurationMapFeatureFlagProvider,
|
||||
FeatureManager,
|
||||
} from "@microsoft/feature-management";
|
||||
|
||||
const appConfig = await load(endpoint, credential, {
|
||||
featureFlagOptions: {
|
||||
enabled: true,
|
||||
selectors: [{ keyFilter: "*" }],
|
||||
refresh: {
|
||||
enabled: true,
|
||||
refreshIntervalInMs: 30_000,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const featureProvider = new ConfigurationMapFeatureFlagProvider(appConfig);
|
||||
const featureManager = new FeatureManager(featureProvider);
|
||||
|
||||
// Simple check
|
||||
const isEnabled = await featureManager.isEnabled("Beta");
|
||||
|
||||
// With targeting context
|
||||
const isEnabledForUser = await featureManager.isEnabled("Beta", {
|
||||
userId: "user@example.com",
|
||||
groups: ["beta-testers"],
|
||||
});
|
||||
```
|
||||
|
||||
## Snapshots
|
||||
|
||||
```typescript
|
||||
// Create snapshot
|
||||
const snapshot = await client.beginCreateSnapshotAndWait({
|
||||
name: "release-v1.0",
|
||||
retentionPeriod: 2592000, // 30 days
|
||||
filters: [{ keyFilter: "app:*", labelFilter: "production" }],
|
||||
});
|
||||
|
||||
// Get snapshot
|
||||
const snap = await client.getSnapshot("release-v1.0");
|
||||
|
||||
// List settings in snapshot
|
||||
const settings = client.listConfigurationSettingsForSnapshot("release-v1.0");
|
||||
for await (const setting of settings) {
|
||||
console.log(`${setting.key}: ${setting.value}`);
|
||||
}
|
||||
|
||||
// Archive/recover
|
||||
await client.archiveSnapshot("release-v1.0");
|
||||
await client.recoverSnapshot("release-v1.0");
|
||||
|
||||
// Load from snapshot (provider)
|
||||
const config = await load(endpoint, credential, {
|
||||
selectors: [{ snapshotName: "release-v1.0" }],
|
||||
});
|
||||
```
|
||||
|
||||
## Labels
|
||||
|
||||
```typescript
|
||||
// Create settings with labels
|
||||
await client.setConfigurationSetting({
|
||||
key: "database:host",
|
||||
value: "dev-db.example.com",
|
||||
label: "development",
|
||||
});
|
||||
|
||||
await client.setConfigurationSetting({
|
||||
key: "database:host",
|
||||
value: "prod-db.example.com",
|
||||
label: "production",
|
||||
});
|
||||
|
||||
// Filter by label
|
||||
const prodSettings = client.listConfigurationSettings({
|
||||
keyFilter: "*",
|
||||
labelFilter: "production",
|
||||
});
|
||||
|
||||
// No label (null label)
|
||||
const noLabelSettings = client.listConfigurationSettings({
|
||||
labelFilter: "\0",
|
||||
});
|
||||
|
||||
// List available labels
|
||||
for await (const label of client.listLabels()) {
|
||||
console.log(label.name);
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
import {
|
||||
AppConfigurationClient,
|
||||
ConfigurationSetting,
|
||||
FeatureFlagValue,
|
||||
SecretReferenceValue,
|
||||
featureFlagPrefix,
|
||||
featureFlagContentType,
|
||||
secretReferenceContentType,
|
||||
ListConfigurationSettingsOptions,
|
||||
} from "@azure/app-configuration";
|
||||
|
||||
import { load } from "@azure/app-configuration-provider";
|
||||
|
||||
import {
|
||||
FeatureManager,
|
||||
ConfigurationMapFeatureFlagProvider,
|
||||
} from "@microsoft/feature-management";
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use provider for apps** - `@azure/app-configuration-provider` for runtime config
|
||||
2. **Use low-level for management** - `@azure/app-configuration` for CRUD operations
|
||||
3. **Enable refresh** - For dynamic configuration updates
|
||||
4. **Use labels** - Separate configurations by environment
|
||||
5. **Use snapshots** - For immutable release configurations
|
||||
6. **Sentinel pattern** - Use a sentinel key to trigger full refresh
|
||||
7. **RBAC roles** - `App Configuration Data Reader` for read-only access
|
||||
254
skills/azure-communication-callautomation-java/SKILL.md
Normal file
254
skills/azure-communication-callautomation-java/SKILL.md
Normal file
@@ -0,0 +1,254 @@
|
||||
---
|
||||
name: azure-communication-callautomation-java
|
||||
description: Build call automation workflows with Azure Communication Services Call Automation Java SDK. Use when implementing IVR systems, call routing, call recording, DTMF recognition, text-to-speech, or AI-powered call flows.
|
||||
package: com.azure:azure-communication-callautomation
|
||||
---
|
||||
|
||||
# Azure Communication Call Automation (Java)
|
||||
|
||||
Build server-side call automation workflows including IVR systems, call routing, recording, and AI-powered interactions.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-callautomation</artifactId>
|
||||
<version>1.6.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
```java
|
||||
import com.azure.communication.callautomation.CallAutomationClient;
|
||||
import com.azure.communication.callautomation.CallAutomationClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
// With DefaultAzureCredential
|
||||
CallAutomationClient client = new CallAutomationClientBuilder()
|
||||
.endpoint("https://<resource>.communication.azure.com")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
|
||||
// With connection string
|
||||
CallAutomationClient client = new CallAutomationClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Class | Purpose |
|
||||
|-------|---------|
|
||||
| `CallAutomationClient` | Make calls, answer/reject incoming calls, redirect calls |
|
||||
| `CallConnection` | Actions in established calls (add participants, terminate) |
|
||||
| `CallMedia` | Media operations (play audio, recognize DTMF/speech) |
|
||||
| `CallRecording` | Start/stop/pause recording |
|
||||
| `CallAutomationEventParser` | Parse webhook events from ACS |
|
||||
|
||||
## Create Outbound Call
|
||||
|
||||
```java
|
||||
import com.azure.communication.callautomation.models.*;
|
||||
import com.azure.communication.common.CommunicationUserIdentifier;
|
||||
import com.azure.communication.common.PhoneNumberIdentifier;
|
||||
|
||||
// Call to PSTN number
|
||||
PhoneNumberIdentifier target = new PhoneNumberIdentifier("+14255551234");
|
||||
PhoneNumberIdentifier caller = new PhoneNumberIdentifier("+14255550100");
|
||||
|
||||
CreateCallOptions options = new CreateCallOptions(
|
||||
new CommunicationUserIdentifier("<user-id>"), // Source
|
||||
List.of(target)) // Targets
|
||||
.setSourceCallerId(caller)
|
||||
.setCallbackUrl("https://your-app.com/api/callbacks");
|
||||
|
||||
CreateCallResult result = client.createCall(options);
|
||||
String callConnectionId = result.getCallConnectionProperties().getCallConnectionId();
|
||||
```
|
||||
|
||||
## Answer Incoming Call
|
||||
|
||||
```java
|
||||
// From Event Grid webhook - IncomingCall event
|
||||
String incomingCallContext = "<incoming-call-context-from-event>";
|
||||
|
||||
AnswerCallOptions options = new AnswerCallOptions(
|
||||
incomingCallContext,
|
||||
"https://your-app.com/api/callbacks");
|
||||
|
||||
AnswerCallResult result = client.answerCall(options);
|
||||
CallConnection callConnection = result.getCallConnection();
|
||||
```
|
||||
|
||||
## Play Audio (Text-to-Speech)
|
||||
|
||||
```java
|
||||
CallConnection callConnection = client.getCallConnection(callConnectionId);
|
||||
CallMedia callMedia = callConnection.getCallMedia();
|
||||
|
||||
// Play text-to-speech
|
||||
TextSource textSource = new TextSource()
|
||||
.setText("Welcome to Contoso. Press 1 for sales, 2 for support.")
|
||||
.setVoiceName("en-US-JennyNeural");
|
||||
|
||||
PlayOptions playOptions = new PlayOptions(
|
||||
List.of(textSource),
|
||||
List.of(new CommunicationUserIdentifier("<target-user>")));
|
||||
|
||||
callMedia.play(playOptions);
|
||||
|
||||
// Play audio file
|
||||
FileSource fileSource = new FileSource()
|
||||
.setUrl("https://storage.blob.core.windows.net/audio/greeting.wav");
|
||||
|
||||
callMedia.play(new PlayOptions(List.of(fileSource), List.of(target)));
|
||||
```
|
||||
|
||||
## Recognize DTMF Input
|
||||
|
||||
```java
|
||||
// Recognize DTMF tones
|
||||
DtmfTone stopTones = DtmfTone.POUND;
|
||||
|
||||
CallMediaRecognizeDtmfOptions recognizeOptions = new CallMediaRecognizeDtmfOptions(
|
||||
new CommunicationUserIdentifier("<target-user>"),
|
||||
5) // Max tones to collect
|
||||
.setInterToneTimeout(Duration.ofSeconds(5))
|
||||
.setStopTones(List.of(stopTones))
|
||||
.setInitialSilenceTimeout(Duration.ofSeconds(15))
|
||||
.setPlayPrompt(new TextSource().setText("Enter your account number followed by pound."));
|
||||
|
||||
callMedia.startRecognizing(recognizeOptions);
|
||||
```
|
||||
|
||||
## Recognize Speech
|
||||
|
||||
```java
|
||||
// Speech recognition with AI
|
||||
CallMediaRecognizeSpeechOptions speechOptions = new CallMediaRecognizeSpeechOptions(
|
||||
new CommunicationUserIdentifier("<target-user>"))
|
||||
.setEndSilenceTimeout(Duration.ofSeconds(2))
|
||||
.setSpeechLanguage("en-US")
|
||||
.setPlayPrompt(new TextSource().setText("How can I help you today?"));
|
||||
|
||||
callMedia.startRecognizing(speechOptions);
|
||||
```
|
||||
|
||||
## Call Recording
|
||||
|
||||
```java
|
||||
CallRecording callRecording = client.getCallRecording();
|
||||
|
||||
// Start recording
|
||||
StartRecordingOptions recordingOptions = new StartRecordingOptions(
|
||||
new ServerCallLocator("<server-call-id>"))
|
||||
.setRecordingChannel(RecordingChannel.MIXED)
|
||||
.setRecordingContent(RecordingContent.AUDIO_VIDEO)
|
||||
.setRecordingFormat(RecordingFormat.MP4);
|
||||
|
||||
RecordingStateResult recordingResult = callRecording.start(recordingOptions);
|
||||
String recordingId = recordingResult.getRecordingId();
|
||||
|
||||
// Pause/resume/stop
|
||||
callRecording.pause(recordingId);
|
||||
callRecording.resume(recordingId);
|
||||
callRecording.stop(recordingId);
|
||||
|
||||
// Download recording (after RecordingFileStatusUpdated event)
|
||||
callRecording.downloadTo(recordingUrl, Paths.get("recording.mp4"));
|
||||
```
|
||||
|
||||
## Add Participant to Call
|
||||
|
||||
```java
|
||||
CallConnection callConnection = client.getCallConnection(callConnectionId);
|
||||
|
||||
CommunicationUserIdentifier participant = new CommunicationUserIdentifier("<user-id>");
|
||||
AddParticipantOptions addOptions = new AddParticipantOptions(participant)
|
||||
.setInvitationTimeout(Duration.ofSeconds(30));
|
||||
|
||||
AddParticipantResult result = callConnection.addParticipant(addOptions);
|
||||
```
|
||||
|
||||
## Transfer Call
|
||||
|
||||
```java
|
||||
// Blind transfer
|
||||
PhoneNumberIdentifier transferTarget = new PhoneNumberIdentifier("+14255559999");
|
||||
TransferCallToParticipantResult result = callConnection.transferCallToParticipant(transferTarget);
|
||||
```
|
||||
|
||||
## Handle Events (Webhook)
|
||||
|
||||
```java
|
||||
import com.azure.communication.callautomation.CallAutomationEventParser;
|
||||
import com.azure.communication.callautomation.models.events.*;
|
||||
|
||||
// In your webhook endpoint
|
||||
public void handleCallback(String requestBody) {
|
||||
List<CallAutomationEventBase> events = CallAutomationEventParser.parseEvents(requestBody);
|
||||
|
||||
for (CallAutomationEventBase event : events) {
|
||||
if (event instanceof CallConnected) {
|
||||
CallConnected connected = (CallConnected) event;
|
||||
System.out.println("Call connected: " + connected.getCallConnectionId());
|
||||
} else if (event instanceof RecognizeCompleted) {
|
||||
RecognizeCompleted recognized = (RecognizeCompleted) event;
|
||||
// Handle DTMF or speech recognition result
|
||||
DtmfResult dtmfResult = (DtmfResult) recognized.getRecognizeResult();
|
||||
String tones = dtmfResult.getTones().stream()
|
||||
.map(DtmfTone::toString)
|
||||
.collect(Collectors.joining());
|
||||
System.out.println("DTMF received: " + tones);
|
||||
} else if (event instanceof PlayCompleted) {
|
||||
System.out.println("Audio playback completed");
|
||||
} else if (event instanceof CallDisconnected) {
|
||||
System.out.println("Call ended");
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Hang Up Call
|
||||
|
||||
```java
|
||||
// Hang up for all participants
|
||||
callConnection.hangUp(true);
|
||||
|
||||
// Hang up only this leg
|
||||
callConnection.hangUp(false);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.answerCall(options);
|
||||
} catch (HttpResponseException e) {
|
||||
if (e.getResponse().getStatusCode() == 404) {
|
||||
System.out.println("Call not found or already ended");
|
||||
} else if (e.getResponse().getStatusCode() == 400) {
|
||||
System.out.println("Invalid request: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COMMUNICATION_ENDPOINT=https://<resource>.communication.azure.com
|
||||
AZURE_COMMUNICATION_CONNECTION_STRING=endpoint=https://...;accesskey=...
|
||||
CALLBACK_BASE_URL=https://your-app.com/api/callbacks
|
||||
```
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "call automation Java", "IVR Java", "interactive voice response"
|
||||
- "call recording Java", "DTMF recognition Java"
|
||||
- "text to speech call", "speech recognition call"
|
||||
- "answer incoming call", "transfer call Java"
|
||||
- "Azure Communication Services call automation"
|
||||
91
skills/azure-communication-callingserver-java/SKILL.md
Normal file
91
skills/azure-communication-callingserver-java/SKILL.md
Normal file
@@ -0,0 +1,91 @@
|
||||
---
|
||||
name: azure-communication-callingserver-java
|
||||
description: Azure Communication Services CallingServer (legacy) Java SDK. Note - This SDK is deprecated. Use azure-communication-callautomation instead for new projects. Only use this skill when maintaining legacy code.
|
||||
package: com.azure:azure-communication-callingserver
|
||||
---
|
||||
|
||||
# Azure Communication CallingServer (Java) - DEPRECATED
|
||||
|
||||
> **⚠️ DEPRECATED**: This SDK has been renamed to **Call Automation**. For new projects, use `azure-communication-callautomation` instead. This skill is for maintaining legacy code only.
|
||||
|
||||
## Migration to Call Automation
|
||||
|
||||
```xml
|
||||
<!-- OLD (deprecated) -->
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-callingserver</artifactId>
|
||||
<version>1.0.0-beta.5</version>
|
||||
</dependency>
|
||||
|
||||
<!-- NEW (use this instead) -->
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-callautomation</artifactId>
|
||||
<version>1.6.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Class Name Changes
|
||||
|
||||
| CallingServer (Old) | Call Automation (New) |
|
||||
|---------------------|----------------------|
|
||||
| `CallingServerClient` | `CallAutomationClient` |
|
||||
| `CallingServerClientBuilder` | `CallAutomationClientBuilder` |
|
||||
| `CallConnection` | `CallConnection` (same) |
|
||||
| `ServerCall` | Removed - use `CallConnection` |
|
||||
|
||||
## Legacy Client Creation
|
||||
|
||||
```java
|
||||
// OLD WAY (deprecated)
|
||||
import com.azure.communication.callingserver.CallingServerClient;
|
||||
import com.azure.communication.callingserver.CallingServerClientBuilder;
|
||||
|
||||
CallingServerClient client = new CallingServerClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
|
||||
// NEW WAY
|
||||
import com.azure.communication.callautomation.CallAutomationClient;
|
||||
import com.azure.communication.callautomation.CallAutomationClientBuilder;
|
||||
|
||||
CallAutomationClient client = new CallAutomationClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Legacy Recording
|
||||
|
||||
```java
|
||||
// OLD WAY
|
||||
StartRecordingOptions options = new StartRecordingOptions(serverCallId)
|
||||
.setRecordingStateCallbackUri(callbackUri);
|
||||
|
||||
StartCallRecordingResult result = client.startRecording(options);
|
||||
String recordingId = result.getRecordingId();
|
||||
|
||||
client.pauseRecording(recordingId);
|
||||
client.resumeRecording(recordingId);
|
||||
client.stopRecording(recordingId);
|
||||
|
||||
// NEW WAY - see azure-communication-callautomation skill
|
||||
```
|
||||
|
||||
## For New Development
|
||||
|
||||
**Do not use this SDK for new projects.**
|
||||
|
||||
See the `azure-communication-callautomation-java` skill for:
|
||||
- Making outbound calls
|
||||
- Answering incoming calls
|
||||
- Call recording
|
||||
- DTMF recognition
|
||||
- Text-to-speech / speech-to-text
|
||||
- Adding/removing participants
|
||||
- Call transfer
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "callingserver legacy", "deprecated calling SDK"
|
||||
- "migrate callingserver to callautomation"
|
||||
310
skills/azure-communication-chat-java/SKILL.md
Normal file
310
skills/azure-communication-chat-java/SKILL.md
Normal file
@@ -0,0 +1,310 @@
|
||||
---
|
||||
name: azure-communication-chat-java
|
||||
description: Build real-time chat applications with Azure Communication Services Chat Java SDK. Use when implementing chat threads, messaging, participants, read receipts, typing notifications, or real-time chat features.
|
||||
package: com.azure:azure-communication-chat
|
||||
---
|
||||
|
||||
# Azure Communication Chat (Java)
|
||||
|
||||
Build real-time chat applications with thread management, messaging, participants, and read receipts.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-chat</artifactId>
|
||||
<version>1.6.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
```java
|
||||
import com.azure.communication.chat.ChatClient;
|
||||
import com.azure.communication.chat.ChatClientBuilder;
|
||||
import com.azure.communication.chat.ChatThreadClient;
|
||||
import com.azure.communication.common.CommunicationTokenCredential;
|
||||
|
||||
// ChatClient requires a CommunicationTokenCredential (user access token)
|
||||
String endpoint = "https://<resource>.communication.azure.com";
|
||||
String userAccessToken = "<user-access-token>";
|
||||
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(userAccessToken);
|
||||
|
||||
ChatClient chatClient = new ChatClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
|
||||
// Async client
|
||||
ChatAsyncClient chatAsyncClient = new ChatClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(credential)
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Class | Purpose |
|
||||
|-------|---------|
|
||||
| `ChatClient` | Create/delete chat threads, get thread clients |
|
||||
| `ChatThreadClient` | Operations within a thread (messages, participants, receipts) |
|
||||
| `ChatParticipant` | User in a chat thread with display name |
|
||||
| `ChatMessage` | Message content, type, sender info, timestamps |
|
||||
| `ChatMessageReadReceipt` | Read receipt tracking per participant |
|
||||
|
||||
## Create Chat Thread
|
||||
|
||||
```java
|
||||
import com.azure.communication.chat.models.*;
|
||||
import com.azure.communication.common.CommunicationUserIdentifier;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
// Define participants
|
||||
List<ChatParticipant> participants = new ArrayList<>();
|
||||
|
||||
ChatParticipant participant1 = new ChatParticipant()
|
||||
.setCommunicationIdentifier(new CommunicationUserIdentifier("<user-id-1>"))
|
||||
.setDisplayName("Alice");
|
||||
|
||||
ChatParticipant participant2 = new ChatParticipant()
|
||||
.setCommunicationIdentifier(new CommunicationUserIdentifier("<user-id-2>"))
|
||||
.setDisplayName("Bob");
|
||||
|
||||
participants.add(participant1);
|
||||
participants.add(participant2);
|
||||
|
||||
// Create thread
|
||||
CreateChatThreadOptions options = new CreateChatThreadOptions("Project Discussion")
|
||||
.setParticipants(participants);
|
||||
|
||||
CreateChatThreadResult result = chatClient.createChatThread(options);
|
||||
String threadId = result.getChatThread().getId();
|
||||
|
||||
// Get thread client for operations
|
||||
ChatThreadClient threadClient = chatClient.getChatThreadClient(threadId);
|
||||
```
|
||||
|
||||
## Send Messages
|
||||
|
||||
```java
|
||||
// Send text message
|
||||
SendChatMessageOptions messageOptions = new SendChatMessageOptions()
|
||||
.setContent("Hello, team!")
|
||||
.setSenderDisplayName("Alice")
|
||||
.setType(ChatMessageType.TEXT);
|
||||
|
||||
SendChatMessageResult sendResult = threadClient.sendMessage(messageOptions);
|
||||
String messageId = sendResult.getId();
|
||||
|
||||
// Send HTML message
|
||||
SendChatMessageOptions htmlOptions = new SendChatMessageOptions()
|
||||
.setContent("<strong>Important:</strong> Meeting at 3pm")
|
||||
.setType(ChatMessageType.HTML);
|
||||
|
||||
threadClient.sendMessage(htmlOptions);
|
||||
```
|
||||
|
||||
## Get Messages
|
||||
|
||||
```java
|
||||
import com.azure.core.util.paging.PagedIterable;
|
||||
|
||||
// List all messages
|
||||
PagedIterable<ChatMessage> messages = threadClient.listMessages();
|
||||
|
||||
for (ChatMessage message : messages) {
|
||||
System.out.println("ID: " + message.getId());
|
||||
System.out.println("Type: " + message.getType());
|
||||
System.out.println("Content: " + message.getContent().getMessage());
|
||||
System.out.println("Sender: " + message.getSenderDisplayName());
|
||||
System.out.println("Created: " + message.getCreatedOn());
|
||||
|
||||
// Check if edited or deleted
|
||||
if (message.getEditedOn() != null) {
|
||||
System.out.println("Edited: " + message.getEditedOn());
|
||||
}
|
||||
if (message.getDeletedOn() != null) {
|
||||
System.out.println("Deleted: " + message.getDeletedOn());
|
||||
}
|
||||
}
|
||||
|
||||
// Get specific message
|
||||
ChatMessage message = threadClient.getMessage(messageId);
|
||||
```
|
||||
|
||||
## Update and Delete Messages
|
||||
|
||||
```java
|
||||
// Update message
|
||||
UpdateChatMessageOptions updateOptions = new UpdateChatMessageOptions()
|
||||
.setContent("Updated message content");
|
||||
|
||||
threadClient.updateMessage(messageId, updateOptions);
|
||||
|
||||
// Delete message
|
||||
threadClient.deleteMessage(messageId);
|
||||
```
|
||||
|
||||
## Manage Participants
|
||||
|
||||
```java
|
||||
// List participants
|
||||
PagedIterable<ChatParticipant> participants = threadClient.listParticipants();
|
||||
|
||||
for (ChatParticipant participant : participants) {
|
||||
CommunicationUserIdentifier user =
|
||||
(CommunicationUserIdentifier) participant.getCommunicationIdentifier();
|
||||
System.out.println("User: " + user.getId());
|
||||
System.out.println("Display Name: " + participant.getDisplayName());
|
||||
}
|
||||
|
||||
// Add participants
|
||||
List<ChatParticipant> newParticipants = new ArrayList<>();
|
||||
newParticipants.add(new ChatParticipant()
|
||||
.setCommunicationIdentifier(new CommunicationUserIdentifier("<new-user-id>"))
|
||||
.setDisplayName("Charlie")
|
||||
.setShareHistoryTime(OffsetDateTime.now().minusDays(7))); // Share last 7 days
|
||||
|
||||
threadClient.addParticipants(newParticipants);
|
||||
|
||||
// Remove participant
|
||||
CommunicationUserIdentifier userToRemove = new CommunicationUserIdentifier("<user-id>");
|
||||
threadClient.removeParticipant(userToRemove);
|
||||
```
|
||||
|
||||
## Read Receipts
|
||||
|
||||
```java
|
||||
// Send read receipt
|
||||
threadClient.sendReadReceipt(messageId);
|
||||
|
||||
// Get read receipts
|
||||
PagedIterable<ChatMessageReadReceipt> receipts = threadClient.listReadReceipts();
|
||||
|
||||
for (ChatMessageReadReceipt receipt : receipts) {
|
||||
System.out.println("Message ID: " + receipt.getChatMessageId());
|
||||
System.out.println("Read by: " + receipt.getSenderCommunicationIdentifier());
|
||||
System.out.println("Read at: " + receipt.getReadOn());
|
||||
}
|
||||
```
|
||||
|
||||
## Typing Notifications
|
||||
|
||||
```java
|
||||
import com.azure.communication.chat.models.TypingNotificationOptions;
|
||||
|
||||
// Send typing notification
|
||||
TypingNotificationOptions typingOptions = new TypingNotificationOptions()
|
||||
.setSenderDisplayName("Alice");
|
||||
|
||||
threadClient.sendTypingNotificationWithResponse(typingOptions, Context.NONE);
|
||||
|
||||
// Simple typing notification
|
||||
threadClient.sendTypingNotification();
|
||||
```
|
||||
|
||||
## Thread Operations
|
||||
|
||||
```java
|
||||
// Get thread properties
|
||||
ChatThreadProperties properties = threadClient.getProperties();
|
||||
System.out.println("Topic: " + properties.getTopic());
|
||||
System.out.println("Created: " + properties.getCreatedOn());
|
||||
|
||||
// Update topic
|
||||
threadClient.updateTopic("New Project Discussion Topic");
|
||||
|
||||
// Delete thread
|
||||
chatClient.deleteChatThread(threadId);
|
||||
```
|
||||
|
||||
## List Threads
|
||||
|
||||
```java
|
||||
// List all chat threads for the user
|
||||
PagedIterable<ChatThreadItem> threads = chatClient.listChatThreads();
|
||||
|
||||
for (ChatThreadItem thread : threads) {
|
||||
System.out.println("Thread ID: " + thread.getId());
|
||||
System.out.println("Topic: " + thread.getTopic());
|
||||
System.out.println("Last message: " + thread.getLastMessageReceivedOn());
|
||||
}
|
||||
```
|
||||
|
||||
## Pagination
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.PagedResponse;
|
||||
|
||||
// Paginate through messages
|
||||
int maxPageSize = 10;
|
||||
ListChatMessagesOptions listOptions = new ListChatMessagesOptions()
|
||||
.setMaxPageSize(maxPageSize);
|
||||
|
||||
PagedIterable<ChatMessage> pagedMessages = threadClient.listMessages(listOptions);
|
||||
|
||||
pagedMessages.iterableByPage().forEach(page -> {
|
||||
System.out.println("Page status code: " + page.getStatusCode());
|
||||
page.getElements().forEach(msg ->
|
||||
System.out.println("Message: " + msg.getContent().getMessage()));
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
threadClient.sendMessage(messageOptions);
|
||||
} catch (HttpResponseException e) {
|
||||
switch (e.getResponse().getStatusCode()) {
|
||||
case 401:
|
||||
System.out.println("Unauthorized - check token");
|
||||
break;
|
||||
case 403:
|
||||
System.out.println("Forbidden - user not in thread");
|
||||
break;
|
||||
case 404:
|
||||
System.out.println("Thread not found");
|
||||
break;
|
||||
default:
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Message Types
|
||||
|
||||
| Type | Description |
|
||||
|------|-------------|
|
||||
| `TEXT` | Regular chat message |
|
||||
| `HTML` | HTML-formatted message |
|
||||
| `TOPIC_UPDATED` | System message - topic changed |
|
||||
| `PARTICIPANT_ADDED` | System message - participant joined |
|
||||
| `PARTICIPANT_REMOVED` | System message - participant left |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COMMUNICATION_ENDPOINT=https://<resource>.communication.azure.com
|
||||
AZURE_COMMUNICATION_USER_TOKEN=<user-access-token>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Token Management** - User tokens expire; implement refresh logic with `CommunicationTokenRefreshOptions`
|
||||
2. **Pagination** - Use `listMessages(options)` with `maxPageSize` for large threads
|
||||
3. **Share History** - Set `shareHistoryTime` when adding participants to control message visibility
|
||||
4. **Message Types** - Filter system messages (`PARTICIPANT_ADDED`, etc.) from user messages
|
||||
5. **Read Receipts** - Send receipts only when messages are actually viewed by user
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "chat application Java", "real-time messaging Java"
|
||||
- "chat thread", "chat participants", "chat messages"
|
||||
- "read receipts", "typing notifications"
|
||||
- "Azure Communication Services chat"
|
||||
304
skills/azure-communication-common-java/SKILL.md
Normal file
304
skills/azure-communication-common-java/SKILL.md
Normal file
@@ -0,0 +1,304 @@
|
||||
---
|
||||
name: azure-communication-common-java
|
||||
description: Azure Communication Services common utilities for Java. Use when working with CommunicationTokenCredential, user identifiers, token refresh, or shared authentication across ACS services.
|
||||
package: com.azure:azure-communication-common
|
||||
---
|
||||
|
||||
# Azure Communication Common (Java)
|
||||
|
||||
Shared authentication utilities and data structures for Azure Communication Services.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-common</artifactId>
|
||||
<version>1.4.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Class | Purpose |
|
||||
|-------|---------|
|
||||
| `CommunicationTokenCredential` | Authenticate users with ACS services |
|
||||
| `CommunicationTokenRefreshOptions` | Configure automatic token refresh |
|
||||
| `CommunicationUserIdentifier` | Identify ACS users |
|
||||
| `PhoneNumberIdentifier` | Identify PSTN phone numbers |
|
||||
| `MicrosoftTeamsUserIdentifier` | Identify Teams users |
|
||||
| `UnknownIdentifier` | Generic identifier for unknown types |
|
||||
|
||||
## CommunicationTokenCredential
|
||||
|
||||
### Static Token (Short-lived Clients)
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationTokenCredential;
|
||||
|
||||
// Simple static token - no refresh
|
||||
String userToken = "<user-access-token>";
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(userToken);
|
||||
|
||||
// Use with Chat, Calling, etc.
|
||||
ChatClient chatClient = new ChatClientBuilder()
|
||||
.endpoint("https://<resource>.communication.azure.com")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Proactive Token Refresh (Long-lived Clients)
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationTokenRefreshOptions;
|
||||
import java.util.concurrent.Callable;
|
||||
|
||||
// Token refresher callback - called when token is about to expire
|
||||
Callable<String> tokenRefresher = () -> {
|
||||
// Call your server to get a fresh token
|
||||
return fetchNewTokenFromServer();
|
||||
};
|
||||
|
||||
// With proactive refresh
|
||||
CommunicationTokenRefreshOptions refreshOptions = new CommunicationTokenRefreshOptions(tokenRefresher)
|
||||
.setRefreshProactively(true) // Refresh before expiry
|
||||
.setInitialToken(currentToken); // Optional initial token
|
||||
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(refreshOptions);
|
||||
```
|
||||
|
||||
### Async Token Refresh
|
||||
|
||||
```java
|
||||
import java.util.concurrent.CompletableFuture;
|
||||
|
||||
// Async token fetcher
|
||||
Callable<String> asyncRefresher = () -> {
|
||||
CompletableFuture<String> future = fetchTokenAsync();
|
||||
return future.get(); // Block until token is available
|
||||
};
|
||||
|
||||
CommunicationTokenRefreshOptions options = new CommunicationTokenRefreshOptions(asyncRefresher)
|
||||
.setRefreshProactively(true);
|
||||
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(options);
|
||||
```
|
||||
|
||||
## Entra ID (Azure AD) Authentication
|
||||
|
||||
```java
|
||||
import com.azure.identity.InteractiveBrowserCredentialBuilder;
|
||||
import com.azure.communication.common.EntraCommunicationTokenCredentialOptions;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
// For Teams Phone Extensibility
|
||||
InteractiveBrowserCredential entraCredential = new InteractiveBrowserCredentialBuilder()
|
||||
.clientId("<your-client-id>")
|
||||
.tenantId("<your-tenant-id>")
|
||||
.redirectUrl("<your-redirect-uri>")
|
||||
.build();
|
||||
|
||||
String resourceEndpoint = "https://<resource>.communication.azure.com";
|
||||
List<String> scopes = Arrays.asList(
|
||||
"https://auth.msft.communication.azure.com/TeamsExtension.ManageCalls"
|
||||
);
|
||||
|
||||
EntraCommunicationTokenCredentialOptions entraOptions =
|
||||
new EntraCommunicationTokenCredentialOptions(entraCredential, resourceEndpoint)
|
||||
.setScopes(scopes);
|
||||
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(entraOptions);
|
||||
```
|
||||
|
||||
## Communication Identifiers
|
||||
|
||||
### CommunicationUserIdentifier
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationUserIdentifier;
|
||||
|
||||
// Create identifier for ACS user
|
||||
CommunicationUserIdentifier user = new CommunicationUserIdentifier("8:acs:resource-id_user-id");
|
||||
|
||||
// Get raw ID
|
||||
String rawId = user.getId();
|
||||
```
|
||||
|
||||
### PhoneNumberIdentifier
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.PhoneNumberIdentifier;
|
||||
|
||||
// E.164 format phone number
|
||||
PhoneNumberIdentifier phone = new PhoneNumberIdentifier("+14255551234");
|
||||
|
||||
String phoneNumber = phone.getPhoneNumber(); // "+14255551234"
|
||||
String rawId = phone.getRawId(); // "4:+14255551234"
|
||||
```
|
||||
|
||||
### MicrosoftTeamsUserIdentifier
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.MicrosoftTeamsUserIdentifier;
|
||||
|
||||
// Teams user identifier
|
||||
MicrosoftTeamsUserIdentifier teamsUser = new MicrosoftTeamsUserIdentifier("<teams-user-id>")
|
||||
.setCloudEnvironment(CommunicationCloudEnvironment.PUBLIC);
|
||||
|
||||
// For anonymous Teams users
|
||||
MicrosoftTeamsUserIdentifier anonymousTeamsUser = new MicrosoftTeamsUserIdentifier("<teams-user-id>")
|
||||
.setAnonymous(true);
|
||||
```
|
||||
|
||||
### UnknownIdentifier
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.UnknownIdentifier;
|
||||
|
||||
// For identifiers of unknown type
|
||||
UnknownIdentifier unknown = new UnknownIdentifier("some-raw-id");
|
||||
```
|
||||
|
||||
## Identifier Parsing
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationIdentifier;
|
||||
import com.azure.communication.common.CommunicationIdentifierModel;
|
||||
|
||||
// Parse raw ID to appropriate type
|
||||
public CommunicationIdentifier parseIdentifier(String rawId) {
|
||||
if (rawId.startsWith("8:acs:")) {
|
||||
return new CommunicationUserIdentifier(rawId);
|
||||
} else if (rawId.startsWith("4:")) {
|
||||
String phone = rawId.substring(2);
|
||||
return new PhoneNumberIdentifier(phone);
|
||||
} else if (rawId.startsWith("8:orgid:")) {
|
||||
String teamsId = rawId.substring(8);
|
||||
return new MicrosoftTeamsUserIdentifier(teamsId);
|
||||
} else {
|
||||
return new UnknownIdentifier(rawId);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Type Checking Identifiers
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationIdentifier;
|
||||
|
||||
public void processIdentifier(CommunicationIdentifier identifier) {
|
||||
if (identifier instanceof CommunicationUserIdentifier) {
|
||||
CommunicationUserIdentifier user = (CommunicationUserIdentifier) identifier;
|
||||
System.out.println("ACS User: " + user.getId());
|
||||
|
||||
} else if (identifier instanceof PhoneNumberIdentifier) {
|
||||
PhoneNumberIdentifier phone = (PhoneNumberIdentifier) identifier;
|
||||
System.out.println("Phone: " + phone.getPhoneNumber());
|
||||
|
||||
} else if (identifier instanceof MicrosoftTeamsUserIdentifier) {
|
||||
MicrosoftTeamsUserIdentifier teams = (MicrosoftTeamsUserIdentifier) identifier;
|
||||
System.out.println("Teams User: " + teams.getUserId());
|
||||
System.out.println("Anonymous: " + teams.isAnonymous());
|
||||
|
||||
} else if (identifier instanceof UnknownIdentifier) {
|
||||
UnknownIdentifier unknown = (UnknownIdentifier) identifier;
|
||||
System.out.println("Unknown: " + unknown.getId());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Token Access
|
||||
|
||||
```java
|
||||
import com.azure.core.credential.AccessToken;
|
||||
|
||||
// Get current token (for debugging/logging - don't expose!)
|
||||
CommunicationTokenCredential credential = new CommunicationTokenCredential(token);
|
||||
|
||||
// Sync access
|
||||
AccessToken accessToken = credential.getToken();
|
||||
System.out.println("Token expires: " + accessToken.getExpiresAt());
|
||||
|
||||
// Async access
|
||||
credential.getTokenAsync()
|
||||
.subscribe(token -> {
|
||||
System.out.println("Token: " + token.getToken().substring(0, 20) + "...");
|
||||
System.out.println("Expires: " + token.getExpiresAt());
|
||||
});
|
||||
```
|
||||
|
||||
## Dispose Credential
|
||||
|
||||
```java
|
||||
// Clean up when done
|
||||
credential.close();
|
||||
|
||||
// Or use try-with-resources
|
||||
try (CommunicationTokenCredential cred = new CommunicationTokenCredential(options)) {
|
||||
// Use credential
|
||||
chatClient.doSomething();
|
||||
}
|
||||
```
|
||||
|
||||
## Cloud Environments
|
||||
|
||||
```java
|
||||
import com.azure.communication.common.CommunicationCloudEnvironment;
|
||||
|
||||
// Available environments
|
||||
CommunicationCloudEnvironment publicCloud = CommunicationCloudEnvironment.PUBLIC;
|
||||
CommunicationCloudEnvironment govCloud = CommunicationCloudEnvironment.GCCH;
|
||||
CommunicationCloudEnvironment dodCloud = CommunicationCloudEnvironment.DOD;
|
||||
|
||||
// Set on Teams identifier
|
||||
MicrosoftTeamsUserIdentifier teamsUser = new MicrosoftTeamsUserIdentifier("<user-id>")
|
||||
.setCloudEnvironment(CommunicationCloudEnvironment.GCCH);
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COMMUNICATION_ENDPOINT=https://<resource>.communication.azure.com
|
||||
AZURE_COMMUNICATION_USER_TOKEN=<user-access-token>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Proactive Refresh** - Always use `setRefreshProactively(true)` for long-lived clients
|
||||
2. **Token Security** - Never log or expose full tokens
|
||||
3. **Close Credentials** - Dispose of credentials when no longer needed
|
||||
4. **Error Handling** - Handle token refresh failures gracefully
|
||||
5. **Identifier Types** - Use specific identifier types, not raw strings
|
||||
|
||||
## Common Usage Patterns
|
||||
|
||||
```java
|
||||
// Pattern: Create credential for Chat/Calling client
|
||||
public ChatClient createChatClient(String token, String endpoint) {
|
||||
CommunicationTokenRefreshOptions refreshOptions =
|
||||
new CommunicationTokenRefreshOptions(this::refreshToken)
|
||||
.setRefreshProactively(true)
|
||||
.setInitialToken(token);
|
||||
|
||||
CommunicationTokenCredential credential =
|
||||
new CommunicationTokenCredential(refreshOptions);
|
||||
|
||||
return new ChatClientBuilder()
|
||||
.endpoint(endpoint)
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
}
|
||||
|
||||
private String refreshToken() {
|
||||
// Call your token endpoint
|
||||
return tokenService.getNewToken();
|
||||
}
|
||||
```
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "ACS authentication", "communication token credential"
|
||||
- "user access token", "token refresh"
|
||||
- "CommunicationUserIdentifier", "PhoneNumberIdentifier"
|
||||
- "Azure Communication Services authentication"
|
||||
274
skills/azure-communication-sms-java/SKILL.md
Normal file
274
skills/azure-communication-sms-java/SKILL.md
Normal file
@@ -0,0 +1,274 @@
|
||||
---
|
||||
name: azure-communication-sms-java
|
||||
description: Send SMS messages with Azure Communication Services SMS Java SDK. Use when implementing SMS notifications, alerts, OTP delivery, bulk messaging, or delivery reports.
|
||||
package: com.azure:azure-communication-sms
|
||||
---
|
||||
|
||||
# Azure Communication SMS (Java)
|
||||
|
||||
Send SMS messages to single or multiple recipients with delivery reporting.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-communication-sms</artifactId>
|
||||
<version>1.2.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
```java
|
||||
import com.azure.communication.sms.SmsClient;
|
||||
import com.azure.communication.sms.SmsClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
// With DefaultAzureCredential (recommended)
|
||||
SmsClient smsClient = new SmsClientBuilder()
|
||||
.endpoint("https://<resource>.communication.azure.com")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
|
||||
// With connection string
|
||||
SmsClient smsClient = new SmsClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildClient();
|
||||
|
||||
// With AzureKeyCredential
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
SmsClient smsClient = new SmsClientBuilder()
|
||||
.endpoint("https://<resource>.communication.azure.com")
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.buildClient();
|
||||
|
||||
// Async client
|
||||
SmsAsyncClient smsAsyncClient = new SmsClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Send SMS to Single Recipient
|
||||
|
||||
```java
|
||||
import com.azure.communication.sms.models.SmsSendResult;
|
||||
|
||||
// Simple send
|
||||
SmsSendResult result = smsClient.send(
|
||||
"+14255550100", // From (your ACS phone number)
|
||||
"+14255551234", // To
|
||||
"Your verification code is 123456");
|
||||
|
||||
System.out.println("Message ID: " + result.getMessageId());
|
||||
System.out.println("To: " + result.getTo());
|
||||
System.out.println("Success: " + result.isSuccessful());
|
||||
|
||||
if (!result.isSuccessful()) {
|
||||
System.out.println("Error: " + result.getErrorMessage());
|
||||
System.out.println("Status: " + result.getHttpStatusCode());
|
||||
}
|
||||
```
|
||||
|
||||
## Send SMS to Multiple Recipients
|
||||
|
||||
```java
|
||||
import com.azure.communication.sms.models.SmsSendOptions;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
List<String> recipients = Arrays.asList(
|
||||
"+14255551111",
|
||||
"+14255552222",
|
||||
"+14255553333"
|
||||
);
|
||||
|
||||
// With options
|
||||
SmsSendOptions options = new SmsSendOptions()
|
||||
.setDeliveryReportEnabled(true)
|
||||
.setTag("marketing-campaign-001");
|
||||
|
||||
Iterable<SmsSendResult> results = smsClient.sendWithResponse(
|
||||
"+14255550100", // From
|
||||
recipients, // To list
|
||||
"Flash sale! 50% off today only.",
|
||||
options,
|
||||
Context.NONE
|
||||
).getValue();
|
||||
|
||||
for (SmsSendResult result : results) {
|
||||
if (result.isSuccessful()) {
|
||||
System.out.println("Sent to " + result.getTo() + ": " + result.getMessageId());
|
||||
} else {
|
||||
System.out.println("Failed to " + result.getTo() + ": " + result.getErrorMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Send Options
|
||||
|
||||
```java
|
||||
SmsSendOptions options = new SmsSendOptions();
|
||||
|
||||
// Enable delivery reports (sent via Event Grid)
|
||||
options.setDeliveryReportEnabled(true);
|
||||
|
||||
// Add custom tag for tracking
|
||||
options.setTag("order-confirmation-12345");
|
||||
```
|
||||
|
||||
## Response Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.Response;
|
||||
|
||||
Response<Iterable<SmsSendResult>> response = smsClient.sendWithResponse(
|
||||
"+14255550100",
|
||||
Arrays.asList("+14255551234"),
|
||||
"Hello!",
|
||||
new SmsSendOptions().setDeliveryReportEnabled(true),
|
||||
Context.NONE
|
||||
);
|
||||
|
||||
// Check HTTP response
|
||||
System.out.println("Status code: " + response.getStatusCode());
|
||||
System.out.println("Headers: " + response.getHeaders());
|
||||
|
||||
// Process results
|
||||
for (SmsSendResult result : response.getValue()) {
|
||||
System.out.println("Message ID: " + result.getMessageId());
|
||||
System.out.println("Successful: " + result.isSuccessful());
|
||||
|
||||
if (!result.isSuccessful()) {
|
||||
System.out.println("HTTP Status: " + result.getHttpStatusCode());
|
||||
System.out.println("Error: " + result.getErrorMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Async Operations
|
||||
|
||||
```java
|
||||
import reactor.core.publisher.Mono;
|
||||
|
||||
SmsAsyncClient asyncClient = new SmsClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.buildAsyncClient();
|
||||
|
||||
// Send single message
|
||||
asyncClient.send("+14255550100", "+14255551234", "Async message!")
|
||||
.subscribe(
|
||||
result -> System.out.println("Sent: " + result.getMessageId()),
|
||||
error -> System.out.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
// Send to multiple with options
|
||||
SmsSendOptions options = new SmsSendOptions()
|
||||
.setDeliveryReportEnabled(true);
|
||||
|
||||
asyncClient.sendWithResponse(
|
||||
"+14255550100",
|
||||
Arrays.asList("+14255551111", "+14255552222"),
|
||||
"Bulk async message",
|
||||
options)
|
||||
.subscribe(response -> {
|
||||
for (SmsSendResult result : response.getValue()) {
|
||||
System.out.println("Result: " + result.getTo() + " - " + result.isSuccessful());
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
SmsSendResult result = smsClient.send(
|
||||
"+14255550100",
|
||||
"+14255551234",
|
||||
"Test message"
|
||||
);
|
||||
|
||||
// Individual message errors don't throw exceptions
|
||||
if (!result.isSuccessful()) {
|
||||
handleMessageError(result);
|
||||
}
|
||||
|
||||
} catch (HttpResponseException e) {
|
||||
// Request-level failures (auth, network, etc.)
|
||||
System.out.println("Request failed: " + e.getMessage());
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
} catch (RuntimeException e) {
|
||||
System.out.println("Unexpected error: " + e.getMessage());
|
||||
}
|
||||
|
||||
private void handleMessageError(SmsSendResult result) {
|
||||
int status = result.getHttpStatusCode();
|
||||
String error = result.getErrorMessage();
|
||||
|
||||
if (status == 400) {
|
||||
System.out.println("Invalid phone number: " + result.getTo());
|
||||
} else if (status == 429) {
|
||||
System.out.println("Rate limited - retry later");
|
||||
} else {
|
||||
System.out.println("Error " + status + ": " + error);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Delivery Reports
|
||||
|
||||
Delivery reports are sent via Azure Event Grid. Configure an Event Grid subscription for your ACS resource.
|
||||
|
||||
```java
|
||||
// Event Grid webhook handler (in your endpoint)
|
||||
public void handleDeliveryReport(String eventJson) {
|
||||
// Parse Event Grid event
|
||||
// Event type: Microsoft.Communication.SMSDeliveryReportReceived
|
||||
|
||||
// Event data contains:
|
||||
// - messageId: correlates to SmsSendResult.getMessageId()
|
||||
// - from: sender number
|
||||
// - to: recipient number
|
||||
// - deliveryStatus: "Delivered", "Failed", etc.
|
||||
// - deliveryStatusDetails: detailed status
|
||||
// - receivedTimestamp: when status was received
|
||||
// - tag: your custom tag from SmsSendOptions
|
||||
}
|
||||
```
|
||||
|
||||
## SmsSendResult Properties
|
||||
|
||||
| Property | Type | Description |
|
||||
|----------|------|-------------|
|
||||
| `getMessageId()` | String | Unique message identifier |
|
||||
| `getTo()` | String | Recipient phone number |
|
||||
| `isSuccessful()` | boolean | Whether send succeeded |
|
||||
| `getHttpStatusCode()` | int | HTTP status for this recipient |
|
||||
| `getErrorMessage()` | String | Error details if failed |
|
||||
| `getRepeatabilityResult()` | RepeatabilityResult | Idempotency result |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_COMMUNICATION_ENDPOINT=https://<resource>.communication.azure.com
|
||||
AZURE_COMMUNICATION_CONNECTION_STRING=endpoint=https://...;accesskey=...
|
||||
SMS_FROM_NUMBER=+14255550100
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Phone Number Format** - Use E.164 format: `+[country code][number]`
|
||||
2. **Delivery Reports** - Enable for critical messages (OTP, alerts)
|
||||
3. **Tagging** - Use tags to correlate messages with business context
|
||||
4. **Error Handling** - Check `isSuccessful()` for each recipient individually
|
||||
5. **Rate Limiting** - Implement retry with backoff for 429 responses
|
||||
6. **Bulk Sending** - Use batch send for multiple recipients (more efficient)
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "send SMS Java", "text message Java"
|
||||
- "SMS notification", "OTP SMS", "bulk SMS"
|
||||
- "delivery report SMS", "Azure Communication Services SMS"
|
||||
379
skills/azure-compute-batch-java/SKILL.md
Normal file
379
skills/azure-compute-batch-java/SKILL.md
Normal file
@@ -0,0 +1,379 @@
|
||||
---
|
||||
name: azure-compute-batch-java
|
||||
description: |
|
||||
Azure Batch SDK for Java. Run large-scale parallel and HPC batch jobs with pools, jobs, tasks, and compute nodes.
|
||||
Triggers: "BatchClient java", "azure batch java", "batch pool java", "batch job java", "HPC java", "parallel computing java".
|
||||
---
|
||||
|
||||
# Azure Batch SDK for Java
|
||||
|
||||
Client library for running large-scale parallel and high-performance computing (HPC) batch jobs in Azure.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-compute-batch</artifactId>
|
||||
<version>1.0.0-beta.5</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Azure Batch account
|
||||
- Pool configured with compute nodes
|
||||
- Azure subscription
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_BATCH_ENDPOINT=https://<account>.<region>.batch.azure.com
|
||||
AZURE_BATCH_ACCOUNT=<account-name>
|
||||
AZURE_BATCH_ACCESS_KEY=<account-key>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With Microsoft Entra ID (Recommended)
|
||||
|
||||
```java
|
||||
import com.azure.compute.batch.BatchClient;
|
||||
import com.azure.compute.batch.BatchClientBuilder;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
BatchClient batchClient = new BatchClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(System.getenv("AZURE_BATCH_ENDPOINT"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.compute.batch.BatchAsyncClient;
|
||||
|
||||
BatchAsyncClient batchAsyncClient = new BatchClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint(System.getenv("AZURE_BATCH_ENDPOINT"))
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### With Shared Key Credentials
|
||||
|
||||
```java
|
||||
import com.azure.core.credential.AzureNamedKeyCredential;
|
||||
|
||||
String accountName = System.getenv("AZURE_BATCH_ACCOUNT");
|
||||
String accountKey = System.getenv("AZURE_BATCH_ACCESS_KEY");
|
||||
AzureNamedKeyCredential sharedKeyCreds = new AzureNamedKeyCredential(accountName, accountKey);
|
||||
|
||||
BatchClient batchClient = new BatchClientBuilder()
|
||||
.credential(sharedKeyCreds)
|
||||
.endpoint(System.getenv("AZURE_BATCH_ENDPOINT"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Concept | Description |
|
||||
|---------|-------------|
|
||||
| Pool | Collection of compute nodes that run tasks |
|
||||
| Job | Logical grouping of tasks |
|
||||
| Task | Unit of computation (command/script) |
|
||||
| Node | VM that executes tasks |
|
||||
| Job Schedule | Recurring job creation |
|
||||
|
||||
## Pool Operations
|
||||
|
||||
### Create Pool
|
||||
|
||||
```java
|
||||
import com.azure.compute.batch.models.*;
|
||||
|
||||
batchClient.createPool(new BatchPoolCreateParameters("myPoolId", "STANDARD_DC2s_V2")
|
||||
.setVirtualMachineConfiguration(
|
||||
new VirtualMachineConfiguration(
|
||||
new BatchVmImageReference()
|
||||
.setPublisher("Canonical")
|
||||
.setOffer("UbuntuServer")
|
||||
.setSku("22_04-lts")
|
||||
.setVersion("latest"),
|
||||
"batch.node.ubuntu 22.04"))
|
||||
.setTargetDedicatedNodes(2)
|
||||
.setTargetLowPriorityNodes(0), null);
|
||||
```
|
||||
|
||||
### Get Pool
|
||||
|
||||
```java
|
||||
BatchPool pool = batchClient.getPool("myPoolId");
|
||||
System.out.println("Pool state: " + pool.getState());
|
||||
System.out.println("Current dedicated nodes: " + pool.getCurrentDedicatedNodes());
|
||||
```
|
||||
|
||||
### List Pools
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.PagedIterable;
|
||||
|
||||
PagedIterable<BatchPool> pools = batchClient.listPools();
|
||||
for (BatchPool pool : pools) {
|
||||
System.out.println("Pool: " + pool.getId() + ", State: " + pool.getState());
|
||||
}
|
||||
```
|
||||
|
||||
### Resize Pool
|
||||
|
||||
```java
|
||||
import com.azure.core.util.polling.SyncPoller;
|
||||
|
||||
BatchPoolResizeParameters resizeParams = new BatchPoolResizeParameters()
|
||||
.setTargetDedicatedNodes(4)
|
||||
.setTargetLowPriorityNodes(2);
|
||||
|
||||
SyncPoller<BatchPool, BatchPool> poller = batchClient.beginResizePool("myPoolId", resizeParams);
|
||||
poller.waitForCompletion();
|
||||
BatchPool resizedPool = poller.getFinalResult();
|
||||
```
|
||||
|
||||
### Enable AutoScale
|
||||
|
||||
```java
|
||||
BatchPoolEnableAutoScaleParameters autoScaleParams = new BatchPoolEnableAutoScaleParameters()
|
||||
.setAutoScaleEvaluationInterval(Duration.ofMinutes(5))
|
||||
.setAutoScaleFormula("$TargetDedicatedNodes = min(10, $PendingTasks.GetSample(TimeInterval_Minute * 5));");
|
||||
|
||||
batchClient.enablePoolAutoScale("myPoolId", autoScaleParams);
|
||||
```
|
||||
|
||||
### Delete Pool
|
||||
|
||||
```java
|
||||
SyncPoller<BatchPool, Void> deletePoller = batchClient.beginDeletePool("myPoolId");
|
||||
deletePoller.waitForCompletion();
|
||||
```
|
||||
|
||||
## Job Operations
|
||||
|
||||
### Create Job
|
||||
|
||||
```java
|
||||
batchClient.createJob(
|
||||
new BatchJobCreateParameters("myJobId", new BatchPoolInfo().setPoolId("myPoolId"))
|
||||
.setPriority(100)
|
||||
.setConstraints(new BatchJobConstraints()
|
||||
.setMaxWallClockTime(Duration.ofHours(24))
|
||||
.setMaxTaskRetryCount(3)),
|
||||
null);
|
||||
```
|
||||
|
||||
### Get Job
|
||||
|
||||
```java
|
||||
BatchJob job = batchClient.getJob("myJobId", null, null);
|
||||
System.out.println("Job state: " + job.getState());
|
||||
```
|
||||
|
||||
### List Jobs
|
||||
|
||||
```java
|
||||
PagedIterable<BatchJob> jobs = batchClient.listJobs(new BatchJobsListOptions());
|
||||
for (BatchJob job : jobs) {
|
||||
System.out.println("Job: " + job.getId() + ", State: " + job.getState());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Task Counts
|
||||
|
||||
```java
|
||||
BatchTaskCountsResult counts = batchClient.getJobTaskCounts("myJobId");
|
||||
System.out.println("Active: " + counts.getTaskCounts().getActive());
|
||||
System.out.println("Running: " + counts.getTaskCounts().getRunning());
|
||||
System.out.println("Completed: " + counts.getTaskCounts().getCompleted());
|
||||
```
|
||||
|
||||
### Terminate Job
|
||||
|
||||
```java
|
||||
BatchJobTerminateParameters terminateParams = new BatchJobTerminateParameters()
|
||||
.setTerminationReason("Manual termination");
|
||||
BatchJobTerminateOptions options = new BatchJobTerminateOptions().setParameters(terminateParams);
|
||||
|
||||
SyncPoller<BatchJob, BatchJob> poller = batchClient.beginTerminateJob("myJobId", options, null);
|
||||
poller.waitForCompletion();
|
||||
```
|
||||
|
||||
### Delete Job
|
||||
|
||||
```java
|
||||
SyncPoller<BatchJob, Void> deletePoller = batchClient.beginDeleteJob("myJobId");
|
||||
deletePoller.waitForCompletion();
|
||||
```
|
||||
|
||||
## Task Operations
|
||||
|
||||
### Create Single Task
|
||||
|
||||
```java
|
||||
BatchTaskCreateParameters task = new BatchTaskCreateParameters("task1", "echo 'Hello World'");
|
||||
batchClient.createTask("myJobId", task);
|
||||
```
|
||||
|
||||
### Create Task with Exit Conditions
|
||||
|
||||
```java
|
||||
batchClient.createTask("myJobId", new BatchTaskCreateParameters("task2", "cmd /c exit 3")
|
||||
.setExitConditions(new ExitConditions()
|
||||
.setExitCodeRanges(Arrays.asList(
|
||||
new ExitCodeRangeMapping(2, 4,
|
||||
new ExitOptions().setJobAction(BatchJobActionKind.TERMINATE)))))
|
||||
.setUserIdentity(new UserIdentity()
|
||||
.setAutoUser(new AutoUserSpecification()
|
||||
.setScope(AutoUserScope.TASK)
|
||||
.setElevationLevel(ElevationLevel.NON_ADMIN))),
|
||||
null);
|
||||
```
|
||||
|
||||
### Create Task Collection (up to 100)
|
||||
|
||||
```java
|
||||
List<BatchTaskCreateParameters> taskList = Arrays.asList(
|
||||
new BatchTaskCreateParameters("task1", "echo Task 1"),
|
||||
new BatchTaskCreateParameters("task2", "echo Task 2"),
|
||||
new BatchTaskCreateParameters("task3", "echo Task 3")
|
||||
);
|
||||
BatchTaskGroup taskGroup = new BatchTaskGroup(taskList);
|
||||
BatchCreateTaskCollectionResult result = batchClient.createTaskCollection("myJobId", taskGroup);
|
||||
```
|
||||
|
||||
### Create Many Tasks (no limit)
|
||||
|
||||
```java
|
||||
List<BatchTaskCreateParameters> tasks = new ArrayList<>();
|
||||
for (int i = 0; i < 1000; i++) {
|
||||
tasks.add(new BatchTaskCreateParameters("task" + i, "echo Task " + i));
|
||||
}
|
||||
batchClient.createTasks("myJobId", tasks);
|
||||
```
|
||||
|
||||
### Get Task
|
||||
|
||||
```java
|
||||
BatchTask task = batchClient.getTask("myJobId", "task1");
|
||||
System.out.println("Task state: " + task.getState());
|
||||
System.out.println("Exit code: " + task.getExecutionInfo().getExitCode());
|
||||
```
|
||||
|
||||
### List Tasks
|
||||
|
||||
```java
|
||||
PagedIterable<BatchTask> tasks = batchClient.listTasks("myJobId");
|
||||
for (BatchTask task : tasks) {
|
||||
System.out.println("Task: " + task.getId() + ", State: " + task.getState());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Task Output
|
||||
|
||||
```java
|
||||
import com.azure.core.util.BinaryData;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
|
||||
BinaryData stdout = batchClient.getTaskFile("myJobId", "task1", "stdout.txt");
|
||||
System.out.println(new String(stdout.toBytes(), StandardCharsets.UTF_8));
|
||||
```
|
||||
|
||||
### Terminate Task
|
||||
|
||||
```java
|
||||
batchClient.terminateTask("myJobId", "task1", null, null);
|
||||
```
|
||||
|
||||
## Node Operations
|
||||
|
||||
### List Nodes
|
||||
|
||||
```java
|
||||
PagedIterable<BatchNode> nodes = batchClient.listNodes("myPoolId", new BatchNodesListOptions());
|
||||
for (BatchNode node : nodes) {
|
||||
System.out.println("Node: " + node.getId() + ", State: " + node.getState());
|
||||
}
|
||||
```
|
||||
|
||||
### Reboot Node
|
||||
|
||||
```java
|
||||
SyncPoller<BatchNode, BatchNode> rebootPoller = batchClient.beginRebootNode("myPoolId", "nodeId");
|
||||
rebootPoller.waitForCompletion();
|
||||
```
|
||||
|
||||
### Get Remote Login Settings
|
||||
|
||||
```java
|
||||
BatchNodeRemoteLoginSettings settings = batchClient.getNodeRemoteLoginSettings("myPoolId", "nodeId");
|
||||
System.out.println("IP: " + settings.getRemoteLoginIpAddress());
|
||||
System.out.println("Port: " + settings.getRemoteLoginPort());
|
||||
```
|
||||
|
||||
## Job Schedule Operations
|
||||
|
||||
### Create Job Schedule
|
||||
|
||||
```java
|
||||
batchClient.createJobSchedule(new BatchJobScheduleCreateParameters("myScheduleId",
|
||||
new BatchJobScheduleConfiguration()
|
||||
.setRecurrenceInterval(Duration.ofHours(6))
|
||||
.setDoNotRunUntil(OffsetDateTime.now().plusDays(1)),
|
||||
new BatchJobSpecification(new BatchPoolInfo().setPoolId("myPoolId"))
|
||||
.setPriority(50)),
|
||||
null);
|
||||
```
|
||||
|
||||
### Get Job Schedule
|
||||
|
||||
```java
|
||||
BatchJobSchedule schedule = batchClient.getJobSchedule("myScheduleId");
|
||||
System.out.println("Schedule state: " + schedule.getState());
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.compute.batch.models.BatchErrorException;
|
||||
import com.azure.compute.batch.models.BatchError;
|
||||
|
||||
try {
|
||||
batchClient.getPool("nonexistent-pool");
|
||||
} catch (BatchErrorException e) {
|
||||
BatchError error = e.getValue();
|
||||
System.err.println("Error code: " + error.getCode());
|
||||
System.err.println("Message: " + error.getMessage().getValue());
|
||||
|
||||
if ("PoolNotFound".equals(error.getCode())) {
|
||||
System.err.println("The specified pool does not exist.");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID** — Preferred over shared key for authentication
|
||||
2. **Use management SDK for pools** — `azure-resourcemanager-batch` supports managed identities
|
||||
3. **Batch task creation** — Use `createTaskCollection` or `createTasks` for multiple tasks
|
||||
4. **Handle LRO properly** — Pool resize, delete operations are long-running
|
||||
5. **Monitor task counts** — Use `getJobTaskCounts` to track progress
|
||||
6. **Set constraints** — Configure `maxWallClockTime` and `maxTaskRetryCount`
|
||||
7. **Use low-priority nodes** — Cost savings for fault-tolerant workloads
|
||||
8. **Enable autoscale** — Dynamically adjust pool size based on workload
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-compute-batch |
|
||||
| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/batch/azure-compute-batch |
|
||||
| API Documentation | https://learn.microsoft.com/java/api/com.azure.compute.batch |
|
||||
| Product Docs | https://learn.microsoft.com/azure/batch/ |
|
||||
| REST API | https://learn.microsoft.com/rest/api/batchservice/ |
|
||||
| Samples | https://github.com/azure/azure-batch-samples |
|
||||
252
skills/azure-containerregistry-py/SKILL.md
Normal file
252
skills/azure-containerregistry-py/SKILL.md
Normal file
@@ -0,0 +1,252 @@
|
||||
---
|
||||
name: azure-containerregistry-py
|
||||
description: |
|
||||
Azure Container Registry SDK for Python. Use for managing container images, artifacts, and repositories.
|
||||
Triggers: "azure-containerregistry", "ContainerRegistryClient", "container images", "docker registry", "ACR".
|
||||
package: azure-containerregistry
|
||||
---
|
||||
|
||||
# Azure Container Registry SDK for Python
|
||||
|
||||
Manage container images, artifacts, and repositories in Azure Container Registry.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-containerregistry
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_CONTAINERREGISTRY_ENDPOINT=https://<registry-name>.azurecr.io
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Entra ID (Recommended)
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ContainerRegistryClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = ContainerRegistryClient(
|
||||
endpoint=os.environ["AZURE_CONTAINERREGISTRY_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
### Anonymous Access (Public Registry)
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ContainerRegistryClient
|
||||
|
||||
client = ContainerRegistryClient(
|
||||
endpoint="https://mcr.microsoft.com",
|
||||
credential=None,
|
||||
audience="https://mcr.microsoft.com"
|
||||
)
|
||||
```
|
||||
|
||||
## List Repositories
|
||||
|
||||
```python
|
||||
client = ContainerRegistryClient(endpoint, DefaultAzureCredential())
|
||||
|
||||
for repository in client.list_repository_names():
|
||||
print(repository)
|
||||
```
|
||||
|
||||
## Repository Operations
|
||||
|
||||
### Get Repository Properties
|
||||
|
||||
```python
|
||||
properties = client.get_repository_properties("my-image")
|
||||
print(f"Created: {properties.created_on}")
|
||||
print(f"Modified: {properties.last_updated_on}")
|
||||
print(f"Manifests: {properties.manifest_count}")
|
||||
print(f"Tags: {properties.tag_count}")
|
||||
```
|
||||
|
||||
### Update Repository Properties
|
||||
|
||||
```python
|
||||
from azure.containerregistry import RepositoryProperties
|
||||
|
||||
client.update_repository_properties(
|
||||
"my-image",
|
||||
properties=RepositoryProperties(
|
||||
can_delete=False,
|
||||
can_write=False
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
### Delete Repository
|
||||
|
||||
```python
|
||||
client.delete_repository("my-image")
|
||||
```
|
||||
|
||||
## List Tags
|
||||
|
||||
```python
|
||||
for tag in client.list_tag_properties("my-image"):
|
||||
print(f"{tag.name}: {tag.created_on}")
|
||||
```
|
||||
|
||||
### Filter by Order
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ArtifactTagOrder
|
||||
|
||||
# Most recent first
|
||||
for tag in client.list_tag_properties(
|
||||
"my-image",
|
||||
order_by=ArtifactTagOrder.LAST_UPDATED_ON_DESCENDING
|
||||
):
|
||||
print(f"{tag.name}: {tag.last_updated_on}")
|
||||
```
|
||||
|
||||
## Manifest Operations
|
||||
|
||||
### List Manifests
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ArtifactManifestOrder
|
||||
|
||||
for manifest in client.list_manifest_properties(
|
||||
"my-image",
|
||||
order_by=ArtifactManifestOrder.LAST_UPDATED_ON_DESCENDING
|
||||
):
|
||||
print(f"Digest: {manifest.digest}")
|
||||
print(f"Tags: {manifest.tags}")
|
||||
print(f"Size: {manifest.size_in_bytes}")
|
||||
```
|
||||
|
||||
### Get Manifest Properties
|
||||
|
||||
```python
|
||||
manifest = client.get_manifest_properties("my-image", "latest")
|
||||
print(f"Digest: {manifest.digest}")
|
||||
print(f"Architecture: {manifest.architecture}")
|
||||
print(f"OS: {manifest.operating_system}")
|
||||
```
|
||||
|
||||
### Update Manifest Properties
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ArtifactManifestProperties
|
||||
|
||||
client.update_manifest_properties(
|
||||
"my-image",
|
||||
"latest",
|
||||
properties=ArtifactManifestProperties(
|
||||
can_delete=False,
|
||||
can_write=False
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
### Delete Manifest
|
||||
|
||||
```python
|
||||
# Delete by digest
|
||||
client.delete_manifest("my-image", "sha256:abc123...")
|
||||
|
||||
# Delete by tag
|
||||
manifest = client.get_manifest_properties("my-image", "old-tag")
|
||||
client.delete_manifest("my-image", manifest.digest)
|
||||
```
|
||||
|
||||
## Tag Operations
|
||||
|
||||
### Get Tag Properties
|
||||
|
||||
```python
|
||||
tag = client.get_tag_properties("my-image", "latest")
|
||||
print(f"Digest: {tag.digest}")
|
||||
print(f"Created: {tag.created_on}")
|
||||
```
|
||||
|
||||
### Delete Tag
|
||||
|
||||
```python
|
||||
client.delete_tag("my-image", "old-tag")
|
||||
```
|
||||
|
||||
## Upload and Download Artifacts
|
||||
|
||||
```python
|
||||
from azure.containerregistry import ContainerRegistryClient
|
||||
|
||||
client = ContainerRegistryClient(endpoint, DefaultAzureCredential())
|
||||
|
||||
# Download manifest
|
||||
manifest = client.download_manifest("my-image", "latest")
|
||||
print(f"Media type: {manifest.media_type}")
|
||||
print(f"Digest: {manifest.digest}")
|
||||
|
||||
# Download blob
|
||||
blob = client.download_blob("my-image", "sha256:abc123...")
|
||||
with open("layer.tar.gz", "wb") as f:
|
||||
for chunk in blob:
|
||||
f.write(chunk)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.containerregistry.aio import ContainerRegistryClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def list_repos():
|
||||
credential = DefaultAzureCredential()
|
||||
client = ContainerRegistryClient(endpoint, credential)
|
||||
|
||||
async for repo in client.list_repository_names():
|
||||
print(repo)
|
||||
|
||||
await client.close()
|
||||
await credential.close()
|
||||
```
|
||||
|
||||
## Clean Up Old Images
|
||||
|
||||
```python
|
||||
from datetime import datetime, timedelta, timezone
|
||||
|
||||
cutoff = datetime.now(timezone.utc) - timedelta(days=30)
|
||||
|
||||
for manifest in client.list_manifest_properties("my-image"):
|
||||
if manifest.last_updated_on < cutoff and not manifest.tags:
|
||||
print(f"Deleting {manifest.digest}")
|
||||
client.delete_manifest("my-image", manifest.digest)
|
||||
```
|
||||
|
||||
## Client Operations
|
||||
|
||||
| Operation | Description |
|
||||
|-----------|-------------|
|
||||
| `list_repository_names` | List all repositories |
|
||||
| `get_repository_properties` | Get repository metadata |
|
||||
| `delete_repository` | Delete repository and all images |
|
||||
| `list_tag_properties` | List tags in repository |
|
||||
| `get_tag_properties` | Get tag metadata |
|
||||
| `delete_tag` | Delete specific tag |
|
||||
| `list_manifest_properties` | List manifests in repository |
|
||||
| `get_manifest_properties` | Get manifest metadata |
|
||||
| `delete_manifest` | Delete manifest by digest |
|
||||
| `download_manifest` | Download manifest content |
|
||||
| `download_blob` | Download layer blob |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID** for authentication in production
|
||||
2. **Delete by digest** not tag to avoid orphaned images
|
||||
3. **Lock production images** with can_delete=False
|
||||
4. **Clean up untagged manifests** regularly
|
||||
5. **Use async client** for high-throughput operations
|
||||
6. **Order by last_updated** to find recent/old images
|
||||
7. **Check manifest.tags** before deleting to avoid removing tagged images
|
||||
239
skills/azure-cosmos-db-py/SKILL.md
Normal file
239
skills/azure-cosmos-db-py/SKILL.md
Normal file
@@ -0,0 +1,239 @@
|
||||
---
|
||||
name: azure-cosmos-db-py
|
||||
description: Build Azure Cosmos DB NoSQL services with Python/FastAPI following production-grade patterns. Use when implementing database client setup with dual auth (DefaultAzureCredential + emulator), service layer classes with CRUD operations, partition key strategies, parameterized queries, or TDD patterns for Cosmos. Triggers on phrases like "Cosmos DB", "NoSQL database", "document store", "add persistence", "database service layer", or "Python Cosmos SDK".
|
||||
package: azure-cosmos
|
||||
---
|
||||
|
||||
# Cosmos DB Service Implementation
|
||||
|
||||
Build production-grade Azure Cosmos DB NoSQL services following clean code, security best practices, and TDD principles.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-cosmos azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
COSMOS_ENDPOINT=https://<account>.documents.azure.com:443/
|
||||
COSMOS_DATABASE_NAME=<database-name>
|
||||
COSMOS_CONTAINER_ID=<container-id>
|
||||
# For emulator only (not production)
|
||||
COSMOS_KEY=<emulator-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
**DefaultAzureCredential (preferred)**:
|
||||
```python
|
||||
from azure.cosmos import CosmosClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = CosmosClient(
|
||||
url=os.environ["COSMOS_ENDPOINT"],
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
**Emulator (local development)**:
|
||||
```python
|
||||
from azure.cosmos import CosmosClient
|
||||
|
||||
client = CosmosClient(
|
||||
url="https://localhost:8081",
|
||||
credential=os.environ["COSMOS_KEY"],
|
||||
connection_verify=False
|
||||
)
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ FastAPI Router │
|
||||
│ - Auth dependencies (get_current_user, get_current_user_required)
|
||||
│ - HTTP error responses (HTTPException) │
|
||||
└──────────────────────────────┬──────────────────────────────────┘
|
||||
│
|
||||
┌──────────────────────────────▼──────────────────────────────────┐
|
||||
│ Service Layer │
|
||||
│ - Business logic and validation │
|
||||
│ - Document ↔ Model conversion │
|
||||
│ - Graceful degradation when Cosmos unavailable │
|
||||
└──────────────────────────────┬──────────────────────────────────┘
|
||||
│
|
||||
┌──────────────────────────────▼──────────────────────────────────┐
|
||||
│ Cosmos DB Client Module │
|
||||
│ - Singleton container initialization │
|
||||
│ - Dual auth: DefaultAzureCredential (Azure) / Key (emulator) │
|
||||
│ - Async wrapper via run_in_threadpool │
|
||||
└─────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Client Module Setup
|
||||
|
||||
Create a singleton Cosmos client with dual authentication:
|
||||
|
||||
```python
|
||||
# db/cosmos.py
|
||||
from azure.cosmos import CosmosClient
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from starlette.concurrency import run_in_threadpool
|
||||
|
||||
_cosmos_container = None
|
||||
|
||||
def _is_emulator_endpoint(endpoint: str) -> bool:
|
||||
return "localhost" in endpoint or "127.0.0.1" in endpoint
|
||||
|
||||
async def get_container():
|
||||
global _cosmos_container
|
||||
if _cosmos_container is None:
|
||||
if _is_emulator_endpoint(settings.cosmos_endpoint):
|
||||
client = CosmosClient(
|
||||
url=settings.cosmos_endpoint,
|
||||
credential=settings.cosmos_key,
|
||||
connection_verify=False
|
||||
)
|
||||
else:
|
||||
client = CosmosClient(
|
||||
url=settings.cosmos_endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
db = client.get_database_client(settings.cosmos_database_name)
|
||||
_cosmos_container = db.get_container_client(settings.cosmos_container_id)
|
||||
return _cosmos_container
|
||||
```
|
||||
|
||||
**Full implementation**: See [references/client-setup.md](references/client-setup.md)
|
||||
|
||||
### 2. Pydantic Model Hierarchy
|
||||
|
||||
Use five-tier model pattern for clean separation:
|
||||
|
||||
```python
|
||||
class ProjectBase(BaseModel): # Shared fields
|
||||
name: str = Field(..., min_length=1, max_length=200)
|
||||
|
||||
class ProjectCreate(ProjectBase): # Creation request
|
||||
workspace_id: str = Field(..., alias="workspaceId")
|
||||
|
||||
class ProjectUpdate(BaseModel): # Partial updates (all optional)
|
||||
name: Optional[str] = Field(None, min_length=1)
|
||||
|
||||
class Project(ProjectBase): # API response
|
||||
id: str
|
||||
created_at: datetime = Field(..., alias="createdAt")
|
||||
|
||||
class ProjectInDB(Project): # Internal with docType
|
||||
doc_type: str = "project"
|
||||
```
|
||||
|
||||
### 3. Service Layer Pattern
|
||||
|
||||
```python
|
||||
class ProjectService:
|
||||
def _use_cosmos(self) -> bool:
|
||||
return get_container() is not None
|
||||
|
||||
async def get_by_id(self, project_id: str, workspace_id: str) -> Project | None:
|
||||
if not self._use_cosmos():
|
||||
return None
|
||||
doc = await get_document(project_id, partition_key=workspace_id)
|
||||
if doc is None:
|
||||
return None
|
||||
return self._doc_to_model(doc)
|
||||
```
|
||||
|
||||
**Full patterns**: See [references/service-layer.md](references/service-layer.md)
|
||||
|
||||
## Core Principles
|
||||
|
||||
### Security Requirements
|
||||
|
||||
1. **RBAC Authentication**: Use `DefaultAzureCredential` in Azure — never store keys in code
|
||||
2. **Emulator-Only Keys**: Hardcode the well-known emulator key only for local development
|
||||
3. **Parameterized Queries**: Always use `@parameter` syntax — never string concatenation
|
||||
4. **Partition Key Validation**: Validate partition key access matches user authorization
|
||||
|
||||
### Clean Code Conventions
|
||||
|
||||
1. **Single Responsibility**: Client module handles connection; services handle business logic
|
||||
2. **Graceful Degradation**: Services return `None`/`[]` when Cosmos unavailable
|
||||
3. **Consistent Naming**: `_doc_to_model()`, `_model_to_doc()`, `_use_cosmos()`
|
||||
4. **Type Hints**: Full typing on all public methods
|
||||
5. **CamelCase Aliases**: Use `Field(alias="camelCase")` for JSON serialization
|
||||
|
||||
### TDD Requirements
|
||||
|
||||
Write tests BEFORE implementation using these patterns:
|
||||
|
||||
```python
|
||||
@pytest.fixture
|
||||
def mock_cosmos_container(mocker):
|
||||
container = mocker.MagicMock()
|
||||
mocker.patch("app.db.cosmos.get_container", return_value=container)
|
||||
return container
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_project_by_id_returns_project(mock_cosmos_container):
|
||||
# Arrange
|
||||
mock_cosmos_container.read_item.return_value = {"id": "123", "name": "Test"}
|
||||
|
||||
# Act
|
||||
result = await project_service.get_by_id("123", "workspace-1")
|
||||
|
||||
# Assert
|
||||
assert result.id == "123"
|
||||
assert result.name == "Test"
|
||||
```
|
||||
|
||||
**Full testing guide**: See [references/testing.md](references/testing.md)
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | When to Read |
|
||||
|------|--------------|
|
||||
| [references/client-setup.md](references/client-setup.md) | Setting up Cosmos client with dual auth, SSL config, singleton pattern |
|
||||
| [references/service-layer.md](references/service-layer.md) | Implementing full service class with CRUD, conversions, graceful degradation |
|
||||
| [references/testing.md](references/testing.md) | Writing pytest tests, mocking Cosmos, integration test setup |
|
||||
| [references/partitioning.md](references/partitioning.md) | Choosing partition keys, cross-partition queries, move operations |
|
||||
| [references/error-handling.md](references/error-handling.md) | Handling CosmosResourceNotFoundError, logging, HTTP error mapping |
|
||||
|
||||
## Template Files
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| [assets/cosmos_client_template.py](assets/cosmos_client_template.py) | Ready-to-use client module |
|
||||
| [assets/service_template.py](assets/service_template.py) | Service class skeleton |
|
||||
| [assets/conftest_template.py](assets/conftest_template.py) | pytest fixtures for Cosmos mocking |
|
||||
|
||||
## Quality Attributes (NFRs)
|
||||
|
||||
### Reliability
|
||||
- Graceful degradation when Cosmos unavailable
|
||||
- Retry logic with exponential backoff for transient failures
|
||||
- Connection pooling via singleton pattern
|
||||
|
||||
### Security
|
||||
- Zero secrets in code (RBAC via DefaultAzureCredential)
|
||||
- Parameterized queries prevent injection
|
||||
- Partition key isolation enforces data boundaries
|
||||
|
||||
### Maintainability
|
||||
- Five-tier model pattern enables schema evolution
|
||||
- Service layer decouples business logic from storage
|
||||
- Consistent patterns across all entity services
|
||||
|
||||
### Testability
|
||||
- Dependency injection via `get_container()`
|
||||
- Easy mocking with module-level globals
|
||||
- Clear separation enables unit testing without Cosmos
|
||||
|
||||
### Performance
|
||||
- Partition key queries avoid cross-partition scans
|
||||
- Async wrapping prevents blocking FastAPI event loop
|
||||
- Minimal document conversion overhead
|
||||
258
skills/azure-cosmos-java/SKILL.md
Normal file
258
skills/azure-cosmos-java/SKILL.md
Normal file
@@ -0,0 +1,258 @@
|
||||
---
|
||||
name: azure-cosmos-java
|
||||
description: |
|
||||
Azure Cosmos DB SDK for Java. NoSQL database operations with global distribution, multi-model support, and reactive patterns.
|
||||
Triggers: "CosmosClient java", "CosmosAsyncClient", "cosmos database java", "cosmosdb java", "document database java".
|
||||
package: azure-cosmos
|
||||
---
|
||||
|
||||
# Azure Cosmos DB SDK for Java
|
||||
|
||||
Client library for Azure Cosmos DB NoSQL API with global distribution and reactive patterns.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-cosmos</artifactId>
|
||||
<version>LATEST</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
Or use Azure SDK BOM:
|
||||
|
||||
```xml
|
||||
<dependencyManagement>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-sdk-bom</artifactId>
|
||||
<version>{bom_version}</version>
|
||||
<type>pom</type>
|
||||
<scope>import</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</dependencyManagement>
|
||||
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-cosmos</artifactId>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
COSMOS_ENDPOINT=https://<account>.documents.azure.com:443/
|
||||
COSMOS_KEY=<your-primary-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Key-based Authentication
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.CosmosClient;
|
||||
import com.azure.cosmos.CosmosClientBuilder;
|
||||
|
||||
CosmosClient client = new CosmosClientBuilder()
|
||||
.endpoint(System.getenv("COSMOS_ENDPOINT"))
|
||||
.key(System.getenv("COSMOS_KEY"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.CosmosAsyncClient;
|
||||
|
||||
CosmosAsyncClient asyncClient = new CosmosClientBuilder()
|
||||
.endpoint(serviceEndpoint)
|
||||
.key(key)
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
### With Customizations
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.ConsistencyLevel;
|
||||
import java.util.Arrays;
|
||||
|
||||
CosmosClient client = new CosmosClientBuilder()
|
||||
.endpoint(serviceEndpoint)
|
||||
.key(key)
|
||||
.directMode(directConnectionConfig, gatewayConnectionConfig)
|
||||
.consistencyLevel(ConsistencyLevel.SESSION)
|
||||
.connectionSharingAcrossClientsEnabled(true)
|
||||
.contentResponseOnWriteEnabled(true)
|
||||
.userAgentSuffix("my-application")
|
||||
.preferredRegions(Arrays.asList("West US", "East US"))
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
| Class | Purpose |
|
||||
|-------|---------|
|
||||
| `CosmosClient` / `CosmosAsyncClient` | Account-level operations |
|
||||
| `CosmosDatabase` / `CosmosAsyncDatabase` | Database operations |
|
||||
| `CosmosContainer` / `CosmosAsyncContainer` | Container/item operations |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Create Database
|
||||
|
||||
```java
|
||||
// Sync
|
||||
client.createDatabaseIfNotExists("myDatabase")
|
||||
.map(response -> client.getDatabase(response.getProperties().getId()));
|
||||
|
||||
// Async with chaining
|
||||
asyncClient.createDatabaseIfNotExists("myDatabase")
|
||||
.map(response -> asyncClient.getDatabase(response.getProperties().getId()))
|
||||
.subscribe(database -> System.out.println("Created: " + database.getId()));
|
||||
```
|
||||
|
||||
### Create Container
|
||||
|
||||
```java
|
||||
asyncClient.createDatabaseIfNotExists("myDatabase")
|
||||
.flatMap(dbResponse -> {
|
||||
String databaseId = dbResponse.getProperties().getId();
|
||||
return asyncClient.getDatabase(databaseId)
|
||||
.createContainerIfNotExists("myContainer", "/partitionKey")
|
||||
.map(containerResponse -> asyncClient.getDatabase(databaseId)
|
||||
.getContainer(containerResponse.getProperties().getId()));
|
||||
})
|
||||
.subscribe(container -> System.out.println("Container: " + container.getId()));
|
||||
```
|
||||
|
||||
### CRUD Operations
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.models.PartitionKey;
|
||||
|
||||
CosmosAsyncContainer container = asyncClient
|
||||
.getDatabase("myDatabase")
|
||||
.getContainer("myContainer");
|
||||
|
||||
// Create
|
||||
container.createItem(new User("1", "John Doe", "john@example.com"))
|
||||
.flatMap(response -> {
|
||||
System.out.println("Created: " + response.getItem());
|
||||
// Read
|
||||
return container.readItem(
|
||||
response.getItem().getId(),
|
||||
new PartitionKey(response.getItem().getId()),
|
||||
User.class);
|
||||
})
|
||||
.flatMap(response -> {
|
||||
System.out.println("Read: " + response.getItem());
|
||||
// Update
|
||||
User user = response.getItem();
|
||||
user.setEmail("john.doe@example.com");
|
||||
return container.replaceItem(
|
||||
user,
|
||||
user.getId(),
|
||||
new PartitionKey(user.getId()));
|
||||
})
|
||||
.flatMap(response -> {
|
||||
// Delete
|
||||
return container.deleteItem(
|
||||
response.getItem().getId(),
|
||||
new PartitionKey(response.getItem().getId()));
|
||||
})
|
||||
.block();
|
||||
```
|
||||
|
||||
### Query Documents
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.models.CosmosQueryRequestOptions;
|
||||
import com.azure.cosmos.util.CosmosPagedIterable;
|
||||
|
||||
CosmosContainer container = client.getDatabase("myDatabase").getContainer("myContainer");
|
||||
|
||||
String query = "SELECT * FROM c WHERE c.status = @status";
|
||||
CosmosQueryRequestOptions options = new CosmosQueryRequestOptions();
|
||||
|
||||
CosmosPagedIterable<User> results = container.queryItems(
|
||||
query,
|
||||
options,
|
||||
User.class
|
||||
);
|
||||
|
||||
results.forEach(user -> System.out.println("User: " + user.getName()));
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### Partition Keys
|
||||
|
||||
Choose a partition key with:
|
||||
- High cardinality (many distinct values)
|
||||
- Even distribution of data and requests
|
||||
- Frequently used in queries
|
||||
|
||||
### Consistency Levels
|
||||
|
||||
| Level | Guarantee |
|
||||
|-------|-----------|
|
||||
| Strong | Linearizability |
|
||||
| Bounded Staleness | Consistent prefix with bounded lag |
|
||||
| Session | Consistent prefix within session |
|
||||
| Consistent Prefix | Reads never see out-of-order writes |
|
||||
| Eventual | No ordering guarantee |
|
||||
|
||||
### Request Units (RUs)
|
||||
|
||||
All operations consume RUs. Check response headers:
|
||||
|
||||
```java
|
||||
CosmosItemResponse<User> response = container.createItem(user);
|
||||
System.out.println("RU charge: " + response.getRequestCharge());
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Reuse CosmosClient** — Create once, reuse throughout application
|
||||
2. **Use async client** for high-throughput scenarios
|
||||
3. **Choose partition key carefully** — Affects performance and scalability
|
||||
4. **Enable content response on write** for immediate access to created items
|
||||
5. **Configure preferred regions** for geo-distributed applications
|
||||
6. **Handle 429 errors** with retry policies (built-in by default)
|
||||
7. **Use direct mode** for lowest latency in production
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.cosmos.CosmosException;
|
||||
|
||||
try {
|
||||
container.createItem(item);
|
||||
} catch (CosmosException e) {
|
||||
System.err.println("Status: " + e.getStatusCode());
|
||||
System.err.println("Message: " + e.getMessage());
|
||||
System.err.println("Request charge: " + e.getRequestCharge());
|
||||
|
||||
if (e.getStatusCode() == 409) {
|
||||
System.err.println("Item already exists");
|
||||
} else if (e.getStatusCode() == 429) {
|
||||
System.err.println("Rate limited, retry after: " + e.getRetryAfterDuration());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-cosmos |
|
||||
| API Documentation | https://azuresdkdocs.z19.web.core.windows.net/java/azure-cosmos/latest/index.html |
|
||||
| Product Docs | https://learn.microsoft.com/azure/cosmos-db/ |
|
||||
| Samples | https://github.com/Azure-Samples/azure-cosmos-java-sql-api-samples |
|
||||
| Performance Guide | https://learn.microsoft.com/azure/cosmos-db/performance-tips-java-sdk-v4-sql |
|
||||
| Troubleshooting | https://learn.microsoft.com/azure/cosmos-db/troubleshoot-java-sdk-v4-sql |
|
||||
280
skills/azure-cosmos-py/SKILL.md
Normal file
280
skills/azure-cosmos-py/SKILL.md
Normal file
@@ -0,0 +1,280 @@
|
||||
---
|
||||
name: azure-cosmos-py
|
||||
description: |
|
||||
Azure Cosmos DB SDK for Python (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
|
||||
Triggers: "cosmos db", "CosmosClient", "container", "document", "NoSQL", "partition key".
|
||||
package: azure-cosmos
|
||||
---
|
||||
|
||||
# Azure Cosmos DB SDK for Python
|
||||
|
||||
Client library for Azure Cosmos DB NoSQL API — globally distributed, multi-model database.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-cosmos azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
COSMOS_ENDPOINT=https://<account>.documents.azure.com:443/
|
||||
COSMOS_DATABASE=mydb
|
||||
COSMOS_CONTAINER=mycontainer
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.cosmos import CosmosClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
endpoint = "https://<account>.documents.azure.com:443/"
|
||||
|
||||
client = CosmosClient(url=endpoint, credential=credential)
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
| Client | Purpose | Get From |
|
||||
|--------|---------|----------|
|
||||
| `CosmosClient` | Account-level operations | Direct instantiation |
|
||||
| `DatabaseProxy` | Database operations | `client.get_database_client()` |
|
||||
| `ContainerProxy` | Container/item operations | `database.get_container_client()` |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Setup Database and Container
|
||||
|
||||
```python
|
||||
# Get or create database
|
||||
database = client.create_database_if_not_exists(id="mydb")
|
||||
|
||||
# Get or create container with partition key
|
||||
container = database.create_container_if_not_exists(
|
||||
id="mycontainer",
|
||||
partition_key=PartitionKey(path="/category")
|
||||
)
|
||||
|
||||
# Get existing
|
||||
database = client.get_database_client("mydb")
|
||||
container = database.get_container_client("mycontainer")
|
||||
```
|
||||
|
||||
### Create Item
|
||||
|
||||
```python
|
||||
item = {
|
||||
"id": "item-001", # Required: unique within partition
|
||||
"category": "electronics", # Partition key value
|
||||
"name": "Laptop",
|
||||
"price": 999.99,
|
||||
"tags": ["computer", "portable"]
|
||||
}
|
||||
|
||||
created = container.create_item(body=item)
|
||||
print(f"Created: {created['id']}")
|
||||
```
|
||||
|
||||
### Read Item
|
||||
|
||||
```python
|
||||
# Read requires id AND partition key
|
||||
item = container.read_item(
|
||||
item="item-001",
|
||||
partition_key="electronics"
|
||||
)
|
||||
print(f"Name: {item['name']}")
|
||||
```
|
||||
|
||||
### Update Item (Replace)
|
||||
|
||||
```python
|
||||
item = container.read_item(item="item-001", partition_key="electronics")
|
||||
item["price"] = 899.99
|
||||
item["on_sale"] = True
|
||||
|
||||
updated = container.replace_item(item=item["id"], body=item)
|
||||
```
|
||||
|
||||
### Upsert Item
|
||||
|
||||
```python
|
||||
# Create if not exists, replace if exists
|
||||
item = {
|
||||
"id": "item-002",
|
||||
"category": "electronics",
|
||||
"name": "Tablet",
|
||||
"price": 499.99
|
||||
}
|
||||
|
||||
result = container.upsert_item(body=item)
|
||||
```
|
||||
|
||||
### Delete Item
|
||||
|
||||
```python
|
||||
container.delete_item(
|
||||
item="item-001",
|
||||
partition_key="electronics"
|
||||
)
|
||||
```
|
||||
|
||||
## Queries
|
||||
|
||||
### Basic Query
|
||||
|
||||
```python
|
||||
# Query within a partition (efficient)
|
||||
query = "SELECT * FROM c WHERE c.price < @max_price"
|
||||
items = container.query_items(
|
||||
query=query,
|
||||
parameters=[{"name": "@max_price", "value": 500}],
|
||||
partition_key="electronics"
|
||||
)
|
||||
|
||||
for item in items:
|
||||
print(f"{item['name']}: ${item['price']}")
|
||||
```
|
||||
|
||||
### Cross-Partition Query
|
||||
|
||||
```python
|
||||
# Cross-partition (more expensive, use sparingly)
|
||||
query = "SELECT * FROM c WHERE c.price < @max_price"
|
||||
items = container.query_items(
|
||||
query=query,
|
||||
parameters=[{"name": "@max_price", "value": 500}],
|
||||
enable_cross_partition_query=True
|
||||
)
|
||||
|
||||
for item in items:
|
||||
print(item)
|
||||
```
|
||||
|
||||
### Query with Projection
|
||||
|
||||
```python
|
||||
query = "SELECT c.id, c.name, c.price FROM c WHERE c.category = @category"
|
||||
items = container.query_items(
|
||||
query=query,
|
||||
parameters=[{"name": "@category", "value": "electronics"}],
|
||||
partition_key="electronics"
|
||||
)
|
||||
```
|
||||
|
||||
### Read All Items
|
||||
|
||||
```python
|
||||
# Read all in a partition
|
||||
items = container.read_all_items() # Cross-partition
|
||||
# Or with partition key
|
||||
items = container.query_items(
|
||||
query="SELECT * FROM c",
|
||||
partition_key="electronics"
|
||||
)
|
||||
```
|
||||
|
||||
## Partition Keys
|
||||
|
||||
**Critical**: Always include partition key for efficient operations.
|
||||
|
||||
```python
|
||||
from azure.cosmos import PartitionKey
|
||||
|
||||
# Single partition key
|
||||
container = database.create_container_if_not_exists(
|
||||
id="orders",
|
||||
partition_key=PartitionKey(path="/customer_id")
|
||||
)
|
||||
|
||||
# Hierarchical partition key (preview)
|
||||
container = database.create_container_if_not_exists(
|
||||
id="events",
|
||||
partition_key=PartitionKey(path=["/tenant_id", "/user_id"])
|
||||
)
|
||||
```
|
||||
|
||||
## Throughput
|
||||
|
||||
```python
|
||||
# Create container with provisioned throughput
|
||||
container = database.create_container_if_not_exists(
|
||||
id="mycontainer",
|
||||
partition_key=PartitionKey(path="/pk"),
|
||||
offer_throughput=400 # RU/s
|
||||
)
|
||||
|
||||
# Read current throughput
|
||||
offer = container.read_offer()
|
||||
print(f"Throughput: {offer.offer_throughput} RU/s")
|
||||
|
||||
# Update throughput
|
||||
container.replace_throughput(throughput=1000)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.cosmos.aio import CosmosClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def cosmos_operations():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with CosmosClient(endpoint, credential=credential) as client:
|
||||
database = client.get_database_client("mydb")
|
||||
container = database.get_container_client("mycontainer")
|
||||
|
||||
# Create
|
||||
await container.create_item(body={"id": "1", "pk": "test"})
|
||||
|
||||
# Read
|
||||
item = await container.read_item(item="1", partition_key="test")
|
||||
|
||||
# Query
|
||||
async for item in container.query_items(
|
||||
query="SELECT * FROM c",
|
||||
partition_key="test"
|
||||
):
|
||||
print(item)
|
||||
|
||||
import asyncio
|
||||
asyncio.run(cosmos_operations())
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
from azure.cosmos.exceptions import CosmosHttpResponseError
|
||||
|
||||
try:
|
||||
item = container.read_item(item="nonexistent", partition_key="pk")
|
||||
except CosmosHttpResponseError as e:
|
||||
if e.status_code == 404:
|
||||
print("Item not found")
|
||||
elif e.status_code == 429:
|
||||
print(f"Rate limited. Retry after: {e.headers.get('x-ms-retry-after-ms')}ms")
|
||||
else:
|
||||
raise
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always specify partition key** for point reads and queries
|
||||
2. **Use parameterized queries** to prevent injection and improve caching
|
||||
3. **Avoid cross-partition queries** when possible
|
||||
4. **Use `upsert_item`** for idempotent writes
|
||||
5. **Use async client** for high-throughput scenarios
|
||||
6. **Design partition key** for even data distribution
|
||||
7. **Use `read_item`** instead of query for single document retrieval
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | Contents |
|
||||
|------|----------|
|
||||
| [references/partitioning.md](references/partitioning.md) | Partition key strategies, hierarchical keys, hot partition detection and mitigation |
|
||||
| [references/query-patterns.md](references/query-patterns.md) | Query optimization, aggregations, pagination, transactions, change feed |
|
||||
| [scripts/setup_cosmos_container.py](scripts/setup_cosmos_container.py) | CLI tool for creating containers with partitioning, throughput, and indexing |
|
||||
135
skills/azure-cosmos-rust/SKILL.md
Normal file
135
skills/azure-cosmos-rust/SKILL.md
Normal file
@@ -0,0 +1,135 @@
|
||||
---
|
||||
name: azure-cosmos-rust
|
||||
description: |
|
||||
Azure Cosmos DB SDK for Rust (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
|
||||
Triggers: "cosmos db rust", "CosmosClient rust", "container", "document rust", "NoSQL rust", "partition key".
|
||||
package: azure_data_cosmos
|
||||
---
|
||||
|
||||
# Azure Cosmos DB SDK for Rust
|
||||
|
||||
Client library for Azure Cosmos DB NoSQL API — globally distributed, multi-model database.
|
||||
|
||||
## Installation
|
||||
|
||||
```sh
|
||||
cargo add azure_data_cosmos azure_identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
COSMOS_ENDPOINT=https://<account>.documents.azure.com:443/
|
||||
COSMOS_DATABASE=mydb
|
||||
COSMOS_CONTAINER=mycontainer
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```rust
|
||||
use azure_identity::DeveloperToolsCredential;
|
||||
use azure_data_cosmos::CosmosClient;
|
||||
|
||||
let credential = DeveloperToolsCredential::new(None)?;
|
||||
let client = CosmosClient::new(
|
||||
"https://<account>.documents.azure.com:443/",
|
||||
credential.clone(),
|
||||
None,
|
||||
)?;
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
| Client | Purpose | Get From |
|
||||
|--------|---------|----------|
|
||||
| `CosmosClient` | Account-level operations | Direct instantiation |
|
||||
| `DatabaseClient` | Database operations | `client.database_client()` |
|
||||
| `ContainerClient` | Container/item operations | `database.container_client()` |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Get Database and Container Clients
|
||||
|
||||
```rust
|
||||
let database = client.database_client("myDatabase");
|
||||
let container = database.container_client("myContainer");
|
||||
```
|
||||
|
||||
### Create Item
|
||||
|
||||
```rust
|
||||
use serde::{Serialize, Deserialize};
|
||||
|
||||
#[derive(Serialize, Deserialize)]
|
||||
struct Item {
|
||||
pub id: String,
|
||||
pub partition_key: String,
|
||||
pub value: String,
|
||||
}
|
||||
|
||||
let item = Item {
|
||||
id: "1".into(),
|
||||
partition_key: "partition1".into(),
|
||||
value: "hello".into(),
|
||||
};
|
||||
|
||||
container.create_item("partition1", item, None).await?;
|
||||
```
|
||||
|
||||
### Read Item
|
||||
|
||||
```rust
|
||||
let response = container.read_item("partition1", "1", None).await?;
|
||||
let item: Item = response.into_model()?;
|
||||
```
|
||||
|
||||
### Replace Item
|
||||
|
||||
```rust
|
||||
let mut item: Item = container.read_item("partition1", "1", None).await?.into_model()?;
|
||||
item.value = "updated".into();
|
||||
|
||||
container.replace_item("partition1", "1", item, None).await?;
|
||||
```
|
||||
|
||||
### Patch Item
|
||||
|
||||
```rust
|
||||
use azure_data_cosmos::models::PatchDocument;
|
||||
|
||||
let patch = PatchDocument::default()
|
||||
.with_add("/newField", "newValue")?
|
||||
.with_remove("/oldField")?;
|
||||
|
||||
container.patch_item("partition1", "1", patch, None).await?;
|
||||
```
|
||||
|
||||
### Delete Item
|
||||
|
||||
```rust
|
||||
container.delete_item("partition1", "1", None).await?;
|
||||
```
|
||||
|
||||
## Key Auth (Optional)
|
||||
|
||||
Enable key-based authentication with feature flag:
|
||||
|
||||
```sh
|
||||
cargo add azure_data_cosmos --features key_auth
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always specify partition key** — required for point reads and writes
|
||||
2. **Use `into_model()?`** — to deserialize responses into your types
|
||||
3. **Derive `Serialize` and `Deserialize`** — for all document types
|
||||
4. **Use Entra ID auth** — prefer `DeveloperToolsCredential` over key auth
|
||||
5. **Reuse client instances** — clients are thread-safe and reusable
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | Link |
|
||||
|----------|------|
|
||||
| API Reference | https://docs.rs/azure_data_cosmos |
|
||||
| Source Code | https://github.com/Azure/azure-sdk-for-rust/tree/main/sdk/cosmos/azure_data_cosmos |
|
||||
| crates.io | https://crates.io/crates/azure_data_cosmos |
|
||||
471
skills/azure-cosmos-ts/SKILL.md
Normal file
471
skills/azure-cosmos-ts/SKILL.md
Normal file
@@ -0,0 +1,471 @@
|
||||
---
|
||||
name: azure-cosmos-ts
|
||||
description: |
|
||||
Azure Cosmos DB JavaScript/TypeScript SDK (@azure/cosmos) for data plane operations. Use for CRUD operations on documents, queries, bulk operations, and container management. Triggers: "Cosmos DB", "@azure/cosmos", "CosmosClient", "document CRUD", "NoSQL queries", "bulk operations", "partition key", "container.items".
|
||||
package: @azure/cosmos
|
||||
---
|
||||
|
||||
# @azure/cosmos (TypeScript/JavaScript)
|
||||
|
||||
Data plane SDK for Azure Cosmos DB NoSQL API operations — CRUD on documents, queries, bulk operations.
|
||||
|
||||
> **⚠️ Data vs Management Plane**
|
||||
> - **This SDK (@azure/cosmos)**: CRUD operations on documents, queries, stored procedures
|
||||
> - **Management SDK (@azure/arm-cosmosdb)**: Create accounts, databases, containers via ARM
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @azure/cosmos @azure/identity
|
||||
```
|
||||
|
||||
**Current Version**: 4.9.0
|
||||
**Node.js**: >= 20.0.0
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
COSMOS_ENDPOINT=https://<account>.documents.azure.com:443/
|
||||
COSMOS_DATABASE=<database-name>
|
||||
COSMOS_CONTAINER=<container-name>
|
||||
# For key-based auth only (prefer AAD)
|
||||
COSMOS_KEY=<account-key>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### AAD with DefaultAzureCredential (Recommended)
|
||||
|
||||
```typescript
|
||||
import { CosmosClient } from "@azure/cosmos";
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
const client = new CosmosClient({
|
||||
endpoint: process.env.COSMOS_ENDPOINT!,
|
||||
aadCredentials: new DefaultAzureCredential(),
|
||||
});
|
||||
```
|
||||
|
||||
### Key-Based Authentication
|
||||
|
||||
```typescript
|
||||
import { CosmosClient } from "@azure/cosmos";
|
||||
|
||||
// Option 1: Endpoint + Key
|
||||
const client = new CosmosClient({
|
||||
endpoint: process.env.COSMOS_ENDPOINT!,
|
||||
key: process.env.COSMOS_KEY!,
|
||||
});
|
||||
|
||||
// Option 2: Connection String
|
||||
const client = new CosmosClient(process.env.COSMOS_CONNECTION_STRING!);
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
CosmosClient
|
||||
└── Database
|
||||
└── Container
|
||||
├── Items (documents)
|
||||
├── Scripts (stored procedures, triggers, UDFs)
|
||||
└── Conflicts
|
||||
```
|
||||
|
||||
## Core Operations
|
||||
|
||||
### Database & Container Setup
|
||||
|
||||
```typescript
|
||||
const { database } = await client.databases.createIfNotExists({
|
||||
id: "my-database",
|
||||
});
|
||||
|
||||
const { container } = await database.containers.createIfNotExists({
|
||||
id: "my-container",
|
||||
partitionKey: { paths: ["/partitionKey"] },
|
||||
});
|
||||
```
|
||||
|
||||
### Create Document
|
||||
|
||||
```typescript
|
||||
interface Product {
|
||||
id: string;
|
||||
partitionKey: string;
|
||||
name: string;
|
||||
price: number;
|
||||
}
|
||||
|
||||
const item: Product = {
|
||||
id: "product-1",
|
||||
partitionKey: "electronics",
|
||||
name: "Laptop",
|
||||
price: 999.99,
|
||||
};
|
||||
|
||||
const { resource } = await container.items.create<Product>(item);
|
||||
```
|
||||
|
||||
### Read Document
|
||||
|
||||
```typescript
|
||||
const { resource } = await container
|
||||
.item("product-1", "electronics") // id, partitionKey
|
||||
.read<Product>();
|
||||
|
||||
if (resource) {
|
||||
console.log(resource.name);
|
||||
}
|
||||
```
|
||||
|
||||
### Update Document (Replace)
|
||||
|
||||
```typescript
|
||||
const { resource: existing } = await container
|
||||
.item("product-1", "electronics")
|
||||
.read<Product>();
|
||||
|
||||
if (existing) {
|
||||
existing.price = 899.99;
|
||||
const { resource: updated } = await container
|
||||
.item("product-1", "electronics")
|
||||
.replace<Product>(existing);
|
||||
}
|
||||
```
|
||||
|
||||
### Upsert Document
|
||||
|
||||
```typescript
|
||||
const item: Product = {
|
||||
id: "product-1",
|
||||
partitionKey: "electronics",
|
||||
name: "Laptop Pro",
|
||||
price: 1299.99,
|
||||
};
|
||||
|
||||
const { resource } = await container.items.upsert<Product>(item);
|
||||
```
|
||||
|
||||
### Delete Document
|
||||
|
||||
```typescript
|
||||
await container.item("product-1", "electronics").delete();
|
||||
```
|
||||
|
||||
### Patch Document (Partial Update)
|
||||
|
||||
```typescript
|
||||
import { PatchOperation } from "@azure/cosmos";
|
||||
|
||||
const operations: PatchOperation[] = [
|
||||
{ op: "replace", path: "/price", value: 799.99 },
|
||||
{ op: "add", path: "/discount", value: true },
|
||||
{ op: "remove", path: "/oldField" },
|
||||
];
|
||||
|
||||
const { resource } = await container
|
||||
.item("product-1", "electronics")
|
||||
.patch<Product>(operations);
|
||||
```
|
||||
|
||||
## Queries
|
||||
|
||||
### Simple Query
|
||||
|
||||
```typescript
|
||||
const { resources } = await container.items
|
||||
.query<Product>("SELECT * FROM c WHERE c.price < 1000")
|
||||
.fetchAll();
|
||||
```
|
||||
|
||||
### Parameterized Query (Recommended)
|
||||
|
||||
```typescript
|
||||
import { SqlQuerySpec } from "@azure/cosmos";
|
||||
|
||||
const querySpec: SqlQuerySpec = {
|
||||
query: "SELECT * FROM c WHERE c.partitionKey = @category AND c.price < @maxPrice",
|
||||
parameters: [
|
||||
{ name: "@category", value: "electronics" },
|
||||
{ name: "@maxPrice", value: 1000 },
|
||||
],
|
||||
};
|
||||
|
||||
const { resources } = await container.items
|
||||
.query<Product>(querySpec)
|
||||
.fetchAll();
|
||||
```
|
||||
|
||||
### Query with Pagination
|
||||
|
||||
```typescript
|
||||
const queryIterator = container.items.query<Product>(querySpec, {
|
||||
maxItemCount: 10, // Items per page
|
||||
});
|
||||
|
||||
while (queryIterator.hasMoreResults()) {
|
||||
const { resources, continuationToken } = await queryIterator.fetchNext();
|
||||
console.log(`Page with ${resources?.length} items`);
|
||||
// Use continuationToken for next page if needed
|
||||
}
|
||||
```
|
||||
|
||||
### Cross-Partition Query
|
||||
|
||||
```typescript
|
||||
const { resources } = await container.items
|
||||
.query<Product>(
|
||||
"SELECT * FROM c WHERE c.price > 500",
|
||||
{ enableCrossPartitionQuery: true }
|
||||
)
|
||||
.fetchAll();
|
||||
```
|
||||
|
||||
## Bulk Operations
|
||||
|
||||
### Execute Bulk Operations
|
||||
|
||||
```typescript
|
||||
import { BulkOperationType, OperationInput } from "@azure/cosmos";
|
||||
|
||||
const operations: OperationInput[] = [
|
||||
{
|
||||
operationType: BulkOperationType.Create,
|
||||
resourceBody: { id: "1", partitionKey: "cat-a", name: "Item 1" },
|
||||
},
|
||||
{
|
||||
operationType: BulkOperationType.Upsert,
|
||||
resourceBody: { id: "2", partitionKey: "cat-a", name: "Item 2" },
|
||||
},
|
||||
{
|
||||
operationType: BulkOperationType.Read,
|
||||
id: "3",
|
||||
partitionKey: "cat-b",
|
||||
},
|
||||
{
|
||||
operationType: BulkOperationType.Replace,
|
||||
id: "4",
|
||||
partitionKey: "cat-b",
|
||||
resourceBody: { id: "4", partitionKey: "cat-b", name: "Updated" },
|
||||
},
|
||||
{
|
||||
operationType: BulkOperationType.Delete,
|
||||
id: "5",
|
||||
partitionKey: "cat-c",
|
||||
},
|
||||
{
|
||||
operationType: BulkOperationType.Patch,
|
||||
id: "6",
|
||||
partitionKey: "cat-c",
|
||||
resourceBody: {
|
||||
operations: [{ op: "replace", path: "/name", value: "Patched" }],
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
const response = await container.items.executeBulkOperations(operations);
|
||||
|
||||
response.forEach((result, index) => {
|
||||
if (result.statusCode >= 200 && result.statusCode < 300) {
|
||||
console.log(`Operation ${index} succeeded`);
|
||||
} else {
|
||||
console.error(`Operation ${index} failed: ${result.statusCode}`);
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Partition Keys
|
||||
|
||||
### Simple Partition Key
|
||||
|
||||
```typescript
|
||||
const { container } = await database.containers.createIfNotExists({
|
||||
id: "products",
|
||||
partitionKey: { paths: ["/category"] },
|
||||
});
|
||||
```
|
||||
|
||||
### Hierarchical Partition Key (MultiHash)
|
||||
|
||||
```typescript
|
||||
import { PartitionKeyDefinitionVersion, PartitionKeyKind } from "@azure/cosmos";
|
||||
|
||||
const { container } = await database.containers.createIfNotExists({
|
||||
id: "orders",
|
||||
partitionKey: {
|
||||
paths: ["/tenantId", "/userId", "/sessionId"],
|
||||
version: PartitionKeyDefinitionVersion.V2,
|
||||
kind: PartitionKeyKind.MultiHash,
|
||||
},
|
||||
});
|
||||
|
||||
// Operations require array of partition key values
|
||||
const { resource } = await container.items.create({
|
||||
id: "order-1",
|
||||
tenantId: "tenant-a",
|
||||
userId: "user-123",
|
||||
sessionId: "session-xyz",
|
||||
total: 99.99,
|
||||
});
|
||||
|
||||
// Read with hierarchical partition key
|
||||
const { resource: order } = await container
|
||||
.item("order-1", ["tenant-a", "user-123", "session-xyz"])
|
||||
.read();
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```typescript
|
||||
import { ErrorResponse } from "@azure/cosmos";
|
||||
|
||||
try {
|
||||
const { resource } = await container.item("missing", "pk").read();
|
||||
} catch (error) {
|
||||
if (error instanceof ErrorResponse) {
|
||||
switch (error.code) {
|
||||
case 404:
|
||||
console.log("Document not found");
|
||||
break;
|
||||
case 409:
|
||||
console.log("Conflict - document already exists");
|
||||
break;
|
||||
case 412:
|
||||
console.log("Precondition failed (ETag mismatch)");
|
||||
break;
|
||||
case 429:
|
||||
console.log("Rate limited - retry after:", error.retryAfterInMs);
|
||||
break;
|
||||
default:
|
||||
console.error(`Cosmos error ${error.code}: ${error.message}`);
|
||||
}
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
```
|
||||
|
||||
## Optimistic Concurrency (ETags)
|
||||
|
||||
```typescript
|
||||
// Read with ETag
|
||||
const { resource, etag } = await container
|
||||
.item("product-1", "electronics")
|
||||
.read<Product>();
|
||||
|
||||
if (resource && etag) {
|
||||
resource.price = 899.99;
|
||||
|
||||
try {
|
||||
// Replace only if ETag matches
|
||||
await container.item("product-1", "electronics").replace(resource, {
|
||||
accessCondition: { type: "IfMatch", condition: etag },
|
||||
});
|
||||
} catch (error) {
|
||||
if (error instanceof ErrorResponse && error.code === 412) {
|
||||
console.log("Document was modified by another process");
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## TypeScript Types Reference
|
||||
|
||||
```typescript
|
||||
import {
|
||||
// Client & Resources
|
||||
CosmosClient,
|
||||
Database,
|
||||
Container,
|
||||
Item,
|
||||
Items,
|
||||
|
||||
// Operations
|
||||
OperationInput,
|
||||
BulkOperationType,
|
||||
PatchOperation,
|
||||
|
||||
// Queries
|
||||
SqlQuerySpec,
|
||||
SqlParameter,
|
||||
FeedOptions,
|
||||
|
||||
// Partition Keys
|
||||
PartitionKeyDefinition,
|
||||
PartitionKeyDefinitionVersion,
|
||||
PartitionKeyKind,
|
||||
|
||||
// Responses
|
||||
ItemResponse,
|
||||
FeedResponse,
|
||||
ResourceResponse,
|
||||
|
||||
// Errors
|
||||
ErrorResponse,
|
||||
} from "@azure/cosmos";
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use AAD authentication** — Prefer `DefaultAzureCredential` over keys
|
||||
2. **Always use parameterized queries** — Prevents injection, improves plan caching
|
||||
3. **Specify partition key** — Avoid cross-partition queries when possible
|
||||
4. **Use bulk operations** — For multiple writes, use `executeBulkOperations`
|
||||
5. **Handle 429 errors** — Implement retry logic with exponential backoff
|
||||
6. **Use ETags for concurrency** — Prevent lost updates in concurrent scenarios
|
||||
7. **Close client on shutdown** — Call `client.dispose()` in cleanup
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### Service Layer Pattern
|
||||
|
||||
```typescript
|
||||
export class ProductService {
|
||||
private container: Container;
|
||||
|
||||
constructor(client: CosmosClient) {
|
||||
this.container = client
|
||||
.database(process.env.COSMOS_DATABASE!)
|
||||
.container(process.env.COSMOS_CONTAINER!);
|
||||
}
|
||||
|
||||
async getById(id: string, category: string): Promise<Product | null> {
|
||||
try {
|
||||
const { resource } = await this.container
|
||||
.item(id, category)
|
||||
.read<Product>();
|
||||
return resource ?? null;
|
||||
} catch (error) {
|
||||
if (error instanceof ErrorResponse && error.code === 404) {
|
||||
return null;
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async create(product: Omit<Product, "id">): Promise<Product> {
|
||||
const item = { ...product, id: crypto.randomUUID() };
|
||||
const { resource } = await this.container.items.create<Product>(item);
|
||||
return resource!;
|
||||
}
|
||||
|
||||
async findByCategory(category: string): Promise<Product[]> {
|
||||
const querySpec: SqlQuerySpec = {
|
||||
query: "SELECT * FROM c WHERE c.partitionKey = @category",
|
||||
parameters: [{ name: "@category", value: category }],
|
||||
};
|
||||
const { resources } = await this.container.items
|
||||
.query<Product>(querySpec)
|
||||
.fetchAll();
|
||||
return resources;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `@azure/cosmos` | Data plane (this SDK) | `npm install @azure/cosmos` |
|
||||
| `@azure/arm-cosmosdb` | Management plane (ARM) | `npm install @azure/arm-cosmosdb` |
|
||||
| `@azure/identity` | Authentication | `npm install @azure/identity` |
|
||||
334
skills/azure-data-tables-java/SKILL.md
Normal file
334
skills/azure-data-tables-java/SKILL.md
Normal file
@@ -0,0 +1,334 @@
|
||||
---
|
||||
name: azure-data-tables-java
|
||||
description: Build table storage applications with Azure Tables SDK for Java. Use when working with Azure Table Storage or Cosmos DB Table API for NoSQL key-value data, schemaless storage, or structured data at scale.
|
||||
package: com.azure:azure-data-tables
|
||||
---
|
||||
|
||||
# Azure Tables SDK for Java
|
||||
|
||||
Build table storage applications using the Azure Tables SDK for Java. Works with both Azure Table Storage and Cosmos DB Table API.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-data-tables</artifactId>
|
||||
<version>12.6.0-beta.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With Connection String
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.TableServiceClient;
|
||||
import com.azure.data.tables.TableServiceClientBuilder;
|
||||
import com.azure.data.tables.TableClient;
|
||||
|
||||
TableServiceClient serviceClient = new TableServiceClientBuilder()
|
||||
.connectionString("<your-connection-string>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With Shared Key
|
||||
|
||||
```java
|
||||
import com.azure.core.credential.AzureNamedKeyCredential;
|
||||
|
||||
AzureNamedKeyCredential credential = new AzureNamedKeyCredential(
|
||||
"<account-name>",
|
||||
"<account-key>");
|
||||
|
||||
TableServiceClient serviceClient = new TableServiceClientBuilder()
|
||||
.endpoint("<your-table-account-url>")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With SAS Token
|
||||
|
||||
```java
|
||||
TableServiceClient serviceClient = new TableServiceClientBuilder()
|
||||
.endpoint("<your-table-account-url>")
|
||||
.sasToken("<sas-token>")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential (Storage only)
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
TableServiceClient serviceClient = new TableServiceClientBuilder()
|
||||
.endpoint("<your-table-account-url>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
- **TableServiceClient**: Manage tables (create, list, delete)
|
||||
- **TableClient**: Manage entities within a table (CRUD)
|
||||
- **Partition Key**: Groups entities for efficient queries
|
||||
- **Row Key**: Unique identifier within a partition
|
||||
- **Entity**: A row with up to 252 properties (1MB Storage, 2MB Cosmos)
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Create Table
|
||||
|
||||
```java
|
||||
// Create table (throws if exists)
|
||||
TableClient tableClient = serviceClient.createTable("mytable");
|
||||
|
||||
// Create if not exists (no exception)
|
||||
TableClient tableClient = serviceClient.createTableIfNotExists("mytable");
|
||||
```
|
||||
|
||||
### Get Table Client
|
||||
|
||||
```java
|
||||
// From service client
|
||||
TableClient tableClient = serviceClient.getTableClient("mytable");
|
||||
|
||||
// Direct construction
|
||||
TableClient tableClient = new TableClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.tableName("mytable")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Create Entity
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableEntity;
|
||||
|
||||
TableEntity entity = new TableEntity("partitionKey", "rowKey")
|
||||
.addProperty("Name", "Product A")
|
||||
.addProperty("Price", 29.99)
|
||||
.addProperty("Quantity", 100)
|
||||
.addProperty("IsAvailable", true);
|
||||
|
||||
tableClient.createEntity(entity);
|
||||
```
|
||||
|
||||
### Get Entity
|
||||
|
||||
```java
|
||||
TableEntity entity = tableClient.getEntity("partitionKey", "rowKey");
|
||||
|
||||
String name = (String) entity.getProperty("Name");
|
||||
Double price = (Double) entity.getProperty("Price");
|
||||
System.out.printf("Product: %s, Price: %.2f%n", name, price);
|
||||
```
|
||||
|
||||
### Update Entity
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableEntityUpdateMode;
|
||||
|
||||
// Merge (update only specified properties)
|
||||
TableEntity updateEntity = new TableEntity("partitionKey", "rowKey")
|
||||
.addProperty("Price", 24.99);
|
||||
tableClient.updateEntity(updateEntity, TableEntityUpdateMode.MERGE);
|
||||
|
||||
// Replace (replace entire entity)
|
||||
TableEntity replaceEntity = new TableEntity("partitionKey", "rowKey")
|
||||
.addProperty("Name", "Product A Updated")
|
||||
.addProperty("Price", 24.99)
|
||||
.addProperty("Quantity", 150);
|
||||
tableClient.updateEntity(replaceEntity, TableEntityUpdateMode.REPLACE);
|
||||
```
|
||||
|
||||
### Upsert Entity
|
||||
|
||||
```java
|
||||
// Insert or update (merge mode)
|
||||
tableClient.upsertEntity(entity, TableEntityUpdateMode.MERGE);
|
||||
|
||||
// Insert or replace
|
||||
tableClient.upsertEntity(entity, TableEntityUpdateMode.REPLACE);
|
||||
```
|
||||
|
||||
### Delete Entity
|
||||
|
||||
```java
|
||||
tableClient.deleteEntity("partitionKey", "rowKey");
|
||||
```
|
||||
|
||||
### List Entities
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.ListEntitiesOptions;
|
||||
|
||||
// List all entities
|
||||
for (TableEntity entity : tableClient.listEntities()) {
|
||||
System.out.printf("%s - %s%n",
|
||||
entity.getPartitionKey(),
|
||||
entity.getRowKey());
|
||||
}
|
||||
|
||||
// With filtering and selection
|
||||
ListEntitiesOptions options = new ListEntitiesOptions()
|
||||
.setFilter("PartitionKey eq 'sales'")
|
||||
.setSelect("Name", "Price");
|
||||
|
||||
for (TableEntity entity : tableClient.listEntities(options, null, null)) {
|
||||
System.out.printf("%s: %.2f%n",
|
||||
entity.getProperty("Name"),
|
||||
entity.getProperty("Price"));
|
||||
}
|
||||
```
|
||||
|
||||
### Query with OData Filter
|
||||
|
||||
```java
|
||||
// Filter by partition key
|
||||
ListEntitiesOptions options = new ListEntitiesOptions()
|
||||
.setFilter("PartitionKey eq 'electronics'");
|
||||
|
||||
// Filter with multiple conditions
|
||||
options.setFilter("PartitionKey eq 'electronics' and Price gt 100");
|
||||
|
||||
// Filter with comparison operators
|
||||
options.setFilter("Quantity ge 10 and Quantity le 100");
|
||||
|
||||
// Top N results
|
||||
options.setTop(10);
|
||||
|
||||
for (TableEntity entity : tableClient.listEntities(options, null, null)) {
|
||||
System.out.println(entity.getRowKey());
|
||||
}
|
||||
```
|
||||
|
||||
### Batch Operations (Transactions)
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableTransactionAction;
|
||||
import com.azure.data.tables.models.TableTransactionActionType;
|
||||
import java.util.Arrays;
|
||||
|
||||
// All entities must have same partition key
|
||||
List<TableTransactionAction> actions = Arrays.asList(
|
||||
new TableTransactionAction(
|
||||
TableTransactionActionType.CREATE,
|
||||
new TableEntity("batch", "row1").addProperty("Name", "Item 1")),
|
||||
new TableTransactionAction(
|
||||
TableTransactionActionType.CREATE,
|
||||
new TableEntity("batch", "row2").addProperty("Name", "Item 2")),
|
||||
new TableTransactionAction(
|
||||
TableTransactionActionType.UPSERT_MERGE,
|
||||
new TableEntity("batch", "row3").addProperty("Name", "Item 3"))
|
||||
);
|
||||
|
||||
tableClient.submitTransaction(actions);
|
||||
```
|
||||
|
||||
### List Tables
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableItem;
|
||||
import com.azure.data.tables.models.ListTablesOptions;
|
||||
|
||||
// List all tables
|
||||
for (TableItem table : serviceClient.listTables()) {
|
||||
System.out.println(table.getName());
|
||||
}
|
||||
|
||||
// Filter tables
|
||||
ListTablesOptions options = new ListTablesOptions()
|
||||
.setFilter("TableName eq 'mytable'");
|
||||
|
||||
for (TableItem table : serviceClient.listTables(options, null, null)) {
|
||||
System.out.println(table.getName());
|
||||
}
|
||||
```
|
||||
|
||||
### Delete Table
|
||||
|
||||
```java
|
||||
serviceClient.deleteTable("mytable");
|
||||
```
|
||||
|
||||
## Typed Entities
|
||||
|
||||
```java
|
||||
public class Product implements TableEntity {
|
||||
private String partitionKey;
|
||||
private String rowKey;
|
||||
private OffsetDateTime timestamp;
|
||||
private String eTag;
|
||||
private String name;
|
||||
private double price;
|
||||
|
||||
// Getters and setters for all fields
|
||||
@Override
|
||||
public String getPartitionKey() { return partitionKey; }
|
||||
@Override
|
||||
public void setPartitionKey(String partitionKey) { this.partitionKey = partitionKey; }
|
||||
@Override
|
||||
public String getRowKey() { return rowKey; }
|
||||
@Override
|
||||
public void setRowKey(String rowKey) { this.rowKey = rowKey; }
|
||||
// ... other getters/setters
|
||||
|
||||
public String getName() { return name; }
|
||||
public void setName(String name) { this.name = name; }
|
||||
public double getPrice() { return price; }
|
||||
public void setPrice(double price) { this.price = price; }
|
||||
}
|
||||
|
||||
// Usage
|
||||
Product product = new Product();
|
||||
product.setPartitionKey("electronics");
|
||||
product.setRowKey("laptop-001");
|
||||
product.setName("Laptop");
|
||||
product.setPrice(999.99);
|
||||
|
||||
tableClient.createEntity(product);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.data.tables.models.TableServiceException;
|
||||
|
||||
try {
|
||||
tableClient.createEntity(entity);
|
||||
} catch (TableServiceException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
// 409 = Conflict (entity exists)
|
||||
// 404 = Not Found
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Storage Account
|
||||
AZURE_TABLES_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...
|
||||
AZURE_TABLES_ENDPOINT=https://<account>.table.core.windows.net
|
||||
|
||||
# Cosmos DB Table API
|
||||
COSMOS_TABLE_ENDPOINT=https://<account>.table.cosmosdb.azure.com
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Partition Key Design**: Choose keys that distribute load evenly
|
||||
2. **Batch Operations**: Use transactions for atomic multi-entity updates
|
||||
3. **Query Optimization**: Always filter by PartitionKey when possible
|
||||
4. **Select Projection**: Only select needed properties for performance
|
||||
5. **Entity Size**: Keep entities under 1MB (Storage) or 2MB (Cosmos)
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Azure Tables Java"
|
||||
- "table storage SDK"
|
||||
- "Cosmos DB Table API"
|
||||
- "NoSQL key-value storage"
|
||||
- "partition key row key"
|
||||
- "table entity CRUD"
|
||||
243
skills/azure-data-tables-py/SKILL.md
Normal file
243
skills/azure-data-tables-py/SKILL.md
Normal file
@@ -0,0 +1,243 @@
|
||||
---
|
||||
name: azure-data-tables-py
|
||||
description: |
|
||||
Azure Tables SDK for Python (Storage and Cosmos DB). Use for NoSQL key-value storage, entity CRUD, and batch operations.
|
||||
Triggers: "table storage", "TableServiceClient", "TableClient", "entities", "PartitionKey", "RowKey".
|
||||
package: azure-data-tables
|
||||
---
|
||||
|
||||
# Azure Tables SDK for Python
|
||||
|
||||
NoSQL key-value store for structured data (Azure Storage Tables or Cosmos DB Table API).
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-data-tables azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Azure Storage Tables
|
||||
AZURE_STORAGE_ACCOUNT_URL=https://<account>.table.core.windows.net
|
||||
|
||||
# Cosmos DB Table API
|
||||
COSMOS_TABLE_ENDPOINT=https://<account>.table.cosmos.azure.com
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.data.tables import TableServiceClient, TableClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
endpoint = "https://<account>.table.core.windows.net"
|
||||
|
||||
# Service client (manage tables)
|
||||
service_client = TableServiceClient(endpoint=endpoint, credential=credential)
|
||||
|
||||
# Table client (work with entities)
|
||||
table_client = TableClient(endpoint=endpoint, table_name="mytable", credential=credential)
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `TableServiceClient` | Create/delete tables, list tables |
|
||||
| `TableClient` | Entity CRUD, queries |
|
||||
|
||||
## Table Operations
|
||||
|
||||
```python
|
||||
# Create table
|
||||
service_client.create_table("mytable")
|
||||
|
||||
# Create if not exists
|
||||
service_client.create_table_if_not_exists("mytable")
|
||||
|
||||
# Delete table
|
||||
service_client.delete_table("mytable")
|
||||
|
||||
# List tables
|
||||
for table in service_client.list_tables():
|
||||
print(table.name)
|
||||
|
||||
# Get table client
|
||||
table_client = service_client.get_table_client("mytable")
|
||||
```
|
||||
|
||||
## Entity Operations
|
||||
|
||||
**Important**: Every entity requires `PartitionKey` and `RowKey` (together form unique ID).
|
||||
|
||||
### Create Entity
|
||||
|
||||
```python
|
||||
entity = {
|
||||
"PartitionKey": "sales",
|
||||
"RowKey": "order-001",
|
||||
"product": "Widget",
|
||||
"quantity": 5,
|
||||
"price": 9.99,
|
||||
"shipped": False
|
||||
}
|
||||
|
||||
# Create (fails if exists)
|
||||
table_client.create_entity(entity=entity)
|
||||
|
||||
# Upsert (create or replace)
|
||||
table_client.upsert_entity(entity=entity)
|
||||
```
|
||||
|
||||
### Get Entity
|
||||
|
||||
```python
|
||||
# Get by key (fastest)
|
||||
entity = table_client.get_entity(
|
||||
partition_key="sales",
|
||||
row_key="order-001"
|
||||
)
|
||||
print(f"Product: {entity['product']}")
|
||||
```
|
||||
|
||||
### Update Entity
|
||||
|
||||
```python
|
||||
# Replace entire entity
|
||||
entity["quantity"] = 10
|
||||
table_client.update_entity(entity=entity, mode="replace")
|
||||
|
||||
# Merge (update specific fields only)
|
||||
update = {
|
||||
"PartitionKey": "sales",
|
||||
"RowKey": "order-001",
|
||||
"shipped": True
|
||||
}
|
||||
table_client.update_entity(entity=update, mode="merge")
|
||||
```
|
||||
|
||||
### Delete Entity
|
||||
|
||||
```python
|
||||
table_client.delete_entity(
|
||||
partition_key="sales",
|
||||
row_key="order-001"
|
||||
)
|
||||
```
|
||||
|
||||
## Query Entities
|
||||
|
||||
### Query Within Partition
|
||||
|
||||
```python
|
||||
# Query by partition (efficient)
|
||||
entities = table_client.query_entities(
|
||||
query_filter="PartitionKey eq 'sales'"
|
||||
)
|
||||
for entity in entities:
|
||||
print(entity)
|
||||
```
|
||||
|
||||
### Query with Filters
|
||||
|
||||
```python
|
||||
# Filter by properties
|
||||
entities = table_client.query_entities(
|
||||
query_filter="PartitionKey eq 'sales' and quantity gt 3"
|
||||
)
|
||||
|
||||
# With parameters (safer)
|
||||
entities = table_client.query_entities(
|
||||
query_filter="PartitionKey eq @pk and price lt @max_price",
|
||||
parameters={"pk": "sales", "max_price": 50.0}
|
||||
)
|
||||
```
|
||||
|
||||
### Select Specific Properties
|
||||
|
||||
```python
|
||||
entities = table_client.query_entities(
|
||||
query_filter="PartitionKey eq 'sales'",
|
||||
select=["RowKey", "product", "price"]
|
||||
)
|
||||
```
|
||||
|
||||
### List All Entities
|
||||
|
||||
```python
|
||||
# List all (cross-partition - use sparingly)
|
||||
for entity in table_client.list_entities():
|
||||
print(entity)
|
||||
```
|
||||
|
||||
## Batch Operations
|
||||
|
||||
```python
|
||||
from azure.data.tables import TableTransactionError
|
||||
|
||||
# Batch operations (same partition only!)
|
||||
operations = [
|
||||
("create", {"PartitionKey": "batch", "RowKey": "1", "data": "first"}),
|
||||
("create", {"PartitionKey": "batch", "RowKey": "2", "data": "second"}),
|
||||
("upsert", {"PartitionKey": "batch", "RowKey": "3", "data": "third"}),
|
||||
]
|
||||
|
||||
try:
|
||||
table_client.submit_transaction(operations)
|
||||
except TableTransactionError as e:
|
||||
print(f"Transaction failed: {e}")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.data.tables.aio import TableServiceClient, TableClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def table_operations():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with TableClient(
|
||||
endpoint="https://<account>.table.core.windows.net",
|
||||
table_name="mytable",
|
||||
credential=credential
|
||||
) as client:
|
||||
# Create
|
||||
await client.create_entity(entity={
|
||||
"PartitionKey": "async",
|
||||
"RowKey": "1",
|
||||
"data": "test"
|
||||
})
|
||||
|
||||
# Query
|
||||
async for entity in client.query_entities("PartitionKey eq 'async'"):
|
||||
print(entity)
|
||||
|
||||
import asyncio
|
||||
asyncio.run(table_operations())
|
||||
```
|
||||
|
||||
## Data Types
|
||||
|
||||
| Python Type | Table Storage Type |
|
||||
|-------------|-------------------|
|
||||
| `str` | String |
|
||||
| `int` | Int64 |
|
||||
| `float` | Double |
|
||||
| `bool` | Boolean |
|
||||
| `datetime` | DateTime |
|
||||
| `bytes` | Binary |
|
||||
| `UUID` | Guid |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Design partition keys** for query patterns and even distribution
|
||||
2. **Query within partitions** whenever possible (cross-partition is expensive)
|
||||
3. **Use batch operations** for multiple entities in same partition
|
||||
4. **Use `upsert_entity`** for idempotent writes
|
||||
5. **Use parameterized queries** to prevent injection
|
||||
6. **Keep entities small** — max 1MB per entity
|
||||
7. **Use async client** for high-throughput scenarios
|
||||
488
skills/azure-eventgrid-dotnet/SKILL.md
Normal file
488
skills/azure-eventgrid-dotnet/SKILL.md
Normal file
@@ -0,0 +1,488 @@
|
||||
---
|
||||
name: azure-eventgrid-dotnet
|
||||
description: |
|
||||
Azure Event Grid SDK for .NET. Client library for publishing and consuming events with Azure Event Grid. Use for event-driven architectures, pub/sub messaging, CloudEvents, and EventGridEvents. Triggers: "Event Grid", "EventGridPublisherClient", "CloudEvent", "EventGridEvent", "publish events .NET", "event-driven", "pub/sub".
|
||||
package: Azure.Messaging.EventGrid
|
||||
---
|
||||
|
||||
# Azure.Messaging.EventGrid (.NET)
|
||||
|
||||
Client library for publishing events to Azure Event Grid topics, domains, and namespaces.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# For topics and domains (push delivery)
|
||||
dotnet add package Azure.Messaging.EventGrid
|
||||
|
||||
# For namespaces (pull delivery)
|
||||
dotnet add package Azure.Messaging.EventGrid.Namespaces
|
||||
|
||||
# For CloudNative CloudEvents interop
|
||||
dotnet add package Microsoft.Azure.Messaging.EventGrid.CloudNativeCloudEvents
|
||||
```
|
||||
|
||||
**Current Version**: 4.28.0 (stable)
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Topic/Domain endpoint
|
||||
EVENT_GRID_TOPIC_ENDPOINT=https://<topic-name>.<region>.eventgrid.azure.net/api/events
|
||||
EVENT_GRID_TOPIC_KEY=<access-key>
|
||||
|
||||
# Namespace endpoint (for pull delivery)
|
||||
EVENT_GRID_NAMESPACE_ENDPOINT=https://<namespace>.<region>.eventgrid.azure.net
|
||||
EVENT_GRID_TOPIC_NAME=<topic-name>
|
||||
EVENT_GRID_SUBSCRIPTION_NAME=<subscription-name>
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
Push Delivery (Topics/Domains)
|
||||
└── EventGridPublisherClient
|
||||
├── SendEventAsync(EventGridEvent)
|
||||
├── SendEventsAsync(IEnumerable<EventGridEvent>)
|
||||
├── SendEventAsync(CloudEvent)
|
||||
└── SendEventsAsync(IEnumerable<CloudEvent>)
|
||||
|
||||
Pull Delivery (Namespaces)
|
||||
├── EventGridSenderClient
|
||||
│ └── SendAsync(CloudEvent)
|
||||
└── EventGridReceiverClient
|
||||
├── ReceiveAsync()
|
||||
├── AcknowledgeAsync()
|
||||
├── ReleaseAsync()
|
||||
└── RejectAsync()
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### API Key Authentication
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Messaging.EventGrid;
|
||||
|
||||
EventGridPublisherClient client = new(
|
||||
new Uri("https://mytopic.eastus-1.eventgrid.azure.net/api/events"),
|
||||
new AzureKeyCredential("<access-key>"));
|
||||
```
|
||||
|
||||
### Microsoft Entra ID (Recommended)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventGrid;
|
||||
|
||||
EventGridPublisherClient client = new(
|
||||
new Uri("https://mytopic.eastus-1.eventgrid.azure.net/api/events"),
|
||||
new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
### SAS Token Authentication
|
||||
|
||||
```csharp
|
||||
string sasToken = EventGridPublisherClient.BuildSharedAccessSignature(
|
||||
new Uri(topicEndpoint),
|
||||
DateTimeOffset.UtcNow.AddHours(1),
|
||||
new AzureKeyCredential(topicKey));
|
||||
|
||||
var sasCredential = new AzureSasCredential(sasToken);
|
||||
EventGridPublisherClient client = new(
|
||||
new Uri(topicEndpoint),
|
||||
sasCredential);
|
||||
```
|
||||
|
||||
## Publishing Events
|
||||
|
||||
### EventGridEvent Schema
|
||||
|
||||
```csharp
|
||||
EventGridPublisherClient client = new(
|
||||
new Uri(topicEndpoint),
|
||||
new AzureKeyCredential(topicKey));
|
||||
|
||||
// Single event
|
||||
EventGridEvent egEvent = new(
|
||||
subject: "orders/12345",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: new { OrderId = "12345", Amount = 99.99 });
|
||||
|
||||
await client.SendEventAsync(egEvent);
|
||||
|
||||
// Batch of events
|
||||
List<EventGridEvent> events = new()
|
||||
{
|
||||
new EventGridEvent(
|
||||
subject: "orders/12345",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: new OrderData { OrderId = "12345", Amount = 99.99 }),
|
||||
new EventGridEvent(
|
||||
subject: "orders/12346",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: new OrderData { OrderId = "12346", Amount = 149.99 })
|
||||
};
|
||||
|
||||
await client.SendEventsAsync(events);
|
||||
```
|
||||
|
||||
### CloudEvent Schema
|
||||
|
||||
```csharp
|
||||
CloudEvent cloudEvent = new(
|
||||
source: "/orders/system",
|
||||
type: "Order.Created",
|
||||
data: new { OrderId = "12345", Amount = 99.99 });
|
||||
|
||||
cloudEvent.Subject = "orders/12345";
|
||||
cloudEvent.Id = Guid.NewGuid().ToString();
|
||||
cloudEvent.Time = DateTimeOffset.UtcNow;
|
||||
|
||||
await client.SendEventAsync(cloudEvent);
|
||||
|
||||
// Batch of CloudEvents
|
||||
List<CloudEvent> cloudEvents = new()
|
||||
{
|
||||
new CloudEvent("/orders", "Order.Created", new { OrderId = "1" }),
|
||||
new CloudEvent("/orders", "Order.Updated", new { OrderId = "2" })
|
||||
};
|
||||
|
||||
await client.SendEventsAsync(cloudEvents);
|
||||
```
|
||||
|
||||
### Publishing to Event Grid Domain
|
||||
|
||||
```csharp
|
||||
// Events must specify the Topic property for domain routing
|
||||
List<EventGridEvent> events = new()
|
||||
{
|
||||
new EventGridEvent(
|
||||
subject: "orders/12345",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: new { OrderId = "12345" })
|
||||
{
|
||||
Topic = "orders-topic" // Domain topic name
|
||||
},
|
||||
new EventGridEvent(
|
||||
subject: "inventory/item-1",
|
||||
eventType: "Inventory.Updated",
|
||||
dataVersion: "1.0",
|
||||
data: new { ItemId = "item-1" })
|
||||
{
|
||||
Topic = "inventory-topic"
|
||||
}
|
||||
};
|
||||
|
||||
await client.SendEventsAsync(events);
|
||||
```
|
||||
|
||||
### Custom Serialization
|
||||
|
||||
```csharp
|
||||
using System.Text.Json;
|
||||
|
||||
var serializerOptions = new JsonSerializerOptions
|
||||
{
|
||||
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
|
||||
};
|
||||
|
||||
var customSerializer = new JsonObjectSerializer(serializerOptions);
|
||||
|
||||
EventGridEvent egEvent = new(
|
||||
subject: "orders/12345",
|
||||
eventType: "Order.Created",
|
||||
dataVersion: "1.0",
|
||||
data: customSerializer.Serialize(new OrderData { OrderId = "12345" }));
|
||||
|
||||
await client.SendEventAsync(egEvent);
|
||||
```
|
||||
|
||||
## Pull Delivery (Namespaces)
|
||||
|
||||
### Send Events to Namespace Topic
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Messaging;
|
||||
using Azure.Messaging.EventGrid.Namespaces;
|
||||
|
||||
var senderClient = new EventGridSenderClient(
|
||||
new Uri(namespaceEndpoint),
|
||||
topicName,
|
||||
new AzureKeyCredential(topicKey));
|
||||
|
||||
// Send single event
|
||||
CloudEvent cloudEvent = new("employee_source", "Employee.Created",
|
||||
new { Name = "John", Age = 30 });
|
||||
await senderClient.SendAsync(cloudEvent);
|
||||
|
||||
// Send batch
|
||||
await senderClient.SendAsync(new[]
|
||||
{
|
||||
new CloudEvent("source", "type", new { Name = "Alice" }),
|
||||
new CloudEvent("source", "type", new { Name = "Bob" })
|
||||
});
|
||||
```
|
||||
|
||||
### Receive and Process Events
|
||||
|
||||
```csharp
|
||||
var receiverClient = new EventGridReceiverClient(
|
||||
new Uri(namespaceEndpoint),
|
||||
topicName,
|
||||
subscriptionName,
|
||||
new AzureKeyCredential(topicKey));
|
||||
|
||||
// Receive events
|
||||
ReceiveResult result = await receiverClient.ReceiveAsync(maxEvents: 10);
|
||||
|
||||
List<string> lockTokensToAck = new();
|
||||
List<string> lockTokensToRelease = new();
|
||||
|
||||
foreach (ReceiveDetails detail in result.Details)
|
||||
{
|
||||
CloudEvent cloudEvent = detail.Event;
|
||||
string lockToken = detail.BrokerProperties.LockToken;
|
||||
|
||||
try
|
||||
{
|
||||
// Process the event
|
||||
Console.WriteLine($"Event: {cloudEvent.Type}, Data: {cloudEvent.Data}");
|
||||
lockTokensToAck.Add(lockToken);
|
||||
}
|
||||
catch (Exception)
|
||||
{
|
||||
// Release for retry
|
||||
lockTokensToRelease.Add(lockToken);
|
||||
}
|
||||
}
|
||||
|
||||
// Acknowledge successfully processed events
|
||||
if (lockTokensToAck.Any())
|
||||
{
|
||||
await receiverClient.AcknowledgeAsync(lockTokensToAck);
|
||||
}
|
||||
|
||||
// Release events for retry
|
||||
if (lockTokensToRelease.Any())
|
||||
{
|
||||
await receiverClient.ReleaseAsync(lockTokensToRelease);
|
||||
}
|
||||
```
|
||||
|
||||
### Reject Events (Dead Letter)
|
||||
|
||||
```csharp
|
||||
// Reject events that cannot be processed
|
||||
await receiverClient.RejectAsync(new[] { lockToken });
|
||||
```
|
||||
|
||||
## Consuming Events (Azure Functions)
|
||||
|
||||
### EventGridEvent Trigger
|
||||
|
||||
```csharp
|
||||
using Azure.Messaging.EventGrid;
|
||||
using Microsoft.Azure.WebJobs;
|
||||
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
|
||||
|
||||
public static class EventGridFunction
|
||||
{
|
||||
[FunctionName("ProcessEventGridEvent")]
|
||||
public static void Run(
|
||||
[EventGridTrigger] EventGridEvent eventGridEvent,
|
||||
ILogger log)
|
||||
{
|
||||
log.LogInformation($"Event Type: {eventGridEvent.EventType}");
|
||||
log.LogInformation($"Subject: {eventGridEvent.Subject}");
|
||||
log.LogInformation($"Data: {eventGridEvent.Data}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### CloudEvent Trigger
|
||||
|
||||
```csharp
|
||||
using Azure.Messaging;
|
||||
using Microsoft.Azure.Functions.Worker;
|
||||
|
||||
public class CloudEventFunction
|
||||
{
|
||||
[Function("ProcessCloudEvent")]
|
||||
public void Run(
|
||||
[EventGridTrigger] CloudEvent cloudEvent,
|
||||
FunctionContext context)
|
||||
{
|
||||
var logger = context.GetLogger("ProcessCloudEvent");
|
||||
logger.LogInformation($"Event Type: {cloudEvent.Type}");
|
||||
logger.LogInformation($"Source: {cloudEvent.Source}");
|
||||
logger.LogInformation($"Data: {cloudEvent.Data}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Parsing Events
|
||||
|
||||
### Parse EventGridEvent
|
||||
|
||||
```csharp
|
||||
// From JSON string
|
||||
string json = "..."; // Event Grid webhook payload
|
||||
EventGridEvent[] events = EventGridEvent.ParseMany(BinaryData.FromString(json));
|
||||
|
||||
foreach (EventGridEvent egEvent in events)
|
||||
{
|
||||
if (egEvent.TryGetSystemEventData(out object systemEvent))
|
||||
{
|
||||
// Handle system event
|
||||
switch (systemEvent)
|
||||
{
|
||||
case StorageBlobCreatedEventData blobCreated:
|
||||
Console.WriteLine($"Blob created: {blobCreated.Url}");
|
||||
break;
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
// Handle custom event
|
||||
var customData = egEvent.Data.ToObjectFromJson<MyCustomData>();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Parse CloudEvent
|
||||
|
||||
```csharp
|
||||
CloudEvent[] cloudEvents = CloudEvent.ParseMany(BinaryData.FromString(json));
|
||||
|
||||
foreach (CloudEvent cloudEvent in cloudEvents)
|
||||
{
|
||||
var data = cloudEvent.Data.ToObjectFromJson<MyEventData>();
|
||||
Console.WriteLine($"Type: {cloudEvent.Type}, Data: {data}");
|
||||
}
|
||||
```
|
||||
|
||||
## System Events
|
||||
|
||||
```csharp
|
||||
// Common system event types
|
||||
using Azure.Messaging.EventGrid.SystemEvents;
|
||||
|
||||
// Storage events
|
||||
StorageBlobCreatedEventData blobCreated;
|
||||
StorageBlobDeletedEventData blobDeleted;
|
||||
|
||||
// Resource events
|
||||
ResourceWriteSuccessEventData resourceCreated;
|
||||
ResourceDeleteSuccessEventData resourceDeleted;
|
||||
|
||||
// App Service events
|
||||
WebAppUpdatedEventData webAppUpdated;
|
||||
|
||||
// Container Registry events
|
||||
ContainerRegistryImagePushedEventData imagePushed;
|
||||
|
||||
// IoT Hub events
|
||||
IotHubDeviceCreatedEventData deviceCreated;
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `EventGridPublisherClient` | Publish to topics/domains |
|
||||
| `EventGridSenderClient` | Send to namespace topics |
|
||||
| `EventGridReceiverClient` | Receive from namespace subscriptions |
|
||||
| `EventGridEvent` | Event Grid native schema |
|
||||
| `CloudEvent` | CloudEvents 1.0 schema |
|
||||
| `ReceiveResult` | Pull delivery response |
|
||||
| `ReceiveDetails` | Event with broker properties |
|
||||
| `BrokerProperties` | Lock token, delivery count |
|
||||
|
||||
## Event Schemas Comparison
|
||||
|
||||
| Feature | EventGridEvent | CloudEvent |
|
||||
|---------|----------------|------------|
|
||||
| Standard | Azure-specific | CNCF standard |
|
||||
| Required fields | subject, eventType, dataVersion, data | source, type |
|
||||
| Extensibility | Limited | Extension attributes |
|
||||
| Interoperability | Azure only | Cross-platform |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use CloudEvents** — Prefer CloudEvents for new implementations (industry standard)
|
||||
2. **Batch events** — Send multiple events in one call for efficiency
|
||||
3. **Use Entra ID** — Prefer managed identity over access keys
|
||||
4. **Idempotent handlers** — Events may be delivered more than once
|
||||
5. **Set event TTL** — Configure time-to-live for namespace events
|
||||
6. **Handle partial failures** — Acknowledge/release events individually
|
||||
7. **Use dead-letter** — Configure dead-letter for failed events
|
||||
8. **Validate schemas** — Validate event data before processing
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
await client.SendEventAsync(cloudEvent);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 401)
|
||||
{
|
||||
Console.WriteLine("Authentication failed - check credentials");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 403)
|
||||
{
|
||||
Console.WriteLine("Authorization failed - check RBAC permissions");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 413)
|
||||
{
|
||||
Console.WriteLine("Payload too large - max 1MB per event, 1MB total batch");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Event Grid error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Failover Pattern
|
||||
|
||||
```csharp
|
||||
try
|
||||
{
|
||||
var primaryClient = new EventGridPublisherClient(primaryUri, primaryKey);
|
||||
await primaryClient.SendEventsAsync(events);
|
||||
}
|
||||
catch (RequestFailedException)
|
||||
{
|
||||
// Failover to secondary region
|
||||
var secondaryClient = new EventGridPublisherClient(secondaryUri, secondaryKey);
|
||||
await secondaryClient.SendEventsAsync(events);
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Messaging.EventGrid` | Topics/Domains (this SDK) | `dotnet add package Azure.Messaging.EventGrid` |
|
||||
| `Azure.Messaging.EventGrid.Namespaces` | Pull delivery | `dotnet add package Azure.Messaging.EventGrid.Namespaces` |
|
||||
| `Azure.Identity` | Authentication | `dotnet add package Azure.Identity` |
|
||||
| `Microsoft.Azure.WebJobs.Extensions.EventGrid` | Azure Functions trigger | `dotnet add package Microsoft.Azure.WebJobs.Extensions.EventGrid` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.Messaging.EventGrid |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.messaging.eventgrid |
|
||||
| Quickstart | https://learn.microsoft.com/azure/event-grid/custom-event-quickstart |
|
||||
| Pull Delivery | https://learn.microsoft.com/azure/event-grid/pull-delivery-overview |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/eventgrid/Azure.Messaging.EventGrid |
|
||||
305
skills/azure-eventgrid-java/SKILL.md
Normal file
305
skills/azure-eventgrid-java/SKILL.md
Normal file
@@ -0,0 +1,305 @@
|
||||
---
|
||||
name: azure-eventgrid-java
|
||||
description: Build event-driven applications with Azure Event Grid SDK for Java. Use when publishing events, implementing pub/sub patterns, or integrating with Azure services via events.
|
||||
package: com.azure:azure-messaging-eventgrid
|
||||
---
|
||||
|
||||
# Azure Event Grid SDK for Java
|
||||
|
||||
Build event-driven applications using the Azure Event Grid SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-messaging-eventgrid</artifactId>
|
||||
<version>4.27.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### EventGridPublisherClient
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.EventGridPublisherClient;
|
||||
import com.azure.messaging.eventgrid.EventGridPublisherClientBuilder;
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
// With API Key
|
||||
EventGridPublisherClient<EventGridEvent> client = new EventGridPublisherClientBuilder()
|
||||
.endpoint("<topic-endpoint>")
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.buildEventGridEventPublisherClient();
|
||||
|
||||
// For CloudEvents
|
||||
EventGridPublisherClient<CloudEvent> cloudClient = new EventGridPublisherClientBuilder()
|
||||
.endpoint("<topic-endpoint>")
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.buildCloudEventPublisherClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
EventGridPublisherClient<EventGridEvent> client = new EventGridPublisherClientBuilder()
|
||||
.endpoint("<topic-endpoint>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildEventGridEventPublisherClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.EventGridPublisherAsyncClient;
|
||||
|
||||
EventGridPublisherAsyncClient<EventGridEvent> asyncClient = new EventGridPublisherClientBuilder()
|
||||
.endpoint("<topic-endpoint>")
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.buildEventGridEventPublisherAsyncClient();
|
||||
```
|
||||
|
||||
## Event Types
|
||||
|
||||
| Type | Description |
|
||||
|------|-------------|
|
||||
| `EventGridEvent` | Azure Event Grid native schema |
|
||||
| `CloudEvent` | CNCF CloudEvents 1.0 specification |
|
||||
| `BinaryData` | Custom schema events |
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Publish EventGridEvent
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.EventGridEvent;
|
||||
import com.azure.core.util.BinaryData;
|
||||
|
||||
EventGridEvent event = new EventGridEvent(
|
||||
"resource/path", // subject
|
||||
"MyApp.Events.OrderCreated", // eventType
|
||||
BinaryData.fromObject(new OrderData("order-123", 99.99)), // data
|
||||
"1.0" // dataVersion
|
||||
);
|
||||
|
||||
client.sendEvent(event);
|
||||
```
|
||||
|
||||
### Publish Multiple Events
|
||||
|
||||
```java
|
||||
List<EventGridEvent> events = Arrays.asList(
|
||||
new EventGridEvent("orders/1", "Order.Created",
|
||||
BinaryData.fromObject(order1), "1.0"),
|
||||
new EventGridEvent("orders/2", "Order.Created",
|
||||
BinaryData.fromObject(order2), "1.0")
|
||||
);
|
||||
|
||||
client.sendEvents(events);
|
||||
```
|
||||
|
||||
### Publish CloudEvent
|
||||
|
||||
```java
|
||||
import com.azure.core.models.CloudEvent;
|
||||
import com.azure.core.models.CloudEventDataFormat;
|
||||
|
||||
CloudEvent cloudEvent = new CloudEvent(
|
||||
"/myapp/orders", // source
|
||||
"order.created", // type
|
||||
BinaryData.fromObject(orderData), // data
|
||||
CloudEventDataFormat.JSON // dataFormat
|
||||
);
|
||||
cloudEvent.setSubject("orders/12345");
|
||||
cloudEvent.setId(UUID.randomUUID().toString());
|
||||
|
||||
cloudClient.sendEvent(cloudEvent);
|
||||
```
|
||||
|
||||
### Publish CloudEvents Batch
|
||||
|
||||
```java
|
||||
List<CloudEvent> cloudEvents = Arrays.asList(
|
||||
new CloudEvent("/app", "event.type1", BinaryData.fromString("data1"), CloudEventDataFormat.JSON),
|
||||
new CloudEvent("/app", "event.type2", BinaryData.fromString("data2"), CloudEventDataFormat.JSON)
|
||||
);
|
||||
|
||||
cloudClient.sendEvents(cloudEvents);
|
||||
```
|
||||
|
||||
### Async Publishing
|
||||
|
||||
```java
|
||||
asyncClient.sendEvent(event)
|
||||
.subscribe(
|
||||
unused -> System.out.println("Event sent successfully"),
|
||||
error -> System.err.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
// With multiple events
|
||||
asyncClient.sendEvents(events)
|
||||
.doOnSuccess(unused -> System.out.println("All events sent"))
|
||||
.doOnError(error -> System.err.println("Failed: " + error))
|
||||
.block(); // Block if needed
|
||||
```
|
||||
|
||||
### Custom Event Data Class
|
||||
|
||||
```java
|
||||
public class OrderData {
|
||||
private String orderId;
|
||||
private double amount;
|
||||
private String customerId;
|
||||
|
||||
public OrderData(String orderId, double amount) {
|
||||
this.orderId = orderId;
|
||||
this.amount = amount;
|
||||
}
|
||||
|
||||
// Getters and setters
|
||||
}
|
||||
|
||||
// Usage
|
||||
OrderData order = new OrderData("ORD-123", 150.00);
|
||||
EventGridEvent event = new EventGridEvent(
|
||||
"orders/" + order.getOrderId(),
|
||||
"MyApp.Order.Created",
|
||||
BinaryData.fromObject(order),
|
||||
"1.0"
|
||||
);
|
||||
```
|
||||
|
||||
## Receiving Events
|
||||
|
||||
### Parse EventGridEvent
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.EventGridEvent;
|
||||
|
||||
// From JSON string (e.g., webhook payload)
|
||||
String jsonPayload = "[{\"id\": \"...\", ...}]";
|
||||
List<EventGridEvent> events = EventGridEvent.fromString(jsonPayload);
|
||||
|
||||
for (EventGridEvent event : events) {
|
||||
System.out.println("Event Type: " + event.getEventType());
|
||||
System.out.println("Subject: " + event.getSubject());
|
||||
System.out.println("Event Time: " + event.getEventTime());
|
||||
|
||||
// Get data
|
||||
BinaryData data = event.getData();
|
||||
OrderData orderData = data.toObject(OrderData.class);
|
||||
}
|
||||
```
|
||||
|
||||
### Parse CloudEvent
|
||||
|
||||
```java
|
||||
import com.azure.core.models.CloudEvent;
|
||||
|
||||
String cloudEventJson = "[{\"specversion\": \"1.0\", ...}]";
|
||||
List<CloudEvent> cloudEvents = CloudEvent.fromString(cloudEventJson);
|
||||
|
||||
for (CloudEvent event : cloudEvents) {
|
||||
System.out.println("Type: " + event.getType());
|
||||
System.out.println("Source: " + event.getSource());
|
||||
System.out.println("ID: " + event.getId());
|
||||
|
||||
MyEventData data = event.getData().toObject(MyEventData.class);
|
||||
}
|
||||
```
|
||||
|
||||
### Handle System Events
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.systemevents.*;
|
||||
|
||||
for (EventGridEvent event : events) {
|
||||
if (event.getEventType().equals("Microsoft.Storage.BlobCreated")) {
|
||||
StorageBlobCreatedEventData blobData =
|
||||
event.getData().toObject(StorageBlobCreatedEventData.class);
|
||||
System.out.println("Blob URL: " + blobData.getUrl());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Event Grid Namespaces (MQTT/Pull)
|
||||
|
||||
### Receive from Namespace Topic
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventgrid.namespaces.EventGridReceiverClient;
|
||||
import com.azure.messaging.eventgrid.namespaces.EventGridReceiverClientBuilder;
|
||||
import com.azure.messaging.eventgrid.namespaces.models.*;
|
||||
|
||||
EventGridReceiverClient receiverClient = new EventGridReceiverClientBuilder()
|
||||
.endpoint("<namespace-endpoint>")
|
||||
.credential(new AzureKeyCredential("<key>"))
|
||||
.topicName("my-topic")
|
||||
.subscriptionName("my-subscription")
|
||||
.buildClient();
|
||||
|
||||
// Receive events
|
||||
ReceiveResult result = receiverClient.receive(10, Duration.ofSeconds(30));
|
||||
|
||||
for (ReceiveDetails detail : result.getValue()) {
|
||||
CloudEvent event = detail.getEvent();
|
||||
System.out.println("Event: " + event.getType());
|
||||
|
||||
// Acknowledge the event
|
||||
receiverClient.acknowledge(Arrays.asList(detail.getBrokerProperties().getLockToken()));
|
||||
}
|
||||
```
|
||||
|
||||
### Reject or Release Events
|
||||
|
||||
```java
|
||||
// Reject (don't retry)
|
||||
receiverClient.reject(Arrays.asList(lockToken));
|
||||
|
||||
// Release (retry later)
|
||||
receiverClient.release(Arrays.asList(lockToken));
|
||||
|
||||
// Release with delay
|
||||
receiverClient.release(Arrays.asList(lockToken),
|
||||
new ReleaseOptions().setDelay(ReleaseDelay.BY_60_SECONDS));
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.sendEvent(event);
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENT_GRID_TOPIC_ENDPOINT=https://<topic-name>.<region>.eventgrid.azure.net/api/events
|
||||
EVENT_GRID_ACCESS_KEY=<your-access-key>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Batch Events**: Send multiple events in one call when possible
|
||||
2. **Idempotency**: Include unique event IDs for deduplication
|
||||
3. **Schema Validation**: Use strongly-typed event data classes
|
||||
4. **Retry Logic**: Built-in, but consider dead-letter for failures
|
||||
5. **Event Size**: Keep events under 1MB (64KB for basic tier)
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Event Grid Java"
|
||||
- "publish events Azure"
|
||||
- "CloudEvent SDK"
|
||||
- "event-driven messaging"
|
||||
- "pub/sub Azure"
|
||||
- "webhook events"
|
||||
168
skills/azure-eventgrid-py/SKILL.md
Normal file
168
skills/azure-eventgrid-py/SKILL.md
Normal file
@@ -0,0 +1,168 @@
|
||||
---
|
||||
name: azure-eventgrid-py
|
||||
description: |
|
||||
Azure Event Grid SDK for Python. Use for publishing events, handling CloudEvents, and event-driven architectures.
|
||||
Triggers: "event grid", "EventGridPublisherClient", "CloudEvent", "EventGridEvent", "publish events".
|
||||
package: azure-eventgrid
|
||||
---
|
||||
|
||||
# Azure Event Grid SDK for Python
|
||||
|
||||
Event routing service for building event-driven applications with pub/sub semantics.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-eventgrid azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENTGRID_TOPIC_ENDPOINT=https://<topic-name>.<region>.eventgrid.azure.net/api/events
|
||||
EVENTGRID_NAMESPACE_ENDPOINT=https://<namespace>.<region>.eventgrid.azure.net
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.eventgrid import EventGridPublisherClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
endpoint = "https://<topic-name>.<region>.eventgrid.azure.net/api/events"
|
||||
|
||||
client = EventGridPublisherClient(endpoint, credential)
|
||||
```
|
||||
|
||||
## Event Types
|
||||
|
||||
| Format | Class | Use Case |
|
||||
|--------|-------|----------|
|
||||
| Cloud Events 1.0 | `CloudEvent` | Standard, interoperable (recommended) |
|
||||
| Event Grid Schema | `EventGridEvent` | Azure-native format |
|
||||
|
||||
## Publish CloudEvents
|
||||
|
||||
```python
|
||||
from azure.eventgrid import EventGridPublisherClient, CloudEvent
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = EventGridPublisherClient(endpoint, DefaultAzureCredential())
|
||||
|
||||
# Single event
|
||||
event = CloudEvent(
|
||||
type="MyApp.Events.OrderCreated",
|
||||
source="/myapp/orders",
|
||||
data={"order_id": "12345", "amount": 99.99}
|
||||
)
|
||||
client.send(event)
|
||||
|
||||
# Multiple events
|
||||
events = [
|
||||
CloudEvent(
|
||||
type="MyApp.Events.OrderCreated",
|
||||
source="/myapp/orders",
|
||||
data={"order_id": f"order-{i}"}
|
||||
)
|
||||
for i in range(10)
|
||||
]
|
||||
client.send(events)
|
||||
```
|
||||
|
||||
## Publish EventGridEvents
|
||||
|
||||
```python
|
||||
from azure.eventgrid import EventGridEvent
|
||||
from datetime import datetime, timezone
|
||||
|
||||
event = EventGridEvent(
|
||||
subject="/myapp/orders/12345",
|
||||
event_type="MyApp.Events.OrderCreated",
|
||||
data={"order_id": "12345", "amount": 99.99},
|
||||
data_version="1.0"
|
||||
)
|
||||
|
||||
client.send(event)
|
||||
```
|
||||
|
||||
## Event Properties
|
||||
|
||||
### CloudEvent Properties
|
||||
|
||||
```python
|
||||
event = CloudEvent(
|
||||
type="MyApp.Events.ItemCreated", # Required: event type
|
||||
source="/myapp/items", # Required: event source
|
||||
data={"key": "value"}, # Event payload
|
||||
subject="items/123", # Optional: subject/path
|
||||
datacontenttype="application/json", # Optional: content type
|
||||
dataschema="https://schema.example", # Optional: schema URL
|
||||
time=datetime.now(timezone.utc), # Optional: timestamp
|
||||
extensions={"custom": "value"} # Optional: custom attributes
|
||||
)
|
||||
```
|
||||
|
||||
### EventGridEvent Properties
|
||||
|
||||
```python
|
||||
event = EventGridEvent(
|
||||
subject="/myapp/items/123", # Required: subject
|
||||
event_type="MyApp.ItemCreated", # Required: event type
|
||||
data={"key": "value"}, # Required: event payload
|
||||
data_version="1.0", # Required: schema version
|
||||
topic="/subscriptions/.../topics/...", # Optional: auto-set
|
||||
event_time=datetime.now(timezone.utc) # Optional: timestamp
|
||||
)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.eventgrid.aio import EventGridPublisherClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def publish_events():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with EventGridPublisherClient(endpoint, credential) as client:
|
||||
event = CloudEvent(
|
||||
type="MyApp.Events.Test",
|
||||
source="/myapp",
|
||||
data={"message": "hello"}
|
||||
)
|
||||
await client.send(event)
|
||||
|
||||
import asyncio
|
||||
asyncio.run(publish_events())
|
||||
```
|
||||
|
||||
## Namespace Topics (Event Grid Namespaces)
|
||||
|
||||
For Event Grid Namespaces (pull delivery):
|
||||
|
||||
```python
|
||||
from azure.eventgrid.aio import EventGridPublisherClient
|
||||
|
||||
# Namespace endpoint (different from custom topic)
|
||||
namespace_endpoint = "https://<namespace>.<region>.eventgrid.azure.net"
|
||||
topic_name = "my-topic"
|
||||
|
||||
async with EventGridPublisherClient(
|
||||
endpoint=namespace_endpoint,
|
||||
credential=DefaultAzureCredential()
|
||||
) as client:
|
||||
await client.send(
|
||||
event,
|
||||
namespace_topic=topic_name
|
||||
)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use CloudEvents** for new applications (industry standard)
|
||||
2. **Batch events** when publishing multiple events
|
||||
3. **Include meaningful subjects** for filtering
|
||||
4. **Use async client** for high-throughput scenarios
|
||||
5. **Handle retries** — Event Grid has built-in retry
|
||||
6. **Set appropriate event types** for routing and filtering
|
||||
362
skills/azure-eventhub-dotnet/SKILL.md
Normal file
362
skills/azure-eventhub-dotnet/SKILL.md
Normal file
@@ -0,0 +1,362 @@
|
||||
---
|
||||
name: azure-eventhub-dotnet
|
||||
description: |
|
||||
Azure Event Hubs SDK for .NET. Use for high-throughput event streaming: sending events (EventHubProducerClient, EventHubBufferedProducerClient), receiving events (EventProcessorClient with checkpointing), partition management, and real-time data ingestion. Triggers: "Event Hubs", "event streaming", "EventHubProducerClient", "EventProcessorClient", "send events", "receive events", "checkpointing", "partition".
|
||||
package: Azure.Messaging.EventHubs
|
||||
---
|
||||
|
||||
# Azure.Messaging.EventHubs (.NET)
|
||||
|
||||
High-throughput event streaming SDK for sending and receiving events via Azure Event Hubs.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Core package (sending and simple receiving)
|
||||
dotnet add package Azure.Messaging.EventHubs
|
||||
|
||||
# Processor package (production receiving with checkpointing)
|
||||
dotnet add package Azure.Messaging.EventHubs.Processor
|
||||
|
||||
# Authentication
|
||||
dotnet add package Azure.Identity
|
||||
|
||||
# For checkpointing (required by EventProcessorClient)
|
||||
dotnet add package Azure.Storage.Blobs
|
||||
```
|
||||
|
||||
**Current Versions**: Azure.Messaging.EventHubs v5.12.2, Azure.Messaging.EventHubs.Processor v5.12.2
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENTHUB_FULLY_QUALIFIED_NAMESPACE=<namespace>.servicebus.windows.net
|
||||
EVENTHUB_NAME=<event-hub-name>
|
||||
|
||||
# For checkpointing (EventProcessorClient)
|
||||
BLOB_STORAGE_CONNECTION_STRING=<storage-connection-string>
|
||||
BLOB_CONTAINER_NAME=<checkpoint-container>
|
||||
|
||||
# Alternative: Connection string auth (not recommended for production)
|
||||
EVENTHUB_CONNECTION_STRING=Endpoint=sb://<namespace>.servicebus.windows.net/;SharedAccessKeyName=...
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventHubs;
|
||||
using Azure.Messaging.EventHubs.Producer;
|
||||
|
||||
// Always use DefaultAzureCredential for production
|
||||
var credential = new DefaultAzureCredential();
|
||||
|
||||
var fullyQualifiedNamespace = Environment.GetEnvironmentVariable("EVENTHUB_FULLY_QUALIFIED_NAMESPACE");
|
||||
var eventHubName = Environment.GetEnvironmentVariable("EVENTHUB_NAME");
|
||||
|
||||
var producer = new EventHubProducerClient(
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
credential);
|
||||
```
|
||||
|
||||
**Required RBAC Roles**:
|
||||
- **Sending**: `Azure Event Hubs Data Sender`
|
||||
- **Receiving**: `Azure Event Hubs Data Receiver`
|
||||
- **Both**: `Azure Event Hubs Data Owner`
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose | When to Use |
|
||||
|--------|---------|-------------|
|
||||
| `EventHubProducerClient` | Send events immediately in batches | Real-time sending, full control over batching |
|
||||
| `EventHubBufferedProducerClient` | Automatic batching with background sending | High-volume, fire-and-forget scenarios |
|
||||
| `EventHubConsumerClient` | Simple event reading | Prototyping only, NOT for production |
|
||||
| `EventProcessorClient` | Production event processing | **Always use this for receiving in production** |
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Send Events (Batch)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventHubs;
|
||||
using Azure.Messaging.EventHubs.Producer;
|
||||
|
||||
await using var producer = new EventHubProducerClient(
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
new DefaultAzureCredential());
|
||||
|
||||
// Create a batch (respects size limits automatically)
|
||||
using EventDataBatch batch = await producer.CreateBatchAsync();
|
||||
|
||||
// Add events to batch
|
||||
var events = new[]
|
||||
{
|
||||
new EventData(BinaryData.FromString("{\"id\": 1, \"message\": \"Hello\"}")),
|
||||
new EventData(BinaryData.FromString("{\"id\": 2, \"message\": \"World\"}"))
|
||||
};
|
||||
|
||||
foreach (var eventData in events)
|
||||
{
|
||||
if (!batch.TryAdd(eventData))
|
||||
{
|
||||
// Batch is full - send it and create a new one
|
||||
await producer.SendAsync(batch);
|
||||
batch = await producer.CreateBatchAsync();
|
||||
|
||||
if (!batch.TryAdd(eventData))
|
||||
{
|
||||
throw new Exception("Event too large for empty batch");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Send remaining events
|
||||
if (batch.Count > 0)
|
||||
{
|
||||
await producer.SendAsync(batch);
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Send Events (Buffered - High Volume)
|
||||
|
||||
```csharp
|
||||
using Azure.Messaging.EventHubs.Producer;
|
||||
|
||||
var options = new EventHubBufferedProducerClientOptions
|
||||
{
|
||||
MaximumWaitTime = TimeSpan.FromSeconds(1)
|
||||
};
|
||||
|
||||
await using var producer = new EventHubBufferedProducerClient(
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
new DefaultAzureCredential(),
|
||||
options);
|
||||
|
||||
// Handle send success/failure
|
||||
producer.SendEventBatchSucceededAsync += args =>
|
||||
{
|
||||
Console.WriteLine($"Batch sent: {args.EventBatch.Count} events");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
producer.SendEventBatchFailedAsync += args =>
|
||||
{
|
||||
Console.WriteLine($"Batch failed: {args.Exception.Message}");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
// Enqueue events (sent automatically in background)
|
||||
for (int i = 0; i < 1000; i++)
|
||||
{
|
||||
await producer.EnqueueEventAsync(new EventData($"Event {i}"));
|
||||
}
|
||||
|
||||
// Flush remaining events before disposing
|
||||
await producer.FlushAsync();
|
||||
```
|
||||
|
||||
### 3. Receive Events (Production - EventProcessorClient)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventHubs;
|
||||
using Azure.Messaging.EventHubs.Consumer;
|
||||
using Azure.Messaging.EventHubs.Processor;
|
||||
using Azure.Storage.Blobs;
|
||||
|
||||
// Blob container for checkpointing
|
||||
var blobClient = new BlobContainerClient(
|
||||
Environment.GetEnvironmentVariable("BLOB_STORAGE_CONNECTION_STRING"),
|
||||
Environment.GetEnvironmentVariable("BLOB_CONTAINER_NAME"));
|
||||
|
||||
await blobClient.CreateIfNotExistsAsync();
|
||||
|
||||
// Create processor
|
||||
var processor = new EventProcessorClient(
|
||||
blobClient,
|
||||
EventHubConsumerClient.DefaultConsumerGroup,
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
new DefaultAzureCredential());
|
||||
|
||||
// Handle events
|
||||
processor.ProcessEventAsync += async args =>
|
||||
{
|
||||
Console.WriteLine($"Partition: {args.Partition.PartitionId}");
|
||||
Console.WriteLine($"Data: {args.Data.EventBody}");
|
||||
|
||||
// Checkpoint after processing (or batch checkpoints)
|
||||
await args.UpdateCheckpointAsync();
|
||||
};
|
||||
|
||||
// Handle errors
|
||||
processor.ProcessErrorAsync += args =>
|
||||
{
|
||||
Console.WriteLine($"Error: {args.Exception.Message}");
|
||||
Console.WriteLine($"Partition: {args.PartitionId}");
|
||||
return Task.CompletedTask;
|
||||
};
|
||||
|
||||
// Start processing
|
||||
await processor.StartProcessingAsync();
|
||||
|
||||
// Run until cancelled
|
||||
await Task.Delay(Timeout.Infinite, cancellationToken);
|
||||
|
||||
// Stop gracefully
|
||||
await processor.StopProcessingAsync();
|
||||
```
|
||||
|
||||
### 4. Partition Operations
|
||||
|
||||
```csharp
|
||||
// Get partition IDs
|
||||
string[] partitionIds = await producer.GetPartitionIdsAsync();
|
||||
|
||||
// Send to specific partition (use sparingly)
|
||||
var options = new SendEventOptions
|
||||
{
|
||||
PartitionId = "0"
|
||||
};
|
||||
await producer.SendAsync(events, options);
|
||||
|
||||
// Use partition key (recommended for ordering)
|
||||
var batchOptions = new CreateBatchOptions
|
||||
{
|
||||
PartitionKey = "customer-123" // Events with same key go to same partition
|
||||
};
|
||||
using var batch = await producer.CreateBatchAsync(batchOptions);
|
||||
```
|
||||
|
||||
## EventPosition Options
|
||||
|
||||
Control where to start reading:
|
||||
|
||||
```csharp
|
||||
// Start from beginning
|
||||
EventPosition.Earliest
|
||||
|
||||
// Start from end (new events only)
|
||||
EventPosition.Latest
|
||||
|
||||
// Start from specific offset
|
||||
EventPosition.FromOffset(12345)
|
||||
|
||||
// Start from specific sequence number
|
||||
EventPosition.FromSequenceNumber(100)
|
||||
|
||||
// Start from specific time
|
||||
EventPosition.FromEnqueuedTime(DateTimeOffset.UtcNow.AddHours(-1))
|
||||
```
|
||||
|
||||
## ASP.NET Core Integration
|
||||
|
||||
```csharp
|
||||
// Program.cs
|
||||
using Azure.Identity;
|
||||
using Azure.Messaging.EventHubs.Producer;
|
||||
using Microsoft.Extensions.Azure;
|
||||
|
||||
builder.Services.AddAzureClients(clientBuilder =>
|
||||
{
|
||||
clientBuilder.AddEventHubProducerClient(
|
||||
builder.Configuration["EventHub:FullyQualifiedNamespace"],
|
||||
builder.Configuration["EventHub:Name"]);
|
||||
|
||||
clientBuilder.UseCredential(new DefaultAzureCredential());
|
||||
});
|
||||
|
||||
// Inject in controller/service
|
||||
public class EventService
|
||||
{
|
||||
private readonly EventHubProducerClient _producer;
|
||||
|
||||
public EventService(EventHubProducerClient producer)
|
||||
{
|
||||
_producer = producer;
|
||||
}
|
||||
|
||||
public async Task SendAsync(string message)
|
||||
{
|
||||
using var batch = await _producer.CreateBatchAsync();
|
||||
batch.TryAdd(new EventData(message));
|
||||
await _producer.SendAsync(batch);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `EventProcessorClient` for receiving** — Never use `EventHubConsumerClient` in production
|
||||
2. **Checkpoint strategically** — After N events or time interval, not every event
|
||||
3. **Use partition keys** — For ordering guarantees within a partition
|
||||
4. **Reuse clients** — Create once, use as singleton (thread-safe)
|
||||
5. **Use `await using`** — Ensures proper disposal
|
||||
6. **Handle `ProcessErrorAsync`** — Always register error handler
|
||||
7. **Batch events** — Use `CreateBatchAsync()` to respect size limits
|
||||
8. **Use buffered producer** — For high-volume scenarios with automatic batching
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure.Messaging.EventHubs;
|
||||
|
||||
try
|
||||
{
|
||||
await producer.SendAsync(batch);
|
||||
}
|
||||
catch (EventHubsException ex) when (ex.Reason == EventHubsException.FailureReason.ServiceBusy)
|
||||
{
|
||||
// Retry with backoff
|
||||
await Task.Delay(TimeSpan.FromSeconds(5));
|
||||
}
|
||||
catch (EventHubsException ex) when (ex.IsTransient)
|
||||
{
|
||||
// Transient error - safe to retry
|
||||
Console.WriteLine($"Transient error: {ex.Message}");
|
||||
}
|
||||
catch (EventHubsException ex)
|
||||
{
|
||||
// Non-transient error
|
||||
Console.WriteLine($"Error: {ex.Reason} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Checkpointing Strategies
|
||||
|
||||
| Strategy | When to Use |
|
||||
|----------|-------------|
|
||||
| Every event | Low volume, critical data |
|
||||
| Every N events | Balanced throughput/reliability |
|
||||
| Time-based | Consistent checkpoint intervals |
|
||||
| Batch completion | After processing a logical batch |
|
||||
|
||||
```csharp
|
||||
// Checkpoint every 100 events
|
||||
private int _eventCount = 0;
|
||||
|
||||
processor.ProcessEventAsync += async args =>
|
||||
{
|
||||
// Process event...
|
||||
|
||||
_eventCount++;
|
||||
if (_eventCount >= 100)
|
||||
{
|
||||
await args.UpdateCheckpointAsync();
|
||||
_eventCount = 0;
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Messaging.EventHubs` | Core sending/receiving | `dotnet add package Azure.Messaging.EventHubs` |
|
||||
| `Azure.Messaging.EventHubs.Processor` | Production processing | `dotnet add package Azure.Messaging.EventHubs.Processor` |
|
||||
| `Azure.ResourceManager.EventHubs` | Management plane (create hubs) | `dotnet add package Azure.ResourceManager.EventHubs` |
|
||||
| `Microsoft.Azure.WebJobs.Extensions.EventHubs` | Azure Functions binding | `dotnet add package Microsoft.Azure.WebJobs.Extensions.EventHubs` |
|
||||
356
skills/azure-eventhub-java/SKILL.md
Normal file
356
skills/azure-eventhub-java/SKILL.md
Normal file
@@ -0,0 +1,356 @@
|
||||
---
|
||||
name: azure-eventhub-java
|
||||
description: Build real-time streaming applications with Azure Event Hubs SDK for Java. Use when implementing event streaming, high-throughput data ingestion, or building event-driven architectures.
|
||||
package: com.azure:azure-messaging-eventhubs
|
||||
---
|
||||
|
||||
# Azure Event Hubs SDK for Java
|
||||
|
||||
Build real-time streaming applications using the Azure Event Hubs SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-messaging-eventhubs</artifactId>
|
||||
<version>5.19.0</version>
|
||||
</dependency>
|
||||
|
||||
<!-- For checkpoint store (production) -->
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-messaging-eventhubs-checkpointstore-blob</artifactId>
|
||||
<version>1.20.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### EventHubProducerClient
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventHubProducerClient;
|
||||
import com.azure.messaging.eventhubs.EventHubClientBuilder;
|
||||
|
||||
// With connection string
|
||||
EventHubProducerClient producer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.buildProducerClient();
|
||||
|
||||
// Full connection string with EntityPath
|
||||
EventHubProducerClient producer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string-with-entity-path>")
|
||||
.buildProducerClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
EventHubProducerClient producer = new EventHubClientBuilder()
|
||||
.fullyQualifiedNamespace("<namespace>.servicebus.windows.net")
|
||||
.eventHubName("<event-hub-name>")
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.buildProducerClient();
|
||||
```
|
||||
|
||||
### EventHubConsumerClient
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventHubConsumerClient;
|
||||
|
||||
EventHubConsumerClient consumer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.consumerGroup(EventHubClientBuilder.DEFAULT_CONSUMER_GROUP_NAME)
|
||||
.buildConsumerClient();
|
||||
```
|
||||
|
||||
### Async Clients
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventHubProducerAsyncClient;
|
||||
import com.azure.messaging.eventhubs.EventHubConsumerAsyncClient;
|
||||
|
||||
EventHubProducerAsyncClient asyncProducer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.buildAsyncProducerClient();
|
||||
|
||||
EventHubConsumerAsyncClient asyncConsumer = new EventHubClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.consumerGroup("$Default")
|
||||
.buildAsyncConsumerClient();
|
||||
```
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Send Single Event
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventData;
|
||||
|
||||
EventData eventData = new EventData("Hello, Event Hubs!");
|
||||
producer.send(Collections.singletonList(eventData));
|
||||
```
|
||||
|
||||
### Send Event Batch
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventDataBatch;
|
||||
import com.azure.messaging.eventhubs.models.CreateBatchOptions;
|
||||
|
||||
// Create batch
|
||||
EventDataBatch batch = producer.createBatch();
|
||||
|
||||
// Add events (returns false if batch is full)
|
||||
for (int i = 0; i < 100; i++) {
|
||||
EventData event = new EventData("Event " + i);
|
||||
if (!batch.tryAdd(event)) {
|
||||
// Batch is full, send and create new batch
|
||||
producer.send(batch);
|
||||
batch = producer.createBatch();
|
||||
batch.tryAdd(event);
|
||||
}
|
||||
}
|
||||
|
||||
// Send remaining events
|
||||
if (batch.getCount() > 0) {
|
||||
producer.send(batch);
|
||||
}
|
||||
```
|
||||
|
||||
### Send to Specific Partition
|
||||
|
||||
```java
|
||||
CreateBatchOptions options = new CreateBatchOptions()
|
||||
.setPartitionId("0");
|
||||
|
||||
EventDataBatch batch = producer.createBatch(options);
|
||||
batch.tryAdd(new EventData("Partition 0 event"));
|
||||
producer.send(batch);
|
||||
```
|
||||
|
||||
### Send with Partition Key
|
||||
|
||||
```java
|
||||
CreateBatchOptions options = new CreateBatchOptions()
|
||||
.setPartitionKey("customer-123");
|
||||
|
||||
EventDataBatch batch = producer.createBatch(options);
|
||||
batch.tryAdd(new EventData("Customer event"));
|
||||
producer.send(batch);
|
||||
```
|
||||
|
||||
### Event with Properties
|
||||
|
||||
```java
|
||||
EventData event = new EventData("Order created");
|
||||
event.getProperties().put("orderId", "ORD-123");
|
||||
event.getProperties().put("customerId", "CUST-456");
|
||||
event.getProperties().put("priority", 1);
|
||||
|
||||
producer.send(Collections.singletonList(event));
|
||||
```
|
||||
|
||||
### Receive Events (Simple)
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.models.EventPosition;
|
||||
import com.azure.messaging.eventhubs.models.PartitionEvent;
|
||||
|
||||
// Receive from specific partition
|
||||
Iterable<PartitionEvent> events = consumer.receiveFromPartition(
|
||||
"0", // partitionId
|
||||
10, // maxEvents
|
||||
EventPosition.earliest(), // startingPosition
|
||||
Duration.ofSeconds(30) // timeout
|
||||
);
|
||||
|
||||
for (PartitionEvent partitionEvent : events) {
|
||||
EventData event = partitionEvent.getData();
|
||||
System.out.println("Body: " + event.getBodyAsString());
|
||||
System.out.println("Sequence: " + event.getSequenceNumber());
|
||||
System.out.println("Offset: " + event.getOffset());
|
||||
}
|
||||
```
|
||||
|
||||
### EventProcessorClient (Production)
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.EventProcessorClient;
|
||||
import com.azure.messaging.eventhubs.EventProcessorClientBuilder;
|
||||
import com.azure.messaging.eventhubs.checkpointstore.blob.BlobCheckpointStore;
|
||||
import com.azure.storage.blob.BlobContainerAsyncClient;
|
||||
import com.azure.storage.blob.BlobContainerClientBuilder;
|
||||
|
||||
// Create checkpoint store
|
||||
BlobContainerAsyncClient blobClient = new BlobContainerClientBuilder()
|
||||
.connectionString("<storage-connection-string>")
|
||||
.containerName("checkpoints")
|
||||
.buildAsyncClient();
|
||||
|
||||
// Create processor
|
||||
EventProcessorClient processor = new EventProcessorClientBuilder()
|
||||
.connectionString("<eventhub-connection-string>", "<event-hub-name>")
|
||||
.consumerGroup("$Default")
|
||||
.checkpointStore(new BlobCheckpointStore(blobClient))
|
||||
.processEvent(eventContext -> {
|
||||
EventData event = eventContext.getEventData();
|
||||
System.out.println("Processing: " + event.getBodyAsString());
|
||||
|
||||
// Checkpoint after processing
|
||||
eventContext.updateCheckpoint();
|
||||
})
|
||||
.processError(errorContext -> {
|
||||
System.err.println("Error: " + errorContext.getThrowable().getMessage());
|
||||
System.err.println("Partition: " + errorContext.getPartitionContext().getPartitionId());
|
||||
})
|
||||
.buildEventProcessorClient();
|
||||
|
||||
// Start processing
|
||||
processor.start();
|
||||
|
||||
// Keep running...
|
||||
Thread.sleep(Duration.ofMinutes(5).toMillis());
|
||||
|
||||
// Stop gracefully
|
||||
processor.stop();
|
||||
```
|
||||
|
||||
### Batch Processing
|
||||
|
||||
```java
|
||||
EventProcessorClient processor = new EventProcessorClientBuilder()
|
||||
.connectionString("<connection-string>", "<event-hub-name>")
|
||||
.consumerGroup("$Default")
|
||||
.checkpointStore(new BlobCheckpointStore(blobClient))
|
||||
.processEventBatch(eventBatchContext -> {
|
||||
List<EventData> events = eventBatchContext.getEvents();
|
||||
System.out.printf("Received %d events%n", events.size());
|
||||
|
||||
for (EventData event : events) {
|
||||
// Process each event
|
||||
System.out.println(event.getBodyAsString());
|
||||
}
|
||||
|
||||
// Checkpoint after batch
|
||||
eventBatchContext.updateCheckpoint();
|
||||
}, 50) // maxBatchSize
|
||||
.processError(errorContext -> {
|
||||
System.err.println("Error: " + errorContext.getThrowable());
|
||||
})
|
||||
.buildEventProcessorClient();
|
||||
```
|
||||
|
||||
### Async Receiving
|
||||
|
||||
```java
|
||||
asyncConsumer.receiveFromPartition("0", EventPosition.latest())
|
||||
.subscribe(
|
||||
partitionEvent -> {
|
||||
EventData event = partitionEvent.getData();
|
||||
System.out.println("Received: " + event.getBodyAsString());
|
||||
},
|
||||
error -> System.err.println("Error: " + error),
|
||||
() -> System.out.println("Complete")
|
||||
);
|
||||
```
|
||||
|
||||
### Get Event Hub Properties
|
||||
|
||||
```java
|
||||
// Get hub info
|
||||
EventHubProperties hubProps = producer.getEventHubProperties();
|
||||
System.out.println("Hub: " + hubProps.getName());
|
||||
System.out.println("Partitions: " + hubProps.getPartitionIds());
|
||||
|
||||
// Get partition info
|
||||
PartitionProperties partitionProps = producer.getPartitionProperties("0");
|
||||
System.out.println("Begin sequence: " + partitionProps.getBeginningSequenceNumber());
|
||||
System.out.println("Last sequence: " + partitionProps.getLastEnqueuedSequenceNumber());
|
||||
System.out.println("Last offset: " + partitionProps.getLastEnqueuedOffset());
|
||||
```
|
||||
|
||||
## Event Positions
|
||||
|
||||
```java
|
||||
// Start from beginning
|
||||
EventPosition.earliest()
|
||||
|
||||
// Start from end (new events only)
|
||||
EventPosition.latest()
|
||||
|
||||
// From specific offset
|
||||
EventPosition.fromOffset(12345L)
|
||||
|
||||
// From specific sequence number
|
||||
EventPosition.fromSequenceNumber(100L)
|
||||
|
||||
// From specific time
|
||||
EventPosition.fromEnqueuedTime(Instant.now().minus(Duration.ofHours(1)))
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.messaging.eventhubs.models.ErrorContext;
|
||||
|
||||
.processError(errorContext -> {
|
||||
Throwable error = errorContext.getThrowable();
|
||||
String partitionId = errorContext.getPartitionContext().getPartitionId();
|
||||
|
||||
if (error instanceof AmqpException) {
|
||||
AmqpException amqpError = (AmqpException) error;
|
||||
if (amqpError.isTransient()) {
|
||||
System.out.println("Transient error, will retry");
|
||||
}
|
||||
}
|
||||
|
||||
System.err.printf("Error on partition %s: %s%n", partitionId, error.getMessage());
|
||||
})
|
||||
```
|
||||
|
||||
## Resource Cleanup
|
||||
|
||||
```java
|
||||
// Always close clients
|
||||
try {
|
||||
producer.send(batch);
|
||||
} finally {
|
||||
producer.close();
|
||||
}
|
||||
|
||||
// Or use try-with-resources
|
||||
try (EventHubProducerClient producer = new EventHubClientBuilder()
|
||||
.connectionString(connectionString, eventHubName)
|
||||
.buildProducerClient()) {
|
||||
producer.send(events);
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENT_HUBS_CONNECTION_STRING=Endpoint=sb://<namespace>.servicebus.windows.net/;SharedAccessKeyName=...
|
||||
EVENT_HUBS_NAME=<event-hub-name>
|
||||
STORAGE_CONNECTION_STRING=<for-checkpointing>
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use EventProcessorClient**: For production, provides load balancing and checkpointing
|
||||
2. **Batch Events**: Use `EventDataBatch` for efficient sending
|
||||
3. **Partition Keys**: Use for ordering guarantees within a partition
|
||||
4. **Checkpointing**: Checkpoint after processing to avoid reprocessing
|
||||
5. **Error Handling**: Handle transient errors with retries
|
||||
6. **Close Clients**: Always close producer/consumer when done
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Event Hubs Java"
|
||||
- "event streaming Azure"
|
||||
- "real-time data ingestion"
|
||||
- "EventProcessorClient"
|
||||
- "event hub producer consumer"
|
||||
- "partition processing"
|
||||
240
skills/azure-eventhub-py/SKILL.md
Normal file
240
skills/azure-eventhub-py/SKILL.md
Normal file
@@ -0,0 +1,240 @@
|
||||
---
|
||||
name: azure-eventhub-py
|
||||
description: |
|
||||
Azure Event Hubs SDK for Python streaming. Use for high-throughput event ingestion, producers, consumers, and checkpointing.
|
||||
Triggers: "event hubs", "EventHubProducerClient", "EventHubConsumerClient", "streaming", "partitions".
|
||||
package: azure-eventhub
|
||||
---
|
||||
|
||||
# Azure Event Hubs SDK for Python
|
||||
|
||||
Big data streaming platform for high-throughput event ingestion.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-eventhub azure-identity
|
||||
# For checkpointing with blob storage
|
||||
pip install azure-eventhub-checkpointstoreblob-aio
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENT_HUB_FULLY_QUALIFIED_NAMESPACE=<namespace>.servicebus.windows.net
|
||||
EVENT_HUB_NAME=my-eventhub
|
||||
STORAGE_ACCOUNT_URL=https://<account>.blob.core.windows.net
|
||||
CHECKPOINT_CONTAINER=checkpoints
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.eventhub import EventHubProducerClient, EventHubConsumerClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
namespace = "<namespace>.servicebus.windows.net"
|
||||
eventhub_name = "my-eventhub"
|
||||
|
||||
# Producer
|
||||
producer = EventHubProducerClient(
|
||||
fully_qualified_namespace=namespace,
|
||||
eventhub_name=eventhub_name,
|
||||
credential=credential
|
||||
)
|
||||
|
||||
# Consumer
|
||||
consumer = EventHubConsumerClient(
|
||||
fully_qualified_namespace=namespace,
|
||||
eventhub_name=eventhub_name,
|
||||
consumer_group="$Default",
|
||||
credential=credential
|
||||
)
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `EventHubProducerClient` | Send events to Event Hub |
|
||||
| `EventHubConsumerClient` | Receive events from Event Hub |
|
||||
| `BlobCheckpointStore` | Track consumer progress |
|
||||
|
||||
## Send Events
|
||||
|
||||
```python
|
||||
from azure.eventhub import EventHubProducerClient, EventData
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
producer = EventHubProducerClient(
|
||||
fully_qualified_namespace="<namespace>.servicebus.windows.net",
|
||||
eventhub_name="my-eventhub",
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
with producer:
|
||||
# Create batch (handles size limits)
|
||||
event_data_batch = producer.create_batch()
|
||||
|
||||
for i in range(10):
|
||||
try:
|
||||
event_data_batch.add(EventData(f"Event {i}"))
|
||||
except ValueError:
|
||||
# Batch is full, send and create new one
|
||||
producer.send_batch(event_data_batch)
|
||||
event_data_batch = producer.create_batch()
|
||||
event_data_batch.add(EventData(f"Event {i}"))
|
||||
|
||||
# Send remaining
|
||||
producer.send_batch(event_data_batch)
|
||||
```
|
||||
|
||||
### Send to Specific Partition
|
||||
|
||||
```python
|
||||
# By partition ID
|
||||
event_data_batch = producer.create_batch(partition_id="0")
|
||||
|
||||
# By partition key (consistent hashing)
|
||||
event_data_batch = producer.create_batch(partition_key="user-123")
|
||||
```
|
||||
|
||||
## Receive Events
|
||||
|
||||
### Simple Receive
|
||||
|
||||
```python
|
||||
from azure.eventhub import EventHubConsumerClient
|
||||
|
||||
def on_event(partition_context, event):
|
||||
print(f"Partition: {partition_context.partition_id}")
|
||||
print(f"Data: {event.body_as_str()}")
|
||||
partition_context.update_checkpoint(event)
|
||||
|
||||
consumer = EventHubConsumerClient(
|
||||
fully_qualified_namespace="<namespace>.servicebus.windows.net",
|
||||
eventhub_name="my-eventhub",
|
||||
consumer_group="$Default",
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
with consumer:
|
||||
consumer.receive(
|
||||
on_event=on_event,
|
||||
starting_position="-1", # Beginning of stream
|
||||
)
|
||||
```
|
||||
|
||||
### With Blob Checkpoint Store (Production)
|
||||
|
||||
```python
|
||||
from azure.eventhub import EventHubConsumerClient
|
||||
from azure.eventhub.extensions.checkpointstoreblob import BlobCheckpointStore
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
checkpoint_store = BlobCheckpointStore(
|
||||
blob_account_url="https://<account>.blob.core.windows.net",
|
||||
container_name="checkpoints",
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
|
||||
consumer = EventHubConsumerClient(
|
||||
fully_qualified_namespace="<namespace>.servicebus.windows.net",
|
||||
eventhub_name="my-eventhub",
|
||||
consumer_group="$Default",
|
||||
credential=DefaultAzureCredential(),
|
||||
checkpoint_store=checkpoint_store
|
||||
)
|
||||
|
||||
def on_event(partition_context, event):
|
||||
print(f"Received: {event.body_as_str()}")
|
||||
# Checkpoint after processing
|
||||
partition_context.update_checkpoint(event)
|
||||
|
||||
with consumer:
|
||||
consumer.receive(on_event=on_event)
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.eventhub.aio import EventHubProducerClient, EventHubConsumerClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
import asyncio
|
||||
|
||||
async def send_events():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with EventHubProducerClient(
|
||||
fully_qualified_namespace="<namespace>.servicebus.windows.net",
|
||||
eventhub_name="my-eventhub",
|
||||
credential=credential
|
||||
) as producer:
|
||||
batch = await producer.create_batch()
|
||||
batch.add(EventData("Async event"))
|
||||
await producer.send_batch(batch)
|
||||
|
||||
async def receive_events():
|
||||
async def on_event(partition_context, event):
|
||||
print(event.body_as_str())
|
||||
await partition_context.update_checkpoint(event)
|
||||
|
||||
async with EventHubConsumerClient(
|
||||
fully_qualified_namespace="<namespace>.servicebus.windows.net",
|
||||
eventhub_name="my-eventhub",
|
||||
consumer_group="$Default",
|
||||
credential=DefaultAzureCredential()
|
||||
) as consumer:
|
||||
await consumer.receive(on_event=on_event)
|
||||
|
||||
asyncio.run(send_events())
|
||||
```
|
||||
|
||||
## Event Properties
|
||||
|
||||
```python
|
||||
event = EventData("My event body")
|
||||
|
||||
# Set properties
|
||||
event.properties = {"custom_property": "value"}
|
||||
event.content_type = "application/json"
|
||||
|
||||
# Read properties (on receive)
|
||||
print(event.body_as_str())
|
||||
print(event.sequence_number)
|
||||
print(event.offset)
|
||||
print(event.enqueued_time)
|
||||
print(event.partition_key)
|
||||
```
|
||||
|
||||
## Get Event Hub Info
|
||||
|
||||
```python
|
||||
with producer:
|
||||
info = producer.get_eventhub_properties()
|
||||
print(f"Name: {info['name']}")
|
||||
print(f"Partitions: {info['partition_ids']}")
|
||||
|
||||
for partition_id in info['partition_ids']:
|
||||
partition_info = producer.get_partition_properties(partition_id)
|
||||
print(f"Partition {partition_id}: {partition_info['last_enqueued_sequence_number']}")
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use batches** for sending multiple events
|
||||
2. **Use checkpoint store** in production for reliable processing
|
||||
3. **Use async client** for high-throughput scenarios
|
||||
4. **Use partition keys** for ordered delivery within a partition
|
||||
5. **Handle batch size limits** — catch ValueError when batch is full
|
||||
6. **Use context managers** (`with`/`async with`) for proper cleanup
|
||||
7. **Set appropriate consumer groups** for different applications
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | Contents |
|
||||
|------|----------|
|
||||
| [references/checkpointing.md](references/checkpointing.md) | Checkpoint store patterns, blob checkpointing, checkpoint strategies |
|
||||
| [references/partitions.md](references/partitions.md) | Partition management, load balancing, starting positions |
|
||||
| [scripts/setup_consumer.py](scripts/setup_consumer.py) | CLI for Event Hub info, consumer setup, and event sending/receiving |
|
||||
127
skills/azure-eventhub-rust/SKILL.md
Normal file
127
skills/azure-eventhub-rust/SKILL.md
Normal file
@@ -0,0 +1,127 @@
|
||||
---
|
||||
name: azure-eventhub-rust
|
||||
description: |
|
||||
Azure Event Hubs SDK for Rust. Use for sending and receiving events, streaming data ingestion.
|
||||
Triggers: "event hubs rust", "ProducerClient rust", "ConsumerClient rust", "send event rust", "streaming rust".
|
||||
package: azure_messaging_eventhubs
|
||||
---
|
||||
|
||||
# Azure Event Hubs SDK for Rust
|
||||
|
||||
Client library for Azure Event Hubs — big data streaming platform and event ingestion service.
|
||||
|
||||
## Installation
|
||||
|
||||
```sh
|
||||
cargo add azure_messaging_eventhubs azure_identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENTHUBS_HOST=<namespace>.servicebus.windows.net
|
||||
EVENTHUB_NAME=<eventhub-name>
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
- **Namespace** — container for Event Hubs
|
||||
- **Event Hub** — stream of events partitioned for parallel processing
|
||||
- **Partition** — ordered sequence of events
|
||||
- **Producer** — sends events to Event Hub
|
||||
- **Consumer** — receives events from partitions
|
||||
|
||||
## Producer Client
|
||||
|
||||
### Create Producer
|
||||
|
||||
```rust
|
||||
use azure_identity::DeveloperToolsCredential;
|
||||
use azure_messaging_eventhubs::ProducerClient;
|
||||
|
||||
let credential = DeveloperToolsCredential::new(None)?;
|
||||
let producer = ProducerClient::builder()
|
||||
.open("<namespace>.servicebus.windows.net", "eventhub-name", credential.clone())
|
||||
.await?;
|
||||
```
|
||||
|
||||
### Send Single Event
|
||||
|
||||
```rust
|
||||
producer.send_event(vec![1, 2, 3, 4], None).await?;
|
||||
```
|
||||
|
||||
### Send Batch
|
||||
|
||||
```rust
|
||||
let batch = producer.create_batch(None).await?;
|
||||
batch.try_add_event_data(b"event 1".to_vec(), None)?;
|
||||
batch.try_add_event_data(b"event 2".to_vec(), None)?;
|
||||
|
||||
producer.send_batch(batch, None).await?;
|
||||
```
|
||||
|
||||
## Consumer Client
|
||||
|
||||
### Create Consumer
|
||||
|
||||
```rust
|
||||
use azure_messaging_eventhubs::ConsumerClient;
|
||||
|
||||
let credential = DeveloperToolsCredential::new(None)?;
|
||||
let consumer = ConsumerClient::builder()
|
||||
.open("<namespace>.servicebus.windows.net", "eventhub-name", credential.clone())
|
||||
.await?;
|
||||
```
|
||||
|
||||
### Receive Events
|
||||
|
||||
```rust
|
||||
// Open receiver for specific partition
|
||||
let receiver = consumer.open_partition_receiver("0", None).await?;
|
||||
|
||||
// Receive events
|
||||
let events = receiver.receive_events(100, None).await?;
|
||||
for event in events {
|
||||
println!("Event data: {:?}", event.body());
|
||||
}
|
||||
```
|
||||
|
||||
### Get Event Hub Properties
|
||||
|
||||
```rust
|
||||
let properties = consumer.get_eventhub_properties(None).await?;
|
||||
println!("Partitions: {:?}", properties.partition_ids);
|
||||
```
|
||||
|
||||
### Get Partition Properties
|
||||
|
||||
```rust
|
||||
let partition_props = consumer.get_partition_properties("0", None).await?;
|
||||
println!("Last sequence number: {}", partition_props.last_enqueued_sequence_number);
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Reuse clients** — create once, send many events
|
||||
2. **Use batches** — more efficient than individual sends
|
||||
3. **Check batch capacity** — `try_add_event_data` returns false when full
|
||||
4. **Process partitions in parallel** — each partition can be consumed independently
|
||||
5. **Use consumer groups** — isolate different consuming applications
|
||||
6. **Handle checkpointing** — use `azure_messaging_eventhubs_checkpointstore_blob` for distributed consumers
|
||||
|
||||
## Checkpoint Store (Optional)
|
||||
|
||||
For distributed consumers with checkpointing:
|
||||
|
||||
```sh
|
||||
cargo add azure_messaging_eventhubs_checkpointstore_blob
|
||||
```
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | Link |
|
||||
|----------|------|
|
||||
| API Reference | https://docs.rs/azure_messaging_eventhubs |
|
||||
| Source Code | https://github.com/Azure/azure-sdk-for-rust/tree/main/sdk/eventhubs/azure_messaging_eventhubs |
|
||||
| crates.io | https://crates.io/crates/azure_messaging_eventhubs |
|
||||
268
skills/azure-eventhub-ts/SKILL.md
Normal file
268
skills/azure-eventhub-ts/SKILL.md
Normal file
@@ -0,0 +1,268 @@
|
||||
---
|
||||
name: azure-eventhub-ts
|
||||
description: Build event streaming applications using Azure Event Hubs SDK for JavaScript (@azure/event-hubs). Use when implementing high-throughput event ingestion, real-time analytics, IoT telemetry, or event-driven architectures with partitioned consumers.
|
||||
package: @azure/event-hubs
|
||||
---
|
||||
|
||||
# Azure Event Hubs SDK for TypeScript
|
||||
|
||||
High-throughput event streaming and real-time data ingestion.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @azure/event-hubs @azure/identity
|
||||
```
|
||||
|
||||
For checkpointing with consumer groups:
|
||||
```bash
|
||||
npm install @azure/eventhubs-checkpointstore-blob @azure/storage-blob
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
EVENTHUB_NAMESPACE=<namespace>.servicebus.windows.net
|
||||
EVENTHUB_NAME=my-eventhub
|
||||
STORAGE_ACCOUNT_NAME=<storage-account>
|
||||
STORAGE_CONTAINER_NAME=checkpoints
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```typescript
|
||||
import { EventHubProducerClient, EventHubConsumerClient } from "@azure/event-hubs";
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
const fullyQualifiedNamespace = process.env.EVENTHUB_NAMESPACE!;
|
||||
const eventHubName = process.env.EVENTHUB_NAME!;
|
||||
const credential = new DefaultAzureCredential();
|
||||
|
||||
// Producer
|
||||
const producer = new EventHubProducerClient(fullyQualifiedNamespace, eventHubName, credential);
|
||||
|
||||
// Consumer
|
||||
const consumer = new EventHubConsumerClient(
|
||||
"$Default", // Consumer group
|
||||
fullyQualifiedNamespace,
|
||||
eventHubName,
|
||||
credential
|
||||
);
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### Send Events
|
||||
|
||||
```typescript
|
||||
const producer = new EventHubProducerClient(namespace, eventHubName, credential);
|
||||
|
||||
// Create batch and add events
|
||||
const batch = await producer.createBatch();
|
||||
batch.tryAdd({ body: { temperature: 72.5, deviceId: "sensor-1" } });
|
||||
batch.tryAdd({ body: { temperature: 68.2, deviceId: "sensor-2" } });
|
||||
|
||||
await producer.sendBatch(batch);
|
||||
await producer.close();
|
||||
```
|
||||
|
||||
### Send to Specific Partition
|
||||
|
||||
```typescript
|
||||
// By partition ID
|
||||
const batch = await producer.createBatch({ partitionId: "0" });
|
||||
|
||||
// By partition key (consistent hashing)
|
||||
const batch = await producer.createBatch({ partitionKey: "device-123" });
|
||||
```
|
||||
|
||||
### Receive Events (Simple)
|
||||
|
||||
```typescript
|
||||
const consumer = new EventHubConsumerClient("$Default", namespace, eventHubName, credential);
|
||||
|
||||
const subscription = consumer.subscribe({
|
||||
processEvents: async (events, context) => {
|
||||
for (const event of events) {
|
||||
console.log(`Partition: ${context.partitionId}, Body: ${JSON.stringify(event.body)}`);
|
||||
}
|
||||
},
|
||||
processError: async (err, context) => {
|
||||
console.error(`Error on partition ${context.partitionId}: ${err.message}`);
|
||||
},
|
||||
});
|
||||
|
||||
// Stop after some time
|
||||
setTimeout(async () => {
|
||||
await subscription.close();
|
||||
await consumer.close();
|
||||
}, 60000);
|
||||
```
|
||||
|
||||
### Receive with Checkpointing (Production)
|
||||
|
||||
```typescript
|
||||
import { EventHubConsumerClient } from "@azure/event-hubs";
|
||||
import { ContainerClient } from "@azure/storage-blob";
|
||||
import { BlobCheckpointStore } from "@azure/eventhubs-checkpointstore-blob";
|
||||
|
||||
const containerClient = new ContainerClient(
|
||||
`https://${storageAccount}.blob.core.windows.net/${containerName}`,
|
||||
credential
|
||||
);
|
||||
|
||||
const checkpointStore = new BlobCheckpointStore(containerClient);
|
||||
|
||||
const consumer = new EventHubConsumerClient(
|
||||
"$Default",
|
||||
namespace,
|
||||
eventHubName,
|
||||
credential,
|
||||
checkpointStore
|
||||
);
|
||||
|
||||
const subscription = consumer.subscribe({
|
||||
processEvents: async (events, context) => {
|
||||
for (const event of events) {
|
||||
console.log(`Processing: ${JSON.stringify(event.body)}`);
|
||||
}
|
||||
// Checkpoint after processing batch
|
||||
if (events.length > 0) {
|
||||
await context.updateCheckpoint(events[events.length - 1]);
|
||||
}
|
||||
},
|
||||
processError: async (err, context) => {
|
||||
console.error(`Error: ${err.message}`);
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
### Receive from Specific Position
|
||||
|
||||
```typescript
|
||||
const subscription = consumer.subscribe({
|
||||
processEvents: async (events, context) => { /* ... */ },
|
||||
processError: async (err, context) => { /* ... */ },
|
||||
}, {
|
||||
startPosition: {
|
||||
// Start from beginning
|
||||
"0": { offset: "@earliest" },
|
||||
// Start from end (new events only)
|
||||
"1": { offset: "@latest" },
|
||||
// Start from specific offset
|
||||
"2": { offset: "12345" },
|
||||
// Start from specific time
|
||||
"3": { enqueuedOn: new Date("2024-01-01") },
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Event Hub Properties
|
||||
|
||||
```typescript
|
||||
// Get hub info
|
||||
const hubProperties = await producer.getEventHubProperties();
|
||||
console.log(`Partitions: ${hubProperties.partitionIds}`);
|
||||
|
||||
// Get partition info
|
||||
const partitionProperties = await producer.getPartitionProperties("0");
|
||||
console.log(`Last sequence: ${partitionProperties.lastEnqueuedSequenceNumber}`);
|
||||
```
|
||||
|
||||
## Batch Processing Options
|
||||
|
||||
```typescript
|
||||
const subscription = consumer.subscribe(
|
||||
{
|
||||
processEvents: async (events, context) => { /* ... */ },
|
||||
processError: async (err, context) => { /* ... */ },
|
||||
},
|
||||
{
|
||||
maxBatchSize: 100, // Max events per batch
|
||||
maxWaitTimeInSeconds: 30, // Max wait for batch
|
||||
}
|
||||
);
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
import {
|
||||
EventHubProducerClient,
|
||||
EventHubConsumerClient,
|
||||
EventData,
|
||||
ReceivedEventData,
|
||||
PartitionContext,
|
||||
Subscription,
|
||||
SubscriptionEventHandlers,
|
||||
CreateBatchOptions,
|
||||
EventPosition,
|
||||
} from "@azure/event-hubs";
|
||||
|
||||
import { BlobCheckpointStore } from "@azure/eventhubs-checkpointstore-blob";
|
||||
```
|
||||
|
||||
## Event Properties
|
||||
|
||||
```typescript
|
||||
// Send with properties
|
||||
const batch = await producer.createBatch();
|
||||
batch.tryAdd({
|
||||
body: { data: "payload" },
|
||||
properties: {
|
||||
eventType: "telemetry",
|
||||
deviceId: "sensor-1",
|
||||
},
|
||||
contentType: "application/json",
|
||||
correlationId: "request-123",
|
||||
});
|
||||
|
||||
// Access in receiver
|
||||
consumer.subscribe({
|
||||
processEvents: async (events, context) => {
|
||||
for (const event of events) {
|
||||
console.log(`Type: ${event.properties?.eventType}`);
|
||||
console.log(`Sequence: ${event.sequenceNumber}`);
|
||||
console.log(`Enqueued: ${event.enqueuedTimeUtc}`);
|
||||
console.log(`Offset: ${event.offset}`);
|
||||
}
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```typescript
|
||||
consumer.subscribe({
|
||||
processEvents: async (events, context) => {
|
||||
try {
|
||||
for (const event of events) {
|
||||
await processEvent(event);
|
||||
}
|
||||
await context.updateCheckpoint(events[events.length - 1]);
|
||||
} catch (error) {
|
||||
// Don't checkpoint on error - events will be reprocessed
|
||||
console.error("Processing failed:", error);
|
||||
}
|
||||
},
|
||||
processError: async (err, context) => {
|
||||
if (err.name === "MessagingError") {
|
||||
// Transient error - SDK will retry
|
||||
console.warn("Transient error:", err.message);
|
||||
} else {
|
||||
// Fatal error
|
||||
console.error("Fatal error:", err);
|
||||
}
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use checkpointing** - Always checkpoint in production for exactly-once processing
|
||||
2. **Batch sends** - Use `createBatch()` for efficient sending
|
||||
3. **Partition keys** - Use partition keys to ensure ordering for related events
|
||||
4. **Consumer groups** - Use separate consumer groups for different processing pipelines
|
||||
5. **Handle errors gracefully** - Don't checkpoint on processing failures
|
||||
6. **Close clients** - Always close producer/consumer when done
|
||||
7. **Monitor lag** - Track `lastEnqueuedSequenceNumber` vs processed sequence
|
||||
339
skills/azure-identity-dotnet/SKILL.md
Normal file
339
skills/azure-identity-dotnet/SKILL.md
Normal file
@@ -0,0 +1,339 @@
|
||||
---
|
||||
name: azure-identity-dotnet
|
||||
description: |
|
||||
Azure Identity SDK for .NET. Authentication library for Azure SDK clients using Microsoft Entra ID. Use for DefaultAzureCredential, managed identity, service principals, and developer credentials. Triggers: "Azure Identity", "DefaultAzureCredential", "ManagedIdentityCredential", "ClientSecretCredential", "authentication .NET", "Azure auth", "credential chain".
|
||||
package: Azure.Identity
|
||||
---
|
||||
|
||||
# Azure.Identity (.NET)
|
||||
|
||||
Authentication library for Azure SDK clients using Microsoft Entra ID (formerly Azure AD).
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.Identity
|
||||
|
||||
# For ASP.NET Core
|
||||
dotnet add package Microsoft.Extensions.Azure
|
||||
|
||||
# For brokered authentication (Windows)
|
||||
dotnet add package Azure.Identity.Broker
|
||||
```
|
||||
|
||||
**Current Versions**: Stable v1.17.1, Preview v1.18.0-beta.2
|
||||
|
||||
## Environment Variables
|
||||
|
||||
### Service Principal with Secret
|
||||
```bash
|
||||
AZURE_CLIENT_ID=<application-client-id>
|
||||
AZURE_TENANT_ID=<directory-tenant-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret-value>
|
||||
```
|
||||
|
||||
### Service Principal with Certificate
|
||||
```bash
|
||||
AZURE_CLIENT_ID=<application-client-id>
|
||||
AZURE_TENANT_ID=<directory-tenant-id>
|
||||
AZURE_CLIENT_CERTIFICATE_PATH=<path-to-pfx-or-pem>
|
||||
AZURE_CLIENT_CERTIFICATE_PASSWORD=<certificate-password> # Optional
|
||||
```
|
||||
|
||||
### Managed Identity
|
||||
```bash
|
||||
AZURE_CLIENT_ID=<user-assigned-managed-identity-client-id> # Only for user-assigned
|
||||
```
|
||||
|
||||
## DefaultAzureCredential
|
||||
|
||||
The recommended credential for most scenarios. Tries multiple authentication methods in order:
|
||||
|
||||
| Order | Credential | Enabled by Default |
|
||||
|-------|------------|-------------------|
|
||||
| 1 | EnvironmentCredential | Yes |
|
||||
| 2 | WorkloadIdentityCredential | Yes |
|
||||
| 3 | ManagedIdentityCredential | Yes |
|
||||
| 4 | VisualStudioCredential | Yes |
|
||||
| 5 | VisualStudioCodeCredential | Yes |
|
||||
| 6 | AzureCliCredential | Yes |
|
||||
| 7 | AzurePowerShellCredential | Yes |
|
||||
| 8 | AzureDeveloperCliCredential | Yes |
|
||||
| 9 | InteractiveBrowserCredential | **No** |
|
||||
|
||||
### Basic Usage
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Storage.Blobs;
|
||||
|
||||
var credential = new DefaultAzureCredential();
|
||||
var blobClient = new BlobServiceClient(
|
||||
new Uri("https://myaccount.blob.core.windows.net"),
|
||||
credential);
|
||||
```
|
||||
|
||||
### ASP.NET Core with Dependency Injection
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Microsoft.Extensions.Azure;
|
||||
|
||||
builder.Services.AddAzureClients(clientBuilder =>
|
||||
{
|
||||
clientBuilder.AddBlobServiceClient(
|
||||
new Uri("https://myaccount.blob.core.windows.net"));
|
||||
clientBuilder.AddSecretClient(
|
||||
new Uri("https://myvault.vault.azure.net"));
|
||||
|
||||
// Uses DefaultAzureCredential by default
|
||||
clientBuilder.UseCredential(new DefaultAzureCredential());
|
||||
});
|
||||
```
|
||||
|
||||
### Customizing DefaultAzureCredential
|
||||
|
||||
```csharp
|
||||
var credential = new DefaultAzureCredential(
|
||||
new DefaultAzureCredentialOptions
|
||||
{
|
||||
ExcludeEnvironmentCredential = true,
|
||||
ExcludeManagedIdentityCredential = false,
|
||||
ExcludeVisualStudioCredential = false,
|
||||
ExcludeAzureCliCredential = false,
|
||||
ExcludeInteractiveBrowserCredential = false, // Enable interactive
|
||||
TenantId = "<tenant-id>",
|
||||
ManagedIdentityClientId = "<user-assigned-mi-client-id>"
|
||||
});
|
||||
```
|
||||
|
||||
## Credential Types
|
||||
|
||||
### ManagedIdentityCredential (Production)
|
||||
|
||||
```csharp
|
||||
// System-assigned managed identity
|
||||
var credential = new ManagedIdentityCredential(ManagedIdentityId.SystemAssigned);
|
||||
|
||||
// User-assigned by client ID
|
||||
var credential = new ManagedIdentityCredential(
|
||||
ManagedIdentityId.FromUserAssignedClientId("<client-id>"));
|
||||
|
||||
// User-assigned by resource ID
|
||||
var credential = new ManagedIdentityCredential(
|
||||
ManagedIdentityId.FromUserAssignedResourceId("<resource-id>"));
|
||||
```
|
||||
|
||||
### ClientSecretCredential
|
||||
|
||||
```csharp
|
||||
var credential = new ClientSecretCredential(
|
||||
tenantId: "<tenant-id>",
|
||||
clientId: "<client-id>",
|
||||
clientSecret: "<client-secret>");
|
||||
|
||||
var client = new SecretClient(
|
||||
new Uri("https://myvault.vault.azure.net"),
|
||||
credential);
|
||||
```
|
||||
|
||||
### ClientCertificateCredential
|
||||
|
||||
```csharp
|
||||
var certificate = X509CertificateLoader.LoadCertificateFromFile("MyCertificate.pfx");
|
||||
var credential = new ClientCertificateCredential(
|
||||
tenantId: "<tenant-id>",
|
||||
clientId: "<client-id>",
|
||||
certificate);
|
||||
```
|
||||
|
||||
### ChainedTokenCredential (Custom Chain)
|
||||
|
||||
```csharp
|
||||
var credential = new ChainedTokenCredential(
|
||||
new ManagedIdentityCredential(),
|
||||
new AzureCliCredential());
|
||||
|
||||
var client = new SecretClient(
|
||||
new Uri("https://myvault.vault.azure.net"),
|
||||
credential);
|
||||
```
|
||||
|
||||
### Developer Credentials
|
||||
|
||||
```csharp
|
||||
// Azure CLI
|
||||
var credential = new AzureCliCredential();
|
||||
|
||||
// Azure PowerShell
|
||||
var credential = new AzurePowerShellCredential();
|
||||
|
||||
// Azure Developer CLI (azd)
|
||||
var credential = new AzureDeveloperCliCredential();
|
||||
|
||||
// Visual Studio
|
||||
var credential = new VisualStudioCredential();
|
||||
|
||||
// Interactive Browser
|
||||
var credential = new InteractiveBrowserCredential();
|
||||
```
|
||||
|
||||
## Environment-Based Configuration
|
||||
|
||||
```csharp
|
||||
// Production vs Development
|
||||
TokenCredential credential = builder.Environment.IsProduction()
|
||||
? new ManagedIdentityCredential("<client-id>")
|
||||
: new DefaultAzureCredential();
|
||||
```
|
||||
|
||||
## Sovereign Clouds
|
||||
|
||||
```csharp
|
||||
var credential = new DefaultAzureCredential(
|
||||
new DefaultAzureCredentialOptions
|
||||
{
|
||||
AuthorityHost = AzureAuthorityHosts.AzureGovernment
|
||||
});
|
||||
|
||||
// Available authority hosts:
|
||||
// AzureAuthorityHosts.AzurePublicCloud (default)
|
||||
// AzureAuthorityHosts.AzureGovernment
|
||||
// AzureAuthorityHosts.AzureChina
|
||||
// AzureAuthorityHosts.AzureGermany
|
||||
```
|
||||
|
||||
## Credential Types Reference
|
||||
|
||||
| Category | Credential | Purpose |
|
||||
|----------|------------|---------|
|
||||
| **Chains** | `DefaultAzureCredential` | Preconfigured chain for dev-to-prod |
|
||||
| | `ChainedTokenCredential` | Custom credential chain |
|
||||
| **Azure-Hosted** | `ManagedIdentityCredential` | Azure managed identity |
|
||||
| | `WorkloadIdentityCredential` | Kubernetes workload identity |
|
||||
| | `EnvironmentCredential` | Environment variables |
|
||||
| **Service Principal** | `ClientSecretCredential` | Client ID + secret |
|
||||
| | `ClientCertificateCredential` | Client ID + certificate |
|
||||
| | `ClientAssertionCredential` | Signed client assertion |
|
||||
| **User** | `InteractiveBrowserCredential` | Browser-based auth |
|
||||
| | `DeviceCodeCredential` | Device code flow |
|
||||
| | `OnBehalfOfCredential` | Delegated identity |
|
||||
| **Developer** | `AzureCliCredential` | Azure CLI |
|
||||
| | `AzurePowerShellCredential` | Azure PowerShell |
|
||||
| | `AzureDeveloperCliCredential` | Azure Developer CLI |
|
||||
| | `VisualStudioCredential` | Visual Studio |
|
||||
|
||||
## Best Practices
|
||||
|
||||
### 1. Use Deterministic Credentials in Production
|
||||
|
||||
```csharp
|
||||
// Development
|
||||
var devCredential = new DefaultAzureCredential();
|
||||
|
||||
// Production - use specific credential
|
||||
var prodCredential = new ManagedIdentityCredential("<client-id>");
|
||||
```
|
||||
|
||||
### 2. Reuse Credential Instances
|
||||
|
||||
```csharp
|
||||
// Good: Single credential instance shared across clients
|
||||
var credential = new DefaultAzureCredential();
|
||||
var blobClient = new BlobServiceClient(blobUri, credential);
|
||||
var secretClient = new SecretClient(vaultUri, credential);
|
||||
```
|
||||
|
||||
### 3. Configure Retry Policies
|
||||
|
||||
```csharp
|
||||
var options = new ManagedIdentityCredentialOptions(
|
||||
ManagedIdentityId.FromUserAssignedClientId(clientId))
|
||||
{
|
||||
Retry =
|
||||
{
|
||||
MaxRetries = 3,
|
||||
Delay = TimeSpan.FromSeconds(0.5),
|
||||
}
|
||||
};
|
||||
var credential = new ManagedIdentityCredential(options);
|
||||
```
|
||||
|
||||
### 4. Enable Logging for Debugging
|
||||
|
||||
```csharp
|
||||
using Azure.Core.Diagnostics;
|
||||
|
||||
using AzureEventSourceListener listener = new((args, message) =>
|
||||
{
|
||||
if (args is { EventSource.Name: "Azure-Identity" })
|
||||
{
|
||||
Console.WriteLine(message);
|
||||
}
|
||||
}, EventLevel.LogAlways);
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Security.KeyVault.Secrets;
|
||||
|
||||
var client = new SecretClient(
|
||||
new Uri("https://myvault.vault.azure.net"),
|
||||
new DefaultAzureCredential());
|
||||
|
||||
try
|
||||
{
|
||||
KeyVaultSecret secret = await client.GetSecretAsync("secret1");
|
||||
}
|
||||
catch (AuthenticationFailedException e)
|
||||
{
|
||||
Console.WriteLine($"Authentication Failed: {e.Message}");
|
||||
}
|
||||
catch (CredentialUnavailableException e)
|
||||
{
|
||||
Console.WriteLine($"Credential Unavailable: {e.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Key Exceptions
|
||||
|
||||
| Exception | Description |
|
||||
|-----------|-------------|
|
||||
| `AuthenticationFailedException` | Base exception for authentication errors |
|
||||
| `CredentialUnavailableException` | Credential cannot authenticate in current environment |
|
||||
| `AuthenticationRequiredException` | Interactive authentication is required |
|
||||
|
||||
## Managed Identity Support
|
||||
|
||||
Supported Azure services:
|
||||
- Azure App Service and Azure Functions
|
||||
- Azure Arc
|
||||
- Azure Cloud Shell
|
||||
- Azure Kubernetes Service (AKS)
|
||||
- Azure Service Fabric
|
||||
- Azure Virtual Machines
|
||||
- Azure Virtual Machine Scale Sets
|
||||
|
||||
## Thread Safety
|
||||
|
||||
All credential implementations are thread-safe. A single credential instance can be safely shared across multiple clients and threads.
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Identity` | Authentication (this SDK) | `dotnet add package Azure.Identity` |
|
||||
| `Microsoft.Extensions.Azure` | DI integration | `dotnet add package Microsoft.Extensions.Azure` |
|
||||
| `Azure.Identity.Broker` | Brokered auth (Windows) | `dotnet add package Azure.Identity.Broker` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.Identity |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.identity |
|
||||
| Credential Chains | https://learn.microsoft.com/dotnet/azure/sdk/authentication/credential-chains |
|
||||
| Best Practices | https://learn.microsoft.com/dotnet/azure/sdk/authentication/best-practices |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/identity/Azure.Identity |
|
||||
366
skills/azure-identity-java/SKILL.md
Normal file
366
skills/azure-identity-java/SKILL.md
Normal file
@@ -0,0 +1,366 @@
|
||||
---
|
||||
name: azure-identity-java
|
||||
description: Azure Identity Java SDK for authentication with Azure services. Use when implementing DefaultAzureCredential, managed identity, service principal, or any Azure authentication pattern in Java applications.
|
||||
package: com.azure:azure-identity
|
||||
---
|
||||
|
||||
# Azure Identity (Java)
|
||||
|
||||
Authenticate Java applications with Azure services using Microsoft Entra ID (Azure AD).
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-identity</artifactId>
|
||||
<version>1.15.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
| Credential | Use Case |
|
||||
|------------|----------|
|
||||
| `DefaultAzureCredential` | **Recommended** - Works in dev and production |
|
||||
| `ManagedIdentityCredential` | Azure-hosted apps (App Service, Functions, VMs) |
|
||||
| `EnvironmentCredential` | CI/CD pipelines with env vars |
|
||||
| `ClientSecretCredential` | Service principals with secret |
|
||||
| `ClientCertificateCredential` | Service principals with certificate |
|
||||
| `AzureCliCredential` | Local dev using `az login` |
|
||||
| `InteractiveBrowserCredential` | Interactive login flow |
|
||||
| `DeviceCodeCredential` | Headless device authentication |
|
||||
|
||||
## DefaultAzureCredential (Recommended)
|
||||
|
||||
The `DefaultAzureCredential` tries multiple authentication methods in order:
|
||||
|
||||
1. Environment variables
|
||||
2. Workload Identity
|
||||
3. Managed Identity
|
||||
4. Azure CLI
|
||||
5. Azure PowerShell
|
||||
6. Azure Developer CLI
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredential;
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
// Simple usage
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder().build();
|
||||
|
||||
// Use with any Azure client
|
||||
BlobServiceClient blobClient = new BlobServiceClientBuilder()
|
||||
.endpoint("https://<storage-account>.blob.core.windows.net")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
|
||||
KeyClient keyClient = new KeyClientBuilder()
|
||||
.vaultUrl("https://<vault-name>.vault.azure.net")
|
||||
.credential(credential)
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Configure DefaultAzureCredential
|
||||
|
||||
```java
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder()
|
||||
.managedIdentityClientId("<user-assigned-identity-client-id>") // For user-assigned MI
|
||||
.tenantId("<tenant-id>") // Limit to specific tenant
|
||||
.excludeEnvironmentCredential() // Skip env vars
|
||||
.excludeAzureCliCredential() // Skip Azure CLI
|
||||
.build();
|
||||
```
|
||||
|
||||
## Managed Identity
|
||||
|
||||
For Azure-hosted applications (App Service, Functions, AKS, VMs).
|
||||
|
||||
```java
|
||||
import com.azure.identity.ManagedIdentityCredential;
|
||||
import com.azure.identity.ManagedIdentityCredentialBuilder;
|
||||
|
||||
// System-assigned managed identity
|
||||
ManagedIdentityCredential credential = new ManagedIdentityCredentialBuilder()
|
||||
.build();
|
||||
|
||||
// User-assigned managed identity (by client ID)
|
||||
ManagedIdentityCredential credential = new ManagedIdentityCredentialBuilder()
|
||||
.clientId("<user-assigned-client-id>")
|
||||
.build();
|
||||
|
||||
// User-assigned managed identity (by resource ID)
|
||||
ManagedIdentityCredential credential = new ManagedIdentityCredentialBuilder()
|
||||
.resourceId("/subscriptions/<sub>/resourceGroups/<rg>/providers/Microsoft.ManagedIdentity/userAssignedIdentities/<name>")
|
||||
.build();
|
||||
```
|
||||
|
||||
## Service Principal with Secret
|
||||
|
||||
```java
|
||||
import com.azure.identity.ClientSecretCredential;
|
||||
import com.azure.identity.ClientSecretCredentialBuilder;
|
||||
|
||||
ClientSecretCredential credential = new ClientSecretCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.clientSecret("<client-secret>")
|
||||
.build();
|
||||
```
|
||||
|
||||
## Service Principal with Certificate
|
||||
|
||||
```java
|
||||
import com.azure.identity.ClientCertificateCredential;
|
||||
import com.azure.identity.ClientCertificateCredentialBuilder;
|
||||
|
||||
// From PEM file
|
||||
ClientCertificateCredential credential = new ClientCertificateCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.pemCertificate("<path-to-cert.pem>")
|
||||
.build();
|
||||
|
||||
// From PFX file with password
|
||||
ClientCertificateCredential credential = new ClientCertificateCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.pfxCertificate("<path-to-cert.pfx>", "<pfx-password>")
|
||||
.build();
|
||||
|
||||
// Send certificate chain for SNI
|
||||
ClientCertificateCredential credential = new ClientCertificateCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.pemCertificate("<path-to-cert.pem>")
|
||||
.sendCertificateChain(true)
|
||||
.build();
|
||||
```
|
||||
|
||||
## Environment Credential
|
||||
|
||||
Reads credentials from environment variables.
|
||||
|
||||
```java
|
||||
import com.azure.identity.EnvironmentCredential;
|
||||
import com.azure.identity.EnvironmentCredentialBuilder;
|
||||
|
||||
EnvironmentCredential credential = new EnvironmentCredentialBuilder().build();
|
||||
```
|
||||
|
||||
### Required Environment Variables
|
||||
|
||||
**For service principal with secret:**
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
**For service principal with certificate:**
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_CERTIFICATE_PATH=/path/to/cert.pem
|
||||
AZURE_CLIENT_CERTIFICATE_PASSWORD=<optional-password>
|
||||
```
|
||||
|
||||
**For username/password:**
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_USERNAME=<username>
|
||||
AZURE_PASSWORD=<password>
|
||||
```
|
||||
|
||||
## Azure CLI Credential
|
||||
|
||||
For local development using `az login`.
|
||||
|
||||
```java
|
||||
import com.azure.identity.AzureCliCredential;
|
||||
import com.azure.identity.AzureCliCredentialBuilder;
|
||||
|
||||
AzureCliCredential credential = new AzureCliCredentialBuilder()
|
||||
.tenantId("<tenant-id>") // Optional: specific tenant
|
||||
.build();
|
||||
```
|
||||
|
||||
## Interactive Browser
|
||||
|
||||
For desktop applications requiring user login.
|
||||
|
||||
```java
|
||||
import com.azure.identity.InteractiveBrowserCredential;
|
||||
import com.azure.identity.InteractiveBrowserCredentialBuilder;
|
||||
|
||||
InteractiveBrowserCredential credential = new InteractiveBrowserCredentialBuilder()
|
||||
.clientId("<client-id>")
|
||||
.redirectUrl("http://localhost:8080") // Must match app registration
|
||||
.build();
|
||||
```
|
||||
|
||||
## Device Code
|
||||
|
||||
For headless devices (IoT, CLI tools).
|
||||
|
||||
```java
|
||||
import com.azure.identity.DeviceCodeCredential;
|
||||
import com.azure.identity.DeviceCodeCredentialBuilder;
|
||||
|
||||
DeviceCodeCredential credential = new DeviceCodeCredentialBuilder()
|
||||
.clientId("<client-id>")
|
||||
.challengeConsumer(challenge -> {
|
||||
// Display to user
|
||||
System.out.println(challenge.getMessage());
|
||||
})
|
||||
.build();
|
||||
```
|
||||
|
||||
## Chained Credential
|
||||
|
||||
Create custom authentication chains.
|
||||
|
||||
```java
|
||||
import com.azure.identity.ChainedTokenCredential;
|
||||
import com.azure.identity.ChainedTokenCredentialBuilder;
|
||||
|
||||
ChainedTokenCredential credential = new ChainedTokenCredentialBuilder()
|
||||
.addFirst(new ManagedIdentityCredentialBuilder().build())
|
||||
.addLast(new AzureCliCredentialBuilder().build())
|
||||
.build();
|
||||
```
|
||||
|
||||
## Workload Identity (AKS)
|
||||
|
||||
For Azure Kubernetes Service with workload identity.
|
||||
|
||||
```java
|
||||
import com.azure.identity.WorkloadIdentityCredential;
|
||||
import com.azure.identity.WorkloadIdentityCredentialBuilder;
|
||||
|
||||
// Reads from AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_FEDERATED_TOKEN_FILE
|
||||
WorkloadIdentityCredential credential = new WorkloadIdentityCredentialBuilder().build();
|
||||
|
||||
// Or explicit configuration
|
||||
WorkloadIdentityCredential credential = new WorkloadIdentityCredentialBuilder()
|
||||
.tenantId("<tenant-id>")
|
||||
.clientId("<client-id>")
|
||||
.tokenFilePath("/var/run/secrets/azure/tokens/azure-identity-token")
|
||||
.build();
|
||||
```
|
||||
|
||||
## Token Caching
|
||||
|
||||
Enable persistent token caching for better performance.
|
||||
|
||||
```java
|
||||
// Enable token caching (in-memory by default)
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder()
|
||||
.enableAccountIdentifierLogging()
|
||||
.build();
|
||||
|
||||
// With shared token cache (for multi-credential scenarios)
|
||||
SharedTokenCacheCredential credential = new SharedTokenCacheCredentialBuilder()
|
||||
.clientId("<client-id>")
|
||||
.build();
|
||||
```
|
||||
|
||||
## Sovereign Clouds
|
||||
|
||||
```java
|
||||
import com.azure.identity.AzureAuthorityHosts;
|
||||
|
||||
// Azure Government
|
||||
DefaultAzureCredential govCredential = new DefaultAzureCredentialBuilder()
|
||||
.authorityHost(AzureAuthorityHosts.AZURE_GOVERNMENT)
|
||||
.build();
|
||||
|
||||
// Azure China
|
||||
DefaultAzureCredential chinaCredential = new DefaultAzureCredentialBuilder()
|
||||
.authorityHost(AzureAuthorityHosts.AZURE_CHINA)
|
||||
.build();
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.identity.CredentialUnavailableException;
|
||||
import com.azure.core.exception.ClientAuthenticationException;
|
||||
|
||||
try {
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder().build();
|
||||
AccessToken token = credential.getToken(new TokenRequestContext()
|
||||
.addScopes("https://management.azure.com/.default"));
|
||||
} catch (CredentialUnavailableException e) {
|
||||
// No credential could authenticate
|
||||
System.out.println("Authentication failed: " + e.getMessage());
|
||||
} catch (ClientAuthenticationException e) {
|
||||
// Authentication error (wrong credentials, expired, etc.)
|
||||
System.out.println("Auth error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Logging
|
||||
|
||||
Enable authentication logging for debugging.
|
||||
|
||||
```java
|
||||
// Via environment variable
|
||||
// AZURE_LOG_LEVEL=verbose
|
||||
|
||||
// Or programmatically
|
||||
DefaultAzureCredential credential = new DefaultAzureCredentialBuilder()
|
||||
.enableAccountIdentifierLogging() // Log account info
|
||||
.build();
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# DefaultAzureCredential configuration
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
|
||||
# Managed Identity
|
||||
AZURE_CLIENT_ID=<user-assigned-mi-client-id>
|
||||
|
||||
# Workload Identity (AKS)
|
||||
AZURE_FEDERATED_TOKEN_FILE=/var/run/secrets/azure/tokens/azure-identity-token
|
||||
|
||||
# Logging
|
||||
AZURE_LOG_LEVEL=verbose
|
||||
|
||||
# Authority host
|
||||
AZURE_AUTHORITY_HOST=https://login.microsoftonline.com/
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** - Works seamlessly from dev to production
|
||||
2. **Managed Identity in Production** - No secrets to manage, automatic rotation
|
||||
3. **Azure CLI for Local Dev** - Run `az login` before running your app
|
||||
4. **Least Privilege** - Grant only required permissions to service principals
|
||||
5. **Token Caching** - Enabled by default, reduces auth round-trips
|
||||
6. **Environment Variables** - Use for CI/CD, not hardcoded secrets
|
||||
|
||||
## Credential Selection Matrix
|
||||
|
||||
| Environment | Recommended Credential |
|
||||
|-------------|----------------------|
|
||||
| Local Development | `DefaultAzureCredential` (uses Azure CLI) |
|
||||
| Azure App Service | `DefaultAzureCredential` (uses Managed Identity) |
|
||||
| Azure Functions | `DefaultAzureCredential` (uses Managed Identity) |
|
||||
| Azure Kubernetes Service | `WorkloadIdentityCredential` |
|
||||
| Azure VMs | `DefaultAzureCredential` (uses Managed Identity) |
|
||||
| CI/CD Pipeline | `EnvironmentCredential` |
|
||||
| Desktop App | `InteractiveBrowserCredential` |
|
||||
| CLI Tool | `DeviceCodeCredential` |
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Azure authentication Java", "DefaultAzureCredential Java"
|
||||
- "managed identity Java", "service principal Java"
|
||||
- "Azure login Java", "Azure credentials Java"
|
||||
- "AZURE_CLIENT_ID", "AZURE_TENANT_ID"
|
||||
192
skills/azure-identity-py/SKILL.md
Normal file
192
skills/azure-identity-py/SKILL.md
Normal file
@@ -0,0 +1,192 @@
|
||||
---
|
||||
name: azure-identity-py
|
||||
description: |
|
||||
Azure Identity SDK for Python authentication. Use for DefaultAzureCredential, managed identity, service principals, and token caching.
|
||||
Triggers: "azure-identity", "DefaultAzureCredential", "authentication", "managed identity", "service principal", "credential".
|
||||
package: azure-identity
|
||||
---
|
||||
|
||||
# Azure Identity SDK for Python
|
||||
|
||||
Authentication library for Azure SDK clients using Microsoft Entra ID (formerly Azure AD).
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Service Principal (for production/CI)
|
||||
AZURE_TENANT_ID=<your-tenant-id>
|
||||
AZURE_CLIENT_ID=<your-client-id>
|
||||
AZURE_CLIENT_SECRET=<your-client-secret>
|
||||
|
||||
# User-assigned Managed Identity (optional)
|
||||
AZURE_CLIENT_ID=<managed-identity-client-id>
|
||||
```
|
||||
|
||||
## DefaultAzureCredential
|
||||
|
||||
The recommended credential for most scenarios. Tries multiple authentication methods in order:
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.storage.blob import BlobServiceClient
|
||||
|
||||
# Works in local dev AND production without code changes
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
client = BlobServiceClient(
|
||||
account_url="https://<account>.blob.core.windows.net",
|
||||
credential=credential
|
||||
)
|
||||
```
|
||||
|
||||
### Credential Chain Order
|
||||
|
||||
| Order | Credential | Environment |
|
||||
|-------|-----------|-------------|
|
||||
| 1 | EnvironmentCredential | CI/CD, containers |
|
||||
| 2 | WorkloadIdentityCredential | Kubernetes |
|
||||
| 3 | ManagedIdentityCredential | Azure VMs, App Service, Functions |
|
||||
| 4 | SharedTokenCacheCredential | Windows only |
|
||||
| 5 | VisualStudioCodeCredential | VS Code with Azure extension |
|
||||
| 6 | AzureCliCredential | `az login` |
|
||||
| 7 | AzurePowerShellCredential | `Connect-AzAccount` |
|
||||
| 8 | AzureDeveloperCliCredential | `azd auth login` |
|
||||
|
||||
### Customizing DefaultAzureCredential
|
||||
|
||||
```python
|
||||
# Exclude credentials you don't need
|
||||
credential = DefaultAzureCredential(
|
||||
exclude_environment_credential=True,
|
||||
exclude_shared_token_cache_credential=True,
|
||||
managed_identity_client_id="<user-assigned-mi-client-id>" # For user-assigned MI
|
||||
)
|
||||
|
||||
# Enable interactive browser (disabled by default)
|
||||
credential = DefaultAzureCredential(
|
||||
exclude_interactive_browser_credential=False
|
||||
)
|
||||
```
|
||||
|
||||
## Specific Credential Types
|
||||
|
||||
### ManagedIdentityCredential
|
||||
|
||||
For Azure-hosted resources (VMs, App Service, Functions, AKS):
|
||||
|
||||
```python
|
||||
from azure.identity import ManagedIdentityCredential
|
||||
|
||||
# System-assigned managed identity
|
||||
credential = ManagedIdentityCredential()
|
||||
|
||||
# User-assigned managed identity
|
||||
credential = ManagedIdentityCredential(
|
||||
client_id="<user-assigned-mi-client-id>"
|
||||
)
|
||||
```
|
||||
|
||||
### ClientSecretCredential
|
||||
|
||||
For service principal with secret:
|
||||
|
||||
```python
|
||||
from azure.identity import ClientSecretCredential
|
||||
|
||||
credential = ClientSecretCredential(
|
||||
tenant_id=os.environ["AZURE_TENANT_ID"],
|
||||
client_id=os.environ["AZURE_CLIENT_ID"],
|
||||
client_secret=os.environ["AZURE_CLIENT_SECRET"]
|
||||
)
|
||||
```
|
||||
|
||||
### AzureCliCredential
|
||||
|
||||
Uses the account from `az login`:
|
||||
|
||||
```python
|
||||
from azure.identity import AzureCliCredential
|
||||
|
||||
credential = AzureCliCredential()
|
||||
```
|
||||
|
||||
### ChainedTokenCredential
|
||||
|
||||
Custom credential chain:
|
||||
|
||||
```python
|
||||
from azure.identity import (
|
||||
ChainedTokenCredential,
|
||||
ManagedIdentityCredential,
|
||||
AzureCliCredential
|
||||
)
|
||||
|
||||
# Try managed identity first, fall back to CLI
|
||||
credential = ChainedTokenCredential(
|
||||
ManagedIdentityCredential(client_id="<user-assigned-mi-client-id>"),
|
||||
AzureCliCredential()
|
||||
)
|
||||
```
|
||||
|
||||
## Credential Types Table
|
||||
|
||||
| Credential | Use Case | Auth Method |
|
||||
|------------|----------|-------------|
|
||||
| `DefaultAzureCredential` | Most scenarios | Auto-detect |
|
||||
| `ManagedIdentityCredential` | Azure-hosted apps | Managed Identity |
|
||||
| `ClientSecretCredential` | Service principal | Client secret |
|
||||
| `ClientCertificateCredential` | Service principal | Certificate |
|
||||
| `AzureCliCredential` | Local development | Azure CLI |
|
||||
| `AzureDeveloperCliCredential` | Local development | Azure Developer CLI |
|
||||
| `InteractiveBrowserCredential` | User sign-in | Browser OAuth |
|
||||
| `DeviceCodeCredential` | Headless/SSH | Device code flow |
|
||||
|
||||
## Getting Tokens Directly
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
# Get token for a specific scope
|
||||
token = credential.get_token("https://management.azure.com/.default")
|
||||
print(f"Token expires: {token.expires_on}")
|
||||
|
||||
# For Azure Database for PostgreSQL
|
||||
token = credential.get_token("https://ossrdbms-aad.database.windows.net/.default")
|
||||
```
|
||||
|
||||
## Async Client
|
||||
|
||||
```python
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
from azure.storage.blob.aio import BlobServiceClient
|
||||
|
||||
async def main():
|
||||
credential = DefaultAzureCredential()
|
||||
|
||||
async with BlobServiceClient(
|
||||
account_url="https://<account>.blob.core.windows.net",
|
||||
credential=credential
|
||||
) as client:
|
||||
# ... async operations
|
||||
pass
|
||||
|
||||
await credential.close()
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for code that runs locally and in Azure
|
||||
2. **Never hardcode credentials** — use environment variables or managed identity
|
||||
3. **Prefer managed identity** in production Azure deployments
|
||||
4. **Use ChainedTokenCredential** when you need a custom credential order
|
||||
5. **Close async credentials** explicitly or use context managers
|
||||
6. **Set AZURE_CLIENT_ID** for user-assigned managed identities
|
||||
7. **Exclude unused credentials** to speed up authentication
|
||||
115
skills/azure-identity-rust/SKILL.md
Normal file
115
skills/azure-identity-rust/SKILL.md
Normal file
@@ -0,0 +1,115 @@
|
||||
---
|
||||
name: azure-identity-rust
|
||||
description: |
|
||||
Azure Identity SDK for Rust authentication. Use for DeveloperToolsCredential, ManagedIdentityCredential, ClientSecretCredential, and token-based authentication.
|
||||
Triggers: "azure-identity", "DeveloperToolsCredential", "authentication rust", "managed identity rust", "credential rust".
|
||||
package: azure_identity
|
||||
---
|
||||
|
||||
# Azure Identity SDK for Rust
|
||||
|
||||
Authentication library for Azure SDK clients using Microsoft Entra ID (formerly Azure AD).
|
||||
|
||||
## Installation
|
||||
|
||||
```sh
|
||||
cargo add azure_identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
# Service Principal (for production/CI)
|
||||
AZURE_TENANT_ID=<your-tenant-id>
|
||||
AZURE_CLIENT_ID=<your-client-id>
|
||||
AZURE_CLIENT_SECRET=<your-client-secret>
|
||||
|
||||
# User-assigned Managed Identity (optional)
|
||||
AZURE_CLIENT_ID=<managed-identity-client-id>
|
||||
```
|
||||
|
||||
## DeveloperToolsCredential
|
||||
|
||||
The recommended credential for local development. Tries developer tools in order (Azure CLI, Azure Developer CLI):
|
||||
|
||||
```rust
|
||||
use azure_identity::DeveloperToolsCredential;
|
||||
use azure_security_keyvault_secrets::SecretClient;
|
||||
|
||||
let credential = DeveloperToolsCredential::new(None)?;
|
||||
let client = SecretClient::new(
|
||||
"https://my-vault.vault.azure.net/",
|
||||
credential.clone(),
|
||||
None,
|
||||
)?;
|
||||
```
|
||||
|
||||
### Credential Chain Order
|
||||
|
||||
| Order | Credential | Environment |
|
||||
|-------|-----------|-------------|
|
||||
| 1 | AzureCliCredential | `az login` |
|
||||
| 2 | AzureDeveloperCliCredential | `azd auth login` |
|
||||
|
||||
## Credential Types
|
||||
|
||||
| Credential | Usage |
|
||||
|------------|-------|
|
||||
| `DeveloperToolsCredential` | Local development - tries CLI tools |
|
||||
| `ManagedIdentityCredential` | Azure VMs, App Service, Functions, AKS |
|
||||
| `WorkloadIdentityCredential` | Kubernetes workload identity |
|
||||
| `ClientSecretCredential` | Service principal with secret |
|
||||
| `ClientCertificateCredential` | Service principal with certificate |
|
||||
| `AzureCliCredential` | Direct Azure CLI auth |
|
||||
| `AzureDeveloperCliCredential` | Direct azd CLI auth |
|
||||
| `AzurePipelinesCredential` | Azure Pipelines service connection |
|
||||
| `ClientAssertionCredential` | Custom assertions (federated identity) |
|
||||
|
||||
## ManagedIdentityCredential
|
||||
|
||||
For Azure-hosted resources:
|
||||
|
||||
```rust
|
||||
use azure_identity::ManagedIdentityCredential;
|
||||
|
||||
// System-assigned managed identity
|
||||
let credential = ManagedIdentityCredential::new(None)?;
|
||||
|
||||
// User-assigned managed identity
|
||||
let options = ManagedIdentityCredentialOptions {
|
||||
client_id: Some("<user-assigned-mi-client-id>".into()),
|
||||
..Default::default()
|
||||
};
|
||||
let credential = ManagedIdentityCredential::new(Some(options))?;
|
||||
```
|
||||
|
||||
## ClientSecretCredential
|
||||
|
||||
For service principal with secret:
|
||||
|
||||
```rust
|
||||
use azure_identity::ClientSecretCredential;
|
||||
|
||||
let credential = ClientSecretCredential::new(
|
||||
"<tenant-id>".into(),
|
||||
"<client-id>".into(),
|
||||
"<client-secret>".into(),
|
||||
None,
|
||||
)?;
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `DeveloperToolsCredential` for local dev** — automatically picks up Azure CLI
|
||||
2. **Use `ManagedIdentityCredential` in production** — no secrets to manage
|
||||
3. **Clone credentials** — credentials are `Arc`-wrapped and cheap to clone
|
||||
4. **Reuse credential instances** — same credential can be used with multiple clients
|
||||
5. **Use `tokio` feature** — `cargo add azure_identity --features tokio`
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | Link |
|
||||
|----------|------|
|
||||
| API Reference | https://docs.rs/azure_identity |
|
||||
| Source Code | https://github.com/Azure/azure-sdk-for-rust/tree/main/sdk/identity/azure_identity |
|
||||
| crates.io | https://crates.io/crates/azure_identity |
|
||||
303
skills/azure-identity-ts/SKILL.md
Normal file
303
skills/azure-identity-ts/SKILL.md
Normal file
@@ -0,0 +1,303 @@
|
||||
---
|
||||
name: azure-identity-ts
|
||||
description: Authenticate to Azure services using Azure Identity SDK for JavaScript (@azure/identity). Use when configuring authentication with DefaultAzureCredential, managed identity, service principals, or interactive browser login.
|
||||
package: @azure/identity
|
||||
---
|
||||
|
||||
# Azure Identity SDK for TypeScript
|
||||
|
||||
Authenticate to Azure services with various credential types.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @azure/identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
### Service Principal (Secret)
|
||||
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
### Service Principal (Certificate)
|
||||
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_CERTIFICATE_PATH=/path/to/cert.pem
|
||||
AZURE_CLIENT_CERTIFICATE_PASSWORD=<optional-password>
|
||||
```
|
||||
|
||||
### Workload Identity (Kubernetes)
|
||||
|
||||
```bash
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_FEDERATED_TOKEN_FILE=/var/run/secrets/tokens/azure-identity
|
||||
```
|
||||
|
||||
## DefaultAzureCredential (Recommended)
|
||||
|
||||
```typescript
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
|
||||
const credential = new DefaultAzureCredential();
|
||||
|
||||
// Use with any Azure SDK client
|
||||
import { BlobServiceClient } from "@azure/storage-blob";
|
||||
const blobClient = new BlobServiceClient(
|
||||
"https://<account>.blob.core.windows.net",
|
||||
credential
|
||||
);
|
||||
```
|
||||
|
||||
**Credential Chain Order:**
|
||||
1. EnvironmentCredential
|
||||
2. WorkloadIdentityCredential
|
||||
3. ManagedIdentityCredential
|
||||
4. VisualStudioCodeCredential
|
||||
5. AzureCliCredential
|
||||
6. AzurePowerShellCredential
|
||||
7. AzureDeveloperCliCredential
|
||||
|
||||
## Managed Identity
|
||||
|
||||
### System-Assigned
|
||||
|
||||
```typescript
|
||||
import { ManagedIdentityCredential } from "@azure/identity";
|
||||
|
||||
const credential = new ManagedIdentityCredential();
|
||||
```
|
||||
|
||||
### User-Assigned (by Client ID)
|
||||
|
||||
```typescript
|
||||
const credential = new ManagedIdentityCredential({
|
||||
clientId: "<user-assigned-client-id>"
|
||||
});
|
||||
```
|
||||
|
||||
### User-Assigned (by Resource ID)
|
||||
|
||||
```typescript
|
||||
const credential = new ManagedIdentityCredential({
|
||||
resourceId: "/subscriptions/<sub>/resourceGroups/<rg>/providers/Microsoft.ManagedIdentity/userAssignedIdentities/<name>"
|
||||
});
|
||||
```
|
||||
|
||||
## Service Principal
|
||||
|
||||
### Client Secret
|
||||
|
||||
```typescript
|
||||
import { ClientSecretCredential } from "@azure/identity";
|
||||
|
||||
const credential = new ClientSecretCredential(
|
||||
"<tenant-id>",
|
||||
"<client-id>",
|
||||
"<client-secret>"
|
||||
);
|
||||
```
|
||||
|
||||
### Client Certificate
|
||||
|
||||
```typescript
|
||||
import { ClientCertificateCredential } from "@azure/identity";
|
||||
|
||||
const credential = new ClientCertificateCredential(
|
||||
"<tenant-id>",
|
||||
"<client-id>",
|
||||
{ certificatePath: "/path/to/cert.pem" }
|
||||
);
|
||||
|
||||
// With password
|
||||
const credentialWithPwd = new ClientCertificateCredential(
|
||||
"<tenant-id>",
|
||||
"<client-id>",
|
||||
{
|
||||
certificatePath: "/path/to/cert.pem",
|
||||
certificatePassword: "<password>"
|
||||
}
|
||||
);
|
||||
```
|
||||
|
||||
## Interactive Authentication
|
||||
|
||||
### Browser-Based Login
|
||||
|
||||
```typescript
|
||||
import { InteractiveBrowserCredential } from "@azure/identity";
|
||||
|
||||
const credential = new InteractiveBrowserCredential({
|
||||
clientId: "<client-id>",
|
||||
tenantId: "<tenant-id>",
|
||||
loginHint: "user@example.com"
|
||||
});
|
||||
```
|
||||
|
||||
### Device Code Flow
|
||||
|
||||
```typescript
|
||||
import { DeviceCodeCredential } from "@azure/identity";
|
||||
|
||||
const credential = new DeviceCodeCredential({
|
||||
clientId: "<client-id>",
|
||||
tenantId: "<tenant-id>",
|
||||
userPromptCallback: (info) => {
|
||||
console.log(info.message);
|
||||
// "To sign in, use a web browser to open..."
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Credential Chain
|
||||
|
||||
```typescript
|
||||
import {
|
||||
ChainedTokenCredential,
|
||||
ManagedIdentityCredential,
|
||||
AzureCliCredential
|
||||
} from "@azure/identity";
|
||||
|
||||
// Try managed identity first, fall back to CLI
|
||||
const credential = new ChainedTokenCredential(
|
||||
new ManagedIdentityCredential(),
|
||||
new AzureCliCredential()
|
||||
);
|
||||
```
|
||||
|
||||
## Developer Credentials
|
||||
|
||||
### Azure CLI
|
||||
|
||||
```typescript
|
||||
import { AzureCliCredential } from "@azure/identity";
|
||||
|
||||
const credential = new AzureCliCredential();
|
||||
// Uses: az login
|
||||
```
|
||||
|
||||
### Azure Developer CLI
|
||||
|
||||
```typescript
|
||||
import { AzureDeveloperCliCredential } from "@azure/identity";
|
||||
|
||||
const credential = new AzureDeveloperCliCredential();
|
||||
// Uses: azd auth login
|
||||
```
|
||||
|
||||
### Azure PowerShell
|
||||
|
||||
```typescript
|
||||
import { AzurePowerShellCredential } from "@azure/identity";
|
||||
|
||||
const credential = new AzurePowerShellCredential();
|
||||
// Uses: Connect-AzAccount
|
||||
```
|
||||
|
||||
## Sovereign Clouds
|
||||
|
||||
```typescript
|
||||
import { ClientSecretCredential, AzureAuthorityHosts } from "@azure/identity";
|
||||
|
||||
// Azure Government
|
||||
const credential = new ClientSecretCredential(
|
||||
"<tenant>", "<client>", "<secret>",
|
||||
{ authorityHost: AzureAuthorityHosts.AzureGovernment }
|
||||
);
|
||||
|
||||
// Azure China
|
||||
const credentialChina = new ClientSecretCredential(
|
||||
"<tenant>", "<client>", "<secret>",
|
||||
{ authorityHost: AzureAuthorityHosts.AzureChina }
|
||||
);
|
||||
```
|
||||
|
||||
## Bearer Token Provider
|
||||
|
||||
```typescript
|
||||
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
|
||||
|
||||
const credential = new DefaultAzureCredential();
|
||||
|
||||
// Create a function that returns tokens
|
||||
const getAccessToken = getBearerTokenProvider(
|
||||
credential,
|
||||
"https://cognitiveservices.azure.com/.default"
|
||||
);
|
||||
|
||||
// Use with APIs that need bearer tokens
|
||||
const token = await getAccessToken();
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
import type {
|
||||
TokenCredential,
|
||||
AccessToken,
|
||||
GetTokenOptions
|
||||
} from "@azure/core-auth";
|
||||
|
||||
import {
|
||||
DefaultAzureCredential,
|
||||
DefaultAzureCredentialOptions,
|
||||
ManagedIdentityCredential,
|
||||
ClientSecretCredential,
|
||||
ClientCertificateCredential,
|
||||
InteractiveBrowserCredential,
|
||||
ChainedTokenCredential,
|
||||
AzureCliCredential,
|
||||
AzurePowerShellCredential,
|
||||
AzureDeveloperCliCredential,
|
||||
DeviceCodeCredential,
|
||||
AzureAuthorityHosts
|
||||
} from "@azure/identity";
|
||||
```
|
||||
|
||||
## Custom Credential Implementation
|
||||
|
||||
```typescript
|
||||
import type { TokenCredential, AccessToken, GetTokenOptions } from "@azure/core-auth";
|
||||
|
||||
class CustomCredential implements TokenCredential {
|
||||
async getToken(
|
||||
scopes: string | string[],
|
||||
options?: GetTokenOptions
|
||||
): Promise<AccessToken | null> {
|
||||
// Custom token acquisition logic
|
||||
return {
|
||||
token: "<access-token>",
|
||||
expiresOnTimestamp: Date.now() + 3600000
|
||||
};
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Debugging
|
||||
|
||||
```typescript
|
||||
import { setLogLevel, AzureLogger } from "@azure/logger";
|
||||
|
||||
setLogLevel("verbose");
|
||||
|
||||
// Custom log handler
|
||||
AzureLogger.log = (...args) => {
|
||||
console.log("[Azure]", ...args);
|
||||
};
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** - Works in development (CLI) and production (managed identity)
|
||||
2. **Never hardcode credentials** - Use environment variables or managed identity
|
||||
3. **Prefer managed identity** - No secrets to manage in production
|
||||
4. **Scope credentials appropriately** - Use user-assigned identity for multi-tenant scenarios
|
||||
5. **Handle token refresh** - Azure SDK handles this automatically
|
||||
6. **Use ChainedTokenCredential** - For custom fallback scenarios
|
||||
177
skills/azure-keyvault-certificates-rust/SKILL.md
Normal file
177
skills/azure-keyvault-certificates-rust/SKILL.md
Normal file
@@ -0,0 +1,177 @@
|
||||
---
|
||||
name: azure-keyvault-certificates-rust
|
||||
description: |
|
||||
Azure Key Vault Certificates SDK for Rust. Use for creating, importing, and managing certificates.
|
||||
Triggers: "keyvault certificates rust", "CertificateClient rust", "create certificate rust", "import certificate rust".
|
||||
package: azure_security_keyvault_certificates
|
||||
---
|
||||
|
||||
# Azure Key Vault Certificates SDK for Rust
|
||||
|
||||
Client library for Azure Key Vault Certificates — secure storage and management of certificates.
|
||||
|
||||
## Installation
|
||||
|
||||
```sh
|
||||
cargo add azure_security_keyvault_certificates azure_identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_KEYVAULT_URL=https://<vault-name>.vault.azure.net/
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```rust
|
||||
use azure_identity::DeveloperToolsCredential;
|
||||
use azure_security_keyvault_certificates::CertificateClient;
|
||||
|
||||
let credential = DeveloperToolsCredential::new(None)?;
|
||||
let client = CertificateClient::new(
|
||||
"https://<vault-name>.vault.azure.net/",
|
||||
credential.clone(),
|
||||
None,
|
||||
)?;
|
||||
```
|
||||
|
||||
## Core Operations
|
||||
|
||||
### Get Certificate
|
||||
|
||||
```rust
|
||||
use azure_core::base64;
|
||||
|
||||
let certificate = client
|
||||
.get_certificate("certificate-name", None)
|
||||
.await?
|
||||
.into_model()?;
|
||||
|
||||
println!(
|
||||
"Thumbprint: {:?}",
|
||||
certificate.x509_thumbprint.map(base64::encode_url_safe)
|
||||
);
|
||||
```
|
||||
|
||||
### Create Certificate
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_certificates::models::{
|
||||
CreateCertificateParameters, CertificatePolicy,
|
||||
IssuerParameters, X509CertificateProperties,
|
||||
};
|
||||
|
||||
let policy = CertificatePolicy {
|
||||
issuer_parameters: Some(IssuerParameters {
|
||||
name: Some("Self".into()),
|
||||
..Default::default()
|
||||
}),
|
||||
x509_certificate_properties: Some(X509CertificateProperties {
|
||||
subject: Some("CN=example.com".into()),
|
||||
..Default::default()
|
||||
}),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let params = CreateCertificateParameters {
|
||||
certificate_policy: Some(policy),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let operation = client
|
||||
.create_certificate("cert-name", params.try_into()?, None)
|
||||
.await?;
|
||||
```
|
||||
|
||||
### Import Certificate
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_certificates::models::ImportCertificateParameters;
|
||||
|
||||
let params = ImportCertificateParameters {
|
||||
base64_encoded_certificate: Some(base64_cert_data),
|
||||
password: Some("optional-password".into()),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let certificate = client
|
||||
.import_certificate("cert-name", params.try_into()?, None)
|
||||
.await?
|
||||
.into_model()?;
|
||||
```
|
||||
|
||||
### Delete Certificate
|
||||
|
||||
```rust
|
||||
client.delete_certificate("certificate-name", None).await?;
|
||||
```
|
||||
|
||||
### List Certificates
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_certificates::ResourceExt;
|
||||
use futures::TryStreamExt;
|
||||
|
||||
let mut pager = client.list_certificate_properties(None)?.into_stream();
|
||||
while let Some(cert) = pager.try_next().await? {
|
||||
let name = cert.resource_id()?.name;
|
||||
println!("Certificate: {}", name);
|
||||
}
|
||||
```
|
||||
|
||||
### Get Certificate Policy
|
||||
|
||||
```rust
|
||||
let policy = client
|
||||
.get_certificate_policy("certificate-name", None)
|
||||
.await?
|
||||
.into_model()?;
|
||||
```
|
||||
|
||||
### Update Certificate Policy
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_certificates::models::UpdateCertificatePolicyParameters;
|
||||
|
||||
let params = UpdateCertificatePolicyParameters {
|
||||
// Update policy properties
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
client
|
||||
.update_certificate_policy("cert-name", params.try_into()?, None)
|
||||
.await?;
|
||||
```
|
||||
|
||||
## Certificate Lifecycle
|
||||
|
||||
1. **Create** — generates new certificate with policy
|
||||
2. **Import** — import existing PFX/PEM certificate
|
||||
3. **Get** — retrieve certificate (public key only)
|
||||
4. **Update** — modify certificate properties
|
||||
5. **Delete** — soft delete (recoverable)
|
||||
6. **Purge** — permanent deletion
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID auth** — `DeveloperToolsCredential` for dev
|
||||
2. **Use managed certificates** — auto-renewal with supported issuers
|
||||
3. **Set proper validity period** — balance security and maintenance
|
||||
4. **Use certificate policies** — define renewal and key properties
|
||||
5. **Monitor expiration** — set up alerts for expiring certificates
|
||||
6. **Enable soft delete** — required for production vaults
|
||||
|
||||
## RBAC Permissions
|
||||
|
||||
Assign these Key Vault roles:
|
||||
- `Key Vault Certificates Officer` — full CRUD on certificates
|
||||
- `Key Vault Reader` — read certificate metadata
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | Link |
|
||||
|----------|------|
|
||||
| API Reference | https://docs.rs/azure_security_keyvault_certificates |
|
||||
| Source Code | https://github.com/Azure/azure-sdk-for-rust/tree/main/sdk/keyvault/azure_security_keyvault_certificates |
|
||||
| crates.io | https://crates.io/crates/azure_security_keyvault_certificates |
|
||||
167
skills/azure-keyvault-keys-rust/SKILL.md
Normal file
167
skills/azure-keyvault-keys-rust/SKILL.md
Normal file
@@ -0,0 +1,167 @@
|
||||
---
|
||||
name: azure-keyvault-keys-rust
|
||||
description: |
|
||||
Azure Key Vault Keys SDK for Rust. Use for creating, managing, and using cryptographic keys.
|
||||
Triggers: "keyvault keys rust", "KeyClient rust", "create key rust", "encrypt rust", "sign rust".
|
||||
package: azure_security_keyvault_keys
|
||||
---
|
||||
|
||||
# Azure Key Vault Keys SDK for Rust
|
||||
|
||||
Client library for Azure Key Vault Keys — secure storage and management of cryptographic keys.
|
||||
|
||||
## Installation
|
||||
|
||||
```sh
|
||||
cargo add azure_security_keyvault_keys azure_identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_KEYVAULT_URL=https://<vault-name>.vault.azure.net/
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```rust
|
||||
use azure_identity::DeveloperToolsCredential;
|
||||
use azure_security_keyvault_keys::KeyClient;
|
||||
|
||||
let credential = DeveloperToolsCredential::new(None)?;
|
||||
let client = KeyClient::new(
|
||||
"https://<vault-name>.vault.azure.net/",
|
||||
credential.clone(),
|
||||
None,
|
||||
)?;
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
| Type | Description |
|
||||
|------|-------------|
|
||||
| RSA | RSA keys (2048, 3072, 4096 bits) |
|
||||
| EC | Elliptic curve keys (P-256, P-384, P-521) |
|
||||
| RSA-HSM | HSM-protected RSA keys |
|
||||
| EC-HSM | HSM-protected EC keys |
|
||||
|
||||
## Core Operations
|
||||
|
||||
### Get Key
|
||||
|
||||
```rust
|
||||
let key = client
|
||||
.get_key("key-name", None)
|
||||
.await?
|
||||
.into_model()?;
|
||||
|
||||
println!("Key ID: {:?}", key.key.as_ref().map(|k| &k.kid));
|
||||
```
|
||||
|
||||
### Create Key
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_keys::models::{CreateKeyParameters, KeyType};
|
||||
|
||||
let params = CreateKeyParameters {
|
||||
kty: KeyType::Rsa,
|
||||
key_size: Some(2048),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let key = client
|
||||
.create_key("key-name", params.try_into()?, None)
|
||||
.await?
|
||||
.into_model()?;
|
||||
```
|
||||
|
||||
### Create EC Key
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_keys::models::{CreateKeyParameters, KeyType, CurveName};
|
||||
|
||||
let params = CreateKeyParameters {
|
||||
kty: KeyType::Ec,
|
||||
curve: Some(CurveName::P256),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let key = client
|
||||
.create_key("ec-key", params.try_into()?, None)
|
||||
.await?
|
||||
.into_model()?;
|
||||
```
|
||||
|
||||
### Delete Key
|
||||
|
||||
```rust
|
||||
client.delete_key("key-name", None).await?;
|
||||
```
|
||||
|
||||
### List Keys
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_keys::ResourceExt;
|
||||
use futures::TryStreamExt;
|
||||
|
||||
let mut pager = client.list_key_properties(None)?.into_stream();
|
||||
while let Some(key) = pager.try_next().await? {
|
||||
let name = key.resource_id()?.name;
|
||||
println!("Key: {}", name);
|
||||
}
|
||||
```
|
||||
|
||||
### Backup Key
|
||||
|
||||
```rust
|
||||
let backup = client.backup_key("key-name", None).await?;
|
||||
// Store backup.value safely
|
||||
```
|
||||
|
||||
### Restore Key
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_keys::models::RestoreKeyParameters;
|
||||
|
||||
let params = RestoreKeyParameters {
|
||||
key_bundle_backup: backup_bytes,
|
||||
};
|
||||
|
||||
client.restore_key(params.try_into()?, None).await?;
|
||||
```
|
||||
|
||||
## Cryptographic Operations
|
||||
|
||||
Key Vault can perform crypto operations without exposing the private key:
|
||||
|
||||
```rust
|
||||
// For cryptographic operations, use the key's operations
|
||||
// Available operations depend on key type and permissions:
|
||||
// - encrypt/decrypt (RSA)
|
||||
// - sign/verify (RSA, EC)
|
||||
// - wrapKey/unwrapKey (RSA)
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID auth** — `DeveloperToolsCredential` for dev, `ManagedIdentityCredential` for production
|
||||
2. **Use HSM keys for sensitive workloads** — hardware-protected keys
|
||||
3. **Use EC for signing** — more efficient than RSA
|
||||
4. **Use RSA for encryption** — when encrypting data
|
||||
5. **Backup keys** — for disaster recovery
|
||||
6. **Enable soft delete** — required for production vaults
|
||||
7. **Use key rotation** — create new versions periodically
|
||||
|
||||
## RBAC Permissions
|
||||
|
||||
Assign these Key Vault roles:
|
||||
- `Key Vault Crypto User` — use keys for crypto operations
|
||||
- `Key Vault Crypto Officer` — full CRUD on keys
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | Link |
|
||||
|----------|------|
|
||||
| API Reference | https://docs.rs/azure_security_keyvault_keys |
|
||||
| Source Code | https://github.com/Azure/azure-sdk-for-rust/tree/main/sdk/keyvault/azure_security_keyvault_keys |
|
||||
| crates.io | https://crates.io/crates/azure_security_keyvault_keys |
|
||||
269
skills/azure-keyvault-keys-ts/SKILL.md
Normal file
269
skills/azure-keyvault-keys-ts/SKILL.md
Normal file
@@ -0,0 +1,269 @@
|
||||
---
|
||||
name: azure-keyvault-keys-ts
|
||||
description: Manage cryptographic keys using Azure Key Vault Keys SDK for JavaScript (@azure/keyvault-keys). Use when creating, encrypting/decrypting, signing, or rotating keys.
|
||||
package: @azure/keyvault-keys
|
||||
---
|
||||
|
||||
# Azure Key Vault Keys SDK for TypeScript
|
||||
|
||||
Manage cryptographic keys with Azure Key Vault.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Keys SDK
|
||||
npm install @azure/keyvault-keys @azure/identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
KEY_VAULT_URL=https://<vault-name>.vault.azure.net
|
||||
# Or
|
||||
AZURE_KEYVAULT_NAME=<vault-name>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```typescript
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
import { KeyClient, CryptographyClient } from "@azure/keyvault-keys";
|
||||
|
||||
const credential = new DefaultAzureCredential();
|
||||
const vaultUrl = `https://${process.env.AZURE_KEYVAULT_NAME}.vault.azure.net`;
|
||||
|
||||
const keyClient = new KeyClient(vaultUrl, credential);
|
||||
const secretClient = new SecretClient(vaultUrl, credential);
|
||||
```
|
||||
|
||||
## Secrets Operations
|
||||
|
||||
### Create/Set Secret
|
||||
|
||||
```typescript
|
||||
const secret = await secretClient.setSecret("MySecret", "secret-value");
|
||||
|
||||
// With attributes
|
||||
const secretWithAttrs = await secretClient.setSecret("MySecret", "value", {
|
||||
enabled: true,
|
||||
expiresOn: new Date("2025-12-31"),
|
||||
contentType: "application/json",
|
||||
tags: { environment: "production" }
|
||||
});
|
||||
```
|
||||
|
||||
### Get Secret
|
||||
|
||||
```typescript
|
||||
// Get latest version
|
||||
const secret = await secretClient.getSecret("MySecret");
|
||||
console.log(secret.value);
|
||||
|
||||
// Get specific version
|
||||
const specificSecret = await secretClient.getSecret("MySecret", {
|
||||
version: secret.properties.version
|
||||
});
|
||||
```
|
||||
|
||||
### List Secrets
|
||||
|
||||
```typescript
|
||||
for await (const secretProperties of secretClient.listPropertiesOfSecrets()) {
|
||||
console.log(secretProperties.name);
|
||||
}
|
||||
|
||||
// List versions
|
||||
for await (const version of secretClient.listPropertiesOfSecretVersions("MySecret")) {
|
||||
console.log(version.version);
|
||||
}
|
||||
```
|
||||
|
||||
### Delete Secret
|
||||
|
||||
```typescript
|
||||
// Soft delete
|
||||
const deletePoller = await secretClient.beginDeleteSecret("MySecret");
|
||||
await deletePoller.pollUntilDone();
|
||||
|
||||
// Purge (permanent)
|
||||
await secretClient.purgeDeletedSecret("MySecret");
|
||||
|
||||
// Recover
|
||||
const recoverPoller = await secretClient.beginRecoverDeletedSecret("MySecret");
|
||||
await recoverPoller.pollUntilDone();
|
||||
```
|
||||
|
||||
## Keys Operations
|
||||
|
||||
### Create Keys
|
||||
|
||||
```typescript
|
||||
// Generic key
|
||||
const key = await keyClient.createKey("MyKey", "RSA");
|
||||
|
||||
// RSA key with size
|
||||
const rsaKey = await keyClient.createRsaKey("MyRsaKey", { keySize: 2048 });
|
||||
|
||||
// Elliptic Curve key
|
||||
const ecKey = await keyClient.createEcKey("MyEcKey", { curve: "P-256" });
|
||||
|
||||
// With attributes
|
||||
const keyWithAttrs = await keyClient.createKey("MyKey", "RSA", {
|
||||
enabled: true,
|
||||
expiresOn: new Date("2025-12-31"),
|
||||
tags: { purpose: "encryption" },
|
||||
keyOps: ["encrypt", "decrypt", "sign", "verify"]
|
||||
});
|
||||
```
|
||||
|
||||
### Get Key
|
||||
|
||||
```typescript
|
||||
const key = await keyClient.getKey("MyKey");
|
||||
console.log(key.name, key.keyType);
|
||||
```
|
||||
|
||||
### List Keys
|
||||
|
||||
```typescript
|
||||
for await (const keyProperties of keyClient.listPropertiesOfKeys()) {
|
||||
console.log(keyProperties.name);
|
||||
}
|
||||
```
|
||||
|
||||
### Rotate Key
|
||||
|
||||
```typescript
|
||||
// Manual rotation
|
||||
const rotatedKey = await keyClient.rotateKey("MyKey");
|
||||
|
||||
// Set rotation policy
|
||||
await keyClient.updateKeyRotationPolicy("MyKey", {
|
||||
lifetimeActions: [{ action: "Rotate", timeBeforeExpiry: "P30D" }],
|
||||
expiresIn: "P90D"
|
||||
});
|
||||
```
|
||||
|
||||
### Delete Key
|
||||
|
||||
```typescript
|
||||
const deletePoller = await keyClient.beginDeleteKey("MyKey");
|
||||
await deletePoller.pollUntilDone();
|
||||
|
||||
// Purge
|
||||
await keyClient.purgeDeletedKey("MyKey");
|
||||
```
|
||||
|
||||
## Cryptographic Operations
|
||||
|
||||
### Create CryptographyClient
|
||||
|
||||
```typescript
|
||||
import { CryptographyClient } from "@azure/keyvault-keys";
|
||||
|
||||
// From key object
|
||||
const cryptoClient = new CryptographyClient(key, credential);
|
||||
|
||||
// From key ID
|
||||
const cryptoClient = new CryptographyClient(key.id!, credential);
|
||||
```
|
||||
|
||||
### Encrypt/Decrypt
|
||||
|
||||
```typescript
|
||||
// Encrypt
|
||||
const encryptResult = await cryptoClient.encrypt({
|
||||
algorithm: "RSA-OAEP",
|
||||
plaintext: Buffer.from("My secret message")
|
||||
});
|
||||
|
||||
// Decrypt
|
||||
const decryptResult = await cryptoClient.decrypt({
|
||||
algorithm: "RSA-OAEP",
|
||||
ciphertext: encryptResult.result
|
||||
});
|
||||
|
||||
console.log(decryptResult.result.toString());
|
||||
```
|
||||
|
||||
### Sign/Verify
|
||||
|
||||
```typescript
|
||||
import { createHash } from "node:crypto";
|
||||
|
||||
// Create digest
|
||||
const hash = createHash("sha256").update("My message").digest();
|
||||
|
||||
// Sign
|
||||
const signResult = await cryptoClient.sign("RS256", hash);
|
||||
|
||||
// Verify
|
||||
const verifyResult = await cryptoClient.verify("RS256", hash, signResult.result);
|
||||
console.log("Valid:", verifyResult.result);
|
||||
```
|
||||
|
||||
### Wrap/Unwrap Keys
|
||||
|
||||
```typescript
|
||||
// Wrap a key (encrypt it for storage)
|
||||
const wrapResult = await cryptoClient.wrapKey("RSA-OAEP", Buffer.from("key-material"));
|
||||
|
||||
// Unwrap
|
||||
const unwrapResult = await cryptoClient.unwrapKey("RSA-OAEP", wrapResult.result);
|
||||
```
|
||||
|
||||
## Backup and Restore
|
||||
|
||||
```typescript
|
||||
// Backup
|
||||
const keyBackup = await keyClient.backupKey("MyKey");
|
||||
const secretBackup = await secretClient.backupSecret("MySecret");
|
||||
|
||||
// Restore (can restore to different vault)
|
||||
const restoredKey = await keyClient.restoreKeyBackup(keyBackup!);
|
||||
const restoredSecret = await secretClient.restoreSecretBackup(secretBackup!);
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
import {
|
||||
KeyClient,
|
||||
KeyVaultKey,
|
||||
KeyProperties,
|
||||
DeletedKey,
|
||||
CryptographyClient,
|
||||
KnownEncryptionAlgorithms,
|
||||
KnownSignatureAlgorithms
|
||||
} from "@azure/keyvault-keys";
|
||||
|
||||
import {
|
||||
SecretClient,
|
||||
KeyVaultSecret,
|
||||
SecretProperties,
|
||||
DeletedSecret
|
||||
} from "@azure/keyvault-secrets";
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```typescript
|
||||
try {
|
||||
const secret = await secretClient.getSecret("NonExistent");
|
||||
} catch (error: any) {
|
||||
if (error.code === "SecretNotFound") {
|
||||
console.log("Secret does not exist");
|
||||
} else {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** - Works across dev and production
|
||||
2. **Enable soft-delete** - Required for production vaults
|
||||
3. **Set expiration dates** - On both keys and secrets
|
||||
4. **Use key rotation policies** - Automate key rotation
|
||||
5. **Limit key operations** - Only grant needed operations (encrypt, sign, etc.)
|
||||
6. **Browser not supported** - These SDKs are Node.js only
|
||||
247
skills/azure-keyvault-py/SKILL.md
Normal file
247
skills/azure-keyvault-py/SKILL.md
Normal file
@@ -0,0 +1,247 @@
|
||||
---
|
||||
name: azure-keyvault-py
|
||||
description: |
|
||||
Azure Key Vault SDK for Python. Use for secrets, keys, and certificates management with secure storage.
|
||||
Triggers: "key vault", "SecretClient", "KeyClient", "CertificateClient", "secrets", "encryption keys".
|
||||
package: azure-keyvault-secrets, azure-keyvault-keys, azure-keyvault-certificates
|
||||
---
|
||||
|
||||
# Azure Key Vault SDK for Python
|
||||
|
||||
Secure storage and management for secrets, cryptographic keys, and certificates.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Secrets
|
||||
pip install azure-keyvault-secrets azure-identity
|
||||
|
||||
# Keys (cryptographic operations)
|
||||
pip install azure-keyvault-keys azure-identity
|
||||
|
||||
# Certificates
|
||||
pip install azure-keyvault-certificates azure-identity
|
||||
|
||||
# All
|
||||
pip install azure-keyvault-secrets azure-keyvault-keys azure-keyvault-certificates azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_KEYVAULT_URL=https://<vault-name>.vault.azure.net/
|
||||
```
|
||||
|
||||
## Secrets
|
||||
|
||||
### SecretClient Setup
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.keyvault.secrets import SecretClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
vault_url = "https://<vault-name>.vault.azure.net/"
|
||||
|
||||
client = SecretClient(vault_url=vault_url, credential=credential)
|
||||
```
|
||||
|
||||
### Secret Operations
|
||||
|
||||
```python
|
||||
# Set secret
|
||||
secret = client.set_secret("database-password", "super-secret-value")
|
||||
print(f"Created: {secret.name}, version: {secret.properties.version}")
|
||||
|
||||
# Get secret
|
||||
secret = client.get_secret("database-password")
|
||||
print(f"Value: {secret.value}")
|
||||
|
||||
# Get specific version
|
||||
secret = client.get_secret("database-password", version="abc123")
|
||||
|
||||
# List secrets (names only, not values)
|
||||
for secret_properties in client.list_properties_of_secrets():
|
||||
print(f"Secret: {secret_properties.name}")
|
||||
|
||||
# List versions
|
||||
for version in client.list_properties_of_secret_versions("database-password"):
|
||||
print(f"Version: {version.version}, Created: {version.created_on}")
|
||||
|
||||
# Delete secret (soft delete)
|
||||
poller = client.begin_delete_secret("database-password")
|
||||
deleted_secret = poller.result()
|
||||
|
||||
# Purge (permanent delete, if soft-delete enabled)
|
||||
client.purge_deleted_secret("database-password")
|
||||
|
||||
# Recover deleted secret
|
||||
client.begin_recover_deleted_secret("database-password").result()
|
||||
```
|
||||
|
||||
## Keys
|
||||
|
||||
### KeyClient Setup
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.keyvault.keys import KeyClient
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
vault_url = "https://<vault-name>.vault.azure.net/"
|
||||
|
||||
client = KeyClient(vault_url=vault_url, credential=credential)
|
||||
```
|
||||
|
||||
### Key Operations
|
||||
|
||||
```python
|
||||
from azure.keyvault.keys import KeyType
|
||||
|
||||
# Create RSA key
|
||||
rsa_key = client.create_rsa_key("rsa-key", size=2048)
|
||||
|
||||
# Create EC key
|
||||
ec_key = client.create_ec_key("ec-key", curve="P-256")
|
||||
|
||||
# Get key
|
||||
key = client.get_key("rsa-key")
|
||||
print(f"Key type: {key.key_type}")
|
||||
|
||||
# List keys
|
||||
for key_properties in client.list_properties_of_keys():
|
||||
print(f"Key: {key_properties.name}")
|
||||
|
||||
# Delete key
|
||||
poller = client.begin_delete_key("rsa-key")
|
||||
deleted_key = poller.result()
|
||||
```
|
||||
|
||||
### Cryptographic Operations
|
||||
|
||||
```python
|
||||
from azure.keyvault.keys.crypto import CryptographyClient, EncryptionAlgorithm
|
||||
|
||||
# Get crypto client for a specific key
|
||||
crypto_client = CryptographyClient(key, credential=credential)
|
||||
# Or from key ID
|
||||
crypto_client = CryptographyClient(
|
||||
"https://<vault>.vault.azure.net/keys/<key-name>/<version>",
|
||||
credential=credential
|
||||
)
|
||||
|
||||
# Encrypt
|
||||
plaintext = b"Hello, Key Vault!"
|
||||
result = crypto_client.encrypt(EncryptionAlgorithm.rsa_oaep, plaintext)
|
||||
ciphertext = result.ciphertext
|
||||
|
||||
# Decrypt
|
||||
result = crypto_client.decrypt(EncryptionAlgorithm.rsa_oaep, ciphertext)
|
||||
decrypted = result.plaintext
|
||||
|
||||
# Sign
|
||||
from azure.keyvault.keys.crypto import SignatureAlgorithm
|
||||
import hashlib
|
||||
|
||||
digest = hashlib.sha256(b"data to sign").digest()
|
||||
result = crypto_client.sign(SignatureAlgorithm.rs256, digest)
|
||||
signature = result.signature
|
||||
|
||||
# Verify
|
||||
result = crypto_client.verify(SignatureAlgorithm.rs256, digest, signature)
|
||||
print(f"Valid: {result.is_valid}")
|
||||
```
|
||||
|
||||
## Certificates
|
||||
|
||||
### CertificateClient Setup
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.keyvault.certificates import CertificateClient, CertificatePolicy
|
||||
|
||||
credential = DefaultAzureCredential()
|
||||
vault_url = "https://<vault-name>.vault.azure.net/"
|
||||
|
||||
client = CertificateClient(vault_url=vault_url, credential=credential)
|
||||
```
|
||||
|
||||
### Certificate Operations
|
||||
|
||||
```python
|
||||
# Create self-signed certificate
|
||||
policy = CertificatePolicy.get_default()
|
||||
poller = client.begin_create_certificate("my-cert", policy=policy)
|
||||
certificate = poller.result()
|
||||
|
||||
# Get certificate
|
||||
certificate = client.get_certificate("my-cert")
|
||||
print(f"Thumbprint: {certificate.properties.x509_thumbprint.hex()}")
|
||||
|
||||
# Get certificate with private key (as secret)
|
||||
from azure.keyvault.secrets import SecretClient
|
||||
secret_client = SecretClient(vault_url=vault_url, credential=credential)
|
||||
cert_secret = secret_client.get_secret("my-cert")
|
||||
# cert_secret.value contains PEM or PKCS12
|
||||
|
||||
# List certificates
|
||||
for cert in client.list_properties_of_certificates():
|
||||
print(f"Certificate: {cert.name}")
|
||||
|
||||
# Delete certificate
|
||||
poller = client.begin_delete_certificate("my-cert")
|
||||
deleted = poller.result()
|
||||
```
|
||||
|
||||
## Client Types Table
|
||||
|
||||
| Client | Package | Purpose |
|
||||
|--------|---------|---------|
|
||||
| `SecretClient` | `azure-keyvault-secrets` | Store/retrieve secrets |
|
||||
| `KeyClient` | `azure-keyvault-keys` | Manage cryptographic keys |
|
||||
| `CryptographyClient` | `azure-keyvault-keys` | Encrypt/decrypt/sign/verify |
|
||||
| `CertificateClient` | `azure-keyvault-certificates` | Manage certificates |
|
||||
|
||||
## Async Clients
|
||||
|
||||
```python
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
from azure.keyvault.secrets.aio import SecretClient
|
||||
|
||||
async def get_secret():
|
||||
credential = DefaultAzureCredential()
|
||||
client = SecretClient(vault_url=vault_url, credential=credential)
|
||||
|
||||
async with client:
|
||||
secret = await client.get_secret("my-secret")
|
||||
print(secret.value)
|
||||
|
||||
import asyncio
|
||||
asyncio.run(get_secret())
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```python
|
||||
from azure.core.exceptions import ResourceNotFoundError, HttpResponseError
|
||||
|
||||
try:
|
||||
secret = client.get_secret("nonexistent")
|
||||
except ResourceNotFoundError:
|
||||
print("Secret not found")
|
||||
except HttpResponseError as e:
|
||||
if e.status_code == 403:
|
||||
print("Access denied - check RBAC permissions")
|
||||
raise
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** for authentication
|
||||
2. **Use managed identity** in Azure-hosted applications
|
||||
3. **Enable soft-delete** for recovery (enabled by default)
|
||||
4. **Use RBAC** over access policies for fine-grained control
|
||||
5. **Rotate secrets** regularly using versioning
|
||||
6. **Use Key Vault references** in App Service/Functions config
|
||||
7. **Cache secrets** appropriately to reduce API calls
|
||||
8. **Use async clients** for high-throughput scenarios
|
||||
142
skills/azure-keyvault-secrets-rust/SKILL.md
Normal file
142
skills/azure-keyvault-secrets-rust/SKILL.md
Normal file
@@ -0,0 +1,142 @@
|
||||
---
|
||||
name: azure-keyvault-secrets-rust
|
||||
description: |
|
||||
Azure Key Vault Secrets SDK for Rust. Use for storing and retrieving secrets, passwords, and API keys.
|
||||
Triggers: "keyvault secrets rust", "SecretClient rust", "get secret rust", "set secret rust".
|
||||
package: azure_security_keyvault_secrets
|
||||
---
|
||||
|
||||
# Azure Key Vault Secrets SDK for Rust
|
||||
|
||||
Client library for Azure Key Vault Secrets — secure storage for passwords, API keys, and other secrets.
|
||||
|
||||
## Installation
|
||||
|
||||
```sh
|
||||
cargo add azure_security_keyvault_secrets azure_identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_KEYVAULT_URL=https://<vault-name>.vault.azure.net/
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```rust
|
||||
use azure_identity::DeveloperToolsCredential;
|
||||
use azure_security_keyvault_secrets::SecretClient;
|
||||
|
||||
let credential = DeveloperToolsCredential::new(None)?;
|
||||
let client = SecretClient::new(
|
||||
"https://<vault-name>.vault.azure.net/",
|
||||
credential.clone(),
|
||||
None,
|
||||
)?;
|
||||
```
|
||||
|
||||
## Core Operations
|
||||
|
||||
### Get Secret
|
||||
|
||||
```rust
|
||||
let secret = client
|
||||
.get_secret("secret-name", None)
|
||||
.await?
|
||||
.into_model()?;
|
||||
|
||||
println!("Secret value: {:?}", secret.value);
|
||||
```
|
||||
|
||||
### Set Secret
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_secrets::models::SetSecretParameters;
|
||||
|
||||
let params = SetSecretParameters {
|
||||
value: Some("secret-value".into()),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let secret = client
|
||||
.set_secret("secret-name", params.try_into()?, None)
|
||||
.await?
|
||||
.into_model()?;
|
||||
```
|
||||
|
||||
### Update Secret Properties
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_secrets::models::UpdateSecretPropertiesParameters;
|
||||
use std::collections::HashMap;
|
||||
|
||||
let params = UpdateSecretPropertiesParameters {
|
||||
content_type: Some("text/plain".into()),
|
||||
tags: Some(HashMap::from([("env".into(), "prod".into())])),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
client
|
||||
.update_secret_properties("secret-name", params.try_into()?, None)
|
||||
.await?;
|
||||
```
|
||||
|
||||
### Delete Secret
|
||||
|
||||
```rust
|
||||
client.delete_secret("secret-name", None).await?;
|
||||
```
|
||||
|
||||
### List Secrets
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_secrets::ResourceExt;
|
||||
use futures::TryStreamExt;
|
||||
|
||||
let mut pager = client.list_secret_properties(None)?.into_stream();
|
||||
while let Some(secret) = pager.try_next().await? {
|
||||
let name = secret.resource_id()?.name;
|
||||
println!("Secret: {}", name);
|
||||
}
|
||||
```
|
||||
|
||||
### Get Specific Version
|
||||
|
||||
```rust
|
||||
use azure_security_keyvault_secrets::models::SecretClientGetSecretOptions;
|
||||
|
||||
let options = SecretClientGetSecretOptions {
|
||||
secret_version: Some("version-id".into()),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let secret = client
|
||||
.get_secret("secret-name", Some(options))
|
||||
.await?
|
||||
.into_model()?;
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID auth** — `DeveloperToolsCredential` for dev, `ManagedIdentityCredential` for production
|
||||
2. **Use `into_model()?`** — to deserialize responses
|
||||
3. **Use `ResourceExt` trait** — for extracting names from IDs
|
||||
4. **Handle soft delete** — deleted secrets can be recovered within retention period
|
||||
5. **Set content type** — helps identify secret format
|
||||
6. **Use tags** — for organizing and filtering secrets
|
||||
7. **Version secrets** — new values create new versions automatically
|
||||
|
||||
## RBAC Permissions
|
||||
|
||||
Assign these Key Vault roles:
|
||||
- `Key Vault Secrets User` — get and list
|
||||
- `Key Vault Secrets Officer` — full CRUD
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | Link |
|
||||
|----------|------|
|
||||
| API Reference | https://docs.rs/azure_security_keyvault_secrets |
|
||||
| Source Code | https://github.com/Azure/azure-sdk-for-rust/tree/main/sdk/keyvault/azure_security_keyvault_secrets |
|
||||
| crates.io | https://crates.io/crates/azure_security_keyvault_secrets |
|
||||
269
skills/azure-keyvault-secrets-ts/SKILL.md
Normal file
269
skills/azure-keyvault-secrets-ts/SKILL.md
Normal file
@@ -0,0 +1,269 @@
|
||||
---
|
||||
name: azure-keyvault-secrets-ts
|
||||
description: Manage secrets using Azure Key Vault Secrets SDK for JavaScript (@azure/keyvault-secrets). Use when storing and retrieving application secrets or configuration values.
|
||||
package: @azure/keyvault-secrets
|
||||
---
|
||||
|
||||
# Azure Key Vault Secrets SDK for TypeScript
|
||||
|
||||
Manage secrets with Azure Key Vault.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Secrets SDK
|
||||
npm install @azure/keyvault-secrets @azure/identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
KEY_VAULT_URL=https://<vault-name>.vault.azure.net
|
||||
# Or
|
||||
AZURE_KEYVAULT_NAME=<vault-name>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```typescript
|
||||
import { DefaultAzureCredential } from "@azure/identity";
|
||||
import { SecretClient } from "@azure/keyvault-secrets";
|
||||
|
||||
const credential = new DefaultAzureCredential();
|
||||
const vaultUrl = `https://${process.env.AZURE_KEYVAULT_NAME}.vault.azure.net`;
|
||||
|
||||
const keyClient = new KeyClient(vaultUrl, credential);
|
||||
const secretClient = new SecretClient(vaultUrl, credential);
|
||||
```
|
||||
|
||||
## Secrets Operations
|
||||
|
||||
### Create/Set Secret
|
||||
|
||||
```typescript
|
||||
const secret = await secretClient.setSecret("MySecret", "secret-value");
|
||||
|
||||
// With attributes
|
||||
const secretWithAttrs = await secretClient.setSecret("MySecret", "value", {
|
||||
enabled: true,
|
||||
expiresOn: new Date("2025-12-31"),
|
||||
contentType: "application/json",
|
||||
tags: { environment: "production" }
|
||||
});
|
||||
```
|
||||
|
||||
### Get Secret
|
||||
|
||||
```typescript
|
||||
// Get latest version
|
||||
const secret = await secretClient.getSecret("MySecret");
|
||||
console.log(secret.value);
|
||||
|
||||
// Get specific version
|
||||
const specificSecret = await secretClient.getSecret("MySecret", {
|
||||
version: secret.properties.version
|
||||
});
|
||||
```
|
||||
|
||||
### List Secrets
|
||||
|
||||
```typescript
|
||||
for await (const secretProperties of secretClient.listPropertiesOfSecrets()) {
|
||||
console.log(secretProperties.name);
|
||||
}
|
||||
|
||||
// List versions
|
||||
for await (const version of secretClient.listPropertiesOfSecretVersions("MySecret")) {
|
||||
console.log(version.version);
|
||||
}
|
||||
```
|
||||
|
||||
### Delete Secret
|
||||
|
||||
```typescript
|
||||
// Soft delete
|
||||
const deletePoller = await secretClient.beginDeleteSecret("MySecret");
|
||||
await deletePoller.pollUntilDone();
|
||||
|
||||
// Purge (permanent)
|
||||
await secretClient.purgeDeletedSecret("MySecret");
|
||||
|
||||
// Recover
|
||||
const recoverPoller = await secretClient.beginRecoverDeletedSecret("MySecret");
|
||||
await recoverPoller.pollUntilDone();
|
||||
```
|
||||
|
||||
## Keys Operations
|
||||
|
||||
### Create Keys
|
||||
|
||||
```typescript
|
||||
// Generic key
|
||||
const key = await keyClient.createKey("MyKey", "RSA");
|
||||
|
||||
// RSA key with size
|
||||
const rsaKey = await keyClient.createRsaKey("MyRsaKey", { keySize: 2048 });
|
||||
|
||||
// Elliptic Curve key
|
||||
const ecKey = await keyClient.createEcKey("MyEcKey", { curve: "P-256" });
|
||||
|
||||
// With attributes
|
||||
const keyWithAttrs = await keyClient.createKey("MyKey", "RSA", {
|
||||
enabled: true,
|
||||
expiresOn: new Date("2025-12-31"),
|
||||
tags: { purpose: "encryption" },
|
||||
keyOps: ["encrypt", "decrypt", "sign", "verify"]
|
||||
});
|
||||
```
|
||||
|
||||
### Get Key
|
||||
|
||||
```typescript
|
||||
const key = await keyClient.getKey("MyKey");
|
||||
console.log(key.name, key.keyType);
|
||||
```
|
||||
|
||||
### List Keys
|
||||
|
||||
```typescript
|
||||
for await (const keyProperties of keyClient.listPropertiesOfKeys()) {
|
||||
console.log(keyProperties.name);
|
||||
}
|
||||
```
|
||||
|
||||
### Rotate Key
|
||||
|
||||
```typescript
|
||||
// Manual rotation
|
||||
const rotatedKey = await keyClient.rotateKey("MyKey");
|
||||
|
||||
// Set rotation policy
|
||||
await keyClient.updateKeyRotationPolicy("MyKey", {
|
||||
lifetimeActions: [{ action: "Rotate", timeBeforeExpiry: "P30D" }],
|
||||
expiresIn: "P90D"
|
||||
});
|
||||
```
|
||||
|
||||
### Delete Key
|
||||
|
||||
```typescript
|
||||
const deletePoller = await keyClient.beginDeleteKey("MyKey");
|
||||
await deletePoller.pollUntilDone();
|
||||
|
||||
// Purge
|
||||
await keyClient.purgeDeletedKey("MyKey");
|
||||
```
|
||||
|
||||
## Cryptographic Operations
|
||||
|
||||
### Create CryptographyClient
|
||||
|
||||
```typescript
|
||||
import { CryptographyClient } from "@azure/keyvault-keys";
|
||||
|
||||
// From key object
|
||||
const cryptoClient = new CryptographyClient(key, credential);
|
||||
|
||||
// From key ID
|
||||
const cryptoClient = new CryptographyClient(key.id!, credential);
|
||||
```
|
||||
|
||||
### Encrypt/Decrypt
|
||||
|
||||
```typescript
|
||||
// Encrypt
|
||||
const encryptResult = await cryptoClient.encrypt({
|
||||
algorithm: "RSA-OAEP",
|
||||
plaintext: Buffer.from("My secret message")
|
||||
});
|
||||
|
||||
// Decrypt
|
||||
const decryptResult = await cryptoClient.decrypt({
|
||||
algorithm: "RSA-OAEP",
|
||||
ciphertext: encryptResult.result
|
||||
});
|
||||
|
||||
console.log(decryptResult.result.toString());
|
||||
```
|
||||
|
||||
### Sign/Verify
|
||||
|
||||
```typescript
|
||||
import { createHash } from "node:crypto";
|
||||
|
||||
// Create digest
|
||||
const hash = createHash("sha256").update("My message").digest();
|
||||
|
||||
// Sign
|
||||
const signResult = await cryptoClient.sign("RS256", hash);
|
||||
|
||||
// Verify
|
||||
const verifyResult = await cryptoClient.verify("RS256", hash, signResult.result);
|
||||
console.log("Valid:", verifyResult.result);
|
||||
```
|
||||
|
||||
### Wrap/Unwrap Keys
|
||||
|
||||
```typescript
|
||||
// Wrap a key (encrypt it for storage)
|
||||
const wrapResult = await cryptoClient.wrapKey("RSA-OAEP", Buffer.from("key-material"));
|
||||
|
||||
// Unwrap
|
||||
const unwrapResult = await cryptoClient.unwrapKey("RSA-OAEP", wrapResult.result);
|
||||
```
|
||||
|
||||
## Backup and Restore
|
||||
|
||||
```typescript
|
||||
// Backup
|
||||
const keyBackup = await keyClient.backupKey("MyKey");
|
||||
const secretBackup = await secretClient.backupSecret("MySecret");
|
||||
|
||||
// Restore (can restore to different vault)
|
||||
const restoredKey = await keyClient.restoreKeyBackup(keyBackup!);
|
||||
const restoredSecret = await secretClient.restoreSecretBackup(secretBackup!);
|
||||
```
|
||||
|
||||
## Key Types
|
||||
|
||||
```typescript
|
||||
import {
|
||||
KeyClient,
|
||||
KeyVaultKey,
|
||||
KeyProperties,
|
||||
DeletedKey,
|
||||
CryptographyClient,
|
||||
KnownEncryptionAlgorithms,
|
||||
KnownSignatureAlgorithms
|
||||
} from "@azure/keyvault-keys";
|
||||
|
||||
import {
|
||||
SecretClient,
|
||||
KeyVaultSecret,
|
||||
SecretProperties,
|
||||
DeletedSecret
|
||||
} from "@azure/keyvault-secrets";
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```typescript
|
||||
try {
|
||||
const secret = await secretClient.getSecret("NonExistent");
|
||||
} catch (error: any) {
|
||||
if (error.code === "SecretNotFound") {
|
||||
console.log("Secret does not exist");
|
||||
} else {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use DefaultAzureCredential** - Works across dev and production
|
||||
2. **Enable soft-delete** - Required for production vaults
|
||||
3. **Set expiration dates** - On both keys and secrets
|
||||
4. **Use key rotation policies** - Automate key rotation
|
||||
5. **Limit key operations** - Only grant needed operations (encrypt, sign, etc.)
|
||||
6. **Browser not supported** - These SDKs are Node.js only
|
||||
494
skills/azure-maps-search-dotnet/SKILL.md
Normal file
494
skills/azure-maps-search-dotnet/SKILL.md
Normal file
@@ -0,0 +1,494 @@
|
||||
---
|
||||
name: azure-maps-search-dotnet
|
||||
description: |
|
||||
Azure Maps SDK for .NET. Location-based services including geocoding, routing, rendering, geolocation, and weather. Use for address search, directions, map tiles, IP geolocation, and weather data. Triggers: "Azure Maps", "MapsSearchClient", "MapsRoutingClient", "MapsRenderingClient", "geocoding .NET", "route directions", "map tiles", "geolocation".
|
||||
package: Azure.Maps.Search
|
||||
---
|
||||
|
||||
# Azure Maps (.NET)
|
||||
|
||||
Azure Maps SDK for .NET providing location-based services: geocoding, routing, rendering, geolocation, and weather.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Search (geocoding, reverse geocoding)
|
||||
dotnet add package Azure.Maps.Search --prerelease
|
||||
|
||||
# Routing (directions, route matrix)
|
||||
dotnet add package Azure.Maps.Routing --prerelease
|
||||
|
||||
# Rendering (map tiles, static images)
|
||||
dotnet add package Azure.Maps.Rendering --prerelease
|
||||
|
||||
# Geolocation (IP to location)
|
||||
dotnet add package Azure.Maps.Geolocation --prerelease
|
||||
|
||||
# Weather
|
||||
dotnet add package Azure.Maps.Weather --prerelease
|
||||
|
||||
# Resource Management (account management, SAS tokens)
|
||||
dotnet add package Azure.ResourceManager.Maps --prerelease
|
||||
|
||||
# Required for authentication
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Versions**:
|
||||
- `Azure.Maps.Search`: v2.0.0-beta.5
|
||||
- `Azure.Maps.Routing`: v1.0.0-beta.4
|
||||
- `Azure.Maps.Rendering`: v2.0.0-beta.1
|
||||
- `Azure.Maps.Geolocation`: v1.0.0-beta.3
|
||||
- `Azure.ResourceManager.Maps`: v1.1.0-beta.2
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_MAPS_SUBSCRIPTION_KEY=<your-subscription-key>
|
||||
AZURE_MAPS_CLIENT_ID=<your-client-id> # For Entra ID auth
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Subscription Key (Shared Key)
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Maps.Search;
|
||||
|
||||
var subscriptionKey = Environment.GetEnvironmentVariable("AZURE_MAPS_SUBSCRIPTION_KEY");
|
||||
var credential = new AzureKeyCredential(subscriptionKey);
|
||||
|
||||
var client = new MapsSearchClient(credential);
|
||||
```
|
||||
|
||||
### Microsoft Entra ID (Recommended for Production)
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.Maps.Search;
|
||||
|
||||
var credential = new DefaultAzureCredential();
|
||||
var clientId = Environment.GetEnvironmentVariable("AZURE_MAPS_CLIENT_ID");
|
||||
|
||||
var client = new MapsSearchClient(credential, clientId);
|
||||
```
|
||||
|
||||
### Shared Access Signature (SAS)
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Core;
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.Maps;
|
||||
using Azure.ResourceManager.Maps.Models;
|
||||
using Azure.Maps.Search;
|
||||
|
||||
// Authenticate with Azure Resource Manager
|
||||
ArmClient armClient = new ArmClient(new DefaultAzureCredential());
|
||||
|
||||
// Get Maps account resource
|
||||
ResourceIdentifier mapsAccountResourceId = MapsAccountResource.CreateResourceIdentifier(
|
||||
subscriptionId, resourceGroupName, accountName);
|
||||
MapsAccountResource mapsAccount = armClient.GetMapsAccountResource(mapsAccountResourceId);
|
||||
|
||||
// Generate SAS token
|
||||
MapsAccountSasContent sasContent = new MapsAccountSasContent(
|
||||
MapsSigningKey.PrimaryKey,
|
||||
principalId,
|
||||
maxRatePerSecond: 500,
|
||||
start: DateTime.UtcNow.ToString("O"),
|
||||
expiry: DateTime.UtcNow.AddDays(1).ToString("O"));
|
||||
|
||||
Response<MapsAccountSasToken> sas = mapsAccount.GetSas(sasContent);
|
||||
|
||||
// Create client with SAS token
|
||||
var sasCredential = new AzureSasCredential(sas.Value.AccountSasToken);
|
||||
var client = new MapsSearchClient(sasCredential);
|
||||
```
|
||||
|
||||
## Client Hierarchy
|
||||
|
||||
```
|
||||
Azure.Maps.Search
|
||||
└── MapsSearchClient
|
||||
├── GetGeocoding() → Geocode addresses
|
||||
├── GetGeocodingBatch() → Batch geocoding
|
||||
├── GetReverseGeocoding() → Coordinates to address
|
||||
├── GetReverseGeocodingBatch() → Batch reverse geocoding
|
||||
└── GetPolygon() → Get boundary polygons
|
||||
|
||||
Azure.Maps.Routing
|
||||
└── MapsRoutingClient
|
||||
├── GetDirections() → Route directions
|
||||
├── GetImmediateRouteMatrix() → Route matrix (sync, ≤100)
|
||||
├── GetRouteMatrix() → Route matrix (async, ≤700)
|
||||
└── GetRouteRange() → Isochrone/reachable range
|
||||
|
||||
Azure.Maps.Rendering
|
||||
└── MapsRenderingClient
|
||||
├── GetMapTile() → Map tiles
|
||||
├── GetMapStaticImage() → Static map images
|
||||
└── GetCopyrightCaption() → Copyright info
|
||||
|
||||
Azure.Maps.Geolocation
|
||||
└── MapsGeolocationClient
|
||||
└── GetCountryCode() → IP to country/region
|
||||
|
||||
Azure.Maps.Weather
|
||||
└── MapsWeatherClient
|
||||
├── GetCurrentWeatherConditions() → Current weather
|
||||
├── GetDailyForecast() → Daily forecast
|
||||
├── GetHourlyForecast() → Hourly forecast
|
||||
└── GetSevereWeatherAlerts() → Weather alerts
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Geocoding (Address to Coordinates)
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Maps.Search;
|
||||
|
||||
var credential = new AzureKeyCredential(subscriptionKey);
|
||||
var client = new MapsSearchClient(credential);
|
||||
|
||||
Response<GeocodingResponse> result = client.GetGeocoding("1 Microsoft Way, Redmond, WA 98052");
|
||||
|
||||
foreach (var feature in result.Value.Features)
|
||||
{
|
||||
Console.WriteLine($"Coordinates: {string.Join(",", feature.Geometry.Coordinates)}");
|
||||
Console.WriteLine($"Address: {feature.Properties.Address.FormattedAddress}");
|
||||
Console.WriteLine($"Confidence: {feature.Properties.Confidence}");
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Batch Geocoding
|
||||
|
||||
```csharp
|
||||
using Azure.Maps.Search.Models.Queries;
|
||||
|
||||
List<GeocodingQuery> queries = new List<GeocodingQuery>
|
||||
{
|
||||
new GeocodingQuery() { Query = "400 Broad St, Seattle, WA" },
|
||||
new GeocodingQuery() { Query = "1 Microsoft Way, Redmond, WA" },
|
||||
new GeocodingQuery() { AddressLine = "Space Needle", Top = 1 },
|
||||
};
|
||||
|
||||
Response<GeocodingBatchResponse> results = client.GetGeocodingBatch(queries);
|
||||
|
||||
foreach (var batchItem in results.Value.BatchItems)
|
||||
{
|
||||
foreach (var feature in batchItem.Features)
|
||||
{
|
||||
Console.WriteLine($"Coordinates: {string.Join(",", feature.Geometry.Coordinates)}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Reverse Geocoding (Coordinates to Address)
|
||||
|
||||
```csharp
|
||||
using Azure.Core.GeoJson;
|
||||
|
||||
GeoPosition coordinates = new GeoPosition(-122.138685, 47.6305637);
|
||||
Response<GeocodingResponse> result = client.GetReverseGeocoding(coordinates);
|
||||
|
||||
foreach (var feature in result.Value.Features)
|
||||
{
|
||||
Console.WriteLine($"Address: {feature.Properties.Address.FormattedAddress}");
|
||||
Console.WriteLine($"Locality: {feature.Properties.Address.Locality}");
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Get Boundary Polygon
|
||||
|
||||
```csharp
|
||||
using Azure.Maps.Search.Models;
|
||||
|
||||
GetPolygonOptions options = new GetPolygonOptions()
|
||||
{
|
||||
Coordinates = new GeoPosition(-122.204141, 47.61256),
|
||||
ResultType = BoundaryResultTypeEnum.Locality,
|
||||
Resolution = ResolutionEnum.Small,
|
||||
};
|
||||
|
||||
Response<Boundary> result = client.GetPolygon(options);
|
||||
|
||||
Console.WriteLine($"Boundary copyright: {result.Value.Properties?.Copyright}");
|
||||
Console.WriteLine($"Polygon count: {result.Value.Geometry.Count}");
|
||||
```
|
||||
|
||||
### 5. Route Directions
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Core.GeoJson;
|
||||
using Azure.Maps.Routing;
|
||||
using Azure.Maps.Routing.Models;
|
||||
|
||||
var client = new MapsRoutingClient(new AzureKeyCredential(subscriptionKey));
|
||||
|
||||
List<GeoPosition> routePoints = new List<GeoPosition>()
|
||||
{
|
||||
new GeoPosition(-122.34, 47.61), // Seattle
|
||||
new GeoPosition(-122.13, 47.64) // Redmond
|
||||
};
|
||||
|
||||
RouteDirectionQuery query = new RouteDirectionQuery(routePoints);
|
||||
Response<RouteDirections> result = client.GetDirections(query);
|
||||
|
||||
foreach (var route in result.Value.Routes)
|
||||
{
|
||||
Console.WriteLine($"Distance: {route.Summary.LengthInMeters} meters");
|
||||
Console.WriteLine($"Duration: {route.Summary.TravelTimeDuration}");
|
||||
|
||||
foreach (RouteLeg leg in route.Legs)
|
||||
{
|
||||
Console.WriteLine($"Leg points: {leg.Points.Count}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Route Directions with Options
|
||||
|
||||
```csharp
|
||||
RouteDirectionOptions options = new RouteDirectionOptions()
|
||||
{
|
||||
RouteType = RouteType.Fastest,
|
||||
UseTrafficData = true,
|
||||
TravelMode = TravelMode.Bicycle,
|
||||
Language = RoutingLanguage.EnglishUsa,
|
||||
InstructionsType = RouteInstructionsType.Text,
|
||||
};
|
||||
|
||||
RouteDirectionQuery query = new RouteDirectionQuery(routePoints)
|
||||
{
|
||||
RouteDirectionOptions = options
|
||||
};
|
||||
|
||||
Response<RouteDirections> result = client.GetDirections(query);
|
||||
```
|
||||
|
||||
### 7. Route Matrix
|
||||
|
||||
```csharp
|
||||
RouteMatrixQuery routeMatrixQuery = new RouteMatrixQuery
|
||||
{
|
||||
Origins = new List<GeoPosition>()
|
||||
{
|
||||
new GeoPosition(-122.34, 47.61),
|
||||
new GeoPosition(-122.13, 47.64)
|
||||
},
|
||||
Destinations = new List<GeoPosition>()
|
||||
{
|
||||
new GeoPosition(-122.20, 47.62),
|
||||
new GeoPosition(-122.40, 47.65)
|
||||
},
|
||||
};
|
||||
|
||||
// Synchronous (up to 100 route combinations)
|
||||
Response<RouteMatrixResult> result = client.GetImmediateRouteMatrix(routeMatrixQuery);
|
||||
|
||||
foreach (var cell in result.Value.Matrix.SelectMany(row => row))
|
||||
{
|
||||
Console.WriteLine($"Distance: {cell.Response?.RouteSummary?.LengthInMeters}");
|
||||
Console.WriteLine($"Duration: {cell.Response?.RouteSummary?.TravelTimeDuration}");
|
||||
}
|
||||
|
||||
// Asynchronous (up to 700 route combinations)
|
||||
RouteMatrixOptions routeMatrixOptions = new RouteMatrixOptions(routeMatrixQuery)
|
||||
{
|
||||
TravelTimeType = TravelTimeType.All,
|
||||
};
|
||||
GetRouteMatrixOperation asyncResult = client.GetRouteMatrix(WaitUntil.Completed, routeMatrixOptions);
|
||||
```
|
||||
|
||||
### 8. Route Range (Isochrone)
|
||||
|
||||
```csharp
|
||||
RouteRangeOptions options = new RouteRangeOptions(-122.34, 47.61)
|
||||
{
|
||||
TimeBudget = new TimeSpan(0, 20, 0) // 20 minutes
|
||||
};
|
||||
|
||||
Response<RouteRangeResult> result = client.GetRouteRange(options);
|
||||
|
||||
// result.Value.ReachableRange contains the polygon
|
||||
Console.WriteLine($"Boundary points: {result.Value.ReachableRange.Boundary.Count}");
|
||||
```
|
||||
|
||||
### 9. Get Map Tiles
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Maps.Rendering;
|
||||
|
||||
var client = new MapsRenderingClient(new AzureKeyCredential(subscriptionKey));
|
||||
|
||||
int zoom = 10;
|
||||
int tileSize = 256;
|
||||
|
||||
// Convert coordinates to tile index
|
||||
MapTileIndex tileIndex = MapsRenderingClient.PositionToTileXY(
|
||||
new GeoPosition(13.3854, 52.517), zoom, tileSize);
|
||||
|
||||
// Fetch map tile
|
||||
GetMapTileOptions options = new GetMapTileOptions(
|
||||
MapTileSetId.MicrosoftImagery,
|
||||
new MapTileIndex(tileIndex.X, tileIndex.Y, zoom)
|
||||
);
|
||||
|
||||
Response<Stream> mapTile = client.GetMapTile(options);
|
||||
|
||||
// Save to file
|
||||
using (FileStream fileStream = File.Create("./MapTile.png"))
|
||||
{
|
||||
mapTile.Value.CopyTo(fileStream);
|
||||
}
|
||||
```
|
||||
|
||||
### 10. IP Geolocation
|
||||
|
||||
```csharp
|
||||
using System.Net;
|
||||
using Azure;
|
||||
using Azure.Maps.Geolocation;
|
||||
|
||||
var client = new MapsGeolocationClient(new AzureKeyCredential(subscriptionKey));
|
||||
|
||||
IPAddress ipAddress = IPAddress.Parse("2001:4898:80e8:b::189");
|
||||
Response<CountryRegionResult> result = client.GetCountryCode(ipAddress);
|
||||
|
||||
Console.WriteLine($"Country ISO Code: {result.Value.IsoCode}");
|
||||
```
|
||||
|
||||
### 11. Current Weather
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
using Azure.Core.GeoJson;
|
||||
using Azure.Maps.Weather;
|
||||
|
||||
var client = new MapsWeatherClient(new AzureKeyCredential(subscriptionKey));
|
||||
|
||||
var position = new GeoPosition(-122.13071, 47.64011);
|
||||
var options = new GetCurrentWeatherConditionsOptions(position);
|
||||
|
||||
Response<CurrentConditionsResult> result = client.GetCurrentWeatherConditions(options);
|
||||
|
||||
foreach (var condition in result.Value.Results)
|
||||
{
|
||||
Console.WriteLine($"Temperature: {condition.Temperature.Value} {condition.Temperature.Unit}");
|
||||
Console.WriteLine($"Weather: {condition.Phrase}");
|
||||
Console.WriteLine($"Humidity: {condition.RelativeHumidity}%");
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
### Search Package
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `MapsSearchClient` | Main client for search operations |
|
||||
| `GeocodingResponse` | Geocoding result |
|
||||
| `GeocodingBatchResponse` | Batch geocoding result |
|
||||
| `GeocodingQuery` | Query for batch geocoding |
|
||||
| `ReverseGeocodingQuery` | Query for batch reverse geocoding |
|
||||
| `GetPolygonOptions` | Options for polygon retrieval |
|
||||
| `Boundary` | Boundary polygon result |
|
||||
| `BoundaryResultTypeEnum` | Boundary type (Locality, AdminDistrict, etc.) |
|
||||
| `ResolutionEnum` | Polygon resolution (Small, Medium, Large) |
|
||||
|
||||
### Routing Package
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `MapsRoutingClient` | Main client for routing operations |
|
||||
| `RouteDirectionQuery` | Query for route directions |
|
||||
| `RouteDirectionOptions` | Route calculation options |
|
||||
| `RouteDirections` | Route directions result |
|
||||
| `RouteLeg` | Segment of a route |
|
||||
| `RouteMatrixQuery` | Query for route matrix |
|
||||
| `RouteMatrixResult` | Route matrix result |
|
||||
| `RouteRangeOptions` | Options for isochrone |
|
||||
| `RouteRangeResult` | Isochrone result |
|
||||
| `RouteType` | Route type (Fastest, Shortest, Eco, Thrilling) |
|
||||
| `TravelMode` | Travel mode (Car, Truck, Bicycle, Pedestrian) |
|
||||
|
||||
### Rendering Package
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `MapsRenderingClient` | Main client for rendering |
|
||||
| `GetMapTileOptions` | Map tile options |
|
||||
| `MapTileIndex` | Tile coordinates (X, Y, Zoom) |
|
||||
| `MapTileSetId` | Tile set identifier |
|
||||
|
||||
### Common Types
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `GeoPosition` | Geographic position (longitude, latitude) |
|
||||
| `GeoBoundingBox` | Bounding box for geographic area |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Entra ID for production** — Prefer over subscription keys
|
||||
2. **Batch operations** — Use batch geocoding for multiple addresses
|
||||
3. **Cache results** — Geocoding results don't change frequently
|
||||
4. **Use appropriate tile sizes** — 256 or 512 pixels based on display
|
||||
5. **Handle rate limits** — Implement exponential backoff
|
||||
6. **Use async route matrix** — For large matrix calculations (>100)
|
||||
7. **Consider traffic data** — Set `UseTrafficData = true` for accurate ETAs
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
try
|
||||
{
|
||||
Response<GeocodingResponse> result = client.GetGeocoding(address);
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Status: {ex.Status}");
|
||||
Console.WriteLine($"Error: {ex.Message}");
|
||||
|
||||
switch (ex.Status)
|
||||
{
|
||||
case 400:
|
||||
// Invalid request parameters
|
||||
break;
|
||||
case 401:
|
||||
// Authentication failed
|
||||
break;
|
||||
case 429:
|
||||
// Rate limited - implement backoff
|
||||
break;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.Maps.Search` | Geocoding, search | `dotnet add package Azure.Maps.Search --prerelease` |
|
||||
| `Azure.Maps.Routing` | Directions, matrix | `dotnet add package Azure.Maps.Routing --prerelease` |
|
||||
| `Azure.Maps.Rendering` | Map tiles, images | `dotnet add package Azure.Maps.Rendering --prerelease` |
|
||||
| `Azure.Maps.Geolocation` | IP geolocation | `dotnet add package Azure.Maps.Geolocation --prerelease` |
|
||||
| `Azure.Maps.Weather` | Weather data | `dotnet add package Azure.Maps.Weather --prerelease` |
|
||||
| `Azure.ResourceManager.Maps` | Account management | `dotnet add package Azure.ResourceManager.Maps --prerelease` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| Azure Maps Documentation | https://learn.microsoft.com/azure/azure-maps/ |
|
||||
| Search API Reference | https://learn.microsoft.com/dotnet/api/azure.maps.search |
|
||||
| Routing API Reference | https://learn.microsoft.com/dotnet/api/azure.maps.routing |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/maps |
|
||||
| Pricing | https://azure.microsoft.com/pricing/details/azure-maps/ |
|
||||
302
skills/azure-messaging-webpubsub-java/SKILL.md
Normal file
302
skills/azure-messaging-webpubsub-java/SKILL.md
Normal file
@@ -0,0 +1,302 @@
|
||||
---
|
||||
name: azure-messaging-webpubsub-java
|
||||
description: Build real-time web applications with Azure Web PubSub SDK for Java. Use when implementing WebSocket-based messaging, live updates, chat applications, or server-to-client push notifications.
|
||||
package: com.azure:azure-messaging-webpubsub
|
||||
---
|
||||
|
||||
# Azure Web PubSub SDK for Java
|
||||
|
||||
Build real-time web applications using the Azure Web PubSub SDK for Java.
|
||||
|
||||
## Installation
|
||||
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.azure</groupId>
|
||||
<artifactId>azure-messaging-webpubsub</artifactId>
|
||||
<version>1.5.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## Client Creation
|
||||
|
||||
### With Connection String
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.WebPubSubServiceClient;
|
||||
import com.azure.messaging.webpubsub.WebPubSubServiceClientBuilder;
|
||||
|
||||
WebPubSubServiceClient client = new WebPubSubServiceClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.hub("chat")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With Access Key
|
||||
|
||||
```java
|
||||
import com.azure.core.credential.AzureKeyCredential;
|
||||
|
||||
WebPubSubServiceClient client = new WebPubSubServiceClientBuilder()
|
||||
.credential(new AzureKeyCredential("<access-key>"))
|
||||
.endpoint("<endpoint>")
|
||||
.hub("chat")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### With DefaultAzureCredential
|
||||
|
||||
```java
|
||||
import com.azure.identity.DefaultAzureCredentialBuilder;
|
||||
|
||||
WebPubSubServiceClient client = new WebPubSubServiceClientBuilder()
|
||||
.credential(new DefaultAzureCredentialBuilder().build())
|
||||
.endpoint("<endpoint>")
|
||||
.hub("chat")
|
||||
.buildClient();
|
||||
```
|
||||
|
||||
### Async Client
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.WebPubSubServiceAsyncClient;
|
||||
|
||||
WebPubSubServiceAsyncClient asyncClient = new WebPubSubServiceClientBuilder()
|
||||
.connectionString("<connection-string>")
|
||||
.hub("chat")
|
||||
.buildAsyncClient();
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
- **Hub**: Logical isolation unit for connections
|
||||
- **Group**: Subset of connections within a hub
|
||||
- **Connection**: Individual WebSocket client connection
|
||||
- **User**: Entity that can have multiple connections
|
||||
|
||||
## Core Patterns
|
||||
|
||||
### Send to All Connections
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.models.WebPubSubContentType;
|
||||
|
||||
// Send text message
|
||||
client.sendToAll("Hello everyone!", WebPubSubContentType.TEXT_PLAIN);
|
||||
|
||||
// Send JSON
|
||||
String jsonMessage = "{\"type\": \"notification\", \"message\": \"New update!\"}";
|
||||
client.sendToAll(jsonMessage, WebPubSubContentType.APPLICATION_JSON);
|
||||
```
|
||||
|
||||
### Send to All with Filter
|
||||
|
||||
```java
|
||||
import com.azure.core.http.rest.RequestOptions;
|
||||
import com.azure.core.util.BinaryData;
|
||||
|
||||
BinaryData message = BinaryData.fromString("Hello filtered users!");
|
||||
|
||||
// Filter by userId
|
||||
client.sendToAllWithResponse(
|
||||
message,
|
||||
WebPubSubContentType.TEXT_PLAIN,
|
||||
message.getLength(),
|
||||
new RequestOptions().addQueryParam("filter", "userId ne 'user1'"));
|
||||
|
||||
// Filter by groups
|
||||
client.sendToAllWithResponse(
|
||||
message,
|
||||
WebPubSubContentType.TEXT_PLAIN,
|
||||
message.getLength(),
|
||||
new RequestOptions().addQueryParam("filter", "'GroupA' in groups and not('GroupB' in groups)"));
|
||||
```
|
||||
|
||||
### Send to Group
|
||||
|
||||
```java
|
||||
// Send to all connections in a group
|
||||
client.sendToGroup("java-developers", "Hello Java devs!", WebPubSubContentType.TEXT_PLAIN);
|
||||
|
||||
// Send JSON to group
|
||||
String json = "{\"event\": \"update\", \"data\": {\"version\": \"2.0\"}}";
|
||||
client.sendToGroup("subscribers", json, WebPubSubContentType.APPLICATION_JSON);
|
||||
```
|
||||
|
||||
### Send to Specific Connection
|
||||
|
||||
```java
|
||||
// Send to a specific connection by ID
|
||||
client.sendToConnection("connectionId123", "Private message", WebPubSubContentType.TEXT_PLAIN);
|
||||
```
|
||||
|
||||
### Send to User
|
||||
|
||||
```java
|
||||
// Send to all connections for a specific user
|
||||
client.sendToUser("andy", "Hello Andy!", WebPubSubContentType.TEXT_PLAIN);
|
||||
```
|
||||
|
||||
### Manage Groups
|
||||
|
||||
```java
|
||||
// Add connection to group
|
||||
client.addConnectionToGroup("premium-users", "connectionId123");
|
||||
|
||||
// Remove connection from group
|
||||
client.removeConnectionFromGroup("premium-users", "connectionId123");
|
||||
|
||||
// Add user to group (all their connections)
|
||||
client.addUserToGroup("admin-group", "userId456");
|
||||
|
||||
// Remove user from group
|
||||
client.removeUserFromGroup("admin-group", "userId456");
|
||||
|
||||
// Check if user is in group
|
||||
boolean exists = client.userExistsInGroup("admin-group", "userId456");
|
||||
```
|
||||
|
||||
### Manage Connections
|
||||
|
||||
```java
|
||||
// Check if connection exists
|
||||
boolean connected = client.connectionExists("connectionId123");
|
||||
|
||||
// Close a connection
|
||||
client.closeConnection("connectionId123");
|
||||
|
||||
// Close with reason
|
||||
client.closeConnection("connectionId123", "Session expired");
|
||||
|
||||
// Check if user exists (has any connections)
|
||||
boolean userOnline = client.userExists("userId456");
|
||||
|
||||
// Close all connections for a user
|
||||
client.closeUserConnections("userId456");
|
||||
|
||||
// Close all connections in a group
|
||||
client.closeGroupConnections("inactive-group");
|
||||
```
|
||||
|
||||
### Generate Client Access Token
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.models.GetClientAccessTokenOptions;
|
||||
import com.azure.messaging.webpubsub.models.WebPubSubClientAccessToken;
|
||||
|
||||
// Basic token
|
||||
WebPubSubClientAccessToken token = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions());
|
||||
System.out.println("URL: " + token.getUrl());
|
||||
|
||||
// With user ID
|
||||
WebPubSubClientAccessToken userToken = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions().setUserId("user123"));
|
||||
|
||||
// With roles (permissions)
|
||||
WebPubSubClientAccessToken roleToken = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions()
|
||||
.setUserId("user123")
|
||||
.addRole("webpubsub.joinLeaveGroup")
|
||||
.addRole("webpubsub.sendToGroup"));
|
||||
|
||||
// With groups to join on connect
|
||||
WebPubSubClientAccessToken groupToken = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions()
|
||||
.setUserId("user123")
|
||||
.addGroup("announcements")
|
||||
.addGroup("updates"));
|
||||
|
||||
// With custom expiration
|
||||
WebPubSubClientAccessToken expToken = client.getClientAccessToken(
|
||||
new GetClientAccessTokenOptions()
|
||||
.setUserId("user123")
|
||||
.setExpiresAfter(Duration.ofHours(2)));
|
||||
```
|
||||
|
||||
### Grant/Revoke Permissions
|
||||
|
||||
```java
|
||||
import com.azure.messaging.webpubsub.models.WebPubSubPermission;
|
||||
|
||||
// Grant permission to send to a group
|
||||
client.grantPermission(
|
||||
WebPubSubPermission.SEND_TO_GROUP,
|
||||
"connectionId123",
|
||||
new RequestOptions().addQueryParam("targetName", "chat-room"));
|
||||
|
||||
// Revoke permission
|
||||
client.revokePermission(
|
||||
WebPubSubPermission.SEND_TO_GROUP,
|
||||
"connectionId123",
|
||||
new RequestOptions().addQueryParam("targetName", "chat-room"));
|
||||
|
||||
// Check permission
|
||||
boolean hasPermission = client.checkPermission(
|
||||
WebPubSubPermission.SEND_TO_GROUP,
|
||||
"connectionId123",
|
||||
new RequestOptions().addQueryParam("targetName", "chat-room"));
|
||||
```
|
||||
|
||||
### Async Operations
|
||||
|
||||
```java
|
||||
asyncClient.sendToAll("Async message!", WebPubSubContentType.TEXT_PLAIN)
|
||||
.subscribe(
|
||||
unused -> System.out.println("Message sent"),
|
||||
error -> System.err.println("Error: " + error.getMessage())
|
||||
);
|
||||
|
||||
asyncClient.sendToGroup("developers", "Group message", WebPubSubContentType.TEXT_PLAIN)
|
||||
.doOnSuccess(v -> System.out.println("Sent to group"))
|
||||
.doOnError(e -> System.err.println("Failed: " + e))
|
||||
.subscribe();
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
```java
|
||||
import com.azure.core.exception.HttpResponseException;
|
||||
|
||||
try {
|
||||
client.sendToConnection("invalid-id", "test", WebPubSubContentType.TEXT_PLAIN);
|
||||
} catch (HttpResponseException e) {
|
||||
System.out.println("Status: " + e.getResponse().getStatusCode());
|
||||
System.out.println("Error: " + e.getMessage());
|
||||
}
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
WEB_PUBSUB_CONNECTION_STRING=Endpoint=https://<resource>.webpubsub.azure.com;AccessKey=...
|
||||
WEB_PUBSUB_ENDPOINT=https://<resource>.webpubsub.azure.com
|
||||
WEB_PUBSUB_ACCESS_KEY=<your-access-key>
|
||||
```
|
||||
|
||||
## Client Roles
|
||||
|
||||
| Role | Permission |
|
||||
|------|------------|
|
||||
| `webpubsub.joinLeaveGroup` | Join/leave any group |
|
||||
| `webpubsub.sendToGroup` | Send to any group |
|
||||
| `webpubsub.joinLeaveGroup.<group>` | Join/leave specific group |
|
||||
| `webpubsub.sendToGroup.<group>` | Send to specific group |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use Groups**: Organize connections into groups for targeted messaging
|
||||
2. **User IDs**: Associate connections with user IDs for user-level messaging
|
||||
3. **Token Expiration**: Set appropriate token expiration for security
|
||||
4. **Roles**: Grant minimal required permissions via roles
|
||||
5. **Hub Isolation**: Use separate hubs for different application features
|
||||
6. **Connection Management**: Clean up inactive connections
|
||||
|
||||
## Trigger Phrases
|
||||
|
||||
- "Web PubSub Java"
|
||||
- "WebSocket messaging Azure"
|
||||
- "real-time push notifications"
|
||||
- "server-sent events"
|
||||
- "chat application backend"
|
||||
- "live updates broadcasting"
|
||||
245
skills/azure-messaging-webpubsubservice-py/SKILL.md
Normal file
245
skills/azure-messaging-webpubsubservice-py/SKILL.md
Normal file
@@ -0,0 +1,245 @@
|
||||
---
|
||||
name: azure-messaging-webpubsubservice-py
|
||||
description: |
|
||||
Azure Web PubSub Service SDK for Python. Use for real-time messaging, WebSocket connections, and pub/sub patterns.
|
||||
Triggers: "azure-messaging-webpubsubservice", "WebPubSubServiceClient", "real-time", "WebSocket", "pub/sub".
|
||||
package: azure-messaging-webpubsubservice
|
||||
---
|
||||
|
||||
# Azure Web PubSub Service SDK for Python
|
||||
|
||||
Real-time messaging with WebSocket connections at scale.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Service SDK (server-side)
|
||||
pip install azure-messaging-webpubsubservice
|
||||
|
||||
# Client SDK (for Python WebSocket clients)
|
||||
pip install azure-messaging-webpubsubclient
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_WEBPUBSUB_CONNECTION_STRING=Endpoint=https://<name>.webpubsub.azure.com;AccessKey=...
|
||||
AZURE_WEBPUBSUB_HUB=my-hub
|
||||
```
|
||||
|
||||
## Service Client (Server-Side)
|
||||
|
||||
### Authentication
|
||||
|
||||
```python
|
||||
from azure.messaging.webpubsubservice import WebPubSubServiceClient
|
||||
|
||||
# Connection string
|
||||
client = WebPubSubServiceClient.from_connection_string(
|
||||
connection_string=os.environ["AZURE_WEBPUBSUB_CONNECTION_STRING"],
|
||||
hub="my-hub"
|
||||
)
|
||||
|
||||
# Entra ID
|
||||
from azure.identity import DefaultAzureCredential
|
||||
|
||||
client = WebPubSubServiceClient(
|
||||
endpoint="https://<name>.webpubsub.azure.com",
|
||||
hub="my-hub",
|
||||
credential=DefaultAzureCredential()
|
||||
)
|
||||
```
|
||||
|
||||
### Generate Client Access Token
|
||||
|
||||
```python
|
||||
# Token for anonymous user
|
||||
token = client.get_client_access_token()
|
||||
print(f"URL: {token['url']}")
|
||||
|
||||
# Token with user ID
|
||||
token = client.get_client_access_token(
|
||||
user_id="user123",
|
||||
roles=["webpubsub.sendToGroup", "webpubsub.joinLeaveGroup"]
|
||||
)
|
||||
|
||||
# Token with groups
|
||||
token = client.get_client_access_token(
|
||||
user_id="user123",
|
||||
groups=["group1", "group2"]
|
||||
)
|
||||
```
|
||||
|
||||
### Send to All Clients
|
||||
|
||||
```python
|
||||
# Send text
|
||||
client.send_to_all(message="Hello everyone!", content_type="text/plain")
|
||||
|
||||
# Send JSON
|
||||
client.send_to_all(
|
||||
message={"type": "notification", "data": "Hello"},
|
||||
content_type="application/json"
|
||||
)
|
||||
```
|
||||
|
||||
### Send to User
|
||||
|
||||
```python
|
||||
client.send_to_user(
|
||||
user_id="user123",
|
||||
message="Hello user!",
|
||||
content_type="text/plain"
|
||||
)
|
||||
```
|
||||
|
||||
### Send to Group
|
||||
|
||||
```python
|
||||
client.send_to_group(
|
||||
group="my-group",
|
||||
message="Hello group!",
|
||||
content_type="text/plain"
|
||||
)
|
||||
```
|
||||
|
||||
### Send to Connection
|
||||
|
||||
```python
|
||||
client.send_to_connection(
|
||||
connection_id="abc123",
|
||||
message="Hello connection!",
|
||||
content_type="text/plain"
|
||||
)
|
||||
```
|
||||
|
||||
### Group Management
|
||||
|
||||
```python
|
||||
# Add user to group
|
||||
client.add_user_to_group(group="my-group", user_id="user123")
|
||||
|
||||
# Remove user from group
|
||||
client.remove_user_from_group(group="my-group", user_id="user123")
|
||||
|
||||
# Add connection to group
|
||||
client.add_connection_to_group(group="my-group", connection_id="abc123")
|
||||
|
||||
# Remove connection from group
|
||||
client.remove_connection_from_group(group="my-group", connection_id="abc123")
|
||||
```
|
||||
|
||||
### Connection Management
|
||||
|
||||
```python
|
||||
# Check if connection exists
|
||||
exists = client.connection_exists(connection_id="abc123")
|
||||
|
||||
# Check if user has connections
|
||||
exists = client.user_exists(user_id="user123")
|
||||
|
||||
# Check if group has connections
|
||||
exists = client.group_exists(group="my-group")
|
||||
|
||||
# Close connection
|
||||
client.close_connection(connection_id="abc123", reason="Session ended")
|
||||
|
||||
# Close all connections for user
|
||||
client.close_all_connections(user_id="user123")
|
||||
```
|
||||
|
||||
### Grant/Revoke Permissions
|
||||
|
||||
```python
|
||||
from azure.messaging.webpubsubservice import WebPubSubServiceClient
|
||||
|
||||
# Grant permission
|
||||
client.grant_permission(
|
||||
permission="joinLeaveGroup",
|
||||
connection_id="abc123",
|
||||
target_name="my-group"
|
||||
)
|
||||
|
||||
# Revoke permission
|
||||
client.revoke_permission(
|
||||
permission="joinLeaveGroup",
|
||||
connection_id="abc123",
|
||||
target_name="my-group"
|
||||
)
|
||||
|
||||
# Check permission
|
||||
has_permission = client.check_permission(
|
||||
permission="joinLeaveGroup",
|
||||
connection_id="abc123",
|
||||
target_name="my-group"
|
||||
)
|
||||
```
|
||||
|
||||
## Client SDK (Python WebSocket Client)
|
||||
|
||||
```python
|
||||
from azure.messaging.webpubsubclient import WebPubSubClient
|
||||
|
||||
client = WebPubSubClient(credential=token["url"])
|
||||
|
||||
# Event handlers
|
||||
@client.on("connected")
|
||||
def on_connected(e):
|
||||
print(f"Connected: {e.connection_id}")
|
||||
|
||||
@client.on("server-message")
|
||||
def on_message(e):
|
||||
print(f"Message: {e.data}")
|
||||
|
||||
@client.on("group-message")
|
||||
def on_group_message(e):
|
||||
print(f"Group {e.group}: {e.data}")
|
||||
|
||||
# Connect and send
|
||||
client.open()
|
||||
client.send_to_group("my-group", "Hello from Python!")
|
||||
```
|
||||
|
||||
## Async Service Client
|
||||
|
||||
```python
|
||||
from azure.messaging.webpubsubservice.aio import WebPubSubServiceClient
|
||||
from azure.identity.aio import DefaultAzureCredential
|
||||
|
||||
async def broadcast():
|
||||
credential = DefaultAzureCredential()
|
||||
client = WebPubSubServiceClient(
|
||||
endpoint="https://<name>.webpubsub.azure.com",
|
||||
hub="my-hub",
|
||||
credential=credential
|
||||
)
|
||||
|
||||
await client.send_to_all("Hello async!", content_type="text/plain")
|
||||
|
||||
await client.close()
|
||||
await credential.close()
|
||||
```
|
||||
|
||||
## Client Operations
|
||||
|
||||
| Operation | Description |
|
||||
|-----------|-------------|
|
||||
| `get_client_access_token` | Generate WebSocket connection URL |
|
||||
| `send_to_all` | Broadcast to all connections |
|
||||
| `send_to_user` | Send to specific user |
|
||||
| `send_to_group` | Send to group members |
|
||||
| `send_to_connection` | Send to specific connection |
|
||||
| `add_user_to_group` | Add user to group |
|
||||
| `remove_user_from_group` | Remove user from group |
|
||||
| `close_connection` | Disconnect client |
|
||||
| `connection_exists` | Check connection status |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use roles** to limit client permissions
|
||||
2. **Use groups** for targeted messaging
|
||||
3. **Generate short-lived tokens** for security
|
||||
4. **Use user IDs** to send to users across connections
|
||||
5. **Handle reconnection** in client applications
|
||||
6. **Use JSON** content type for structured data
|
||||
7. **Close connections** gracefully with reasons
|
||||
411
skills/azure-mgmt-apicenter-dotnet/SKILL.md
Normal file
411
skills/azure-mgmt-apicenter-dotnet/SKILL.md
Normal file
@@ -0,0 +1,411 @@
|
||||
---
|
||||
name: azure-mgmt-apicenter-dotnet
|
||||
description: |
|
||||
Azure API Center SDK for .NET. Centralized API inventory management with governance, versioning, and discovery. Use for creating API services, workspaces, APIs, versions, definitions, environments, deployments, and metadata schemas. Triggers: "API Center", "ApiCenterService", "ApiCenterWorkspace", "ApiCenterApi", "API inventory", "API governance", "API versioning", "API catalog", "API discovery".
|
||||
package: Azure.ResourceManager.ApiCenter
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.ApiCenter (.NET)
|
||||
|
||||
Centralized API inventory and governance SDK for managing APIs across your organization.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.ApiCenter
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.0.0 (GA)
|
||||
**API Version**: 2024-03-01
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_APICENTER_SERVICE_NAME=<your-apicenter-service>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.ApiCenter;
|
||||
|
||||
ArmClient client = new ArmClient(new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
Subscription
|
||||
└── ResourceGroup
|
||||
└── ApiCenterService # API inventory service
|
||||
├── Workspace # Logical grouping of APIs
|
||||
│ ├── Api # API definition
|
||||
│ │ └── ApiVersion # Version of the API
|
||||
│ │ └── ApiDefinition # OpenAPI/GraphQL/etc specification
|
||||
│ ├── Environment # Deployment target (dev/staging/prod)
|
||||
│ └── Deployment # API deployed to environment
|
||||
└── MetadataSchema # Custom metadata definitions
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create API Center Service
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.ApiCenter;
|
||||
using Azure.ResourceManager.ApiCenter.Models;
|
||||
|
||||
ResourceGroupResource resourceGroup = await client
|
||||
.GetDefaultSubscriptionAsync()
|
||||
.Result
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
ApiCenterServiceCollection services = resourceGroup.GetApiCenterServices();
|
||||
|
||||
ApiCenterServiceData data = new ApiCenterServiceData(AzureLocation.EastUS)
|
||||
{
|
||||
Identity = new ManagedServiceIdentity(ManagedServiceIdentityType.SystemAssigned)
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterServiceResource> operation = await services
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-api-center", data);
|
||||
|
||||
ApiCenterServiceResource service = operation.Value;
|
||||
```
|
||||
|
||||
### 2. Create Workspace
|
||||
|
||||
```csharp
|
||||
ApiCenterWorkspaceCollection workspaces = service.GetApiCenterWorkspaces();
|
||||
|
||||
ApiCenterWorkspaceData workspaceData = new ApiCenterWorkspaceData
|
||||
{
|
||||
Title = "Engineering APIs",
|
||||
Description = "APIs owned by the engineering team"
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterWorkspaceResource> operation = await workspaces
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "engineering", workspaceData);
|
||||
|
||||
ApiCenterWorkspaceResource workspace = operation.Value;
|
||||
```
|
||||
|
||||
### 3. Create API
|
||||
|
||||
```csharp
|
||||
ApiCenterApiCollection apis = workspace.GetApiCenterApis();
|
||||
|
||||
ApiCenterApiData apiData = new ApiCenterApiData
|
||||
{
|
||||
Title = "Orders API",
|
||||
Description = "API for managing customer orders",
|
||||
Kind = ApiKind.Rest,
|
||||
LifecycleStage = ApiLifecycleStage.Production,
|
||||
TermsOfService = new ApiTermsOfService
|
||||
{
|
||||
Uri = new Uri("https://example.com/terms")
|
||||
},
|
||||
ExternalDocumentation =
|
||||
{
|
||||
new ApiExternalDocumentation
|
||||
{
|
||||
Title = "Documentation",
|
||||
Uri = new Uri("https://docs.example.com/orders")
|
||||
}
|
||||
},
|
||||
Contacts =
|
||||
{
|
||||
new ApiContact
|
||||
{
|
||||
Name = "API Support",
|
||||
Email = "api-support@example.com"
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Add custom metadata
|
||||
apiData.CustomProperties = BinaryData.FromObjectAsJson(new
|
||||
{
|
||||
team = "orders-team",
|
||||
costCenter = "CC-1234"
|
||||
});
|
||||
|
||||
ArmOperation<ApiCenterApiResource> operation = await apis
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "orders-api", apiData);
|
||||
|
||||
ApiCenterApiResource api = operation.Value;
|
||||
```
|
||||
|
||||
### 4. Create API Version
|
||||
|
||||
```csharp
|
||||
ApiCenterApiVersionCollection versions = api.GetApiCenterApiVersions();
|
||||
|
||||
ApiCenterApiVersionData versionData = new ApiCenterApiVersionData
|
||||
{
|
||||
Title = "v1.0.0",
|
||||
LifecycleStage = ApiLifecycleStage.Production
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterApiVersionResource> operation = await versions
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "v1-0-0", versionData);
|
||||
|
||||
ApiCenterApiVersionResource version = operation.Value;
|
||||
```
|
||||
|
||||
### 5. Create API Definition (Upload OpenAPI Spec)
|
||||
|
||||
```csharp
|
||||
ApiCenterApiDefinitionCollection definitions = version.GetApiCenterApiDefinitions();
|
||||
|
||||
ApiCenterApiDefinitionData definitionData = new ApiCenterApiDefinitionData
|
||||
{
|
||||
Title = "OpenAPI Specification",
|
||||
Description = "Orders API OpenAPI 3.0 definition"
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterApiDefinitionResource> operation = await definitions
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "openapi", definitionData);
|
||||
|
||||
ApiCenterApiDefinitionResource definition = operation.Value;
|
||||
|
||||
// Import specification
|
||||
string openApiSpec = await File.ReadAllTextAsync("orders-api.yaml");
|
||||
|
||||
ApiSpecImportContent importContent = new ApiSpecImportContent
|
||||
{
|
||||
Format = ApiSpecImportSourceFormat.Inline,
|
||||
Value = openApiSpec,
|
||||
Specification = new ApiSpecImportSpecification
|
||||
{
|
||||
Name = "openapi",
|
||||
Version = "3.0.1"
|
||||
}
|
||||
};
|
||||
|
||||
await definition.ImportSpecificationAsync(WaitUntil.Completed, importContent);
|
||||
```
|
||||
|
||||
### 6. Export API Specification
|
||||
|
||||
```csharp
|
||||
ApiCenterApiDefinitionResource definition = await client
|
||||
.GetApiCenterApiDefinitionResource(definitionResourceId)
|
||||
.GetAsync();
|
||||
|
||||
ArmOperation<ApiSpecExportResult> operation = await definition
|
||||
.ExportSpecificationAsync(WaitUntil.Completed);
|
||||
|
||||
ApiSpecExportResult result = operation.Value;
|
||||
|
||||
// result.Format - e.g., "inline"
|
||||
// result.Value - the specification content
|
||||
```
|
||||
|
||||
### 7. Create Environment
|
||||
|
||||
```csharp
|
||||
ApiCenterEnvironmentCollection environments = workspace.GetApiCenterEnvironments();
|
||||
|
||||
ApiCenterEnvironmentData envData = new ApiCenterEnvironmentData
|
||||
{
|
||||
Title = "Production",
|
||||
Description = "Production environment",
|
||||
Kind = ApiCenterEnvironmentKind.Production,
|
||||
Server = new ApiCenterEnvironmentServer
|
||||
{
|
||||
ManagementPortalUris = { new Uri("https://portal.azure.com") }
|
||||
},
|
||||
Onboarding = new EnvironmentOnboardingModel
|
||||
{
|
||||
Instructions = "Contact platform team for access",
|
||||
DeveloperPortalUris = { new Uri("https://developer.example.com") }
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterEnvironmentResource> operation = await environments
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "production", envData);
|
||||
```
|
||||
|
||||
### 8. Create Deployment
|
||||
|
||||
```csharp
|
||||
ApiCenterDeploymentCollection deployments = workspace.GetApiCenterDeployments();
|
||||
|
||||
// Get environment resource ID
|
||||
ResourceIdentifier envResourceId = ApiCenterEnvironmentResource.CreateResourceIdentifier(
|
||||
subscriptionId, resourceGroupName, serviceName, workspaceName, "production");
|
||||
|
||||
// Get API definition resource ID
|
||||
ResourceIdentifier definitionResourceId = ApiCenterApiDefinitionResource.CreateResourceIdentifier(
|
||||
subscriptionId, resourceGroupName, serviceName, workspaceName,
|
||||
"orders-api", "v1-0-0", "openapi");
|
||||
|
||||
ApiCenterDeploymentData deploymentData = new ApiCenterDeploymentData
|
||||
{
|
||||
Title = "Orders API - Production",
|
||||
Description = "Production deployment of Orders API v1.0.0",
|
||||
EnvironmentId = envResourceId,
|
||||
DefinitionId = definitionResourceId,
|
||||
State = ApiCenterDeploymentState.Active,
|
||||
Server = new ApiCenterDeploymentServer
|
||||
{
|
||||
RuntimeUris = { new Uri("https://api.example.com/orders") }
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterDeploymentResource> operation = await deployments
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "orders-api-prod", deploymentData);
|
||||
```
|
||||
|
||||
### 9. Create Metadata Schema
|
||||
|
||||
```csharp
|
||||
ApiCenterMetadataSchemaCollection schemas = service.GetApiCenterMetadataSchemas();
|
||||
|
||||
string jsonSchema = """
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"team": {
|
||||
"type": "string",
|
||||
"title": "Owning Team"
|
||||
},
|
||||
"costCenter": {
|
||||
"type": "string",
|
||||
"title": "Cost Center"
|
||||
},
|
||||
"dataClassification": {
|
||||
"type": "string",
|
||||
"enum": ["public", "internal", "confidential"],
|
||||
"title": "Data Classification"
|
||||
}
|
||||
},
|
||||
"required": ["team"]
|
||||
}
|
||||
""";
|
||||
|
||||
ApiCenterMetadataSchemaData schemaData = new ApiCenterMetadataSchemaData
|
||||
{
|
||||
Schema = jsonSchema,
|
||||
AssignedTo =
|
||||
{
|
||||
new MetadataAssignment
|
||||
{
|
||||
Entity = MetadataAssignmentEntity.Api,
|
||||
Required = true
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<ApiCenterMetadataSchemaResource> operation = await schemas
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "api-metadata", schemaData);
|
||||
```
|
||||
|
||||
### 10. List and Search APIs
|
||||
|
||||
```csharp
|
||||
// List all APIs in a workspace
|
||||
ApiCenterWorkspaceResource workspace = await client
|
||||
.GetApiCenterWorkspaceResource(workspaceResourceId)
|
||||
.GetAsync();
|
||||
|
||||
await foreach (ApiCenterApiResource api in workspace.GetApiCenterApis())
|
||||
{
|
||||
Console.WriteLine($"API: {api.Data.Title}");
|
||||
Console.WriteLine($" Kind: {api.Data.Kind}");
|
||||
Console.WriteLine($" Stage: {api.Data.LifecycleStage}");
|
||||
|
||||
// List versions
|
||||
await foreach (ApiCenterApiVersionResource version in api.GetApiCenterApiVersions())
|
||||
{
|
||||
Console.WriteLine($" Version: {version.Data.Title}");
|
||||
}
|
||||
}
|
||||
|
||||
// List environments
|
||||
await foreach (ApiCenterEnvironmentResource env in workspace.GetApiCenterEnvironments())
|
||||
{
|
||||
Console.WriteLine($"Environment: {env.Data.Title} ({env.Data.Kind})");
|
||||
}
|
||||
|
||||
// List deployments
|
||||
await foreach (ApiCenterDeploymentResource deployment in workspace.GetApiCenterDeployments())
|
||||
{
|
||||
Console.WriteLine($"Deployment: {deployment.Data.Title}");
|
||||
Console.WriteLine($" State: {deployment.Data.State}");
|
||||
}
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ApiCenterServiceResource` | API Center service instance |
|
||||
| `ApiCenterWorkspaceResource` | Logical grouping of APIs |
|
||||
| `ApiCenterApiResource` | Individual API |
|
||||
| `ApiCenterApiVersionResource` | Version of an API |
|
||||
| `ApiCenterApiDefinitionResource` | API specification (OpenAPI, etc.) |
|
||||
| `ApiCenterEnvironmentResource` | Deployment environment |
|
||||
| `ApiCenterDeploymentResource` | API deployment to environment |
|
||||
| `ApiCenterMetadataSchemaResource` | Custom metadata schema |
|
||||
| `ApiKind` | rest, graphql, grpc, soap, webhook, websocket, mcp |
|
||||
| `ApiLifecycleStage` | design, development, testing, preview, production, deprecated, retired |
|
||||
| `ApiCenterEnvironmentKind` | development, testing, staging, production |
|
||||
| `ApiCenterDeploymentState` | active, inactive |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Organize with workspaces** — Group APIs by team, domain, or product
|
||||
2. **Use metadata schemas** — Define custom properties for governance
|
||||
3. **Track lifecycle stages** — Keep API status current (design → production → deprecated)
|
||||
4. **Document environments** — Include onboarding instructions and portal URIs
|
||||
5. **Version consistently** — Use semantic versioning for API versions
|
||||
6. **Import specifications** — Upload OpenAPI/GraphQL specs for discovery
|
||||
7. **Link deployments** — Connect APIs to their runtime environments
|
||||
8. **Use managed identity** — Enable SystemAssigned identity for secure integrations
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ArmOperation<ApiCenterApiResource> operation = await apis
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-api", apiData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("API already exists with conflicting configuration");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid request: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.ApiCenter` | API Center management (this SDK) | `dotnet add package Azure.ResourceManager.ApiCenter` |
|
||||
| `Azure.ResourceManager.ApiManagement` | API gateway and policies | `dotnet add package Azure.ResourceManager.ApiManagement` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.ResourceManager.ApiCenter |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.resourcemanager.apicenter |
|
||||
| Product Documentation | https://learn.microsoft.com/azure/api-center/ |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/apicenter/Azure.ResourceManager.ApiCenter |
|
||||
242
skills/azure-mgmt-apicenter-py/SKILL.md
Normal file
242
skills/azure-mgmt-apicenter-py/SKILL.md
Normal file
@@ -0,0 +1,242 @@
|
||||
---
|
||||
name: azure-mgmt-apicenter-py
|
||||
description: |
|
||||
Azure API Center Management SDK for Python. Use for managing API inventory, metadata, and governance across your organization.
|
||||
Triggers: "azure-mgmt-apicenter", "ApiCenterMgmtClient", "API Center", "API inventory", "API governance".
|
||||
package: azure-mgmt-apicenter
|
||||
---
|
||||
|
||||
# Azure API Center Management SDK for Python
|
||||
|
||||
Manage API inventory, metadata, and governance in Azure API Center.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-mgmt-apicenter
|
||||
pip install azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=your-subscription-id
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.mgmt.apicenter import ApiCenterMgmtClient
|
||||
import os
|
||||
|
||||
client = ApiCenterMgmtClient(
|
||||
credential=DefaultAzureCredential(),
|
||||
subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"]
|
||||
)
|
||||
```
|
||||
|
||||
## Create API Center
|
||||
|
||||
```python
|
||||
from azure.mgmt.apicenter.models import Service
|
||||
|
||||
api_center = client.services.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
resource=Service(
|
||||
location="eastus",
|
||||
tags={"environment": "production"}
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Created API Center: {api_center.name}")
|
||||
```
|
||||
|
||||
## List API Centers
|
||||
|
||||
```python
|
||||
api_centers = client.services.list_by_subscription()
|
||||
|
||||
for api_center in api_centers:
|
||||
print(f"{api_center.name} - {api_center.location}")
|
||||
```
|
||||
|
||||
## Register an API
|
||||
|
||||
```python
|
||||
from azure.mgmt.apicenter.models import Api, ApiKind, LifecycleStage
|
||||
|
||||
api = client.apis.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
workspace_name="default",
|
||||
api_name="my-api",
|
||||
resource=Api(
|
||||
title="My API",
|
||||
description="A sample API for demonstration",
|
||||
kind=ApiKind.REST,
|
||||
lifecycle_stage=LifecycleStage.PRODUCTION,
|
||||
terms_of_service={"url": "https://example.com/terms"},
|
||||
contacts=[{"name": "API Team", "email": "api-team@example.com"}]
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Registered API: {api.title}")
|
||||
```
|
||||
|
||||
## Create API Version
|
||||
|
||||
```python
|
||||
from azure.mgmt.apicenter.models import ApiVersion, LifecycleStage
|
||||
|
||||
version = client.api_versions.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
workspace_name="default",
|
||||
api_name="my-api",
|
||||
version_name="v1",
|
||||
resource=ApiVersion(
|
||||
title="Version 1.0",
|
||||
lifecycle_stage=LifecycleStage.PRODUCTION
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Created version: {version.title}")
|
||||
```
|
||||
|
||||
## Add API Definition
|
||||
|
||||
```python
|
||||
from azure.mgmt.apicenter.models import ApiDefinition
|
||||
|
||||
definition = client.api_definitions.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
workspace_name="default",
|
||||
api_name="my-api",
|
||||
version_name="v1",
|
||||
definition_name="openapi",
|
||||
resource=ApiDefinition(
|
||||
title="OpenAPI Definition",
|
||||
description="OpenAPI 3.0 specification"
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Import API Specification
|
||||
|
||||
```python
|
||||
from azure.mgmt.apicenter.models import ApiSpecImportRequest, ApiSpecImportSourceFormat
|
||||
|
||||
# Import from inline content
|
||||
client.api_definitions.import_specification(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
workspace_name="default",
|
||||
api_name="my-api",
|
||||
version_name="v1",
|
||||
definition_name="openapi",
|
||||
body=ApiSpecImportRequest(
|
||||
format=ApiSpecImportSourceFormat.INLINE,
|
||||
value='{"openapi": "3.0.0", "info": {"title": "My API", "version": "1.0"}, "paths": {}}'
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## List APIs
|
||||
|
||||
```python
|
||||
apis = client.apis.list(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
workspace_name="default"
|
||||
)
|
||||
|
||||
for api in apis:
|
||||
print(f"{api.name}: {api.title} ({api.kind})")
|
||||
```
|
||||
|
||||
## Create Environment
|
||||
|
||||
```python
|
||||
from azure.mgmt.apicenter.models import Environment, EnvironmentKind
|
||||
|
||||
environment = client.environments.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
workspace_name="default",
|
||||
environment_name="production",
|
||||
resource=Environment(
|
||||
title="Production",
|
||||
description="Production environment",
|
||||
kind=EnvironmentKind.PRODUCTION,
|
||||
server={"type": "Azure API Management", "management_portal_uri": ["https://portal.azure.com"]}
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Create Deployment
|
||||
|
||||
```python
|
||||
from azure.mgmt.apicenter.models import Deployment, DeploymentState
|
||||
|
||||
deployment = client.deployments.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
workspace_name="default",
|
||||
api_name="my-api",
|
||||
deployment_name="prod-deployment",
|
||||
resource=Deployment(
|
||||
title="Production Deployment",
|
||||
description="Deployed to production APIM",
|
||||
environment_id="/workspaces/default/environments/production",
|
||||
definition_id="/workspaces/default/apis/my-api/versions/v1/definitions/openapi",
|
||||
state=DeploymentState.ACTIVE,
|
||||
server={"runtime_uri": ["https://api.example.com"]}
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Define Custom Metadata
|
||||
|
||||
```python
|
||||
from azure.mgmt.apicenter.models import MetadataSchema
|
||||
|
||||
metadata = client.metadata_schemas.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-api-center",
|
||||
metadata_schema_name="data-classification",
|
||||
resource=MetadataSchema(
|
||||
schema='{"type": "string", "title": "Data Classification", "enum": ["public", "internal", "confidential"]}'
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Client Types
|
||||
|
||||
| Client | Purpose |
|
||||
|--------|---------|
|
||||
| `ApiCenterMgmtClient` | Main client for all operations |
|
||||
|
||||
## Operations
|
||||
|
||||
| Operation Group | Purpose |
|
||||
|----------------|---------|
|
||||
| `services` | API Center service management |
|
||||
| `workspaces` | Workspace management |
|
||||
| `apis` | API registration and management |
|
||||
| `api_versions` | API version management |
|
||||
| `api_definitions` | API definition management |
|
||||
| `deployments` | Deployment tracking |
|
||||
| `environments` | Environment management |
|
||||
| `metadata_schemas` | Custom metadata definitions |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use workspaces** to organize APIs by team or domain
|
||||
2. **Define metadata schemas** for consistent governance
|
||||
3. **Track deployments** to understand where APIs are running
|
||||
4. **Import specifications** to enable API analysis and linting
|
||||
5. **Use lifecycle stages** to track API maturity
|
||||
6. **Add contacts** for API ownership and support
|
||||
310
skills/azure-mgmt-apimanagement-dotnet/SKILL.md
Normal file
310
skills/azure-mgmt-apimanagement-dotnet/SKILL.md
Normal file
@@ -0,0 +1,310 @@
|
||||
---
|
||||
name: azure-mgmt-apimanagement-dotnet
|
||||
description: |
|
||||
Azure Resource Manager SDK for API Management in .NET. Use for MANAGEMENT PLANE operations: creating/managing APIM services, APIs, products, subscriptions, policies, users, groups, gateways, and backends via Azure Resource Manager. Triggers: "API Management", "APIM service", "create APIM", "manage APIs", "ApiManagementServiceResource", "API policies", "APIM products", "APIM subscriptions".
|
||||
package: Azure.ResourceManager.ApiManagement
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.ApiManagement (.NET)
|
||||
|
||||
Management plane SDK for provisioning and managing Azure API Management resources via Azure Resource Manager.
|
||||
|
||||
> **⚠️ Management vs Data Plane**
|
||||
> - **This SDK (Azure.ResourceManager.ApiManagement)**: Create services, APIs, products, subscriptions, policies, users, groups
|
||||
> - **Data Plane**: Direct API calls to your APIM gateway endpoints
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.ApiManagement
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.3.0
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
# For service principal auth (optional)
|
||||
AZURE_TENANT_ID=<tenant-id>
|
||||
AZURE_CLIENT_ID=<client-id>
|
||||
AZURE_CLIENT_SECRET=<client-secret>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.ApiManagement;
|
||||
|
||||
// Always use DefaultAzureCredential
|
||||
var credential = new DefaultAzureCredential();
|
||||
var armClient = new ArmClient(credential);
|
||||
|
||||
// Get subscription
|
||||
var subscriptionId = Environment.GetEnvironmentVariable("AZURE_SUBSCRIPTION_ID");
|
||||
var subscription = armClient.GetSubscriptionResource(
|
||||
new ResourceIdentifier($"/subscriptions/{subscriptionId}"));
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
ArmClient
|
||||
└── SubscriptionResource
|
||||
└── ResourceGroupResource
|
||||
└── ApiManagementServiceResource
|
||||
├── ApiResource
|
||||
│ ├── ApiOperationResource
|
||||
│ │ └── ApiOperationPolicyResource
|
||||
│ ├── ApiPolicyResource
|
||||
│ ├── ApiSchemaResource
|
||||
│ └── ApiDiagnosticResource
|
||||
├── ApiManagementProductResource
|
||||
│ ├── ProductApiResource
|
||||
│ ├── ProductGroupResource
|
||||
│ └── ProductPolicyResource
|
||||
├── ApiManagementSubscriptionResource
|
||||
├── ApiManagementPolicyResource
|
||||
├── ApiManagementUserResource
|
||||
├── ApiManagementGroupResource
|
||||
├── ApiManagementBackendResource
|
||||
├── ApiManagementGatewayResource
|
||||
├── ApiManagementCertificateResource
|
||||
├── ApiManagementNamedValueResource
|
||||
└── ApiManagementLoggerResource
|
||||
```
|
||||
|
||||
## Core Workflow
|
||||
|
||||
### 1. Create API Management Service
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.ApiManagement;
|
||||
using Azure.ResourceManager.ApiManagement.Models;
|
||||
|
||||
// Get resource group
|
||||
var resourceGroup = await subscription
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
// Define service
|
||||
var serviceData = new ApiManagementServiceData(
|
||||
location: AzureLocation.EastUS,
|
||||
sku: new ApiManagementServiceSkuProperties(
|
||||
ApiManagementServiceSkuType.Developer,
|
||||
capacity: 1),
|
||||
publisherEmail: "admin@contoso.com",
|
||||
publisherName: "Contoso");
|
||||
|
||||
// Create service (long-running operation - can take 30+ minutes)
|
||||
var serviceCollection = resourceGroup.Value.GetApiManagementServices();
|
||||
var operation = await serviceCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-apim-service",
|
||||
serviceData);
|
||||
|
||||
ApiManagementServiceResource service = operation.Value;
|
||||
```
|
||||
|
||||
### 2. Create an API
|
||||
|
||||
```csharp
|
||||
var apiData = new ApiCreateOrUpdateContent
|
||||
{
|
||||
DisplayName = "My API",
|
||||
Path = "myapi",
|
||||
Protocols = { ApiOperationInvokableProtocol.Https },
|
||||
ServiceUri = new Uri("https://backend.contoso.com/api")
|
||||
};
|
||||
|
||||
var apiCollection = service.GetApis();
|
||||
var apiOperation = await apiCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-api",
|
||||
apiData);
|
||||
|
||||
ApiResource api = apiOperation.Value;
|
||||
```
|
||||
|
||||
### 3. Create a Product
|
||||
|
||||
```csharp
|
||||
var productData = new ApiManagementProductData
|
||||
{
|
||||
DisplayName = "Starter",
|
||||
Description = "Starter tier with limited access",
|
||||
IsSubscriptionRequired = true,
|
||||
IsApprovalRequired = false,
|
||||
SubscriptionsLimit = 1,
|
||||
State = ApiManagementProductState.Published
|
||||
};
|
||||
|
||||
var productCollection = service.GetApiManagementProducts();
|
||||
var productOperation = await productCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"starter",
|
||||
productData);
|
||||
|
||||
ApiManagementProductResource product = productOperation.Value;
|
||||
|
||||
// Add API to product
|
||||
await product.GetProductApis().CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-api");
|
||||
```
|
||||
|
||||
### 4. Create a Subscription
|
||||
|
||||
```csharp
|
||||
var subscriptionData = new ApiManagementSubscriptionCreateOrUpdateContent
|
||||
{
|
||||
DisplayName = "My Subscription",
|
||||
Scope = $"/products/{product.Data.Name}",
|
||||
State = ApiManagementSubscriptionState.Active
|
||||
};
|
||||
|
||||
var subscriptionCollection = service.GetApiManagementSubscriptions();
|
||||
var subOperation = await subscriptionCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
"my-subscription",
|
||||
subscriptionData);
|
||||
|
||||
ApiManagementSubscriptionResource subscription = subOperation.Value;
|
||||
|
||||
// Get subscription keys
|
||||
var keys = await subscription.GetSecretsAsync();
|
||||
Console.WriteLine($"Primary Key: {keys.Value.PrimaryKey}");
|
||||
```
|
||||
|
||||
### 5. Set API Policy
|
||||
|
||||
```csharp
|
||||
var policyXml = @"
|
||||
<policies>
|
||||
<inbound>
|
||||
<rate-limit calls=""100"" renewal-period=""60"" />
|
||||
<set-header name=""X-Custom-Header"" exists-action=""override"">
|
||||
<value>CustomValue</value>
|
||||
</set-header>
|
||||
<base />
|
||||
</inbound>
|
||||
<backend>
|
||||
<base />
|
||||
</backend>
|
||||
<outbound>
|
||||
<base />
|
||||
</outbound>
|
||||
<on-error>
|
||||
<base />
|
||||
</on-error>
|
||||
</policies>";
|
||||
|
||||
var policyData = new PolicyContractData
|
||||
{
|
||||
Value = policyXml,
|
||||
Format = PolicyContentFormat.Xml
|
||||
};
|
||||
|
||||
await api.GetApiPolicy().CreateOrUpdateAsync(
|
||||
WaitUntil.Completed,
|
||||
policyData);
|
||||
```
|
||||
|
||||
### 6. Backup and Restore
|
||||
|
||||
```csharp
|
||||
// Backup
|
||||
var backupParams = new ApiManagementServiceBackupRestoreContent(
|
||||
storageAccount: "mystorageaccount",
|
||||
containerName: "apim-backups",
|
||||
backupName: "backup-2024-01-15")
|
||||
{
|
||||
AccessType = StorageAccountAccessType.SystemAssignedManagedIdentity
|
||||
};
|
||||
|
||||
await service.BackupAsync(WaitUntil.Completed, backupParams);
|
||||
|
||||
// Restore
|
||||
await service.RestoreAsync(WaitUntil.Completed, backupParams);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ArmClient` | Entry point for all ARM operations |
|
||||
| `ApiManagementServiceResource` | Represents an APIM service instance |
|
||||
| `ApiManagementServiceCollection` | Collection for service CRUD |
|
||||
| `ApiResource` | Represents an API |
|
||||
| `ApiManagementProductResource` | Represents a product |
|
||||
| `ApiManagementSubscriptionResource` | Represents a subscription |
|
||||
| `ApiManagementPolicyResource` | Service-level policy |
|
||||
| `ApiPolicyResource` | API-level policy |
|
||||
| `ApiManagementUserResource` | Represents a user |
|
||||
| `ApiManagementGroupResource` | Represents a group |
|
||||
| `ApiManagementBackendResource` | Represents a backend service |
|
||||
| `ApiManagementGatewayResource` | Represents a self-hosted gateway |
|
||||
|
||||
## SKU Types
|
||||
|
||||
| SKU | Purpose | Capacity |
|
||||
|-----|---------|----------|
|
||||
| `Developer` | Development/testing (no SLA) | 1 |
|
||||
| `Basic` | Entry-level production | 1-2 |
|
||||
| `Standard` | Medium workloads | 1-4 |
|
||||
| `Premium` | High availability, multi-region | 1-12 per region |
|
||||
| `Consumption` | Serverless, pay-per-call | N/A |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use `WaitUntil.Completed`** for operations that must finish before proceeding
|
||||
2. **Use `WaitUntil.Started`** for long operations like service creation (30+ min)
|
||||
3. **Always use `DefaultAzureCredential`** — never hardcode keys
|
||||
4. **Handle `RequestFailedException`** for ARM API errors
|
||||
5. **Use `CreateOrUpdateAsync`** for idempotent operations
|
||||
6. **Navigate hierarchy** via `Get*` methods (e.g., `service.GetApis()`)
|
||||
7. **Policy format** — Use XML format for policies; JSON is also supported
|
||||
8. **Service creation** — Developer SKU is fastest for testing (~15-30 min)
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
var operation = await serviceCollection.CreateOrUpdateAsync(
|
||||
WaitUntil.Completed, serviceName, serviceData);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Service already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Bad request: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"ARM Error: {ex.Status} - {ex.ErrorCode}: {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## Reference Files
|
||||
|
||||
| File | When to Read |
|
||||
|------|--------------|
|
||||
| [references/service-management.md](references/service-management.md) | Service CRUD, SKUs, networking, backup/restore |
|
||||
| [references/apis-operations.md](references/apis-operations.md) | APIs, operations, schemas, versioning |
|
||||
| [references/products-subscriptions.md](references/products-subscriptions.md) | Products, subscriptions, access control |
|
||||
| [references/policies.md](references/policies.md) | Policy XML patterns, scopes, common policies |
|
||||
|
||||
## Related Resources
|
||||
|
||||
| Resource | Purpose |
|
||||
|----------|---------|
|
||||
| [API Management Documentation](https://learn.microsoft.com/en-us/azure/api-management/) | Official Azure docs |
|
||||
| [Policy Reference](https://learn.microsoft.com/en-us/azure/api-management/api-management-policies) | Complete policy reference |
|
||||
| [SDK Reference](https://learn.microsoft.com/en-us/dotnet/api/azure.resourcemanager.apimanagement) | .NET API reference |
|
||||
278
skills/azure-mgmt-apimanagement-py/SKILL.md
Normal file
278
skills/azure-mgmt-apimanagement-py/SKILL.md
Normal file
@@ -0,0 +1,278 @@
|
||||
---
|
||||
name: azure-mgmt-apimanagement-py
|
||||
description: |
|
||||
Azure API Management SDK for Python. Use for managing APIM services, APIs, products, subscriptions, and policies.
|
||||
Triggers: "azure-mgmt-apimanagement", "ApiManagementClient", "APIM", "API gateway", "API Management".
|
||||
package: azure-mgmt-apimanagement
|
||||
---
|
||||
|
||||
# Azure API Management SDK for Python
|
||||
|
||||
Manage Azure API Management services, APIs, products, and policies.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install azure-mgmt-apimanagement
|
||||
pip install azure-identity
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=your-subscription-id
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```python
|
||||
from azure.identity import DefaultAzureCredential
|
||||
from azure.mgmt.apimanagement import ApiManagementClient
|
||||
import os
|
||||
|
||||
client = ApiManagementClient(
|
||||
credential=DefaultAzureCredential(),
|
||||
subscription_id=os.environ["AZURE_SUBSCRIPTION_ID"]
|
||||
)
|
||||
```
|
||||
|
||||
## Create APIM Service
|
||||
|
||||
```python
|
||||
from azure.mgmt.apimanagement.models import (
|
||||
ApiManagementServiceResource,
|
||||
ApiManagementServiceSkuProperties,
|
||||
SkuType
|
||||
)
|
||||
|
||||
service = client.api_management_service.begin_create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
parameters=ApiManagementServiceResource(
|
||||
location="eastus",
|
||||
publisher_email="admin@example.com",
|
||||
publisher_name="My Organization",
|
||||
sku=ApiManagementServiceSkuProperties(
|
||||
name=SkuType.DEVELOPER,
|
||||
capacity=1
|
||||
)
|
||||
)
|
||||
).result()
|
||||
|
||||
print(f"Created APIM: {service.name}")
|
||||
```
|
||||
|
||||
## Import API from OpenAPI
|
||||
|
||||
```python
|
||||
from azure.mgmt.apimanagement.models import (
|
||||
ApiCreateOrUpdateParameter,
|
||||
ContentFormat,
|
||||
Protocol
|
||||
)
|
||||
|
||||
api = client.api.begin_create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
api_id="my-api",
|
||||
parameters=ApiCreateOrUpdateParameter(
|
||||
display_name="My API",
|
||||
path="myapi",
|
||||
protocols=[Protocol.HTTPS],
|
||||
format=ContentFormat.OPENAPI_JSON,
|
||||
value='{"openapi": "3.0.0", "info": {"title": "My API", "version": "1.0"}, "paths": {"/health": {"get": {"responses": {"200": {"description": "OK"}}}}}}'
|
||||
)
|
||||
).result()
|
||||
|
||||
print(f"Imported API: {api.display_name}")
|
||||
```
|
||||
|
||||
## Import API from URL
|
||||
|
||||
```python
|
||||
api = client.api.begin_create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
api_id="petstore",
|
||||
parameters=ApiCreateOrUpdateParameter(
|
||||
display_name="Petstore API",
|
||||
path="petstore",
|
||||
protocols=[Protocol.HTTPS],
|
||||
format=ContentFormat.OPENAPI_LINK,
|
||||
value="https://petstore.swagger.io/v2/swagger.json"
|
||||
)
|
||||
).result()
|
||||
```
|
||||
|
||||
## List APIs
|
||||
|
||||
```python
|
||||
apis = client.api.list_by_service(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim"
|
||||
)
|
||||
|
||||
for api in apis:
|
||||
print(f"{api.name}: {api.display_name} - {api.path}")
|
||||
```
|
||||
|
||||
## Create Product
|
||||
|
||||
```python
|
||||
from azure.mgmt.apimanagement.models import ProductContract
|
||||
|
||||
product = client.product.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
product_id="premium",
|
||||
parameters=ProductContract(
|
||||
display_name="Premium",
|
||||
description="Premium tier with unlimited access",
|
||||
subscription_required=True,
|
||||
approval_required=False,
|
||||
state="published"
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Created product: {product.display_name}")
|
||||
```
|
||||
|
||||
## Add API to Product
|
||||
|
||||
```python
|
||||
client.product_api.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
product_id="premium",
|
||||
api_id="my-api"
|
||||
)
|
||||
```
|
||||
|
||||
## Create Subscription
|
||||
|
||||
```python
|
||||
from azure.mgmt.apimanagement.models import SubscriptionCreateParameters
|
||||
|
||||
subscription = client.subscription.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
sid="my-subscription",
|
||||
parameters=SubscriptionCreateParameters(
|
||||
display_name="My Subscription",
|
||||
scope=f"/products/premium",
|
||||
state="active"
|
||||
)
|
||||
)
|
||||
|
||||
print(f"Subscription key: {subscription.primary_key}")
|
||||
```
|
||||
|
||||
## Set API Policy
|
||||
|
||||
```python
|
||||
from azure.mgmt.apimanagement.models import PolicyContract
|
||||
|
||||
policy_xml = """
|
||||
<policies>
|
||||
<inbound>
|
||||
<rate-limit calls="100" renewal-period="60" />
|
||||
<set-header name="X-Custom-Header" exists-action="override">
|
||||
<value>CustomValue</value>
|
||||
</set-header>
|
||||
</inbound>
|
||||
<backend>
|
||||
<forward-request />
|
||||
</backend>
|
||||
<outbound />
|
||||
<on-error />
|
||||
</policies>
|
||||
"""
|
||||
|
||||
client.api_policy.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
api_id="my-api",
|
||||
policy_id="policy",
|
||||
parameters=PolicyContract(
|
||||
value=policy_xml,
|
||||
format="xml"
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Create Named Value (Secret)
|
||||
|
||||
```python
|
||||
from azure.mgmt.apimanagement.models import NamedValueCreateContract
|
||||
|
||||
named_value = client.named_value.begin_create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
named_value_id="backend-api-key",
|
||||
parameters=NamedValueCreateContract(
|
||||
display_name="Backend API Key",
|
||||
value="secret-key-value",
|
||||
secret=True
|
||||
)
|
||||
).result()
|
||||
```
|
||||
|
||||
## Create Backend
|
||||
|
||||
```python
|
||||
from azure.mgmt.apimanagement.models import BackendContract
|
||||
|
||||
backend = client.backend.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
backend_id="my-backend",
|
||||
parameters=BackendContract(
|
||||
url="https://api.backend.example.com",
|
||||
protocol="http",
|
||||
description="My backend service"
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Create User
|
||||
|
||||
```python
|
||||
from azure.mgmt.apimanagement.models import UserCreateParameters
|
||||
|
||||
user = client.user.create_or_update(
|
||||
resource_group_name="my-resource-group",
|
||||
service_name="my-apim",
|
||||
user_id="newuser",
|
||||
parameters=UserCreateParameters(
|
||||
email="user@example.com",
|
||||
first_name="John",
|
||||
last_name="Doe"
|
||||
)
|
||||
)
|
||||
```
|
||||
|
||||
## Operation Groups
|
||||
|
||||
| Group | Purpose |
|
||||
|-------|---------|
|
||||
| `api_management_service` | APIM instance management |
|
||||
| `api` | API operations |
|
||||
| `api_operation` | API operation details |
|
||||
| `api_policy` | API-level policies |
|
||||
| `product` | Product management |
|
||||
| `product_api` | Product-API associations |
|
||||
| `subscription` | Subscription management |
|
||||
| `user` | User management |
|
||||
| `named_value` | Named values/secrets |
|
||||
| `backend` | Backend services |
|
||||
| `certificate` | Certificates |
|
||||
| `gateway` | Self-hosted gateways |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use named values** for secrets and configuration
|
||||
2. **Apply policies** at appropriate scopes (global, product, API, operation)
|
||||
3. **Use products** to bundle APIs and manage access
|
||||
4. **Enable Application Insights** for monitoring
|
||||
5. **Use backends** to abstract backend services
|
||||
6. **Version your APIs** using APIM's versioning features
|
||||
486
skills/azure-mgmt-applicationinsights-dotnet/SKILL.md
Normal file
486
skills/azure-mgmt-applicationinsights-dotnet/SKILL.md
Normal file
@@ -0,0 +1,486 @@
|
||||
---
|
||||
name: azure-mgmt-applicationinsights-dotnet
|
||||
description: |
|
||||
Azure Application Insights SDK for .NET. Application performance monitoring and observability resource management. Use for creating Application Insights components, web tests, workbooks, analytics items, and API keys. Triggers: "Application Insights", "ApplicationInsights", "App Insights", "APM", "application monitoring", "web tests", "availability tests", "workbooks".
|
||||
package: Azure.ResourceManager.ApplicationInsights
|
||||
---
|
||||
|
||||
# Azure.ResourceManager.ApplicationInsights (.NET)
|
||||
|
||||
Azure Resource Manager SDK for managing Application Insights resources for application performance monitoring.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
dotnet add package Azure.ResourceManager.ApplicationInsights
|
||||
dotnet add package Azure.Identity
|
||||
```
|
||||
|
||||
**Current Version**: v1.0.0 (GA)
|
||||
**API Version**: 2022-06-15
|
||||
|
||||
## Environment Variables
|
||||
|
||||
```bash
|
||||
AZURE_SUBSCRIPTION_ID=<your-subscription-id>
|
||||
AZURE_RESOURCE_GROUP=<your-resource-group>
|
||||
AZURE_APPINSIGHTS_NAME=<your-appinsights-component>
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
```csharp
|
||||
using Azure.Identity;
|
||||
using Azure.ResourceManager;
|
||||
using Azure.ResourceManager.ApplicationInsights;
|
||||
|
||||
ArmClient client = new ArmClient(new DefaultAzureCredential());
|
||||
```
|
||||
|
||||
## Resource Hierarchy
|
||||
|
||||
```
|
||||
Subscription
|
||||
└── ResourceGroup
|
||||
└── ApplicationInsightsComponent # App Insights resource
|
||||
├── ApplicationInsightsComponentApiKey # API keys for programmatic access
|
||||
├── ComponentLinkedStorageAccount # Linked storage for data export
|
||||
└── (via component ID)
|
||||
├── WebTest # Availability tests
|
||||
├── Workbook # Workbooks for analysis
|
||||
├── WorkbookTemplate # Workbook templates
|
||||
└── MyWorkbook # Private workbooks
|
||||
```
|
||||
|
||||
## Core Workflows
|
||||
|
||||
### 1. Create Application Insights Component (Workspace-based)
|
||||
|
||||
```csharp
|
||||
using Azure.ResourceManager.ApplicationInsights;
|
||||
using Azure.ResourceManager.ApplicationInsights.Models;
|
||||
|
||||
ResourceGroupResource resourceGroup = await client
|
||||
.GetDefaultSubscriptionAsync()
|
||||
.Result
|
||||
.GetResourceGroupAsync("my-resource-group");
|
||||
|
||||
ApplicationInsightsComponentCollection components = resourceGroup.GetApplicationInsightsComponents();
|
||||
|
||||
// Workspace-based Application Insights (recommended)
|
||||
ApplicationInsightsComponentData data = new ApplicationInsightsComponentData(
|
||||
AzureLocation.EastUS,
|
||||
ApplicationInsightsApplicationType.Web)
|
||||
{
|
||||
Kind = "web",
|
||||
WorkspaceResourceId = new ResourceIdentifier(
|
||||
"/subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.OperationalInsights/workspaces/<workspace-name>"),
|
||||
IngestionMode = IngestionMode.LogAnalytics,
|
||||
PublicNetworkAccessForIngestion = PublicNetworkAccessType.Enabled,
|
||||
PublicNetworkAccessForQuery = PublicNetworkAccessType.Enabled,
|
||||
RetentionInDays = 90,
|
||||
SamplingPercentage = 100,
|
||||
DisableIPMasking = false,
|
||||
ImmediatePurgeDataOn30Days = false,
|
||||
Tags =
|
||||
{
|
||||
{ "environment", "production" },
|
||||
{ "application", "mywebapp" }
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<ApplicationInsightsComponentResource> operation = await components
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-appinsights", data);
|
||||
|
||||
ApplicationInsightsComponentResource component = operation.Value;
|
||||
|
||||
Console.WriteLine($"Component created: {component.Data.Name}");
|
||||
Console.WriteLine($"Instrumentation Key: {component.Data.InstrumentationKey}");
|
||||
Console.WriteLine($"Connection String: {component.Data.ConnectionString}");
|
||||
```
|
||||
|
||||
### 2. Get Connection String and Keys
|
||||
|
||||
```csharp
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
|
||||
// Get connection string for SDK configuration
|
||||
string connectionString = component.Data.ConnectionString;
|
||||
string instrumentationKey = component.Data.InstrumentationKey;
|
||||
string appId = component.Data.AppId;
|
||||
|
||||
Console.WriteLine($"Connection String: {connectionString}");
|
||||
Console.WriteLine($"Instrumentation Key: {instrumentationKey}");
|
||||
Console.WriteLine($"App ID: {appId}");
|
||||
```
|
||||
|
||||
### 3. Create API Key
|
||||
|
||||
```csharp
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
|
||||
ApplicationInsightsComponentApiKeyCollection apiKeys = component.GetApplicationInsightsComponentApiKeys();
|
||||
|
||||
// API key for reading telemetry
|
||||
ApplicationInsightsApiKeyContent keyContent = new ApplicationInsightsApiKeyContent
|
||||
{
|
||||
Name = "ReadTelemetryKey",
|
||||
LinkedReadProperties =
|
||||
{
|
||||
$"/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/microsoft.insights/components/{component.Data.Name}/api",
|
||||
$"/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/microsoft.insights/components/{component.Data.Name}/agentconfig"
|
||||
}
|
||||
};
|
||||
|
||||
ApplicationInsightsComponentApiKeyResource apiKey = await apiKeys
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, keyContent);
|
||||
|
||||
Console.WriteLine($"API Key Name: {apiKey.Data.Name}");
|
||||
Console.WriteLine($"API Key: {apiKey.Data.ApiKey}"); // Only shown once!
|
||||
```
|
||||
|
||||
### 4. Create Web Test (Availability Test)
|
||||
|
||||
```csharp
|
||||
WebTestCollection webTests = resourceGroup.GetWebTests();
|
||||
|
||||
// URL Ping Test
|
||||
WebTestData urlPingTest = new WebTestData(AzureLocation.EastUS)
|
||||
{
|
||||
Kind = WebTestKind.Ping,
|
||||
SyntheticMonitorId = "webtest-ping-myapp",
|
||||
WebTestName = "Homepage Availability",
|
||||
Description = "Checks if homepage is available",
|
||||
IsEnabled = true,
|
||||
Frequency = 300, // 5 minutes
|
||||
Timeout = 120, // 2 minutes
|
||||
WebTestKind = WebTestKind.Ping,
|
||||
IsRetryEnabled = true,
|
||||
Locations =
|
||||
{
|
||||
new WebTestGeolocation { WebTestLocationId = "us-ca-sjc-azr" }, // West US
|
||||
new WebTestGeolocation { WebTestLocationId = "us-tx-sn1-azr" }, // South Central US
|
||||
new WebTestGeolocation { WebTestLocationId = "us-il-ch1-azr" }, // North Central US
|
||||
new WebTestGeolocation { WebTestLocationId = "emea-gb-db3-azr" }, // UK South
|
||||
new WebTestGeolocation { WebTestLocationId = "apac-sg-sin-azr" } // Southeast Asia
|
||||
},
|
||||
Configuration = new WebTestConfiguration
|
||||
{
|
||||
WebTest = """
|
||||
<WebTest Name="Homepage" Enabled="True" Timeout="120"
|
||||
xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
|
||||
<Items>
|
||||
<Request Method="GET" Version="1.1" Url="https://myapp.example.com"
|
||||
ThinkTime="0" Timeout="120" ParseDependentRequests="False"
|
||||
FollowRedirects="True" RecordResult="True" Cache="False"
|
||||
ResponseTimeGoal="0" Encoding="utf-8" ExpectedHttpStatusCode="200" />
|
||||
</Items>
|
||||
</WebTest>
|
||||
"""
|
||||
},
|
||||
Tags =
|
||||
{
|
||||
{ $"hidden-link:/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/microsoft.insights/components/my-appinsights", "Resource" }
|
||||
}
|
||||
};
|
||||
|
||||
ArmOperation<WebTestResource> operation = await webTests
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "webtest-homepage", urlPingTest);
|
||||
|
||||
WebTestResource webTest = operation.Value;
|
||||
Console.WriteLine($"Web test created: {webTest.Data.Name}");
|
||||
```
|
||||
|
||||
### 5. Create Multi-Step Web Test
|
||||
|
||||
```csharp
|
||||
WebTestData multiStepTest = new WebTestData(AzureLocation.EastUS)
|
||||
{
|
||||
Kind = WebTestKind.MultiStep,
|
||||
SyntheticMonitorId = "webtest-multistep-login",
|
||||
WebTestName = "Login Flow Test",
|
||||
Description = "Tests login functionality",
|
||||
IsEnabled = true,
|
||||
Frequency = 900, // 15 minutes
|
||||
Timeout = 300, // 5 minutes
|
||||
WebTestKind = WebTestKind.MultiStep,
|
||||
IsRetryEnabled = true,
|
||||
Locations =
|
||||
{
|
||||
new WebTestGeolocation { WebTestLocationId = "us-ca-sjc-azr" }
|
||||
},
|
||||
Configuration = new WebTestConfiguration
|
||||
{
|
||||
WebTest = """
|
||||
<WebTest Name="LoginFlow" Enabled="True" Timeout="300"
|
||||
xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
|
||||
<Items>
|
||||
<Request Method="GET" Version="1.1" Url="https://myapp.example.com/login"
|
||||
ThinkTime="0" Timeout="60" />
|
||||
<Request Method="POST" Version="1.1" Url="https://myapp.example.com/api/auth"
|
||||
ThinkTime="0" Timeout="60">
|
||||
<Headers>
|
||||
<Header Name="Content-Type" Value="application/json" />
|
||||
</Headers>
|
||||
<Body>{"username":"testuser","password":"{{TestPassword}}"}</Body>
|
||||
</Request>
|
||||
</Items>
|
||||
</WebTest>
|
||||
"""
|
||||
},
|
||||
Tags =
|
||||
{
|
||||
{ $"hidden-link:/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/microsoft.insights/components/my-appinsights", "Resource" }
|
||||
}
|
||||
};
|
||||
|
||||
await webTests.CreateOrUpdateAsync(WaitUntil.Completed, "webtest-login-flow", multiStepTest);
|
||||
```
|
||||
|
||||
### 6. Create Workbook
|
||||
|
||||
```csharp
|
||||
WorkbookCollection workbooks = resourceGroup.GetWorkbooks();
|
||||
|
||||
WorkbookData workbookData = new WorkbookData(AzureLocation.EastUS)
|
||||
{
|
||||
DisplayName = "Application Performance Dashboard",
|
||||
Category = "workbook",
|
||||
Kind = WorkbookSharedTypeKind.Shared,
|
||||
SerializedData = """
|
||||
{
|
||||
"version": "Notebook/1.0",
|
||||
"items": [
|
||||
{
|
||||
"type": 1,
|
||||
"content": {
|
||||
"json": "# Application Performance\n\nThis workbook shows application performance metrics."
|
||||
},
|
||||
"name": "header"
|
||||
},
|
||||
{
|
||||
"type": 3,
|
||||
"content": {
|
||||
"version": "KqlItem/1.0",
|
||||
"query": "requests\n| summarize count() by bin(timestamp, 1h)\n| render timechart",
|
||||
"size": 0,
|
||||
"title": "Requests per Hour",
|
||||
"timeContext": {
|
||||
"durationMs": 86400000
|
||||
},
|
||||
"queryType": 0,
|
||||
"resourceType": "microsoft.insights/components"
|
||||
},
|
||||
"name": "requestsChart"
|
||||
}
|
||||
],
|
||||
"isLocked": false
|
||||
}
|
||||
""",
|
||||
SourceId = component.Id,
|
||||
Tags =
|
||||
{
|
||||
{ "environment", "production" }
|
||||
}
|
||||
};
|
||||
|
||||
// Note: Workbook ID should be a new GUID
|
||||
string workbookId = Guid.NewGuid().ToString();
|
||||
|
||||
ArmOperation<WorkbookResource> operation = await workbooks
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, workbookId, workbookData);
|
||||
|
||||
WorkbookResource workbook = operation.Value;
|
||||
Console.WriteLine($"Workbook created: {workbook.Data.DisplayName}");
|
||||
```
|
||||
|
||||
### 7. Link Storage Account
|
||||
|
||||
```csharp
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
|
||||
ComponentLinkedStorageAccountCollection linkedStorage = component.GetComponentLinkedStorageAccounts();
|
||||
|
||||
ComponentLinkedStorageAccountData storageData = new ComponentLinkedStorageAccountData
|
||||
{
|
||||
LinkedStorageAccount = new ResourceIdentifier(
|
||||
"/subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.Storage/storageAccounts/<storage-account>")
|
||||
};
|
||||
|
||||
ArmOperation<ComponentLinkedStorageAccountResource> operation = await linkedStorage
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, StorageType.ServiceProfiler, storageData);
|
||||
```
|
||||
|
||||
### 8. List and Manage Components
|
||||
|
||||
```csharp
|
||||
// List all Application Insights components in resource group
|
||||
await foreach (ApplicationInsightsComponentResource component in
|
||||
resourceGroup.GetApplicationInsightsComponents())
|
||||
{
|
||||
Console.WriteLine($"Component: {component.Data.Name}");
|
||||
Console.WriteLine($" App ID: {component.Data.AppId}");
|
||||
Console.WriteLine($" Type: {component.Data.ApplicationType}");
|
||||
Console.WriteLine($" Ingestion Mode: {component.Data.IngestionMode}");
|
||||
Console.WriteLine($" Retention: {component.Data.RetentionInDays} days");
|
||||
}
|
||||
|
||||
// List web tests
|
||||
await foreach (WebTestResource webTest in resourceGroup.GetWebTests())
|
||||
{
|
||||
Console.WriteLine($"Web Test: {webTest.Data.WebTestName}");
|
||||
Console.WriteLine($" Enabled: {webTest.Data.IsEnabled}");
|
||||
Console.WriteLine($" Frequency: {webTest.Data.Frequency}s");
|
||||
}
|
||||
|
||||
// List workbooks
|
||||
await foreach (WorkbookResource workbook in resourceGroup.GetWorkbooks())
|
||||
{
|
||||
Console.WriteLine($"Workbook: {workbook.Data.DisplayName}");
|
||||
}
|
||||
```
|
||||
|
||||
### 9. Update Component
|
||||
|
||||
```csharp
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
|
||||
// Update using full data (PUT operation)
|
||||
ApplicationInsightsComponentData updateData = component.Data;
|
||||
updateData.RetentionInDays = 180;
|
||||
updateData.SamplingPercentage = 50;
|
||||
updateData.Tags["updated"] = "true";
|
||||
|
||||
ArmOperation<ApplicationInsightsComponentResource> operation = await resourceGroup
|
||||
.GetApplicationInsightsComponents()
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-appinsights", updateData);
|
||||
```
|
||||
|
||||
### 10. Delete Resources
|
||||
|
||||
```csharp
|
||||
// Delete Application Insights component
|
||||
ApplicationInsightsComponentResource component = await resourceGroup
|
||||
.GetApplicationInsightsComponentAsync("my-appinsights");
|
||||
await component.DeleteAsync(WaitUntil.Completed);
|
||||
|
||||
// Delete web test
|
||||
WebTestResource webTest = await resourceGroup.GetWebTestAsync("webtest-homepage");
|
||||
await webTest.DeleteAsync(WaitUntil.Completed);
|
||||
```
|
||||
|
||||
## Key Types Reference
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| `ApplicationInsightsComponentResource` | App Insights component |
|
||||
| `ApplicationInsightsComponentData` | Component configuration |
|
||||
| `ApplicationInsightsComponentCollection` | Collection of components |
|
||||
| `ApplicationInsightsComponentApiKeyResource` | API key for programmatic access |
|
||||
| `WebTestResource` | Availability/web test |
|
||||
| `WebTestData` | Web test configuration |
|
||||
| `WorkbookResource` | Analysis workbook |
|
||||
| `WorkbookData` | Workbook configuration |
|
||||
| `ComponentLinkedStorageAccountResource` | Linked storage for exports |
|
||||
|
||||
## Application Types
|
||||
|
||||
| Type | Enum Value |
|
||||
|------|------------|
|
||||
| Web Application | `Web` |
|
||||
| iOS Application | `iOS` |
|
||||
| Java Application | `Java` |
|
||||
| Node.js Application | `NodeJS` |
|
||||
| .NET Application | `MRT` |
|
||||
| Other | `Other` |
|
||||
|
||||
## Web Test Locations
|
||||
|
||||
| Location ID | Region |
|
||||
|-------------|--------|
|
||||
| `us-ca-sjc-azr` | West US |
|
||||
| `us-tx-sn1-azr` | South Central US |
|
||||
| `us-il-ch1-azr` | North Central US |
|
||||
| `us-va-ash-azr` | East US |
|
||||
| `emea-gb-db3-azr` | UK South |
|
||||
| `emea-nl-ams-azr` | West Europe |
|
||||
| `emea-fr-pra-edge` | France Central |
|
||||
| `apac-sg-sin-azr` | Southeast Asia |
|
||||
| `apac-hk-hkn-azr` | East Asia |
|
||||
| `apac-jp-kaw-edge` | Japan East |
|
||||
| `latam-br-gru-edge` | Brazil South |
|
||||
| `emea-au-syd-edge` | Australia East |
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Use workspace-based** — Workspace-based App Insights is the current standard
|
||||
2. **Link to Log Analytics** — Store data in Log Analytics for better querying
|
||||
3. **Set appropriate retention** — Balance cost vs. data availability
|
||||
4. **Use sampling** — Reduce costs for high-volume applications
|
||||
5. **Store connection string securely** — Use Key Vault or managed identity
|
||||
6. **Enable multiple test locations** — For accurate availability monitoring
|
||||
7. **Use workbooks** — For custom dashboards and analysis
|
||||
8. **Set up alerts** — Based on availability tests and metrics
|
||||
9. **Tag resources** — For cost allocation and organization
|
||||
10. **Use private endpoints** — For secure data ingestion
|
||||
|
||||
## Error Handling
|
||||
|
||||
```csharp
|
||||
using Azure;
|
||||
|
||||
try
|
||||
{
|
||||
ArmOperation<ApplicationInsightsComponentResource> operation = await components
|
||||
.CreateOrUpdateAsync(WaitUntil.Completed, "my-appinsights", data);
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 409)
|
||||
{
|
||||
Console.WriteLine("Component already exists");
|
||||
}
|
||||
catch (RequestFailedException ex) when (ex.Status == 400)
|
||||
{
|
||||
Console.WriteLine($"Invalid configuration: {ex.Message}");
|
||||
}
|
||||
catch (RequestFailedException ex)
|
||||
{
|
||||
Console.WriteLine($"Azure error: {ex.Status} - {ex.Message}");
|
||||
}
|
||||
```
|
||||
|
||||
## SDK Integration
|
||||
|
||||
Use the connection string with Application Insights SDK:
|
||||
|
||||
```csharp
|
||||
// Program.cs in ASP.NET Core
|
||||
builder.Services.AddApplicationInsightsTelemetry(options =>
|
||||
{
|
||||
options.ConnectionString = configuration["ApplicationInsights:ConnectionString"];
|
||||
});
|
||||
|
||||
// Or set via environment variable
|
||||
// APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=...;IngestionEndpoint=...
|
||||
```
|
||||
|
||||
## Related SDKs
|
||||
|
||||
| SDK | Purpose | Install |
|
||||
|-----|---------|---------|
|
||||
| `Azure.ResourceManager.ApplicationInsights` | Resource management (this SDK) | `dotnet add package Azure.ResourceManager.ApplicationInsights` |
|
||||
| `Microsoft.ApplicationInsights` | Telemetry SDK | `dotnet add package Microsoft.ApplicationInsights` |
|
||||
| `Microsoft.ApplicationInsights.AspNetCore` | ASP.NET Core integration | `dotnet add package Microsoft.ApplicationInsights.AspNetCore` |
|
||||
| `Azure.Monitor.OpenTelemetry.Exporter` | OpenTelemetry export | `dotnet add package Azure.Monitor.OpenTelemetry.Exporter` |
|
||||
|
||||
## Reference Links
|
||||
|
||||
| Resource | URL |
|
||||
|----------|-----|
|
||||
| NuGet Package | https://www.nuget.org/packages/Azure.ResourceManager.ApplicationInsights |
|
||||
| API Reference | https://learn.microsoft.com/dotnet/api/azure.resourcemanager.applicationinsights |
|
||||
| Product Documentation | https://learn.microsoft.com/azure/azure-monitor/app/app-insights-overview |
|
||||
| GitHub Source | https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/applicationinsights/Azure.ResourceManager.ApplicationInsights |
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user