A simple AI agent built with the GitHub Copilot SDK, running as an Azure Function.
- Python 3.11+ (via uv)
- Azure Functions Core Tools
- Azure Developer CLI (azd) (only needed for deploying Microsoft Foundry resources)
- Access to an AI model via one of:
- GitHub Copilot subscription — models are available automatically
- Bring Your Own Key (BYOK) — use an API key from Microsoft Foundry (see BYOK docs)
If you're using BYOK and don't already have a Microsoft Foundry project with a model deployed:
azd auth login
azd upThis provisions all resources and configures local development automatically.
- Microsoft Foundry project with GPT-5-mini model
- Azure Functions app (Python, Flex Consumption plan)
- Storage, monitoring, and all necessary RBAC role assignments
- Optional: Search for vector store (disabled by default)
- Optional: Cosmos DB for agent thread storage (disabled by default)
-
Clone the repository
-
Install dependencies:
uv venv source .venv/bin/activate # macOS/Linux # .venv\Scripts\activate # Windows uv pip install -r requirements.txt
-
Run the function locally:
func start
-
Test the agent (in a new terminal):
# Interactive chat client uv run chat.py # Or use curl directly curl -X POST http://localhost:7071/api/ask -d "what are the laws"
Set
AGENT_URLto point to a deployed instance:AGENT_URL=https://<your-function-app>.azurewebsites.net uv run chat.py
The agent logic is in function_app.py. It creates a CopilotClient, configures a session with a system message (Asimov's Three Laws of Robotics), and exposes an HTTP endpoint (/api/ask) that accepts a prompt and returns the agent's response.
chat.py is a lightweight console client that POSTs messages to the function in a loop, giving you an interactive chat experience. It defaults to http://localhost:7071 but can be pointed at a deployed instance via the AGENT_URL environment variable.
By default the agent uses GitHub Copilot's models. To use your own model from Microsoft Foundry instead, set these environment variables:
export AZURE_OPENAI_ENDPOINT="https://<your-ai-services>.openai.azure.com/"
export AZURE_OPENAI_API_KEY="<your-api-key>"
export AZURE_OPENAI_MODEL="gpt-5-mini" # optional, defaults to gpt-5-miniGetting these values:
- If you ran
azd up, the endpoint is already in your environment — runazd env get-values | grep AZURE_OPENAI_ENDPOINT - For the API key, go to Azure Portal → your AI Services resource → Keys and Endpoint → select the Azure OpenAI tab
- Or find both in the Microsoft Foundry portal under your project settings
See the BYOK docs for details.