Skip to content

paulyuk/simple-agent-af-python

Repository files navigation

Simple Agent QuickStart (Python Copilot SDK)

A simple AI agent built with the GitHub Copilot SDK, running as an Azure Function.

Prerequisites

Deploy Microsoft Foundry Resources (if needed)

If you're using BYOK and don't already have a Microsoft Foundry project with a model deployed:

azd auth login
azd up

This provisions all resources and configures local development automatically.

What Gets Deployed

  • Microsoft Foundry project with GPT-5-mini model
  • Azure Functions app (Python, Flex Consumption plan)
  • Storage, monitoring, and all necessary RBAC role assignments
  • Optional: Search for vector store (disabled by default)
  • Optional: Cosmos DB for agent thread storage (disabled by default)

Quickstart

  1. Clone the repository

  2. Install dependencies:

    uv venv
    source .venv/bin/activate  # macOS/Linux
    # .venv\Scripts\activate   # Windows
    uv pip install -r requirements.txt
  3. Run the function locally:

    func start
  4. Test the agent (in a new terminal):

    # Interactive chat client
    uv run chat.py
    
    # Or use curl directly
    curl -X POST http://localhost:7071/api/ask -d "what are the laws"

    Set AGENT_URL to point to a deployed instance:

    AGENT_URL=https://<your-function-app>.azurewebsites.net uv run chat.py

Source Code

The agent logic is in function_app.py. It creates a CopilotClient, configures a session with a system message (Asimov's Three Laws of Robotics), and exposes an HTTP endpoint (/api/ask) that accepts a prompt and returns the agent's response.

chat.py is a lightweight console client that POSTs messages to the function in a loop, giving you an interactive chat experience. It defaults to http://localhost:7071 but can be pointed at a deployed instance via the AGENT_URL environment variable.

Using Microsoft Foundry (BYOK)

By default the agent uses GitHub Copilot's models. To use your own model from Microsoft Foundry instead, set these environment variables:

export AZURE_OPENAI_ENDPOINT="https://<your-ai-services>.openai.azure.com/"
export AZURE_OPENAI_API_KEY="<your-api-key>"
export AZURE_OPENAI_MODEL="gpt-5-mini"  # optional, defaults to gpt-5-mini

Getting these values:

  • If you ran azd up, the endpoint is already in your environment — run azd env get-values | grep AZURE_OPENAI_ENDPOINT
  • For the API key, go to Azure Portal → your AI Services resource → Keys and Endpoint → select the Azure OpenAI tab
  • Or find both in the Microsoft Foundry portal under your project settings

See the BYOK docs for details.

Learn More

About

Simple Agent Framework Quickstart (Python)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages