-->

02/04/2026

Authenticating Azure Foundry OpenAI Using Managed Identity

 In earlier post we have tested 30+ features of vaious Azure foundry Open AI services.

Post: https://pratapreddypilaka.blogspot.com/2026/04/azure-ai-services-complete-guide-to.html

Most of the code have saved all the Open AI service keys in a environment file following python best parctice. But this is done only for prcatice purpose, just to get familiar with Open AI services and how to access those services using python code.

Saving the Access keys in code is a worst possible security blunder you can do.  Access keys will completly bypass all the RBAC controls and give complete access to anyone. 


What are the ways to authenticate Azure Open AI services with out exposing the keys?

  • System Assigned Managed Identity — Auto-created identity tied to an Azure resource (VM, App Service, Functions)
  • User Assigned Managed Identity — Standalone identity you create and attach to one or more Azure resources
  • Service Principal + Client Secret — App registration with a secret, works anywhere including on-prem
  • Service Principal + Certificate — Same as above but uses a certificate instead of secret, more secure
  • Azure CLI Credential — Uses the logged-in az login identity, ideal for local dev/testing
  • Interactive Browser Credential — Pops a browser login window, good for desktop tools
  • Device Code Credential — Prints a code to enter at a URL, useful for headless servers
  • DefaultAzureCredential — Tries multiple methods automatically in order, recommended for most cases
  • Workload Identity — Federated keyless auth for pods running in Azure Kubernetes Service (AKS)
  • Federated Identity Credential — Allows external IdPs like GitHub Actions or GitLab to authenticate without any secrets

  • Managed Identity is the preffered method for Azure workloads to access Open AI services.

    In this article we will have a look at how we can atuhenticate using managed Identity.

    For this we need a VM up on which we will enable system managed Identity. 

    I created a Linux VM, and enabled the managed identity.


    Now , go to Foundry , go to Access control, and Add a new role assignment for the managed identity we created aerlier with role being "Cognitive Services OpenAI User".




    Now create the rules for NSG to open the communication between OpenAI services. This is necessary if you are using a private endpoint for the OpenAI instance.

    Login to your virtual machine. 

    Now inorder to test the connectivity , i am installing Python and Azure-Identity.

    # Python & pip
    sudo apt update && sudo apt install python3-pip -y

    # Required packages
    pip3 install openai azure-identity


    Once all the dependencies are installed, we will create a file with below code.


    from azure.identity import ManagedIdentityCredential
    from openai import AzureOpenAI
    import os

    # ── Config ──────────────────────────────────────────────
    AZURE_OPENAI_ENDPOINT = "https://prata-mhl58p7n-eastus2.cognitiveservices.azure.com/"
    DEPLOYMENT_NAME       = "gpt-5-chat"   # e.g. gpt-4o
    API_VERSION           = "2024-02-01"
    # ────────────────────────────────────────────────────────

    # 1. Obtain a token via Managed Identity (no keys!)
    credential = ManagedIdentityCredential()
    token      = credential.get_token("https://cognitiveservices.azure.com/.default")


    # 2. Build AzureOpenAI client using the bearer token
    client = AzureOpenAI(
        azure_endpoint = AZURE_OPENAI_ENDPOINT,
        api_version    = API_VERSION,
        azure_ad_token = token.token,        # <-- token-based, not key-based
    )

    # 3. Call the model
    response = client.chat.completions.create(
        model    = DEPLOYMENT_NAME,
        messages = [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user",   "content": "Hello! Tell me more about Managed identity authentication without using access keys for Azure Open AI"}
        ]
    )

    print("✅ Response:", response.choices[0].message.content)



    Now run the python file "python3 chat.py"

    You will get the response from your gpt-5-chat model, without using access keys.









    No comments:

    Post a Comment