Share via

HTTP error while calling OpenAI models using Microsoft Foundry

Agarwal, Anjali 20 Reputation points
2026-04-22T11:17:19.6466667+00:00

Code.py %pip install azure-ai-projects==2.0.0b2 openai==1.109.1 python-dotenv azure-identity

import os from dotenv import load_dotenv from azure.identity import DefaultAzureCredential from azure.ai.projects import AIProjectClient load_dotenv() foundry_project_endpoint = os.getenv("FOUNDRY_PROJECT_ENDPOINT") model_deployment_name = os.getenv("MODEL_DEPLOYMENT_NAME")

project_client = AIProjectClient( endpoint= foundry_project_endpoint, credential=DefaultAzureCredential() )

openai_client = project_client.get_openai_client()

response = openai_client.responses.create( model=model_deployment_name, instructions="You are a helpful AI assistant.", input = "Can you tell me about Microsoft Foundry?" ) print(f"Response output: {response.output_text}")

And .env has:

FOUNDRY_PROJECT_ENDPOINT="https://demoprojectv1.services.ai.azure.com/api/projects/udemy-demo-proj" MODEL_DEPLOYMENT_NAME="gpt-4o"

Error:

ConnectError Traceback (most recent call last) File ~\AppData\Roaming\Python\Python313\site-packages\httpx_transports\default.py:101, in map_httpcore_exceptions() 100 try: --> 101 yield 102 except Exception as exc:

File ~\AppData\Roaming\Python\Python313\site-packages\httpx_transports\default.py:250, in HTTPTransport.handle_request(self, request) 249 with map_httpcore_exceptions(): --> 250 resp = self._pool.handle_request(req) 252 assert isinstance(resp.stream, typing.Iterable)

File ~\AppData\Roaming\Python\Python313\site-packages\httpcore_sync\connection_pool.py:256, in ConnectionPool.handle_request(self, request) 255 self._close_connections(closing) --> 256 raise exc from None 258 # Return the response. Note that in this case we still have to manage 259 # the point at which the response is closed.

File ~\AppData\Roaming\Python\Python313\site-packages\httpcore_sync\connection_pool.py:236, in ConnectionPool.handle_request(self, request) 234 try: 235 # Send the request on the assigned connection. --> 236 response = connection.handle_request( 237 pool_request.request 238 ) 239 except ConnectionNotAvailable: ... (...) 1022 response.headers, 1023 ) 1024 log.debug("request_id: %s", response.headers.get("x-request-id"))

APIConnectionError: Connection error. Output is truncated. View as a scrollable element or open in a text editor. Adjust cell output settings...

Foundry Agent Service
Foundry Agent Service

A fully managed platform in Microsoft Foundry for hosting, scaling, and securing AI agents built with any supported framework or model


Answer accepted by question author

  1. SRILAKSHMI C 18,205 Reputation points Microsoft External Staff Moderator
    2026-04-22T13:21:19.8933333+00:00

    Hello @Agarwal, Anjali

    Thank you for sharing the code snippet and detailed error message.

    The APIConnectionError / ConnectError you’re encountering indicates that the request is failing at the network/connection layer, meaning it is not reaching the Azure AI Foundry (OpenAI) service. This is typically related to endpoint configuration, authentication, or network restrictions, rather than an issue with the model or SDK usage.

    Your code structure and SDK usage look correct. Based on the details provided, the issue is most likely caused by one or more of the following:

    • Endpoint not reachable or incorrectly formatted
    • Authentication not being resolved via DefaultAzureCredential
    • Network/firewall/proxy restrictions
    • Environment variable formatting issues

    Recommended Troubleshooting Steps

    1. Validate the Foundry Project Endpoint

    Ensure the endpoint format is correct:

    https://<your-resource>.services.ai.azure.com/api/projects/<your-project-id>
    

    In your case:

    https://demoprojectv1.services.ai.azure.com/api/projects/udemy-demo-proj
    

    Also, update your .env file to remove quotes:

    FOUNDRY_PROJECT_ENDPOINT=https://demoprojectv1.services.ai.azure.com/api/projects/udemy-demo-proj
    

    Then verify it is reachable:

    Open in browser:

    https://demoprojectv1.services.ai.azure.com
    

    Or test via curl:

    curl https://demoprojectv1.services.ai.azure.com
    

    If this fails, it indicates a DNS or network connectivity issue.

    2. Verify Environment Variable Loading

    Confirm values are being read correctly in your script:

    print(foundry_project_endpoint)
    

    Ensure there are no extra spaces or hidden characters.

    3. Confirm Authentication

    DefaultAzureCredential() requires a valid identity. Please ensure one of the following:

    Login via Azure CLI:

    az login
    

    OR set service principal credentials:

    AZURE_CLIENT_ID
    

    If authentication is not properly established, the SDK may fail during connection initialization.

    4. Verify Azure Permissions

    Ensure the identity being used has:

    • Reader role on the Foundry project
    • Cognitive Services OpenAI User role on the underlying Azure OpenAI resource

    5. Check Network / Firewall / Proxy

    If you are on a corporate network:

    Ensure outbound HTTPS (port 443) is allowed

    Allow access to:

    *.services.ai.azure.com
    

    If using a proxy, configure:

    set HTTPS_PROXY=http://<proxy>:<port>
    

    You can also test connectivity:

    nslookup demoprojectv1.services.ai.azure.com
    

    6. SDK and Python Version Compatibility

    You are using:

    • azure-ai-projects==2.0.0b2 (preview SDK)
    • Python 3.13

    Some Azure SDKs may have limited validation on newer Python versions.

    Recommendation:

    • Try with Python 3.10 or 3.11
    • Upgrade SDK if a newer version is available

    7. Isolate the Issue

    Test basic client initialization:

    from azure.identity import DefaultAzureCredential
    

    If this fails, it confirms the issue is before the OpenAI API call (connection/auth level).

    Please refer this

    Troubleshoot endpoint and model discovery (Cannot find model): https://learn.microsoft.com/azure/ai-foundry/reference/region-support

    HTTP 404 “Operation Not Found” (ensure correct endpoint & deployment name): https://learn.microsoft.com/azure/ai-foundry/openai/latest

    I hope this will help you. Please feel free to let me know if you have any other queries.

    Thank you!

    Was this answer helpful?

    0 comments No comments

2 additional answers

Sort by: Most helpful
  1. Agarwal, Anjali 20 Reputation points
    2026-04-24T16:23:49.6133333+00:00

    Hi Anshika,

    I was working with Capgemini's internal support team and they helped me with a solution that worked.

    All I had to do was -  pip install pip_system_certs

    Somehow Python was unable to verify TLS/SSL connections to servers whose certificates are trusted by my system.

    So, you may close this case now. Thank you so much for being patient with me and helping me all this while!

    Was this answer helpful?

    0 comments No comments

  2. Agarwal, Anjali 20 Reputation points
    2026-04-24T10:02:51.9533333+00:00

    Thanks for your reply, Anshika.

    I have tried multiple combinations in the last few days and none worked for me. Can you kindly rewrite the below code with the correct SDK and then may be I can try running it?

    Code.py %pip install azure-ai-projects==2.0.0b2 openai==1.109.1 python-dotenv azure-identity

    import os from dotenv import load_dotenv from azure.identity import DefaultAzureCredential from azure.ai.projects import AIProjectClient load_dotenv() foundry_project_endpoint = os.getenv("FOUNDRY_PROJECT_ENDPOINT") model_deployment_name = os.getenv("MODEL_DEPLOYMENT_NAME")

    project_client = AIProjectClient( endpoint= foundry_project_endpoint, credential=DefaultAzureCredential() )

    openai_client = project_client.get_openai_client()

    response = openai_client.responses.create( model=model_deployment_name, instructions="You are a helpful AI assistant.", input = "Can you tell me about Microsoft Foundry?" ) print(f"Response output: {response.output_text}")

    And .env has:

    FOUNDRY_PROJECT_ENDPOINT="https://demoprojectv1.services.ai.azure.com/api/projects/udemy-demo-proj" MODEL_DEPLOYMENT_NAME="gpt-4o"

    Was this answer helpful?

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.