The Sudo developer platform maintains parity with the OpenAI Chat Completions API, allowing you to use the OpenAI SDK to access Sudo’s AI routing. By changing just two parameters, you can use the OpenAI SDK to build with Sudo today.
This compatability is not suitable for future-proof and production use cases, and is better for testing and validation.
Recall that each model provider has different degrees of support for OpenAI features (e.g. Anthropic models do not support response_format parameter). You must refer to their documentation of OpenAI compatability when in doubt
Change the API key to your Sudo API key, which you can get from the dev dashboard.
Now, you can call Sudo’s chat API with any supported AI model of your choice.
Copy
import openai# Replace with your Sudo API keyopenai.api_key = "your-sudo-api-key"# Set the custom base URLopenai.base_url = "https://sudoapp.dev/api/v1/"# Make a test call to the chat endpointresponse = openai.chat.completions.create( model="claude-sonnet-4-20250514", # or "gpt-4", "gemini-2.0-flash", etc. messages=[ {"role": "user", "content": "Hello! What is the capital of France?"} ], temperature=0.7,)print("Response:")print(response.choices[0].message.content)