Gemini-Compatible OpenAI Model List
- the auth style can be Gemini-like
- the response shape is still OpenAI-style model-list JSON
2026-03-23.
When to use it
- your client prefers
?key= - your middleware already uses
x-goog-api-key - you want an OpenAI-style
list + data[]response rather than native Gemini model metadata
Authentication
This recheck confirmed that all three of the following work:Request examples
Current production findings
In the2026-03-23 production recheck:
- the
?key=request returned200 - the
x-goog-api-keyrequest returned200 - the top-level
objectwaslist - the top-level
successwastrue - the current model count was
541 - common fields inside
data[]included:idobjectcreatedowned_bysupported_endpoint_types
Response example
Field notes
| Field | Type | Meaning |
|---|---|---|
data[] | array | Models available to the current token |
id | string | Model identifier |
owned_by | string | Upstream source or owner label |
supported_endpoint_types | array | Supported API styles for the model |
object | string | Always list |
success | boolean | Whether the request succeeded |
Read the path meaning of
supported_endpoint_types from supported_endpoint in GET /api/pricing. At the moment, openai maps to POST /v1/chat/completions, anthropic maps to POST /v1/messages, and only openai-response maps to POST /v1/responses. Claude currently exposes openai + anthropic, not openai-response.This endpoint does not return only Gemini models. It returns the OpenAI-style model list visible to the current token, but exposes it through a Gemini-friendly access path.