Summary
Please support configuring model aliases that automatically map to a base model plus a predefined thinking/reasoning level.
Example:
gpt-5.4-low -> upstream model gpt-5.4 with reasoning_effort=low
gpt-5.4-medium -> upstream model gpt-5.4 with reasoning_effort=medium
gpt-5.4-high -> upstream model gpt-5.4 with reasoning_effort=high
Current behavior
CLIProxyAPI already supports:
-
dynamic thinking suffix parsing such as:
gpt-5.4(low)
gpt-5.4(medium)
-
model aliases in some provider configs (such as openai-compatibility, vertex, claude)
However, there is currently no native config mechanism to declare aliases like:
gpt-5.4-low
gpt-5.4-medium
gpt-5.4-high
and have them automatically inject the corresponding thinking/reasoning config.
For codex-api-key, there is also no models/alias section, so this is not currently possible in a clean built-in way.
Why this is useful
Some clients only work well with plain model names and do not conveniently support the model(low) syntax.
Model aliases such as gpt-5.4-low are easier to:
- expose in
/v1/models
- select in clients like Cherry Studio
- distinguish for different latency/cost profiles
- use as stable presets for end users
Expected behavior
Allow configuration like this (example only):
codex-api-key:
- api-key: "..."
base-url: "..."
models:
- name: "gpt-5.4"
alias: "gpt-5.4-low"
reasoning-effort: "low"
- name: "gpt-5.4"
alias: "gpt-5.4-medium"
reasoning-effort: "medium"
- name: "gpt-5.4"
alias: "gpt-5.4-high"
reasoning-effort: "high"
or a generic alias mapping mechanism such as:
model-aliases:
- alias: "gpt-5.4-low"
target: "gpt-5.4"
reasoning-effort: "low"
- alias: "gpt-5.4-medium"
target: "gpt-5.4"
reasoning-effort: "medium"
- alias: "gpt-5.4-high"
target: "gpt-5.4"
reasoning-effort: "high"
## Expected result
- these aliases should appear in /v1/models
- requests using these aliases should route to the base model
- the configured thinking/reasoning setting should be injected automatically
- this should work without requiring the client to use gpt-5.4(low) syntax
## Notes
At the moment this can only be approximated by combining:
- alias registration in some compatibility providers
- payload override rules
But that is not a native first-class solution, especially for Codex/OpenAI model routing.
A built-in alias + reasoning mapping feature would make this much easier to use.
Summary
Please support configuring model aliases that automatically map to a base model plus a predefined thinking/reasoning level.
Example:
gpt-5.4-low-> upstream modelgpt-5.4withreasoning_effort=lowgpt-5.4-medium-> upstream modelgpt-5.4withreasoning_effort=mediumgpt-5.4-high-> upstream modelgpt-5.4withreasoning_effort=highCurrent behavior
CLIProxyAPI already supports:
dynamic thinking suffix parsing such as:
gpt-5.4(low)gpt-5.4(medium)model aliases in some provider configs (such as
openai-compatibility,vertex,claude)However, there is currently no native config mechanism to declare aliases like:
gpt-5.4-lowgpt-5.4-mediumgpt-5.4-highand have them automatically inject the corresponding thinking/reasoning config.
For
codex-api-key, there is also nomodels/aliassection, so this is not currently possible in a clean built-in way.Why this is useful
Some clients only work well with plain model names and do not conveniently support the
model(low)syntax.Model aliases such as
gpt-5.4-loware easier to:/v1/modelsExpected behavior
Allow configuration like this (example only):