Releases: BerriAI/litellm
v1.83.8-nightly
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.8-nightlyVerify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.8-nightly/cosign.pub \
ghcr.io/berriai/litellm:v1.83.8-nightlyExpected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- fix(mcp): is_tool_name_prefixed validates against known server prefixes by @voidborne-d in #25085
- fix(s3_v2): use prepared URL for SigV4-signed S3 requests by @nehaaprasad in #25074
- fix(cache): Prevent 'multiple values' TypeError in get_cache_key by @hunterchris in #20261
- fix(presidio): use correct text positions in anonymize_text by @Dmitry-Kucher in #24998
- feat(prometheus): add 7m and 10m latency histogram buckets by @kulia26 in #25071
- fix(ui): resolve login redirect loop when reverse proxy adds HttpOnly to cookies by @jaxhend in #23532
- fix(proxy): set key_alias=user_id in JWT auth for Prometheus metrics by @michelligabriele in #25340
- fix(vertex_ai): normalize Gemini finish_reason enum through map_finis… by @abhyudayareddy in #25337
- fix: remove leading space from license public_key.pem by @milan-berri in #25339
- feat(dashscope): preserve cache_control for explicit prompt caching by @silencedoctor in #25331
- fix: expose reasoning effort fields in get_model_info + add together_ai/gpt-oss-120b by @avarga1 in #25263
- Add PromptGuard guardrail integration by @acebot712 in #24268
- Revert "fix(proxy): set key_alias=user_id in JWT auth for Prometheus metrics" by @krrish-berri-2 in #25438
- build: migrate packaging, CI, and Docker from Poetry to uv by @stuxf in #25007
- fix(security): bump vulnerable dependencies (22 of 25 dependabot alerts) by @stuxf in #25442
- blog: restyle docs.litellm.ai/blog to engineering blog aesthetic by @ishaan-berri in #25580
- blog: Ramp-style engineering blog restyle + Redis circuit breaker post by @ishaan-berri in #25583
- feat(advisor): advisor tool orchestration loop for non-Anthropic providers by @ishaan-berri in #25579
- Litellm dev 04 11 2026 p1 by @krrish-berri-2 in #25585
- blog: add back arrow to blog post pages by @ishaan-berri in #25587
- fix(proxy): cache invalidation double-hashes token in bulk update and key rotation by @dkindlund in #25552
- fix(proxy): model_max_budget silently broken for routed models by @dkindlund in #25549
- fix(embedding): omit null encoding_format for openai requests by @meutsabdahal in #25395
- fix(budget): align reset times for legacy entities (Team Members, End Users) with standardized calendar by @DmitriyAlergant in #25440
- feat(model):add wandb model offerings to include kimi-k2.5 and minimax-m2.5 by @csoni-cweave in #25409
- [Fix] Align field-level checks in user and key update endpoints by @yuneng-berri in #25541
- [Refactor] UI - Virtual Keys: migrate regenerate key modal to AntD by @yuneng-berri in #25406
- [Fix] tighten handling of environment references in request parameters by @yuneng-berri in #25592
- merge main by @Sameerlite in #25616
- Litellm oss staging 04 08 2026 by @krrish-berri-2 in #25397
- fix(auth): gate post-custom-auth DB lookups behind opt-in flag by @michelligabriele in #25634
- fix: blog dark mode - text invisible on dark background by @krrish-berri-2 in #25620
- [Fix] /spend/logs: align filter handling with user scoping by @yuneng-berri in #25594
- feat: add litellm.compress() — BM25-based prompt compression with retrieval tool by @krrish-berri-2 in #25637
- docs: week 2 checklist by @mubashir1osmani in #25452
- feat(guardrails): per-team opt-out for specific global guardrails by @ryan-crabbe-berri in #25575
- Litellm ishaan april11 by @ishaan-berri in #25586
- chore: remove deprecated tests/ui_e2e_tests/ suite by @ryan-crabbe-berri in #25657
- test(e2e): add edit team model TPM/RPM limits test by @ryan-crabbe-berri in #25658
- Litellm oss staging 04 09 2026 by @krrish-berri-2 in #25463
- [Feature] UI - Teams: Allow Editing Router Settings After Team Creation by @yuneng-berri in #25398
- [Infra] Merge dev branch with main by @yuneng-berri in #25647
- [Test] UI - Models: Add E2E tests for Add Model flow by @yuneng-berri in #25590
- Revert "fix(embedding): omit null encoding_format for openai requests" by @Sameerlite in #25698
- feat(bedrock): normalize custom tool JSON schema for Invoke and Converse by @Sameerlite in #25396
- feat(gemini): Veo Lite pricing, video resolution usage and tiered cost by @Sameerlite in #25348
- litellm_staging_04_04_2026 by @krrish-berri-2 in #25192
- Litellm oss staging 04 11 2026 by @krrish-berri-2 in #25589
- litellm oss staging 04/13/2026 by @krrish-berri-2 in #25665
- fix(cost-map): add us-south1 to vertex qwen3-235b-a22b-instruct-2507-maas by @ti3x in #25382
- feat: add litellm.compress() — BM25-based prompt compression with ret… by @krrish-berri-2 in #25650
- fix(ui): pre-select backend default for boolean guardrail provider fields by @ryan-crabbe-berri in #25700
- fix: isolate logs team filter dropdown from root teams state bleed by @ryan-crabbe-berri in #25716
- test(ui): add getCookie to cookieUtils mock in user_dashboard test by @ryan-crabbe-berri in #25719
- [Docs] Add release notes for v1.83.3-stable and v1.83.7.rc.1 by @yuneng-berri in #25723
- fix: default invite user modal global role to least-privilege by @ryan-crabbe-berri in #25721
- [Docs] Regenerate v1.83.3-stable release notes from previous stable by @yuneng-berri in #25726
- [Refactor] Remove Chat UI link from Swagger docs message by @yuneng-berri in #25727
- [Fix] Test - Together AI: replace deprecated Mixtral with serverless Qwen3.5-9B by @yuneng-berri in #25728
- fallbacks image by @shivamrawat1 in #25731
- [Test] Replace flaky bedrock gpt-oss tool-call live test with request-body mock by @yuneng-berri in #25739
- docs update by @shivamrawat1 in #25736
- fix: remove non-existent litellm_mcps_tests_coverage from coverage combine by @joereyna in #25737
- fix(ci): increase test-server-root-path timeout to 30m by @joereyna in #25741
- bump: version 1.83.7 → 1.83.8 by @yuneng-berri in #25730
New Contributors
- @hunterchris made their first contribution in #20261
- @Dmitry-Kucher made their first contribution in #24998
- @kulia26 made their first contribution in #25071
- @jaxhend made their first contribution in #23532
- @abhyudayareddy made their first contribution in #25337
- @avarga1 made their first contribution in https://...
v1.83.7.rc.1
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.7.rc.1Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.7.rc.1/cosign.pub \
ghcr.io/berriai/litellm:v1.83.7.rc.1Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
⚠️ Breaking Changes
Prometheus default latency histogram buckets reduced (#25527)
The default LATENCY_BUCKETS has been reduced from 35 to 18 boundaries. If you have existing Prometheus dashboards or PromQL SLO queries that reference specific le values (e.g. le="1.5", le="9.5"), those series will no longer exist after upgrading. Review and update any affected queries or dashboard panels.
What's Changed
- Fix node-gyp symlink path after npm upgrade in Dockerfile by @joereyna in #25048
- [Test] UI - E2E: Add Playwright tests with local PostgreSQL by @yuneng-berri in #25126
- feat: add POST /team/permissions_bulk_update endpoint by @ryan-crabbe-berri in #25239
- added applyguardrail to inline iam by @shivamrawat1 in #25241
- [Infra] Pin cosign.pub verification to initial commit hash by @yuneng-berri in #25273
- feat(containers): Azure routing, managed container IDs, delete response parsing by @Sameerlite in #25287
- [Fix] Update check_responses_cost tests for _expire_stale_rows by @yuneng-berri in #25299
- [Fix] Dockerfile.non_root: handle missing .npmrc gracefully by @yuneng-berri in #25307
- [Refactor] Align /v2/key/info response handling with v1 by @yuneng-berri in #25313
- [Infra] Bump version 1.83.4 → 1.83.5 by @yuneng-berri in #25316
- fix(mcp): block arbitrary command execution via stdio transport by @Sameerlite in #25343
- [Infra] Migrate Redis caching tests from GHA to CircleCI by @yuneng-berri in #25354
- [Feature] UI E2E Tests: Proxy Admin Team and Key Management by @yuneng-berri in #25365
- [Fix] UI: improve storage handling and Dockerfile consistency by @yuneng-berri in #25384
- feat(bedrock): skip dummy user continue for assistant prefix prefill by @Sameerlite in #25419
- fix(websearch_interception): ensure spend/cost logging runs when stream=True by @Sameerlite in #25424
- fix(responses-ws): append ?model= to backend WebSocket URL by @joereyna in #25437
- feat(mcp): add per-user OAuth token storage for interactive MCP flows by @csoni-cweave in #25441
- fix(test): mock headers in test_completion_fine_tuned_model by @joereyna in #25444
- fix(proxy): improve input validation on management endpoints by @jaydns in #25445
- fix(logging): preserve proxy key-auth metadata on /v1/messages Langfuse traces by @michelligabriele in #25448
- Add file content streaming support for OpenAI and related utilities by @harish876 in #25450
- Team member permission /spend/logs for team-wide spend logs (UI + RBAC) by @shivamrawat1 in #25458
- fix(proxy): pass-through multipart uploads and Bedrock JSON body by @shivamrawat1 in #25464
- fix(proxy): use parameterized query for combined_view token lookup by @jaydns in #25467
- [Test] UI - Unit tests: raise global vitest timeout and remove per-test overrides by @yuneng-berri in #25468
- [Docs] Add missing MCP per-user token env vars to config_settings by @yuneng-berri in #25471
- refactor: consolidate route auth for UI and API tokens by @ryan-crabbe-berri in #25473
- [Fix] Harden file path resolution in skill archive extraction by @yuneng-berri in #25475
- [Fix] Align v1 guardrail and agent list responses with v2 field handling by @yuneng-berri in #25478
- [Fix] Flush Tremor Tooltip timers in user_edit_view tests by @yuneng-berri in #25480
- feat(guardrails): optional skip system message in unified guardrail inputs by @Sameerlite in #25481
- fix(responses): map refusal stop_reason to incomplete status in streaming by @Sameerlite in #25498
- [Fix] Responses WebSocket Duplicate Keyword Argument Error by @yuneng-berri in #25513
- fix: a2a create a2a client default 60 second timeout by @milan-berri in #25514
- fix(bedrock): avoid double-counting cache tokens in Anthropic Messages streaming usage by @Sameerlite in #25517
- merge main by @Sameerlite in #25524
- feat(anthropic): support advisor_20260301 tool type by @ishaan-berri in #25525
- [Infra] Merge Dev Branch with Main by @yuneng-berri in #25526
- Reduce default latency histogram bucket cardinality by @J-Byron in #25527
- bump: version 1.83.5 → 1.83.6 by @yuneng-berri in #25528
- fix(s3): add retry with exponential backoff for transient S3 503/500 errors by @jimmychen-p72 in #25530
- docs: document april townhall announcements by @krrish-berri-2 in #25537
- fix(spend): session-TZ-independent date filtering for spend/error log queries by @ryan-crabbe-berri in #25542
- Litellm ishaan april10 by @ishaan-berri in #25545
- [Fix] Align Org and Team Endpoint Permission Checks by @yuneng-berri in #25554
- fix(proxy): preserve dict guardrail HTTPException.detail + bedrock context by @michelligabriele in #25558
- Litellm internal staging 04 11 2026 by @krrish-berri-2 in #25562
- Add "Screenshots / Proof of Fix" section to PR template by @krrish-berri-2 in #25564
- [Infra] Merge dev with main by @yuneng-berri in #25568
- Litellm harish april11 by @ishaan-berri in #25569
- [Infra] Build UI for release by @yuneng-berri in #25571
- [Infra] Rebuild UI by @yuneng-berri in #25573
- [Infra] Rebuild UI by @yuneng-berri in #25577
- bump: version 1.83.6 → 1.83.7 by @yuneng-berri in #25578
New Contributors
- @csoni-cweave made their first contribution in #25441
- @jimmychen-p72 made their first contribution in #25530
Full Changelog: v1.83.3.rc.1...v1.83.7.rc.1
v1.83.6-nightly
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.6-nightlyVerify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.6-nightly/cosign.pub \
ghcr.io/berriai/litellm:v1.83.6-nightlyExpected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- Litellm docs 1 83 3 by @ishaan-berri in #25166
- [Nit] Small docs fix, fixing img + folder name by @ishaan-berri in #25171
- docs: week 1 checklist by @mubashir1osmani in #25083
- [Docs] Add cosign Docker image verification steps to security blog posts by @yuneng-berri in #25122
- [Infra] Remove flaky proxy_e2e_azure_batches_tests CI workflow by @yuneng-berri in #25247
- [Docs] Enforce Black Formatting in Contributor Docs by @yuneng-berri in #25135
- [Infra] Remove Redundant Matrix Unit Test Workflow by @yuneng-berri in #25251
- feat: add POST /team/permissions_bulk_update endpoint by @ryan-crabbe-berri in #25239
- fix: batch-limit stale managed object cleanup to prevent 300K row UPD… by @ishaan-berri in #25258
- bump litellm-enterprise to 0.1.37 by @ishaan-berri in #25265
- bump litellm version to 1.83.4 by @ishaan-berri in #25266
- Litellm aws gov cloud mode support by @shivamrawat1 in #25254
- [Fix] Update check_responses_cost tests for _expire_stale_rows by @yuneng-berri in #25299
- [Test] UI - E2E: Add Playwright tests with local PostgreSQL by @yuneng-berri in #25126
- [Fix] Dockerfile.non_root: handle missing .npmrc gracefully by @yuneng-berri in #25307
- fix(auth): allow JWT override OAuth2 routing without global OAuth2 enablement by @milan-berri in #25252
- [Infra] Pin cosign.pub verification to initial commit hash by @yuneng-berri in #25273
- [Refactor] Align /v2/key/info response handling with v1 by @yuneng-berri in #25313
- Fix node-gyp symlink path after npm upgrade in Dockerfile by @joereyna in #25048
- [Infra] Bump version 1.83.4 → 1.83.5 by @yuneng-berri in #25316
- fix(mcp): block arbitrary command execution via stdio transport by @Sameerlite in #25343
- [Infra] Migrate Redis caching tests from GHA to CircleCI by @yuneng-berri in #25354
- [Feature] UI E2E Tests: Proxy Admin Team and Key Management by @yuneng-berri in #25365
- Add Ramp as a built-in success callback by @kedarthakkar in #23769
- fix(router): tag-based routing broken when encrypted_content_affinity is enabled by @Sameerlite in #25347
- feat(triton): add embedding usage estimation for self-hosted responses by @Sameerlite in #25345
- fix(router): pass custom_llm_provider to get_llm_provider for unprefixed model names by @Sameerlite in #25334
- Litellm oss staging 04 02 2026 p1 by @krrish-berri-2 in #25055
- feat(cost): add baseten model api pricing entries by @Sameerlite in #25358
- feat(proxy): add credential overrides per team/project via model_config metadata by @michelligabriele in #24438
- docs: add Docker Image Security Guide (cosign verification & deployment best practices) by @krrish-berri-2 in #25439
- fix(test): mock headers in test_completion_fine_tuned_model by @joereyna in #25444
- feat(mcp): add per-user OAuth token storage for interactive MCP flows by @csoni-cweave in #25441
- [Fix] UI: improve storage handling and Dockerfile consistency by @yuneng-berri in #25384
- fix(responses-ws): append ?model= to backend WebSocket URL by @joereyna in #25437
- [Docs] Add missing MCP per-user token env vars to config_settings by @yuneng-berri in #25471
- [Test] UI - Unit tests: raise global vitest timeout and remove per-test overrides by @yuneng-berri in #25468
- refactor: consolidate route auth for UI and API tokens by @ryan-crabbe-berri in #25473
- [Fix] Responses WebSocket Duplicate Keyword Argument Error by @yuneng-berri in #25513
- fix(bedrock): avoid double-counting cache tokens in Anthropic Messages streaming usage by @Sameerlite in #25517
- bump: version 1.83.5 → 1.83.6 by @yuneng-berri in #25528
New Contributors
- @kedarthakkar made their first contribution in #23769
- @csoni-cweave made their first contribution in #25441
Full Changelog: v1.83.3.rc.1...v1.83.6-nightly
v1.82.3.dev.9
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.82.3.dev.9Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.82.3.dev.9/cosign.pub \
ghcr.io/berriai/litellm:v1.82.3.dev.9Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
Full Changelog: v1.82.3.dev.6...v1.82.3.dev.9
v1.82.3-stable.patch.4
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.82.3-stable.patch.4Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.82.3-stable.patch.4/cosign.pub \
ghcr.io/berriai/litellm:v1.82.3-stable.patch.4Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
Full Changelog: v1.82.3-stable.patch.2...v1.82.3-stable.patch.4
v1.83.5-nightly
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.5-nightlyVerify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.5-nightly/cosign.pub \
ghcr.io/berriai/litellm:v1.83.5-nightlyExpected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- docs(blog): add security hardening April 2026 post (#25101) by @ishaan-berri in #25102
- Litellm ishaan april1 try2 by @ishaan-berri in #25110
- bump: version 1.83.1 → 1.83.2 by @ishaan-berri in #25112
- fix(ui): wire team_id filter to key alias dropdown on Virtual Keys tab by @ryan-crabbe-berri in #25114
- fix(ui): add paginated team search to usage page filter by @ryan-crabbe-berri in #25107
- fix(ui): allow changing team organization from team settings by @ryan-crabbe-berri in #25095
- docs: document default_team_params in config reference by @ryan-crabbe-berri in #25032
- feat(teams): resolve access group resources in team endpoints by @ryan-crabbe-berri in #25027
- bump litellm-proxy-extras to 0.4.64 by @ishaan-berri in #25121
- feat(proxy): add project-level guardrails support by @michelligabriele in #25087
- fix(a2a): preserve JSON-RPC envelope for AgentCore A2A-native agents by @michelligabriele in #25092
- feat(ui): add guardrails support to project create/edit forms by @michelligabriele in #25100
- Fix broken codeql-action SHA in scorecard workflow by @joereyna in #24815
- litellm ryan march 31 by @ryan-crabbe-berri in #25119
- [Infra] Building UI for Release by @yuneng-berri in #25136
- fix(ui): don't inject vector_store_ids: [] when editing a model by @ryan-crabbe-berri in #25133
- Litellm ishaan april2 by @ishaan-berri in #25113
- fix(docker): load enterprise hooks in non-root runtime image by @Sameerlite in #24917
- Litellm ishaan march30 (#24887) by @ishaan-berri in #25151
- [Fix] Team Model Update 500 Due to Unsupported Prisma JSON Path Filter by @yuneng-berri in #25152
- Litellm team model group name routing fix (#25148) by @ishaan-berri in #25154
- Litellm ishaan april4 2 by @ishaan-berri in #25150
- feat(ui): expose Azure Entra ID credential fields in provider form by @ryan-crabbe-berri in #25137
- feat(ui): add per-model rate limits to team edit/info views by @ryan-crabbe-berri in #25144
- fix(ui): use entity key for usage export display by @ryan-crabbe-berri in #25153
- Litellm ishaan march23 - MCP Toolsets + GCP Caching fix (#25146) by @ishaan-berri in #25155
- cherry-pick: tag query fix + MCP metadata support by @ishaan-berri in #25145
- feat: allow adding team guardrails from the UI by @ryan-crabbe-berri in #25038
- Litellm ryan apr 4 by @ryan-crabbe-berri in #25156
- [Infra] Rebuild UI for Release by @yuneng-berri in #25158
- bump: version 1.83.2 → 1.83.3 by @yuneng-berri in #25162
- bump litellm-proxy-extras to 0.4.65 by @ishaan-berri in #25163
- bump litellm-enterprise to 0.1.36 by @ishaan-berri in #25164
- fix: regenerate poetry.lock by @ishaan-berri in #25169
- Litellm docs 1 83 3 by @ishaan-berri in #25166
- [Nit] Small docs fix, fixing img + folder name by @ishaan-berri in #25171
- docs: week 1 checklist by @mubashir1osmani in #25083
- [Docs] Add cosign Docker image verification steps to security blog posts by @yuneng-berri in #25122
- [Infra] Remove flaky proxy_e2e_azure_batches_tests CI workflow by @yuneng-berri in #25247
- [Docs] Enforce Black Formatting in Contributor Docs by @yuneng-berri in #25135
- [Infra] Remove Redundant Matrix Unit Test Workflow by @yuneng-berri in #25251
- feat: add POST /team/permissions_bulk_update endpoint by @ryan-crabbe-berri in #25239
- fix: batch-limit stale managed object cleanup to prevent 300K row UPD… by @ishaan-berri in #25258
- bump litellm-enterprise to 0.1.37 by @ishaan-berri in #25265
- bump litellm version to 1.83.4 by @ishaan-berri in #25266
- Litellm aws gov cloud mode support by @shivamrawat1 in #25254
- [Fix] Update check_responses_cost tests for _expire_stale_rows by @yuneng-berri in #25299
- [Test] UI - E2E: Add Playwright tests with local PostgreSQL by @yuneng-berri in #25126
- [Fix] Dockerfile.non_root: handle missing .npmrc gracefully by @yuneng-berri in #25307
- fix(auth): allow JWT override OAuth2 routing without global OAuth2 enablement by @milan-berri in #25252
- [Infra] Pin cosign.pub verification to initial commit hash by @yuneng-berri in #25273
- [Refactor] Align /v2/key/info response handling with v1 by @yuneng-berri in #25313
- Fix node-gyp symlink path after npm upgrade in Dockerfile by @joereyna in #25048
- [Infra] Bump version 1.83.4 → 1.83.5 by @yuneng-berri in #25316
Full Changelog: v1.83.1-nightly...v1.83.5-nightly
v1.83.3.rc.1
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.3.rc.1Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.3.rc.1/cosign.pub \
ghcr.io/berriai/litellm:v1.83.3.rc.1Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- fix(ui): wire team_id filter to key alias dropdown on Virtual Keys tab by @ryan-crabbe-berri in #25114
- fix(ui): add paginated team search to usage page filter by @ryan-crabbe-berri in #25107
- fix(ui): allow changing team organization from team settings by @ryan-crabbe-berri in #25095
- docs: document default_team_params in config reference by @ryan-crabbe-berri in #25032
- feat(teams): resolve access group resources in team endpoints by @ryan-crabbe-berri in #25027
- feat(proxy): add project-level guardrails support by @michelligabriele in #25087
- fix(a2a): preserve JSON-RPC envelope for AgentCore A2A-native agents by @michelligabriele in #25092
- feat(ui): add guardrails support to project create/edit forms by @michelligabriele in #25100
- Fix broken codeql-action SHA in scorecard workflow by @joereyna in #24815
- litellm ryan march 31 by @ryan-crabbe-berri in #25119
- [Infra] Building UI for Release by @yuneng-berri in #25136
- fix(ui): don't inject vector_store_ids: [] when editing a model by @ryan-crabbe-berri in #25133
- Litellm ishaan april2 by @ishaan-berri in #25113
- fix(docker): load enterprise hooks in non-root runtime image by @Sameerlite in #24917
- Litellm ishaan march30 (#24887) by @ishaan-berri in #25151
- [Fix] Team Model Update 500 Due to Unsupported Prisma JSON Path Filter by @yuneng-berri in #25152
- Litellm team model group name routing fix (#25148) by @ishaan-berri in #25154
- Litellm ishaan april4 2 by @ishaan-berri in #25150
- feat(ui): expose Azure Entra ID credential fields in provider form by @ryan-crabbe-berri in #25137
- feat(ui): add per-model rate limits to team edit/info views by @ryan-crabbe-berri in #25144
- fix(ui): use entity key for usage export display by @ryan-crabbe-berri in #25153
- Litellm ishaan march23 - MCP Toolsets + GCP Caching fix (#25146) by @ishaan-berri in #25155
- cherry-pick: tag query fix + MCP metadata support by @ishaan-berri in #25145
- feat: allow adding team guardrails from the UI by @ryan-crabbe-berri in #25038
- Litellm ryan apr 4 by @ryan-crabbe-berri in #25156
- [Infra] Rebuild UI for Release by @yuneng-berri in #25158
- bump: version 1.83.2 → 1.83.3 by @yuneng-berri in #25162
- bump litellm-proxy-extras to 0.4.65 by @ishaan-berri in #25163
- bump litellm-enterprise to 0.1.36 by @ishaan-berri in #25164
- fix: regenerate poetry.lock by @ishaan-berri in #25169
Full Changelog: v1.83.2-nightly...v1.83.3.rc.1
v1.83.3-stable
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.83.3-stableVerify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.83.3-stable/cosign.pub \
ghcr.io/berriai/litellm:v1.83.3-stableExpected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- fix(anthropic): preserve thinking.summary when routing to OpenAI Responses API by @Chesars in #21441
- docs: add thinking.summary field to /v1/messages and reasoning docs by @Chesars in #22823
- fix(gemini): resolve image token undercounting in usage metadata by @gustipardo in #22608
- feat(anthropic): add opt-out flag for default reasoning summary by @Chesars in #22904
- fix(anthropic): align translate_thinking_for_model with default summary injection + docs by @Chesars in #22909
- Fix: Vertex ai Batch Output File Download Fails with 500 by @Sameerlite in #23718
- docs(blog): add WebRTC blog post link by @Sameerlite in #23547
- Refactor: Filtering beta header after transformation by @Sameerlite in #23715
- fix(streaming): preserve custom attributes on final stream chunk by @Sameerlite in #23530
- [Feat] Add create character endpoints and other new videos Endpoints by @Sameerlite in #23737
- Litellm oss staging 03 14 2026 by @RheagalFire in #23686
- fix: align DefaultInternalUserParams Pydantic default with runtime fallback by @ryan-crabbe in #23666
- [Test] UI Dashboard - Add unit tests for 5 untested files by @yuneng-jiang in #23773
- [Infra] Merging RC Branch with Main by @yuneng-jiang in #23786
- [Fix] Privilege Escalation on /key/block, /key/unblock, and /key/update max_budget by @yuneng-jiang in #23781
- chore(ui): migrate DefaultUserSettings buttons from Tremor to antd by @ryan-crabbe in #23787
- fix: set oauth2_flow when building MCPServer in _execute_with_mcp_client by @joereyna in #23468
- [Fix] UI - Logs: Empty Filter Results Show Stale Data by @yuneng-jiang in #23792
- Litellm update blog posts rss by @ryan-crabbe in #23791
- [Fix] Prevent Internal Users from Creating Invalid Keys by @yuneng-jiang in #23795
- [Fix] Key Alias Re-validation on Update Blocks Legacy Aliases by @yuneng-jiang in #23798
- fix: Register DynamoAI guardrail initializer and enum entry by @Harshit28j in #23752
- docs: add v1.82.3 release notes by @joereyna in #23816
- Revert "docs: add v1.82.3 release notes" in #23817
- fix(fireworks): skip #transform=inline for base64 data URLs by @awais786 in #23729
- fix(langsmith): avoid no running event loop during sync init by @pandego in #23727
- [Feature] Disable Custom Virtual Key Values via UI Setting by @yuneng-jiang in #23812
- fix(gemini): support images in tool_results for /v1/messages routing by @awais786 in #23724
- fix(ui): CSV export empty on Global Usage page by @ryan-crabbe in #23819
- fix: langfuse trace leak key on model params by @Harshit28j in #22188
- [Infra] Merge personal dev branch with daily dev branch by @yuneng-jiang in #23826
- fix(model-prices): correct supported_regions for Vertex AI DeepSeek models by @Chesars in #23864
- fix(model-prices): restore gpt-4-0314 by @Chesars in #23753
- fix(cache): Fix Redis cluster caching by @cohml in #23480
- Revert "fix: langfuse trace leak key on model params" by @yuneng-jiang in #23868
- [Infra] Merge daily dev branch with main by @yuneng-jiang in #23827
- Litellm ryan march 16 by @ryan-crabbe in #23822
- fix(proxy): convert max_budget to float when set via environment variable by @rstar327 in #23855
- [Test] UI: Add unit tests for 10 untested components by @yuneng-jiang in #23891
- Add Akto Guardrails to LiteLLM by @rzeta-10 in #23250
- fix(core): map Anthropic 'refusal' finish reason to 'content_filter' by @Chesars in #23899
- fix(vertex): streaming finish_reason='stop' instead of 'tool_calls' for gemini-3.1-flash-lite-preview by @Chesars in #23895
- [Fix] Add contents:write permission to ghcr_deploy release job by @yuneng-jiang in #23917
- [Infra] bump: version 1.82.3 → 1.82.4 by @yuneng-jiang in #23919
- docs(mcp_zero_trust): add MCP zero trust auth guide by @ishaan-jaff in #23918
- Capture incomplete terminal error in background streaming by @xianzongxie-stripe in #23881
- fix: cache_control directive dropped anthropic document/file blocks by @kelvin-tran in #23911
- [Infra] Security and Proxy Extras for Nightly by @yuneng-jiang in #23921
- fix: map Chat Completion file type to Responses API input_file by @gambletan in #23618
- fix(vertex): respect vertex_count_tokens_location for Claude count_tokens by @Chesars in #23907
- fix(anthropic): preserve cache directive on file-type content blocks by @Chesars in #23906
- fix(mistral): preserve diarization segments in transcription response by @Chesars in #23925
- fix(gemini): pass model to context caching URL builder for custom api_base by @Chesars in #23928
- fix(azure): auto-route gpt-5.4+ tools+reasoning to Responses API by @Chesars in #23926
- fix: auto-recover shared aiohttp session when closed by @voidborne-d in #23808
- [Feature] /v2/team/list: Add org admin access control, members_count, and indexes by @yuneng-jiang in #23938
- [Refactor] UI - Playground: Extract FilePreviewCard from ChatUI by @yuneng-jiang in #23973
- docs: add v1.82.3 release notes by @joereyna in #23820
- fix(proxy): model-level guardrails not executing for non-streaming post_call by @michelligabriele in #23774
- fix(proxy): prevent duplicate callback logs for pass-through endpoint failures by @michelligabriele in #23509
- docs: Revamp documentation site with new navigation, landing pages, and styling by @Arindam200 in #24023
- Fix langfuse otel traceparent propagation by @jyeros in #24048
- [Test] UI: Add unit tests for 10 untested components by @yuneng-jiang in #24036
- [Fix] UI - Logs: Guardrail Mode Type Crash on Non-String Values by @yuneng-jiang in #24035
- [Staging] - Ishaan March 17th by @ishaan-jaff in #23903
- [Infra] Merge daily branch with main by @yuneng-jiang in #24055
- [Fix] UI - Default Team Settings: Add Missing Permission Options by @yuneng-jiang in #24039
- fix: /key/block and /key/unblock return 404 (not 401) for non-existent keys by @yuneng-jiang in #23977
- [Refactor] UI - Playground: Extract ChatMessageBubble from ChatUI by @yuneng-jiang in #24062
- [Fix] Key Update Endpoint Returns 401 Instead of 404 for Nonexistent Keys by @yuneng-jiang in #24063
- fix: surface Anthropic code execution results as cod...
v1.82.3.dev.7
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. To verify the integrity of an image before deploying:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.82.3.dev.7/cosign.pub \
ghcr.io/berriai/litellm:v1.82.3.dev.7Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
Full Changelog: v1.82.3.dev.5...v1.82.3.dev.7
v1.82.3.dev.6
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. To verify the integrity of an image before deploying:
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.82.3.dev.6/cosign.pub \
ghcr.io/berriai/litellm:v1.82.3.dev.6Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
Full Changelog: v1.82.3.dev.5...v1.82.3.dev.6