[Feature] UI - Admin Settings: Add option for Authentication for public AI Hub#20444
[Feature] UI - Admin Settings: Add option for Authentication for public AI Hub#20444yuneng-jiang merged 5 commits intomainfrom
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Greptile OverviewGreptile SummaryThis PR adds a UI setting Key Changes:
Issues Found:
The refactoring of JWT utilities and Confidence Score: 3/5
|
| Filename | Overview |
|---|---|
| litellm/proxy/ui_crud_endpoints/proxy_setting_endpoints.py | Added require_auth_for_public_ai_hub boolean field to UISettings model and allowlist - backend definition is correct |
| ui/litellm-dashboard/src/components/AIHub/ModelHubTable.tsx | Added frontend auth check using require_auth_for_public_ai_hub setting - redirects to login when enabled and token invalid, but backend enforcement is missing |
| ui/litellm-dashboard/src/app/(dashboard)/hooks/useAuthorized.ts | Refactored to use new checkTokenValidity and decodeToken utilities, consolidated redirect logic into single useEffect - cleaner implementation |
| ui/litellm-dashboard/src/utils/jwtUtils.ts | Added decodeToken and checkTokenValidity utility functions with proper error handling - good refactor for code reuse |
| ui/litellm-dashboard/src/components/networking.tsx | Removed authentication header from getUiSettings call - correctly made endpoint public |
Sequence Diagram
sequenceDiagram
participant User
participant Browser
participant ModelHubTable
participant useUISettings
participant Networking
participant Backend
participant Database
User->>Browser: Access Public AI Hub
Browser->>ModelHubTable: Render component
ModelHubTable->>useUISettings: Fetch UI settings
useUISettings->>Networking: getUiSettings()
Networking->>Backend: GET /get/ui_settings (no auth)
Backend->>Database: Query ui_settings
Database-->>Backend: Return settings
Backend-->>Networking: Return {require_auth_for_public_ai_hub: true/false}
Networking-->>useUISettings: Settings data
useUISettings-->>ModelHubTable: Settings with require_auth flag
alt require_auth_for_public_ai_hub is true
ModelHubTable->>ModelHubTable: getCookie("token")
ModelHubTable->>ModelHubTable: checkTokenValidity(token)
alt token is invalid
ModelHubTable->>Browser: router.replace("/ui/login")
Browser->>User: Redirect to login page
else token is valid
ModelHubTable->>Backend: Fetch model hub data (authenticated)
Backend-->>ModelHubTable: Return data
ModelHubTable->>User: Display AI Hub
end
else require_auth_for_public_ai_hub is false
ModelHubTable->>Backend: Fetch model hub data
Backend-->>ModelHubTable: Return data
ModelHubTable->>User: Display AI Hub (public access)
end
| // Check authentication requirement for public AI Hub | ||
| useEffect(() => { | ||
| // Only check when UI settings are loaded and this is a public page | ||
| if (isUISettingsLoading || !publicPage) { | ||
| return; | ||
| } | ||
|
|
||
| const requireAuth = uiSettings?.values?.require_auth_for_public_ai_hub; | ||
|
|
||
| // If require_auth_for_public_ai_hub is true, verify token | ||
| if (requireAuth === true) { | ||
| const token = getCookie("token"); | ||
| const isTokenValid = checkTokenValidity(token); | ||
|
|
||
| // If token is invalid, redirect to login | ||
| if (!isTokenValid) { | ||
| router.replace(`${getProxyBaseUrl()}/ui/login`); | ||
| return; | ||
| } | ||
| } | ||
| // If require_auth_for_public_ai_hub is false, allow public access (no change) | ||
| }, [isUISettingsLoading, publicPage, uiSettings, router]); |
There was a problem hiding this comment.
Frontend implements the require_auth_for_public_ai_hub check, but the backend /public/model_hub endpoint at litellm/proxy/public_endpoints/public_endpoints.py:27 still has dependencies=[Depends(user_api_key_auth)] hardcoded. This means the backend always requires authentication regardless of this setting.
The setting should control the backend endpoint's authentication dependency, not just redirect on the frontend. Otherwise, unauthenticated users will be blocked by the API even if the setting is disabled.
Prompt To Fix With AI
This is a comment left during a code review.
Path: ui/litellm-dashboard/src/components/AIHub/ModelHubTable.tsx
Line: 84:105
Comment:
Frontend implements the `require_auth_for_public_ai_hub` check, but the backend `/public/model_hub` endpoint at `litellm/proxy/public_endpoints/public_endpoints.py:27` still has `dependencies=[Depends(user_api_key_auth)]` hardcoded. This means the backend always requires authentication regardless of this setting.
The setting should control the backend endpoint's authentication dependency, not just redirect on the frontend. Otherwise, unauthenticated users will be blocked by the API even if the setting is disabled.
How can I resolve this? If you propose a fix, please make it concise.| accessToken: decoded?.key ?? null, | ||
| userId: decoded?.user_id ?? null, | ||
| userEmail: decoded?.user_email ?? null, | ||
| userRole: formatUserRole(decoded?.user_role ?? null), | ||
| userRole: formatUserRole(decoded?.user_role), | ||
| premiumUser: decoded?.premium_user ?? null, | ||
| disabledPersonalKeyCreation: decoded?.disabled_non_admin_personal_key_creation ?? null, | ||
| showSSOBanner: decoded?.login_method === "username_password", |
There was a problem hiding this comment.
Decoded user data is returned even when isAuthorized is false. This exposes data from expired/invalid tokens. Consider gating these fields by isAuthorized like the token field on line 65.
Prompt To Fix With AI
This is a comment left during a code review.
Path: ui/litellm-dashboard/src/app/(dashboard)/hooks/useAuthorized.ts
Line: 66:72
Comment:
Decoded user data is returned even when `isAuthorized` is false. This exposes data from expired/invalid tokens. Consider gating these fields by `isAuthorized` like the `token` field on line 65.
How can I resolve this? If you propose a fix, please make it concise.
Relevant issues
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/directory, Adding at least 1 test is a hard requirement - see detailsmake test-unitCI (LiteLLM team)
Branch creation CI run
Link:
CI run for the last commit
Link:
Merge / cherry-pick CI run
Links:
Type
🆕 New Feature
🧹 Refactoring
✅ Test
Changes
Adds a UI setting
require_auth_for_public_ai_hubthat allows administrators to require authentication for accessing the public AI Hub. When enabled, unauthenticated users are redirected to the login page. RefactorsuseAuthorizedhook to consolidate token validation logic usingcheckTokenValidityanddecodeTokenutilities. Adds tests for ModelHubTable authentication flow, UISettings component, useAuthorized hook, and jwtUtils functions.Screenshots