[rhoai-2.25] chore: cherry-pick developer tooling QoL from main#2056
[rhoai-2.25] chore: cherry-pick developer tooling QoL from main#2056jiridanek wants to merge 3 commits intorhoai-2.25from
Conversation
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here. DetailsNeeds approval from an approver in each of these files:Approvers can indicate their approval by writing |
📝 WalkthroughWalkthroughMultiple configuration and script updates across the project: Python version targets downgraded from 3.14 to 3.12 across workflows and tools; lockfile generation refactored from bash script to Python CLI with expanded build-arg configuration; CVE constraint added for urllib3 with documentation; Makefile enhanced for conditional build arguments and prefetch logic. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
@coderabbitai full review |
✅ Actions performedFull review triggered. |
There was a problem hiding this comment.
Actionable comments posted: 6
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
.github/workflows/piplock-renewal.yaml (1)
87-89:⚠️ Potential issue | 🔴 CriticalUpdate workflow to use correct Makefile target:
refresh-lock-filesThe Makefile no longer contains the
refresh-pipfilelock-filestarget. The target was renamed torefresh-lock-files, which is defined at line 438 of the Makefile. Update the workflow to call the correct target name.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/piplock-renewal.yaml around lines 87 - 89, The workflow step currently invokes the removed Makefile target refresh-pipfilelock-files; update that CI step to call the renamed Makefile target refresh-lock-files instead. Specifically, change the job step labeled "Run make refresh-pipfilelock-files" so its run command uses make refresh-lock-files with the same PYTHON_VERSION and INCLUDE_OPT_DIRS environment arguments (i.e., replace refresh-pipfilelock-files with refresh-lock-files in the run line). Ensure no other references to refresh-pipfilelock-files remain.
🧹 Nitpick comments (2)
uv (1)
39-40: Slow path requiresuvto be installed.The slow path uses
uv tool run "uv@${version}"which requires a workinguvinstallation. If nouvis installed at all on the system, this will fail with a "command not found" error rather than a helpful message guiding the user to installuvfirst.Consider adding a check or a clearer error message:
🔧 Proposed improvement
# Slow path: run the pinned version via uvx (downloaded and cached on first use) +if ! command -v uv &>/dev/null; then + echo "error: uv is not installed. Please install uv first: https://docs.astral.sh/uv/getting-started/installation/" >&2 + exit 1 +fi exec uv tool run "uv@${version}" "$@"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@uv` around lines 39 - 40, The slow-path call using exec uv tool run "uv@${version}" "$@" will fail with a "command not found" if uv isn't installed; add a pre-check that verifies the uv binary is present (e.g., check command -v uv / which uv) and if missing print a clear error guiding the user to install uv before attempting the slow path (or fall back to a different installer path), then only call exec uv tool run "uv@${version}" "$@" when the check succeeds.docs/cves/python.md (1)
19-21: Add languages to the new fenced blocks.
markdownlintis already flagging MD040 on these fences. Addingtext,bash, ortoml/inikeeps the doc lint-clean.Also applies to: 28-32, 41-45, 116-119, 176-179
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/cves/python.md` around lines 19 - 21, Several fenced code blocks (e.g., the one containing "dependencies/cve-constraints.txt") are missing language specifiers and trigger markdownlint MD040; update each affected triple-backtick fence to include an appropriate language tag (for example use "text" for plain filenames, "bash" for shell snippets, or "toml"/"ini" for config blocks) so the fences at the indicated occurrences are language-tagged and lint-clean.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/cves/python.md`:
- Around line 34-35: Update all references to the removed
scripts/pylocks_generator.sh in this document (notably around the paragraph
currently numbered "2. Automatic application" and the sections at lines 47-52
and 205-216) to point to the new entrypoints: use "./uv run
scripts/pylocks_generator.py ..." or the Make target "make refresh-lock-files"
where appropriate; ensure examples, command snippets, and the related-files
section consistently reference scripts/pylocks_generator.py or the make target
instead of the deleted .sh script.
In `@Makefile`:
- Around line 93-100: The prefetch and cachi2 existence checks use caller CWD;
update all path checks and wildcards to resolve via ROOT_DIR so they mirror the
rest of the build logic: change uses of $(wildcard $(BUILD_DIR)prefetch-input),
[ -d '$(BUILD_DIR)prefetch-input' ], and [ ! -d cachi2/output ] to reference
$(ROOT_DIR) (e.g., $(ROOT_DIR)$(BUILD_DIR)prefetch-input and
$(ROOT_DIR)cachi2/output) and likewise update the CACHI2_VOLUME and
LOCAL_BUILD_ARG evaluations to base their wildcard tests on $(ROOT_DIR) to
ensure invocations outside repo root see the same cache/mount behavior.
In `@pyproject.toml`:
- Line 34: The pipeline fails because the new dependency "typer" was added to
pyproject.toml but the lock file (uv.lock) wasn't regenerated; run the project's
lock-refresh task to update the lockfile (execute make refresh-lock-files) so
uv.lock is regenerated and the missing `source`/matching-package error for
"typer" is resolved, then commit the updated lock file alongside the pyproject
change.
In `@README.md`:
- Around line 84-100: Update the version strings in the README examples to match
uv.toml (change 0.10.6 → 0.8.12) by replacing occurrences in the three examples:
the uvx invocation (uvx uv@0.10.6 sync --locked → uvx uv@0.8.12 sync --locked),
the uv tool run invocation (uv tool run uv@0.10.6 sync --locked → uv tool run
uv@0.8.12 sync --locked), and the installer/pip lines (curl
.../uv/0.10.6/install.sh → .../uv/0.8.12/install.sh and pip install uv==0.10.6 →
pip install uv==0.8.12) so the README and uv.toml versions are consistent.
In `@scripts/pylocks_generator.py`:
- Around line 345-354: The loop silently skips directories when
extract_python_version(tdir) returns None or detect_flavors(tdir) returns empty,
causing the script to succeed with no output; change these early-continue paths
to fail-fast by emitting a clear error (use warn or a new error log) and exiting
non-zero (e.g., sys.exit(1) or raise a RuntimeError) so CI/local runs detect the
bad DIR=; update the blocks around extract_python_version(tdir) and
detect_flavors(tdir) accordingly and add an import for sys if using sys.exit.
- Around line 240-296: The script currently unconditionally deletes the
destination pylock file on failure (output_path.unlink) which can remove a
previously good checked-in lockfile; change the flow to preserve/restore the
existing lockfile: before running subprocess.run, check if (project_dir /
output).exists() and if so atomically move it to a backup (e.g., backup_path =
project_dir / f"{output}.bak" or similar) and then run cmd; if result.returncode
!= 0 restore the backup to output_path (or leave it in place) instead of
unconditionally unlinking, and if the run succeeds remove the backup; use the
existing symbols output, project_dir, output_path and result to implement the
backup/restore logic.
---
Outside diff comments:
In @.github/workflows/piplock-renewal.yaml:
- Around line 87-89: The workflow step currently invokes the removed Makefile
target refresh-pipfilelock-files; update that CI step to call the renamed
Makefile target refresh-lock-files instead. Specifically, change the job step
labeled "Run make refresh-pipfilelock-files" so its run command uses make
refresh-lock-files with the same PYTHON_VERSION and INCLUDE_OPT_DIRS environment
arguments (i.e., replace refresh-pipfilelock-files with refresh-lock-files in
the run line). Ensure no other references to refresh-pipfilelock-files remain.
---
Nitpick comments:
In `@docs/cves/python.md`:
- Around line 19-21: Several fenced code blocks (e.g., the one containing
"dependencies/cve-constraints.txt") are missing language specifiers and trigger
markdownlint MD040; update each affected triple-backtick fence to include an
appropriate language tag (for example use "text" for plain filenames, "bash" for
shell snippets, or "toml"/"ini" for config blocks) so the fences at the
indicated occurrences are language-tagged and lint-clean.
In `@uv`:
- Around line 39-40: The slow-path call using exec uv tool run "uv@${version}"
"$@" will fail with a "command not found" if uv isn't installed; add a pre-check
that verifies the uv binary is present (e.g., check command -v uv / which uv)
and if missing print a clear error guiding the user to install uv before
attempting the slow path (or fall back to a different installer path), then only
call exec uv tool run "uv@${version}" "$@" when the check succeeds.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: dd66e507-084f-45f0-997a-e92035f1bcac
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (20)
.github/workflows/build-notebooks-TEMPLATE.yaml.github/workflows/code-quality.yaml.github/workflows/docs.yaml.github/workflows/piplock-renewal.yaml.github/workflows/security.yaml.pre-commit-config.yamlMakefileREADME.mdci/generate_code.shdependencies/cve-constraints.txtdocs/cves/python.mdjupyter/datascience/ubi9-python-3.12/pyproject.tomljupyter/pytorch/ubi9-python-3.12/pyproject.tomljupyter/rocm/pytorch/ubi9-python-3.12/pyproject.tomljupyter/trustyai/ubi9-python-3.12/pyproject.tomlpyproject.tomlscripts/pylocks_generator.pyscripts/pylocks_generator.shuvuv.toml
💤 Files with no reviewable changes (1)
- scripts/pylocks_generator.sh
| $(eval LOCAL_BUILD_ARG := $(if $(wildcard $(BUILD_DIR)prefetch-input),--build-arg LOCAL_BUILD=true,)) | ||
| $(eval CACHI2_VOLUME := $(if $(and $(wildcard cachi2/output),$(wildcard $(BUILD_DIR)prefetch-input)),--volume $(ROOT_DIR)cachi2/output:/cachi2/output:Z,)) | ||
|
|
||
| $(info # Building $(IMAGE_NAME) using $(DOCKERFILE_NAME) with $(CONF_FILE) and $(BUILD_ARGS)...) | ||
|
|
||
| @if [ -d '$(BUILD_DIR)prefetch-input' ] && [ ! -d cachi2/output ]; then \ | ||
| echo "Prefetch required for hermetic build. Run: scripts/lockfile-generators/prefetch-all.sh --component-dir $(patsubst %/,%,$(BUILD_DIR)) -- see scripts/lockfile-generators/README.md"; \ | ||
| exit 1; \ |
There was a problem hiding this comment.
Resolve prefetch and cachi2 paths from $(ROOT_DIR).
These checks are currently relative to the caller's CWD, while the rest of the build logic is rooted via ROOT_DIR. Invoking the Makefile outside the repo root can miss an existing cache and incorrectly fail or skip the hermetic mount.
💡 Suggested fix
- $(eval LOCAL_BUILD_ARG := $(if $(wildcard $(BUILD_DIR)prefetch-input),--build-arg LOCAL_BUILD=true,))
- $(eval CACHI2_VOLUME := $(if $(and $(wildcard cachi2/output),$(wildcard $(BUILD_DIR)prefetch-input)),--volume $(ROOT_DIR)cachi2/output:/cachi2/output:Z,))
+ $(eval LOCAL_BUILD_ARG := $(if $(wildcard $(ROOT_DIR)$(BUILD_DIR)prefetch-input),--build-arg LOCAL_BUILD=true,))
+ $(eval CACHI2_VOLUME := $(if $(and $(wildcard $(ROOT_DIR)cachi2/output),$(wildcard $(ROOT_DIR)$(BUILD_DIR)prefetch-input)),--volume $(ROOT_DIR)cachi2/output:/cachi2/output:Z,))
@@
- `@if` [ -d '$(BUILD_DIR)prefetch-input' ] && [ ! -d cachi2/output ]; then \
+ `@if` [ -d '$(ROOT_DIR)$(BUILD_DIR)prefetch-input' ] && [ ! -d '$(ROOT_DIR)cachi2/output' ]; then \
echo "Prefetch required for hermetic build. Run: scripts/lockfile-generators/prefetch-all.sh --component-dir $(patsubst %/,%,$(BUILD_DIR)) -- see scripts/lockfile-generators/README.md"; \
exit 1; \
fi🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Makefile` around lines 93 - 100, The prefetch and cachi2 existence checks use
caller CWD; update all path checks and wildcards to resolve via ROOT_DIR so they
mirror the rest of the build logic: change uses of $(wildcard
$(BUILD_DIR)prefetch-input), [ -d '$(BUILD_DIR)prefetch-input' ], and [ ! -d
cachi2/output ] to reference $(ROOT_DIR) (e.g.,
$(ROOT_DIR)$(BUILD_DIR)prefetch-input and $(ROOT_DIR)cachi2/output) and likewise
update the CACHI2_VOLUME and LOCAL_BUILD_ARG evaluations to base their wildcard
tests on $(ROOT_DIR) to ensure invocations outside repo root see the same
cache/mount behavior.
There was a problem hiding this comment.
this should not be cherrypicked?
There was a problem hiding this comment.
@jiridanek, that's a fair point. If the LOCAL_BUILD_ARG / CACHI2_VOLUME / prefetch-input guard block was not intentionally included in this cherry-pick (it doesn't appear in the listed PR objectives for rhoai-2.25), then my path-resolution comment doesn't apply here.
Would you like to:
- Revert those specific lines from the
build_imagefunction to keep this PR scoped to the intended changes (uv pinning, cve-constraints, pylocks_generator rewrite, Makefile target rename)? - Or keep the changes but address the ROOT_DIR path concern as a follow-up?
If the prefetch/hermetic-build support is indeed out of scope for rhoai-2.25, I can resolve this comment.
🧠 Learnings used
Learnt from: CR
Repo: red-hat-data-services/notebooks PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-03-26T19:41:58.522Z
Learning: Applies to Makefile : Set `KONFLUX` Makefile variable consistently across build and test steps; use `KONFLUX=no` for `Dockerfile.*` and `manifests/odh/base/` manifests, or `KONFLUX=yes` for `Dockerfile.konflux.*` and `manifests/rhoai/base/` manifests
Learnt from: CR
Repo: red-hat-data-services/notebooks PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-03-26T19:41:58.522Z
Learning: Applies to **/Dockerfile{,.konflux*} : Minimize Docker layers in Dockerfiles
Learnt from: CR
Repo: red-hat-data-services/notebooks PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-03-26T19:41:58.522Z
Learning: Applies to **/Dockerfile{,.konflux*} : Follow security best practices in Dockerfiles
| if mode == IndexMode.public_index: | ||
| output = "pylock.toml" | ||
| desc = "pylock.toml (public index)" | ||
| print("➡️ Generating pylock.toml from public PyPI index...") | ||
| else: | ||
| (project_dir / "uv.lock.d").mkdir(exist_ok=True) | ||
| output = f"uv.lock.d/pylock.{flavor}.toml" | ||
| desc = f"{flavor.upper()} lock file" | ||
| print(f"➡️ Generating {flavor.upper()} lock file...") | ||
|
|
||
| # Tag filtering was added in uv 0.9.16 (https://github.com/astral-sh/uv/pull/16956) | ||
| # but bypassed in --universal mode. uv 0.10.5 (https://github.com/astral-sh/uv/pull/18081) | ||
| # now filters wheels by requires-python and marker disjointness even in --universal mode. | ||
| # Documentation at https://docs.astral.sh/uv/reference/cli/#uv-pip-compile--python-platform says that | ||
| # `--python-platform linux` is alias for `x86_64-unknown-linux-gnu`; we cannot use this to get a multiarch pylock | ||
| # Let's use --universal temporarily, and in the future we can switch to using uv.lock | ||
| # when https://github.com/astral-sh/uv/issues/6830 is resolved, or symlink `ln -s uv.lock.d/uv.${flavor}.lock uv.lock` | ||
| # Note: currently generating uv.lock.d/pylock.${flavor}.toml; future rename to uv.${flavor}.lock is planned | ||
| # See also --universal discussion with Gerard | ||
| # https://redhat-internal.slack.com/archives/C0961HQ858Q/p1757935641975969?thread_ts=1757542802.032519&cid=C0961HQ858Q | ||
| cmd: list[str] = [ | ||
| str(UV), | ||
| "pip", | ||
| "compile", | ||
| "pyproject.toml", | ||
| "--output-file", | ||
| output, | ||
| "--format", | ||
| "pylock.toml", | ||
| "--generate-hashes", | ||
| "--emit-index-url", | ||
| f"--python-version={python_version}", | ||
| "--universal", | ||
| "--no-annotate", | ||
| "--quiet", | ||
| ] | ||
|
|
||
| for pkg in NO_EMIT_PACKAGES: | ||
| cmd.extend(["--no-emit-package", pkg]) | ||
|
|
||
| if upgrade: | ||
| cmd.append("--upgrade") | ||
|
|
||
| # Use relative path to avoid absolute paths in pylock.toml headers | ||
| if CVE_CONSTRAINTS_FILE.is_file(): | ||
| relative_constraints = os.path.relpath(CVE_CONSTRAINTS_FILE, project_dir) | ||
| cmd.extend(["--constraints", relative_constraints]) | ||
|
|
||
| cmd.extend(index_flags) | ||
|
|
||
| result = subprocess.run(cmd, cwd=project_dir, check=False) | ||
|
|
||
| if result.returncode != 0: | ||
| warn(f"Failed to generate {desc} in {project_dir}") | ||
| output_path = project_dir / output | ||
| output_path.unlink(missing_ok=True) | ||
| return False |
There was a problem hiding this comment.
Preserve the last good lockfile when compilation fails.
unlink() here removes the destination unconditionally. If uv pip compile fails because of a transient resolver or index problem, an existing checked-in pylock.toml / uv.lock.d/pylock.*.toml gets deleted and the repo is left in a worse state than before the run.
💡 Safer pattern
if mode == IndexMode.public_index:
- output = "pylock.toml"
+ output = Path("pylock.toml")
desc = "pylock.toml (public index)"
print("➡️ Generating pylock.toml from public PyPI index...")
else:
(project_dir / "uv.lock.d").mkdir(exist_ok=True)
- output = f"uv.lock.d/pylock.{flavor}.toml"
+ output = Path("uv.lock.d") / f"pylock.{flavor}.toml"
desc = f"{flavor.upper()} lock file"
print(f"➡️ Generating {flavor.upper()} lock file...")
+ tmp_output = output.with_suffix(output.suffix + ".tmp")
cmd: list[str] = [
str(UV),
"pip",
"compile",
"pyproject.toml",
"--output-file",
- output,
+ str(tmp_output),
"--format",
"pylock.toml",
"--generate-hashes",
"--emit-index-url",
f"--python-version={python_version}",
@@
result = subprocess.run(cmd, cwd=project_dir, check=False)
if result.returncode != 0:
warn(f"Failed to generate {desc} in {project_dir}")
- output_path = project_dir / output
- output_path.unlink(missing_ok=True)
+ (project_dir / tmp_output).unlink(missing_ok=True)
return False
+ (project_dir / tmp_output).replace(project_dir / output)
ok(f"{desc} generated successfully.")🧰 Tools
🪛 Ruff (0.15.7)
[error] 290-290: subprocess call: check for execution of untrusted input
(S603)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/pylocks_generator.py` around lines 240 - 296, The script currently
unconditionally deletes the destination pylock file on failure
(output_path.unlink) which can remove a previously good checked-in lockfile;
change the flow to preserve/restore the existing lockfile: before running
subprocess.run, check if (project_dir / output).exists() and if so atomically
move it to a backup (e.g., backup_path = project_dir / f"{output}.bak" or
similar) and then run cmd; if result.returncode != 0 restore the backup to
output_path (or leave it in place) instead of unconditionally unlinking, and if
the run succeeds remove the backup; use the existing symbols output,
project_dir, output_path and result to implement the backup/restore logic.
| python_version = extract_python_version(tdir) | ||
| if python_version is None: | ||
| warn(f"Could not extract valid Python version from directory name: {tdir}") | ||
| warn("Expected directory format: .../ubi9-python-X.Y") | ||
| continue | ||
|
|
||
| flavors = detect_flavors(tdir) | ||
| if not flavors: | ||
| warn(f"No Dockerfiles found in {tdir} (cpu/cuda/rocm). Skipping.") | ||
| continue |
There was a problem hiding this comment.
Make explicit DIR= mistakes fail the command.
When the Makefile's DIR= flow points at the wrong directory, these continues skip work but never record a failure. The command can exit 0 after generating nothing, which is a bad failure mode for both local usage and CI.
💡 Minimal fix
python_version = extract_python_version(tdir)
if python_version is None:
warn(f"Could not extract valid Python version from directory name: {tdir}")
warn("Expected directory format: .../ubi9-python-X.Y")
+ if target_dir is not None:
+ failed_dirs.append(tdir)
continue
flavors = detect_flavors(tdir)
if not flavors:
warn(f"No Dockerfiles found in {tdir} (cpu/cuda/rocm). Skipping.")
+ if target_dir is not None:
+ failed_dirs.append(tdir)
continue🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/pylocks_generator.py` around lines 345 - 354, The loop silently skips
directories when extract_python_version(tdir) returns None or
detect_flavors(tdir) returns empty, causing the script to succeed with no
output; change these early-continue paths to fail-fast by emitting a clear error
(use warn or a new error log) and exiting non-zero (e.g., sys.exit(1) or raise a
RuntimeError) so CI/local runs detect the bad DIR=; update the blocks around
extract_python_version(tdir) and detect_flavors(tdir) accordingly and add an
import for sys if using sys.exit.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/cves/python.md`:
- Around line 91-96: Replace bare invocations of the uv wrapper with the
repo-pinned wrapper by changing commands like "uv tree | grep -A5 -B5 tornado"
and "uv tree --invert tornado" to "./uv tree | grep -A5 -B5 tornado" and "./uv
tree --invert tornado" (and update any other plain "uv" examples elsewhere in
the file, e.g., the second uv example block) so the docs always use the pinned
./uv wrapper.
- Around line 19-21: Several fenced code blocks in the docs/cves/python.md
examples (e.g., the block containing "dependencies/cve-constraints.txt" and the
blocks starting with "# CVE-ID: Description", "# RHAIENG-XXXX: CVE-YYYY-ZZZZZ
package_name vulnerability description", "# RHAIENG-2448: CVE-XXXX-YYYY tornado
quadratic DoS", and "# RHAIENG-2458: CVE-2025-66418 urllib3 decompression
vulnerability") are missing language identifiers; add an appropriate language
tag (e.g., ```text or ```ini) to each fenced code block to satisfy MD040 linting
so the blocks become ```text ... ``` (or another suitable tag) for all
occurrences noted (also at the other ranges mentioned).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 41ed7a2d-4352-4032-abf5-5b2779b63ee5
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (2)
README.mddocs/cves/python.md
✅ Files skipped from review due to trivial changes (1)
- README.md
| ``` | ||
| dependencies/cve-constraints.txt | ||
| ``` |
There was a problem hiding this comment.
Add language identifiers to fenced code blocks to satisfy markdown linting.
Several fenced blocks are missing language tags (MD040).
Suggested doc patch
-```
+```text
dependencies/cve-constraints.txt-
# CVE-ID: Description # Reference: https://... package>=fixed_version
-
# RHAIENG-XXXX: CVE-YYYY-ZZZZZ package_name vulnerability description # Upstream: https://github.com/... package_name>=fixed_version
-
# RHAIENG-2448: CVE-XXXX-YYYY tornado quadratic DoS tornado>=6.5.3
-
# RHAIENG-2458: CVE-2025-66418 urllib3 decompression vulnerability urllib3>=2.6.0
</details>
Also applies to: 28-32, 41-45, 116-119, 176-179
<details>
<summary>🧰 Tools</summary>
<details>
<summary>🪛 markdownlint-cli2 (0.22.0)</summary>
[warning] 19-19: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
</details>
</details>
<details>
<summary>🤖 Prompt for AI Agents</summary>
Verify each finding against the current code and only fix it if needed.
In @docs/cves/python.md around lines 19 - 21, Several fenced code blocks in the
docs/cves/python.md examples (e.g., the block containing
"dependencies/cve-constraints.txt" and the blocks starting with "# CVE-ID:
Description", "# RHAIENG-XXXX: CVE-YYYY-ZZZZZ package_name vulnerability
description", "# RHAIENG-2448: CVE-XXXX-YYYY tornado quadratic DoS", and "#
RHAIENG-2458: CVE-2025-66418 urllib3 decompression vulnerability") are missing
language identifiers; add an appropriate language tag (e.g., text or ini)
to each fenced code block to satisfy MD040 linting so the blocks become text ... (or another suitable tag) for all occurrences noted (also at the other
ranges mentioned).
</details>
<!-- fingerprinting:phantom:triton:hawk:fee7b910-a35d-4675-bb7d-5a85f1ec4e23 -->
<!-- This is an auto-generated comment by CodeRabbit -->
4d81430 to
d26add0
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (4)
docs/cves/python.md (2)
90-96:⚠️ Potential issue | 🟡 MinorUse the pinned
./uvwrapper in examples.These command snippets still use bare
uv, which can bypass the repo-pinned versioning model.💡 Doc patch
# Using uv (preferred) -uv tree | grep -A5 -B5 tornado +./uv tree | grep -A5 -B5 tornado # Or check the package's dependents -uv tree --invert tornado +./uv tree --invert tornado @@ # Check dependency tree -uv tree +./uv tree # Find what depends on a package -uv tree --invert package-name +./uv tree --invert package-nameAlso applies to: 219-223
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/cves/python.md` around lines 90 - 96, Replace bare `uv` usage in the example commands with the repository-pinned wrapper `./uv` so the docs use the pinned version; specifically update the examples that show "uv tree | grep -A5 -B5 tornado" and "uv tree --invert tornado" to use "./uv" and also make the same change for the other occurrence noted around the later example (the block referenced as lines 219-223) so all examples consistently call the local ./uv wrapper.
19-21:⚠️ Potential issue | 🟡 MinorAdd language identifiers to fenced code blocks.
Several fenced blocks are still missing a language tag (MD040). Use
text,bash, ortomlas appropriate.Also applies to: 28-32, 41-45, 116-119, 176-179
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/cves/python.md` around lines 19 - 21, Multiple fenced code blocks in docs/cves/python.md are missing language identifiers (MD040). Edit the fenced blocks at the ranges called out (lines ~19, 28-32, 41-45, 116-119, 176-179) and add appropriate language tags (e.g., ```text, ```bash, or ```toml) to each opening fence so the markdown linter no longer flags MD040; pick `text` for plain file paths, `bash` for shell commands, and `toml` for constraint files or config snippets to match the content.scripts/pylocks_generator.py (2)
240-247:⚠️ Potential issue | 🟠 MajorDon’t delete existing lockfiles when compile fails.
A transient
uv pip compilefailure currently removes the destination lockfile, which can erase a previously valid checked-in file.💡 Safer write pattern (temp file + atomic replace)
def run_lock( @@ ) -> bool: @@ if mode == IndexMode.public_index: - output = "pylock.toml" + output = Path("pylock.toml") desc = "pylock.toml (public index)" print("➡️ Generating pylock.toml from public PyPI index...") else: (project_dir / "uv.lock.d").mkdir(exist_ok=True) - output = f"uv.lock.d/pylock.{flavor}.toml" + output = Path("uv.lock.d") / f"pylock.{flavor}.toml" desc = f"{flavor.upper()} lock file" print(f"➡️ Generating {flavor.upper()} lock file...") + tmp_output = output.with_suffix(output.suffix + ".tmp") @@ "pyproject.toml", "--output-file", - output, + str(tmp_output), @@ if result.returncode != 0: warn(f"Failed to generate {desc} in {project_dir}") - output_path = project_dir / output - output_path.unlink(missing_ok=True) + (project_dir / tmp_output).unlink(missing_ok=True) return False + (project_dir / tmp_output).replace(project_dir / output) ok(f"{desc} generated successfully.")Also applies to: 265-266, 292-296
345-354:⚠️ Potential issue | 🟠 MajorFail the run for invalid
DIRinputs instead of silently skipping.When a specific
target_diris passed and it fails Python-version extraction or flavor detection, the script currentlycontinues and may still exit 0.💡 Minimal fix to propagate failure in `DIR=` flow
python_version = extract_python_version(tdir) if python_version is None: warn(f"Could not extract valid Python version from directory name: {tdir}") warn("Expected directory format: .../ubi9-python-X.Y") + if target_dir is not None: + failed_dirs.append(tdir) continue flavors = detect_flavors(tdir) if not flavors: warn(f"No Dockerfiles found in {tdir} (cpu/cuda/rocm). Skipping.") + if target_dir is not None: + failed_dirs.append(tdir) continue🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/pylocks_generator.py` around lines 345 - 354, When processing a specific directory (tdir) the script currently warns and continues on failures from extract_python_version(tdir) and detect_flavors(tdir); update both failure branches so that if a target_dir filter is active (check the variable that holds the user-specified DIR, e.g. target_dir) the script terminates with a non-zero exit (sys.exit(1) or raise SystemExit) instead of continue(), otherwise retain the current warn+continue behavior for broader runs. Make the checks adjacent to the existing extract_python_version and detect_flavors calls so failures for the targeted DIR propagate as an error return.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/cves/python.md`:
- Around line 78-80: The grep example only searches pylock.toml and misses files
like uv.lock.d/pylock.<flavor>.toml; update the example command (the line
containing grep -r "tornado" --include="pylock.toml" .) to include the
uv.lock.d/pylock.*.toml pattern as another --include (or otherwise use an
include pattern that matches both) so the rh-index mode lockfile variants are
covered.
---
Duplicate comments:
In `@docs/cves/python.md`:
- Around line 90-96: Replace bare `uv` usage in the example commands with the
repository-pinned wrapper `./uv` so the docs use the pinned version;
specifically update the examples that show "uv tree | grep -A5 -B5 tornado" and
"uv tree --invert tornado" to use "./uv" and also make the same change for the
other occurrence noted around the later example (the block referenced as lines
219-223) so all examples consistently call the local ./uv wrapper.
- Around line 19-21: Multiple fenced code blocks in docs/cves/python.md are
missing language identifiers (MD040). Edit the fenced blocks at the ranges
called out (lines ~19, 28-32, 41-45, 116-119, 176-179) and add appropriate
language tags (e.g., ```text, ```bash, or ```toml) to each opening fence so the
markdown linter no longer flags MD040; pick `text` for plain file paths, `bash`
for shell commands, and `toml` for constraint files or config snippets to match
the content.
In `@scripts/pylocks_generator.py`:
- Around line 345-354: When processing a specific directory (tdir) the script
currently warns and continues on failures from extract_python_version(tdir) and
detect_flavors(tdir); update both failure branches so that if a target_dir
filter is active (check the variable that holds the user-specified DIR, e.g.
target_dir) the script terminates with a non-zero exit (sys.exit(1) or raise
SystemExit) instead of continue(), otherwise retain the current warn+continue
behavior for broader runs. Make the checks adjacent to the existing
extract_python_version and detect_flavors calls so failures for the targeted DIR
propagate as an error return.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 077fa609-30a4-4fbd-9733-bfa498027c13
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (20)
.github/workflows/build-notebooks-TEMPLATE.yaml.github/workflows/code-quality.yaml.github/workflows/docs.yaml.github/workflows/piplock-renewal.yaml.github/workflows/security.yaml.pre-commit-config.yamlMakefileREADME.mdci/generate_code.shdependencies/cve-constraints.txtdocs/cves/python.mdjupyter/datascience/ubi9-python-3.12/pyproject.tomljupyter/pytorch/ubi9-python-3.12/pyproject.tomljupyter/rocm/pytorch/ubi9-python-3.12/pyproject.tomljupyter/trustyai/ubi9-python-3.12/pyproject.tomlpyproject.tomlscripts/pylocks_generator.pyscripts/pylocks_generator.shuvuv.toml
💤 Files with no reviewable changes (1)
- scripts/pylocks_generator.sh
✅ Files skipped from review due to trivial changes (14)
- uv.toml
- .github/workflows/build-notebooks-TEMPLATE.yaml
- jupyter/datascience/ubi9-python-3.12/pyproject.toml
- .github/workflows/piplock-renewal.yaml
- .github/workflows/security.yaml
- jupyter/rocm/pytorch/ubi9-python-3.12/pyproject.toml
- .github/workflows/code-quality.yaml
- dependencies/cve-constraints.txt
- jupyter/pytorch/ubi9-python-3.12/pyproject.toml
- .github/workflows/docs.yaml
- jupyter/trustyai/ubi9-python-3.12/pyproject.toml
- pyproject.toml
- README.md
- ci/generate_code.sh
🚧 Files skipped from review as they are similar to previous changes (3)
- .pre-commit-config.yaml
- uv
- Makefile
| # Search in pylock.toml files | ||
| grep -r "tornado" --include="pylock.toml" . | ||
| ``` |
There was a problem hiding this comment.
Broaden lockfile search example for rh-index mode.
The current grep only checks pylock.toml and misses uv.lock.d/pylock.<flavor>.toml files.
💡 Doc patch
# Search in pylock.toml files
-grep -r "tornado" --include="pylock.toml" .
+grep -r "tornado" --include="pylock*.toml" .📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # Search in pylock.toml files | |
| grep -r "tornado" --include="pylock.toml" . | |
| ``` | |
| # Search in pylock.toml files | |
| grep -r "tornado" --include="pylock*.toml" . |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/cves/python.md` around lines 78 - 80, The grep example only searches
pylock.toml and misses files like uv.lock.d/pylock.<flavor>.toml; update the
example command (the line containing grep -r "tornado" --include="pylock.toml"
.) to include the uv.lock.d/pylock.*.toml pattern as another --include (or
otherwise use an include pattern that matches both) so the rh-index mode
lockfile variants are covered.
756e2dd to
c160f51
Compare
…txt (opendatahub-io#2886) RHAIENG-2458: CVE-2025-66418 urllib3 decompression vulnerability - Create dependencies/cve-constraints.txt as single source of truth for CVE-induced minimum version constraints - Update pylocks_generator.sh to use --constraints flag with CVE file - Update comments on urllib3 overrides in jupyter images to explain that override is needed because odh-elyra's appengine-python-standard has an obnoxious urllib3<2 constraint - Add docs/cves/python.md documenting the CVE resolution workflow This approach: - Centralizes CVE fixes in one file - Applies constraints at resolution time via uv pip compile --constraints - Uses override-dependencies only where needed (odh-elyra conflict) - Prevents CVEs from returning through transitive dependencies Thanks to Adriana Theodorakopoulou for contributing the CVE resolution workflow documentation. (cherry picked from commit 2e8d387)
…generator.sh` to Python (opendatahub-io#3057) **Files created:** - `scripts/pylocks_generator.py` -- Python rewrite of the bash script using typer for CLI **Files modified:** - `pyproject.toml` -- added `typer` to dev dependencies - `uv.lock` -- updated by `uv lock` to include typer as explicit dep - `Makefile` -- `refresh-lock-files` target now calls `python3 scripts/pylocks_generator.py` - `ci/generate_code.sh` -- uses `"${REPO_ROOT}/uv" run scripts/pylocks_generator.py` (consistent with adjacent lines) - `scripts/lockfile-generators/create-requirements-lockfile.sh` -- all references updated from `.sh` to `.py` **Files deleted:** - `scripts/pylocks_generator.sh` -- the original bash script **Verification results:** - `--help` shows the correct typer CLI with `INDEX_MODE` and `TARGET_DIR` arguments - Running against `jupyter/minimal/ubi9-python-3.12` produced byte-identical lock files to the bash script (zero diff after fixing arg formatting) - Correctly detects Python version, flavors, effective mode, and generates all three flavor locks (CPU, CUDA, ROCM) Minimal, focused PR: rewrite the script, update call sites, delete old script. No tests, no ruff include changes (follow-up). - **1 deleted file** (`pylocks_generator.sh`, 376 lines) - **1 added file** (`pylocks_generator.py`, ~200-250 lines), structured to mirror the bash script's section layout so reviewers can compare side by side - **4 small edits** (pyproject.toml, Makefile, ci/generate_code.sh, create-requirements-lockfile.sh) Add `typer` to `[dependency-groups] dev` in [pyproject.toml](pyproject.toml) (it's already in `uv.lock` as a transitive dep, but needs to be explicit). Run `uv lock` to sync. Use typer for CLI with the same interface as the bash script: ```python """Generate Python dependency lock files (pylock.toml) using uv pip compile.""" from __future__ import annotations import os import re import subprocess import sys from pathlib import Path import typer ROOT_DIR = Path(__file__).resolve().parent.parent UV = ROOT_DIR / "uv" CVE_CONSTRAINTS_FILE = ROOT_DIR / "dependencies" / "cve-constraints.txt" PUBLIC_INDEX = "--default-index=https://pypi.org/simple" MAIN_DIRS = ["jupyter", "runtimes", "rstudio", "codeserver"] UV_MIN_VERSION = (0, 4, 0) app = typer.Typer() @app.command() def main( index_mode: str = typer.Argument("auto", help="..."), target_dir: Path | None = typer.Argument(None, help="..."), ) -> None: ... ``` Key functions (same logical sections as the bash script): - `read_conf_value(conf_file: Path, key: str) -> str | None` -- simple line parser replacing awk - `check_uv()` -- verify `./uv` exists, parse version, compare as `tuple(int, ...)` >= `(0, 4, 0)` - `find_target_dirs(target_dir: Path | None) -> list[Path]` -- `pathlib.glob("**/pyproject.toml")` - `detect_flavors(project_dir: Path) -> set[str]` -- check `Dockerfile.{cpu,cuda,rocm}` existence - `get_index_flags(project_dir: Path, flavor: str) -> list[str] | None` -- read `build-args/<flavor>.conf` - `run_lock(project_dir, flavor, index_flags, mode, python_version, upgrade) -> bool` -- `subprocess.run(cwd=project_dir)` with list args Implementation notes: - Build the `uv pip compile` command as a list, conditionally appending `--upgrade` - Use `cwd=target_dir` in `subprocess.run()` and `os.path.relpath()` for CVE constraints path - `FORCE_LOCKFILES_UPGRADE` env var read via `os.environ.get()` - [Makefile](Makefile) line 443: `bash scripts/pylocks_generator.sh $(INDEX_MODE) $(DIR)` -> `python3 scripts/pylocks_generator.py $(INDEX_MODE) $(DIR)` - [ci/generate_code.sh](ci/generate_code.sh) line 11: `bash scripts/pylocks_generator.sh` -> `"${REPO_ROOT}/uv" run scripts/pylocks_generator.py` (matches the two lines above it) - [scripts/lockfile-generators/create-requirements-lockfile.sh](scripts/lockfile-generators/create-requirements-lockfile.sh) -- all references: - Line 31: `PYLOCKS_GENERATOR="scripts/pylocks_generator.sh"` -> `.py` - Line 43: help text reference - Line 57: help text reference - Lines 73-74: guard check and error message - Line 115: `bash "$PYLOCKS_GENERATOR"` -> `python3 "$PYLOCKS_GENERATOR"` Clean deletion. It remains in git history. Run `python3 scripts/pylocks_generator.py --help` and a targeted invocation like `python3 scripts/pylocks_generator.py auto jupyter/minimal/ubi9-python-3.12`. * Here's what I addressed from the CodeRabbit review: **Fixed (3 items):** - **Makefile**: Changed `python3` to `./uv run` so typer is available via the dev environment - **`mode: str` -> `mode: IndexMode`**: `run_lock` now takes the enum directly, and the caller passes `IndexMode.rh_index` / `IndexMode.public_index` instead of raw strings - **`find_target_dirs` sort/dedup**: Uses a `set` for dedup and returns `sorted(dirs)` for stable CI output **Rejected (4 items):** - **`@app.callback()`**: Tested it -- produces a misleading `COMMAND [ARGS]...` in help. `@app.command()` works correctly with enum arguments; `auto` is parsed as the enum value, not a subcommand name - **`read_conf_value` hardening (export/quotes)**: The actual `.conf` files don't use these patterns, and the original bash awk parser didn't handle them either. Unnecessary scope creep - **Skipped dirs tracking**: The bash script didn't track these either. Behavior parity - **`source "$CONF_FILE"` in create-requirements-lockfile.sh**: Pre-existing, not introduced by this PR (cherry picked from commit 8da26d3)
fix: downgrade to Python 3.12 and fix CI failures - Pin Python to 3.12 (uv 0.8.12 only has 3.14.0rc2 which breaks pydantic) - Update pyright/ruff target versions to 3.12 - Fix except syntax for 3.12 compatibility (parenthesized form) - Remove generate_kustomization.py call (not present on rhoai-2.25) Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> fix: resolve pre-commit failures (ruff, pyright) Cherry-pick fixes from main (50c960f): - RUF067: move PROJECT_ROOT from __init__.py to _common.py - pyright: remove return from finally block in check-software-versions.py - ruff format: reformat 2 test files Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> fix: update stale references and regenerate uv.lock - docs/cves/python.md: replace pylocks_generator.sh with .py - README.md: fix uv version examples (0.10.6 → 0.8.12) - uv.lock: regenerate after typer dependency addition Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
c160f51 to
98dfacd
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
ci/check-software-versions.py (1)
254-262: LGTM — cleanup failure no longer short-circuits the return.Setting
ret_code = 1in thefinallybranch instead ofreturn 1preserves any prior failure signal and avoids returning from afinally(which would mask exceptions). Behavior is otherwise equivalent sinceprint_delimiter()is now invoked on both paths.Optional nit:
print_delimiter()is called in both branches and could be hoisted out of the if/else for a small readability win:♻️ Optional simplification
finally: if stop_and_remove_container(container_id) != 0: log.error(f"Failed to stop/remove the container '{container_id}' for the '{image_ref}' tag!") - print_delimiter() ret_code = 1 - else: - print_delimiter() + print_delimiter()🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@ci/check-software-versions.py` around lines 254 - 262, The finally block currently sets ret_code = 1 when stop_and_remove_container(container_id) returns non-zero and calls print_delimiter() in both branches; simplify by keeping the failure behavior (set ret_code = 1 on non-zero from stop_and_remove_container) but hoist print_delimiter() out of the if/else to avoid duplication; locate the finally handling around stop_and_remove_container(container_id), ret_code, container_id, image_ref and move the single print_delimiter() call to execute unconditionally after the if/else while preserving the ret_code assignment and the log.error(...) call on failure.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@scripts/pylocks_generator.py`:
- Around line 136-152: The check_uv() logic currently assumes uv --version
succeeds and that all dot-separated tokens are pure integers; update it to first
check subprocess.run(...).returncode and treat a non-zero return as a failure
(log the actual stderr/stdout and exit), then robustly parse version_str
obtained from result.stdout by stripping non-numeric suffixes or by extracting
only the leading numeric components (ignore tokens with non-digit characters or
stop parsing at the first non-numeric token) and guard the int() conversion with
try/except to catch ValueError; ensure any parsing failure falls back to a safe
"0.0.0" (or logs the raw version and exits) before comparing the resulting tuple
to UV_MIN_VERSION, referencing UV, version_str, version_tuple and UV_MIN_VERSION
in your changes.
---
Nitpick comments:
In `@ci/check-software-versions.py`:
- Around line 254-262: The finally block currently sets ret_code = 1 when
stop_and_remove_container(container_id) returns non-zero and calls
print_delimiter() in both branches; simplify by keeping the failure behavior
(set ret_code = 1 on non-zero from stop_and_remove_container) but hoist
print_delimiter() out of the if/else to avoid duplication; locate the finally
handling around stop_and_remove_container(container_id), ret_code, container_id,
image_ref and move the single print_delimiter() call to execute unconditionally
after the if/else while preserving the ret_code assignment and the
log.error(...) call on failure.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro Plus
Run ID: 9e550c71-704c-4049-9076-b8b8db1e94aa
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (21)
.github/workflows/build-notebooks-TEMPLATE.yaml.github/workflows/code-quality.yaml.github/workflows/docs.yaml.pre-commit-config.yaml.python-versionMakefileREADME.mdci/check-software-versions.pyci/generate_code.shdependencies/cve-constraints.txtdocs/cves/python.mdjupyter/datascience/ubi9-python-3.12/pyproject.tomljupyter/pytorch/ubi9-python-3.12/pyproject.tomljupyter/rocm/pytorch/ubi9-python-3.12/pyproject.tomljupyter/trustyai/ubi9-python-3.12/pyproject.tomlpyproject.tomlscripts/pylocks_generator.pyscripts/pylocks_generator.shtests/__init__.pytests/_common.pytests/containers/workbenches/workbench_image_test.py
💤 Files with no reviewable changes (1)
- scripts/pylocks_generator.sh
✅ Files skipped from review due to trivial changes (10)
- .python-version
- tests/containers/workbenches/workbench_image_test.py
- tests/init.py
- tests/_common.py
- jupyter/trustyai/ubi9-python-3.12/pyproject.toml
- jupyter/datascience/ubi9-python-3.12/pyproject.toml
- pyproject.toml
- jupyter/rocm/pytorch/ubi9-python-3.12/pyproject.toml
- README.md
- jupyter/pytorch/ubi9-python-3.12/pyproject.toml
🚧 Files skipped from review as they are similar to previous changes (6)
- ci/generate_code.sh
- .github/workflows/docs.yaml
- .github/workflows/code-quality.yaml
- .pre-commit-config.yaml
- dependencies/cve-constraints.txt
- Makefile
| try: | ||
| result = subprocess.run( | ||
| [str(UV), "--version"], | ||
| capture_output=True, | ||
| text=True, | ||
| check=False, | ||
| ) | ||
| version_str = result.stdout.strip().split()[1] if result.stdout.strip() else "0.0.0" | ||
| except (IndexError, FileNotFoundError): | ||
| version_str = "0.0.0" | ||
|
|
||
| version_tuple = tuple(int(x) for x in version_str.split(".")) | ||
| if version_tuple < UV_MIN_VERSION: | ||
| min_ver = ".".join(str(x) for x in UV_MIN_VERSION) | ||
| error(f"uv version {version_str} found, but >= {min_ver} is required.") | ||
| error("Please upgrade uv: https://github.com/astral-sh/uv") | ||
| raise SystemExit(1) |
There was a problem hiding this comment.
check_uv() can crash on non-numeric uv version tokens.
version_str.split(".") followed by int(x) will raise ValueError on any non-numeric component (e.g. uv 0.8.12+abcdef, 0.9.0rc1, or a dev build), and that exception isn’t covered by the except (IndexError, FileNotFoundError) above — the whole script aborts with a traceback instead of a clean error. Also, result.returncode isn’t checked, so a failing uv --version silently falls back to "0.0.0" and the user sees a misleading "version >= 0.4.0 required" message rather than the real problem.
🛡️ Safer parsing
- try:
- result = subprocess.run(
- [str(UV), "--version"],
- capture_output=True,
- text=True,
- check=False,
- )
- version_str = result.stdout.strip().split()[1] if result.stdout.strip() else "0.0.0"
- except (IndexError, FileNotFoundError):
- version_str = "0.0.0"
-
- version_tuple = tuple(int(x) for x in version_str.split("."))
+ try:
+ result = subprocess.run(
+ [str(UV), "--version"],
+ capture_output=True, text=True, check=False,
+ )
+ except FileNotFoundError:
+ error(f"Unable to execute uv wrapper at '{UV}'.")
+ raise SystemExit(1)
+ if result.returncode != 0:
+ error(f"'{UV} --version' failed: {result.stderr.strip() or result.stdout.strip()}")
+ raise SystemExit(1)
+ m = re.match(r"uv\s+(\d+)\.(\d+)\.(\d+)", result.stdout.strip())
+ if not m:
+ error(f"Could not parse uv version from: {result.stdout.strip()!r}")
+ raise SystemExit(1)
+ version_tuple = tuple(int(x) for x in m.groups())
+ version_str = ".".join(str(x) for x in version_tuple)🧰 Tools
🪛 Ruff (0.15.10)
[error] 137-137: subprocess call: check for execution of untrusted input
(S603)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/pylocks_generator.py` around lines 136 - 152, The check_uv() logic
currently assumes uv --version succeeds and that all dot-separated tokens are
pure integers; update it to first check subprocess.run(...).returncode and treat
a non-zero return as a failure (log the actual stderr/stdout and exit), then
robustly parse version_str obtained from result.stdout by stripping non-numeric
suffixes or by extracting only the leading numeric components (ignore tokens
with non-digit characters or stop parsing at the first non-numeric token) and
guard the int() conversion with try/except to catch ValueError; ensure any
parsing failure falls back to a safe "0.0.0" (or logs the raw version and exits)
before comparing the resulting tuple to UV_MIN_VERSION, referencing UV,
version_str, version_tuple and UV_MIN_VERSION in your changes.
Summary
Cherry-picks developer tooling improvements from
maintorhoai-2.25:dependencies/cve-constraints.txt— centralized CVE version constraints applied during lock generation (fix(CVE): centralize CVE constraints and document resolution workflow opendatahub-io/notebooks#2886)uv.toml+./uvwrapper — pins uv version to==0.8.12per Slack guidance (newer versions break wheel handling on 2.25) (ISSUE #3032: chore(uv): pin uv version to fix CI check-generated-code failure opendatahub-io/notebooks#3034)scripts/pylocks_generator.py— Python rewrite of lock generator, replacing shell script (ISSUE #3036: chore(scripts/): rewritescripts/pylocks_generator.shto Python opendatahub-io/notebooks#3057)refresh-pipfilelock-files→refresh-lock-files(consistent with main) (NO-JIRA: chore: fix Makefiletesttarget to use correct./uvscript opendatahub-io/notebooks#3077)All commits cherry-picked with
-xfor traceability, plus one adjustment commit to pin uv to 0.8.12.Note: keras migration commit was intentionally skipped — not applicable to rhoai-2.25.
Related: opendatahub-io#3197
Test plan
./uv --versionshows0.8.12cat dependencies/cve-constraints.txtshows urllib3 constraintmake refresh-lock-filestarget works (oldrefresh-pipfilelock-filestarget removed)make testpasses🤖 Generated with Claude Code
Summary by CodeRabbit
New Features
Documentation
Chores