Skip to content

[rhoai-2.25] chore: cherry-pick developer tooling QoL from main#2056

Open
jiridanek wants to merge 3 commits intorhoai-2.25from
cherry-pick-qol-rhoai-2.25
Open

[rhoai-2.25] chore: cherry-pick developer tooling QoL from main#2056
jiridanek wants to merge 3 commits intorhoai-2.25from
cherry-pick-qol-rhoai-2.25

Conversation

@jiridanek
Copy link
Copy Markdown
Member

@jiridanek jiridanek commented Mar 26, 2026

Summary

Cherry-picks developer tooling improvements from main to rhoai-2.25:

All commits cherry-picked with -x for traceability, plus one adjustment commit to pin uv to 0.8.12.

Note: keras migration commit was intentionally skipped — not applicable to rhoai-2.25.

Related: opendatahub-io#3197

Test plan

  • ./uv --version shows 0.8.12
  • cat dependencies/cve-constraints.txt shows urllib3 constraint
  • make refresh-lock-files target works (old refresh-pipfilelock-files target removed)
  • make test passes

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features

    • Enhanced lock file generation with support for multiple dependency indexes and flavor variants.
  • Documentation

    • Added comprehensive guide for addressing Python CVEs in image dependencies.
  • Chores

    • Updated Python version from 3.14 to 3.12.
    • Streamlined build system configuration and CI/CD workflows.
    • Migrated lock file generation to Python-based implementation.

@openshift-ci openshift-ci Bot requested review from daniellutz and dibryant March 26, 2026 18:35
@openshift-ci
Copy link
Copy Markdown

openshift-ci Bot commented Mar 26, 2026

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign jstourac for approval. For more information see the Code Review Process.

The full list of commands accepted by this bot can be found here.

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 26, 2026

📝 Walkthrough

Walkthrough

Multiple configuration and script updates across the project: Python version targets downgraded from 3.14 to 3.12 across workflows and tools; lockfile generation refactored from bash script to Python CLI with expanded build-arg configuration; CVE constraint added for urllib3 with documentation; Makefile enhanced for conditional build arguments and prefetch logic.

Changes

Cohort / File(s) Summary
CI/CD Workflows
.github/workflows/build-notebooks-TEMPLATE.yaml, .github/workflows/code-quality.yaml, .github/workflows/docs.yaml
Updated astral-sh/setup-uv@v7 and actions/setup-python@v6 steps to derive Python version from .python-version file instead of hardcoding 3.14.
Python Target Version
.python-version, pyproject.toml, .pre-commit-config.yaml
Downgraded Python target from 3.14 to 3.12 in version file, project configuration (dependencies and tool versions), and pre-commit pyright check.
Lockfile Generation Refactoring
Makefile, scripts/pylocks_generator.py, scripts/pylocks_generator.sh, ci/generate_code.sh
Replaced bash script with new Python Typer CLI; added refresh-lock-files make target; enhanced build-args configuration with conditional mounts and LOCAL\_BUILD flag logic; updated generate\_code.sh to invoke Python script.
CVE Remediation Infrastructure
dependencies/cve-constraints.txt, docs/cves/python.md
Added urllib3>=2.6.0 CVE-2025-66418 constraint with documentation; introduced comprehensive CVE remediation guide covering direct/transitive dependencies, constraint mechanisms, and resolution workflows.
Image-Specific CVE Overrides
jupyter/datascience/ubi9-python-3.12/pyproject.toml, jupyter/pytorch/ubi9-python-3.12/pyproject.toml, jupyter/rocm/pytorch/ubi9-python-3.12/pyproject.toml, jupyter/trustyai/ubi9-python-3.12/pyproject.toml
Updated comments for urllib3 override entries to clarify origin (odh-elyra → appengine-python-standard) instead of upstream Elyra issue reference; override values unchanged.
Documentation
README.md
Updated pinned uv version examples from 0.10.6 to 0.8.12 in alternative installation instructions.
Test Infrastructure
tests/__init__.py, tests/_common.py, tests/containers/workbenches/workbench_image_test.py
Extracted PROJECT_ROOT computation into new tests/_common.py shared module; adjusted lambda formatting in SSH tunnel predicate expression.
CI Utilities
ci/check-software-versions.py
Modified error handling in container cleanup logic to set return code instead of early return, allowing proper control flow to completion.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Suggested reviewers

  • daniellutz
  • dibryant
🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 64.29% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly identifies this as a cherry-pick of developer tooling improvements to rhoai-2.25 branch, accurately reflecting the main purpose of the changeset.
Description check ✅ Passed The PR description provides detailed summary of cherry-picked changes, test plan checklist, and related issues, though the self-checklist items (make test) and merge criteria boxes remain unchecked.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch cherry-pick-qol-rhoai-2.25

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@jiridanek
Copy link
Copy Markdown
Member Author

@coderabbitai full review

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 27, 2026

✅ Actions performed

Full review triggered.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
.github/workflows/piplock-renewal.yaml (1)

87-89: ⚠️ Potential issue | 🔴 Critical

Update workflow to use correct Makefile target: refresh-lock-files

The Makefile no longer contains the refresh-pipfilelock-files target. The target was renamed to refresh-lock-files, which is defined at line 438 of the Makefile. Update the workflow to call the correct target name.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/piplock-renewal.yaml around lines 87 - 89, The workflow
step currently invokes the removed Makefile target refresh-pipfilelock-files;
update that CI step to call the renamed Makefile target refresh-lock-files
instead. Specifically, change the job step labeled "Run make
refresh-pipfilelock-files" so its run command uses make refresh-lock-files with
the same PYTHON_VERSION and INCLUDE_OPT_DIRS environment arguments (i.e.,
replace refresh-pipfilelock-files with refresh-lock-files in the run line).
Ensure no other references to refresh-pipfilelock-files remain.
🧹 Nitpick comments (2)
uv (1)

39-40: Slow path requires uv to be installed.

The slow path uses uv tool run "uv@${version}" which requires a working uv installation. If no uv is installed at all on the system, this will fail with a "command not found" error rather than a helpful message guiding the user to install uv first.

Consider adding a check or a clearer error message:

🔧 Proposed improvement
 # Slow path: run the pinned version via uvx (downloaded and cached on first use)
+if ! command -v uv &>/dev/null; then
+    echo "error: uv is not installed. Please install uv first: https://docs.astral.sh/uv/getting-started/installation/" >&2
+    exit 1
+fi
 exec uv tool run "uv@${version}" "$@"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@uv` around lines 39 - 40, The slow-path call using exec uv tool run
"uv@${version}" "$@" will fail with a "command not found" if uv isn't installed;
add a pre-check that verifies the uv binary is present (e.g., check command -v
uv / which uv) and if missing print a clear error guiding the user to install uv
before attempting the slow path (or fall back to a different installer path),
then only call exec uv tool run "uv@${version}" "$@" when the check succeeds.
docs/cves/python.md (1)

19-21: Add languages to the new fenced blocks.

markdownlint is already flagging MD040 on these fences. Adding text, bash, or toml/ini keeps the doc lint-clean.

Also applies to: 28-32, 41-45, 116-119, 176-179

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/cves/python.md` around lines 19 - 21, Several fenced code blocks (e.g.,
the one containing "dependencies/cve-constraints.txt") are missing language
specifiers and trigger markdownlint MD040; update each affected triple-backtick
fence to include an appropriate language tag (for example use "text" for plain
filenames, "bash" for shell snippets, or "toml"/"ini" for config blocks) so the
fences at the indicated occurrences are language-tagged and lint-clean.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/cves/python.md`:
- Around line 34-35: Update all references to the removed
scripts/pylocks_generator.sh in this document (notably around the paragraph
currently numbered "2. Automatic application" and the sections at lines 47-52
and 205-216) to point to the new entrypoints: use "./uv run
scripts/pylocks_generator.py ..." or the Make target "make refresh-lock-files"
where appropriate; ensure examples, command snippets, and the related-files
section consistently reference scripts/pylocks_generator.py or the make target
instead of the deleted .sh script.

In `@Makefile`:
- Around line 93-100: The prefetch and cachi2 existence checks use caller CWD;
update all path checks and wildcards to resolve via ROOT_DIR so they mirror the
rest of the build logic: change uses of $(wildcard $(BUILD_DIR)prefetch-input),
[ -d '$(BUILD_DIR)prefetch-input' ], and [ ! -d cachi2/output ] to reference
$(ROOT_DIR) (e.g., $(ROOT_DIR)$(BUILD_DIR)prefetch-input and
$(ROOT_DIR)cachi2/output) and likewise update the CACHI2_VOLUME and
LOCAL_BUILD_ARG evaluations to base their wildcard tests on $(ROOT_DIR) to
ensure invocations outside repo root see the same cache/mount behavior.

In `@pyproject.toml`:
- Line 34: The pipeline fails because the new dependency "typer" was added to
pyproject.toml but the lock file (uv.lock) wasn't regenerated; run the project's
lock-refresh task to update the lockfile (execute make refresh-lock-files) so
uv.lock is regenerated and the missing `source`/matching-package error for
"typer" is resolved, then commit the updated lock file alongside the pyproject
change.

In `@README.md`:
- Around line 84-100: Update the version strings in the README examples to match
uv.toml (change 0.10.6 → 0.8.12) by replacing occurrences in the three examples:
the uvx invocation (uvx uv@0.10.6 sync --locked → uvx uv@0.8.12 sync --locked),
the uv tool run invocation (uv tool run uv@0.10.6 sync --locked → uv tool run
uv@0.8.12 sync --locked), and the installer/pip lines (curl
.../uv/0.10.6/install.sh → .../uv/0.8.12/install.sh and pip install uv==0.10.6 →
pip install uv==0.8.12) so the README and uv.toml versions are consistent.

In `@scripts/pylocks_generator.py`:
- Around line 345-354: The loop silently skips directories when
extract_python_version(tdir) returns None or detect_flavors(tdir) returns empty,
causing the script to succeed with no output; change these early-continue paths
to fail-fast by emitting a clear error (use warn or a new error log) and exiting
non-zero (e.g., sys.exit(1) or raise a RuntimeError) so CI/local runs detect the
bad DIR=; update the blocks around extract_python_version(tdir) and
detect_flavors(tdir) accordingly and add an import for sys if using sys.exit.
- Around line 240-296: The script currently unconditionally deletes the
destination pylock file on failure (output_path.unlink) which can remove a
previously good checked-in lockfile; change the flow to preserve/restore the
existing lockfile: before running subprocess.run, check if (project_dir /
output).exists() and if so atomically move it to a backup (e.g., backup_path =
project_dir / f"{output}.bak" or similar) and then run cmd; if result.returncode
!= 0 restore the backup to output_path (or leave it in place) instead of
unconditionally unlinking, and if the run succeeds remove the backup; use the
existing symbols output, project_dir, output_path and result to implement the
backup/restore logic.

---

Outside diff comments:
In @.github/workflows/piplock-renewal.yaml:
- Around line 87-89: The workflow step currently invokes the removed Makefile
target refresh-pipfilelock-files; update that CI step to call the renamed
Makefile target refresh-lock-files instead. Specifically, change the job step
labeled "Run make refresh-pipfilelock-files" so its run command uses make
refresh-lock-files with the same PYTHON_VERSION and INCLUDE_OPT_DIRS environment
arguments (i.e., replace refresh-pipfilelock-files with refresh-lock-files in
the run line). Ensure no other references to refresh-pipfilelock-files remain.

---

Nitpick comments:
In `@docs/cves/python.md`:
- Around line 19-21: Several fenced code blocks (e.g., the one containing
"dependencies/cve-constraints.txt") are missing language specifiers and trigger
markdownlint MD040; update each affected triple-backtick fence to include an
appropriate language tag (for example use "text" for plain filenames, "bash" for
shell snippets, or "toml"/"ini" for config blocks) so the fences at the
indicated occurrences are language-tagged and lint-clean.

In `@uv`:
- Around line 39-40: The slow-path call using exec uv tool run "uv@${version}"
"$@" will fail with a "command not found" if uv isn't installed; add a pre-check
that verifies the uv binary is present (e.g., check command -v uv / which uv)
and if missing print a clear error guiding the user to install uv before
attempting the slow path (or fall back to a different installer path), then only
call exec uv tool run "uv@${version}" "$@" when the check succeeds.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: dd66e507-084f-45f0-997a-e92035f1bcac

📥 Commits

Reviewing files that changed from the base of the PR and between 3005b0f and 1955290.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (20)
  • .github/workflows/build-notebooks-TEMPLATE.yaml
  • .github/workflows/code-quality.yaml
  • .github/workflows/docs.yaml
  • .github/workflows/piplock-renewal.yaml
  • .github/workflows/security.yaml
  • .pre-commit-config.yaml
  • Makefile
  • README.md
  • ci/generate_code.sh
  • dependencies/cve-constraints.txt
  • docs/cves/python.md
  • jupyter/datascience/ubi9-python-3.12/pyproject.toml
  • jupyter/pytorch/ubi9-python-3.12/pyproject.toml
  • jupyter/rocm/pytorch/ubi9-python-3.12/pyproject.toml
  • jupyter/trustyai/ubi9-python-3.12/pyproject.toml
  • pyproject.toml
  • scripts/pylocks_generator.py
  • scripts/pylocks_generator.sh
  • uv
  • uv.toml
💤 Files with no reviewable changes (1)
  • scripts/pylocks_generator.sh

Comment thread docs/cves/python.md
Comment thread Makefile
Comment on lines +93 to +100
$(eval LOCAL_BUILD_ARG := $(if $(wildcard $(BUILD_DIR)prefetch-input),--build-arg LOCAL_BUILD=true,))
$(eval CACHI2_VOLUME := $(if $(and $(wildcard cachi2/output),$(wildcard $(BUILD_DIR)prefetch-input)),--volume $(ROOT_DIR)cachi2/output:/cachi2/output:Z,))

$(info # Building $(IMAGE_NAME) using $(DOCKERFILE_NAME) with $(CONF_FILE) and $(BUILD_ARGS)...)

@if [ -d '$(BUILD_DIR)prefetch-input' ] && [ ! -d cachi2/output ]; then \
echo "Prefetch required for hermetic build. Run: scripts/lockfile-generators/prefetch-all.sh --component-dir $(patsubst %/,%,$(BUILD_DIR)) -- see scripts/lockfile-generators/README.md"; \
exit 1; \
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot Mar 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Resolve prefetch and cachi2 paths from $(ROOT_DIR).

These checks are currently relative to the caller's CWD, while the rest of the build logic is rooted via ROOT_DIR. Invoking the Makefile outside the repo root can miss an existing cache and incorrectly fail or skip the hermetic mount.

💡 Suggested fix
-	$(eval LOCAL_BUILD_ARG := $(if $(wildcard $(BUILD_DIR)prefetch-input),--build-arg LOCAL_BUILD=true,))
-	$(eval CACHI2_VOLUME := $(if $(and $(wildcard cachi2/output),$(wildcard $(BUILD_DIR)prefetch-input)),--volume $(ROOT_DIR)cachi2/output:/cachi2/output:Z,))
+	$(eval LOCAL_BUILD_ARG := $(if $(wildcard $(ROOT_DIR)$(BUILD_DIR)prefetch-input),--build-arg LOCAL_BUILD=true,))
+	$(eval CACHI2_VOLUME := $(if $(and $(wildcard $(ROOT_DIR)cachi2/output),$(wildcard $(ROOT_DIR)$(BUILD_DIR)prefetch-input)),--volume $(ROOT_DIR)cachi2/output:/cachi2/output:Z,))
@@
-	`@if` [ -d '$(BUILD_DIR)prefetch-input' ] && [ ! -d cachi2/output ]; then \
+	`@if` [ -d '$(ROOT_DIR)$(BUILD_DIR)prefetch-input' ] && [ ! -d '$(ROOT_DIR)cachi2/output' ]; then \
 	  echo "Prefetch required for hermetic build. Run: scripts/lockfile-generators/prefetch-all.sh --component-dir $(patsubst %/,%,$(BUILD_DIR)) -- see scripts/lockfile-generators/README.md"; \
 	  exit 1; \
 	fi
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@Makefile` around lines 93 - 100, The prefetch and cachi2 existence checks use
caller CWD; update all path checks and wildcards to resolve via ROOT_DIR so they
mirror the rest of the build logic: change uses of $(wildcard
$(BUILD_DIR)prefetch-input), [ -d '$(BUILD_DIR)prefetch-input' ], and [ ! -d
cachi2/output ] to reference $(ROOT_DIR) (e.g.,
$(ROOT_DIR)$(BUILD_DIR)prefetch-input and $(ROOT_DIR)cachi2/output) and likewise
update the CACHI2_VOLUME and LOCAL_BUILD_ARG evaluations to base their wildcard
tests on $(ROOT_DIR) to ensure invocations outside repo root see the same
cache/mount behavior.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should not be cherrypicked?

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jiridanek, that's a fair point. If the LOCAL_BUILD_ARG / CACHI2_VOLUME / prefetch-input guard block was not intentionally included in this cherry-pick (it doesn't appear in the listed PR objectives for rhoai-2.25), then my path-resolution comment doesn't apply here.

Would you like to:

  1. Revert those specific lines from the build_image function to keep this PR scoped to the intended changes (uv pinning, cve-constraints, pylocks_generator rewrite, Makefile target rename)?
  2. Or keep the changes but address the ROOT_DIR path concern as a follow-up?

If the prefetch/hermetic-build support is indeed out of scope for rhoai-2.25, I can resolve this comment.


🧠 Learnings used
Learnt from: CR
Repo: red-hat-data-services/notebooks PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-03-26T19:41:58.522Z
Learning: Applies to Makefile : Set `KONFLUX` Makefile variable consistently across build and test steps; use `KONFLUX=no` for `Dockerfile.*` and `manifests/odh/base/` manifests, or `KONFLUX=yes` for `Dockerfile.konflux.*` and `manifests/rhoai/base/` manifests

Learnt from: CR
Repo: red-hat-data-services/notebooks PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-03-26T19:41:58.522Z
Learning: Applies to **/Dockerfile{,.konflux*} : Minimize Docker layers in Dockerfiles

Learnt from: CR
Repo: red-hat-data-services/notebooks PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-03-26T19:41:58.522Z
Learning: Applies to **/Dockerfile{,.konflux*} : Follow security best practices in Dockerfiles

Comment thread pyproject.toml
Comment thread README.md Outdated
Comment on lines +240 to +296
if mode == IndexMode.public_index:
output = "pylock.toml"
desc = "pylock.toml (public index)"
print("➡️ Generating pylock.toml from public PyPI index...")
else:
(project_dir / "uv.lock.d").mkdir(exist_ok=True)
output = f"uv.lock.d/pylock.{flavor}.toml"
desc = f"{flavor.upper()} lock file"
print(f"➡️ Generating {flavor.upper()} lock file...")

# Tag filtering was added in uv 0.9.16 (https://github.com/astral-sh/uv/pull/16956)
# but bypassed in --universal mode. uv 0.10.5 (https://github.com/astral-sh/uv/pull/18081)
# now filters wheels by requires-python and marker disjointness even in --universal mode.
# Documentation at https://docs.astral.sh/uv/reference/cli/#uv-pip-compile--python-platform says that
# `--python-platform linux` is alias for `x86_64-unknown-linux-gnu`; we cannot use this to get a multiarch pylock
# Let's use --universal temporarily, and in the future we can switch to using uv.lock
# when https://github.com/astral-sh/uv/issues/6830 is resolved, or symlink `ln -s uv.lock.d/uv.${flavor}.lock uv.lock`
# Note: currently generating uv.lock.d/pylock.${flavor}.toml; future rename to uv.${flavor}.lock is planned
# See also --universal discussion with Gerard
# https://redhat-internal.slack.com/archives/C0961HQ858Q/p1757935641975969?thread_ts=1757542802.032519&cid=C0961HQ858Q
cmd: list[str] = [
str(UV),
"pip",
"compile",
"pyproject.toml",
"--output-file",
output,
"--format",
"pylock.toml",
"--generate-hashes",
"--emit-index-url",
f"--python-version={python_version}",
"--universal",
"--no-annotate",
"--quiet",
]

for pkg in NO_EMIT_PACKAGES:
cmd.extend(["--no-emit-package", pkg])

if upgrade:
cmd.append("--upgrade")

# Use relative path to avoid absolute paths in pylock.toml headers
if CVE_CONSTRAINTS_FILE.is_file():
relative_constraints = os.path.relpath(CVE_CONSTRAINTS_FILE, project_dir)
cmd.extend(["--constraints", relative_constraints])

cmd.extend(index_flags)

result = subprocess.run(cmd, cwd=project_dir, check=False)

if result.returncode != 0:
warn(f"Failed to generate {desc} in {project_dir}")
output_path = project_dir / output
output_path.unlink(missing_ok=True)
return False
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Preserve the last good lockfile when compilation fails.

unlink() here removes the destination unconditionally. If uv pip compile fails because of a transient resolver or index problem, an existing checked-in pylock.toml / uv.lock.d/pylock.*.toml gets deleted and the repo is left in a worse state than before the run.

💡 Safer pattern
     if mode == IndexMode.public_index:
-        output = "pylock.toml"
+        output = Path("pylock.toml")
         desc = "pylock.toml (public index)"
         print("➡️ Generating pylock.toml from public PyPI index...")
     else:
         (project_dir / "uv.lock.d").mkdir(exist_ok=True)
-        output = f"uv.lock.d/pylock.{flavor}.toml"
+        output = Path("uv.lock.d") / f"pylock.{flavor}.toml"
         desc = f"{flavor.upper()} lock file"
         print(f"➡️ Generating {flavor.upper()} lock file...")
+    tmp_output = output.with_suffix(output.suffix + ".tmp")

     cmd: list[str] = [
         str(UV),
         "pip",
         "compile",
         "pyproject.toml",
         "--output-file",
-        output,
+        str(tmp_output),
         "--format",
         "pylock.toml",
         "--generate-hashes",
         "--emit-index-url",
         f"--python-version={python_version}",
@@
     result = subprocess.run(cmd, cwd=project_dir, check=False)

     if result.returncode != 0:
         warn(f"Failed to generate {desc} in {project_dir}")
-        output_path = project_dir / output
-        output_path.unlink(missing_ok=True)
+        (project_dir / tmp_output).unlink(missing_ok=True)
         return False

+    (project_dir / tmp_output).replace(project_dir / output)
     ok(f"{desc} generated successfully.")
🧰 Tools
🪛 Ruff (0.15.7)

[error] 290-290: subprocess call: check for execution of untrusted input

(S603)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/pylocks_generator.py` around lines 240 - 296, The script currently
unconditionally deletes the destination pylock file on failure
(output_path.unlink) which can remove a previously good checked-in lockfile;
change the flow to preserve/restore the existing lockfile: before running
subprocess.run, check if (project_dir / output).exists() and if so atomically
move it to a backup (e.g., backup_path = project_dir / f"{output}.bak" or
similar) and then run cmd; if result.returncode != 0 restore the backup to
output_path (or leave it in place) instead of unconditionally unlinking, and if
the run succeeds remove the backup; use the existing symbols output,
project_dir, output_path and result to implement the backup/restore logic.

Comment on lines +345 to +354
python_version = extract_python_version(tdir)
if python_version is None:
warn(f"Could not extract valid Python version from directory name: {tdir}")
warn("Expected directory format: .../ubi9-python-X.Y")
continue

flavors = detect_flavors(tdir)
if not flavors:
warn(f"No Dockerfiles found in {tdir} (cpu/cuda/rocm). Skipping.")
continue
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Make explicit DIR= mistakes fail the command.

When the Makefile's DIR= flow points at the wrong directory, these continues skip work but never record a failure. The command can exit 0 after generating nothing, which is a bad failure mode for both local usage and CI.

💡 Minimal fix
         python_version = extract_python_version(tdir)
         if python_version is None:
             warn(f"Could not extract valid Python version from directory name: {tdir}")
             warn("Expected directory format: .../ubi9-python-X.Y")
+            if target_dir is not None:
+                failed_dirs.append(tdir)
             continue

         flavors = detect_flavors(tdir)
         if not flavors:
             warn(f"No Dockerfiles found in {tdir} (cpu/cuda/rocm). Skipping.")
+            if target_dir is not None:
+                failed_dirs.append(tdir)
             continue
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/pylocks_generator.py` around lines 345 - 354, The loop silently skips
directories when extract_python_version(tdir) returns None or
detect_flavors(tdir) returns empty, causing the script to succeed with no
output; change these early-continue paths to fail-fast by emitting a clear error
(use warn or a new error log) and exiting non-zero (e.g., sys.exit(1) or raise a
RuntimeError) so CI/local runs detect the bad DIR=; update the blocks around
extract_python_version(tdir) and detect_flavors(tdir) accordingly and add an
import for sys if using sys.exit.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/cves/python.md`:
- Around line 91-96: Replace bare invocations of the uv wrapper with the
repo-pinned wrapper by changing commands like "uv tree | grep -A5 -B5 tornado"
and "uv tree --invert tornado" to "./uv tree | grep -A5 -B5 tornado" and "./uv
tree --invert tornado" (and update any other plain "uv" examples elsewhere in
the file, e.g., the second uv example block) so the docs always use the pinned
./uv wrapper.
- Around line 19-21: Several fenced code blocks in the docs/cves/python.md
examples (e.g., the block containing "dependencies/cve-constraints.txt" and the
blocks starting with "# CVE-ID: Description", "# RHAIENG-XXXX: CVE-YYYY-ZZZZZ
package_name vulnerability description", "# RHAIENG-2448: CVE-XXXX-YYYY tornado
quadratic DoS", and "# RHAIENG-2458: CVE-2025-66418 urllib3 decompression
vulnerability") are missing language identifiers; add an appropriate language
tag (e.g., ```text or ```ini) to each fenced code block to satisfy MD040 linting
so the blocks become ```text ... ``` (or another suitable tag) for all
occurrences noted (also at the other ranges mentioned).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 41ed7a2d-4352-4032-abf5-5b2779b63ee5

📥 Commits

Reviewing files that changed from the base of the PR and between 1955290 and 4d81430.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (2)
  • README.md
  • docs/cves/python.md
✅ Files skipped from review due to trivial changes (1)
  • README.md

Comment thread docs/cves/python.md
Comment on lines +19 to +21
```
dependencies/cve-constraints.txt
```
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add language identifiers to fenced code blocks to satisfy markdown linting.

Several fenced blocks are missing language tags (MD040).

Suggested doc patch
-```
+```text
 dependencies/cve-constraints.txt
  • # CVE-ID: Description
    # Reference: https://...
    package>=fixed_version
    
  • # RHAIENG-XXXX: CVE-YYYY-ZZZZZ package_name vulnerability description
    # Upstream: https://github.com/...
    package_name>=fixed_version
    
  • # RHAIENG-2448: CVE-XXXX-YYYY tornado quadratic DoS
    tornado>=6.5.3
    
  • # RHAIENG-2458: CVE-2025-66418 urllib3 decompression vulnerability
    urllib3>=2.6.0
    
</details>


Also applies to: 28-32, 41-45, 116-119, 176-179

<details>
<summary>🧰 Tools</summary>

<details>
<summary>🪛 markdownlint-cli2 (0.22.0)</summary>

[warning] 19-19: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

</details>

</details>

<details>
<summary>🤖 Prompt for AI Agents</summary>

Verify each finding against the current code and only fix it if needed.

In @docs/cves/python.md around lines 19 - 21, Several fenced code blocks in the
docs/cves/python.md examples (e.g., the block containing
"dependencies/cve-constraints.txt" and the blocks starting with "# CVE-ID:
Description", "# RHAIENG-XXXX: CVE-YYYY-ZZZZZ package_name vulnerability
description", "# RHAIENG-2448: CVE-XXXX-YYYY tornado quadratic DoS", and "#
RHAIENG-2458: CVE-2025-66418 urllib3 decompression vulnerability") are missing
language identifiers; add an appropriate language tag (e.g., text or ini)
to each fenced code block to satisfy MD040 linting so the blocks become text ... (or another suitable tag) for all occurrences noted (also at the other
ranges mentioned).


</details>

<!-- fingerprinting:phantom:triton:hawk:fee7b910-a35d-4675-bb7d-5a85f1ec4e23 -->

<!-- This is an auto-generated comment by CodeRabbit -->

Comment thread docs/cves/python.md
@jiridanek jiridanek force-pushed the cherry-pick-qol-rhoai-2.25 branch from 4d81430 to d26add0 Compare March 31, 2026 09:43
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (4)
docs/cves/python.md (2)

90-96: ⚠️ Potential issue | 🟡 Minor

Use the pinned ./uv wrapper in examples.

These command snippets still use bare uv, which can bypass the repo-pinned versioning model.

💡 Doc patch
 # Using uv (preferred)
-uv tree | grep -A5 -B5 tornado
+./uv tree | grep -A5 -B5 tornado

 # Or check the package's dependents
-uv tree --invert tornado
+./uv tree --invert tornado
@@
 # Check dependency tree
-uv tree
+./uv tree

 # Find what depends on a package
-uv tree --invert package-name
+./uv tree --invert package-name

Also applies to: 219-223

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/cves/python.md` around lines 90 - 96, Replace bare `uv` usage in the
example commands with the repository-pinned wrapper `./uv` so the docs use the
pinned version; specifically update the examples that show "uv tree | grep -A5
-B5 tornado" and "uv tree --invert tornado" to use "./uv" and also make the same
change for the other occurrence noted around the later example (the block
referenced as lines 219-223) so all examples consistently call the local ./uv
wrapper.

19-21: ⚠️ Potential issue | 🟡 Minor

Add language identifiers to fenced code blocks.

Several fenced blocks are still missing a language tag (MD040). Use text, bash, or toml as appropriate.

Also applies to: 28-32, 41-45, 116-119, 176-179

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/cves/python.md` around lines 19 - 21, Multiple fenced code blocks in
docs/cves/python.md are missing language identifiers (MD040). Edit the fenced
blocks at the ranges called out (lines ~19, 28-32, 41-45, 116-119, 176-179) and
add appropriate language tags (e.g., ```text, ```bash, or ```toml) to each
opening fence so the markdown linter no longer flags MD040; pick `text` for
plain file paths, `bash` for shell commands, and `toml` for constraint files or
config snippets to match the content.
scripts/pylocks_generator.py (2)

240-247: ⚠️ Potential issue | 🟠 Major

Don’t delete existing lockfiles when compile fails.

A transient uv pip compile failure currently removes the destination lockfile, which can erase a previously valid checked-in file.

💡 Safer write pattern (temp file + atomic replace)
 def run_lock(
@@
 ) -> bool:
@@
     if mode == IndexMode.public_index:
-        output = "pylock.toml"
+        output = Path("pylock.toml")
         desc = "pylock.toml (public index)"
         print("➡️ Generating pylock.toml from public PyPI index...")
     else:
         (project_dir / "uv.lock.d").mkdir(exist_ok=True)
-        output = f"uv.lock.d/pylock.{flavor}.toml"
+        output = Path("uv.lock.d") / f"pylock.{flavor}.toml"
         desc = f"{flavor.upper()} lock file"
         print(f"➡️ Generating {flavor.upper()} lock file...")
+    tmp_output = output.with_suffix(output.suffix + ".tmp")
@@
         "pyproject.toml",
         "--output-file",
-        output,
+        str(tmp_output),
@@
     if result.returncode != 0:
         warn(f"Failed to generate {desc} in {project_dir}")
-        output_path = project_dir / output
-        output_path.unlink(missing_ok=True)
+        (project_dir / tmp_output).unlink(missing_ok=True)
         return False

+    (project_dir / tmp_output).replace(project_dir / output)
     ok(f"{desc} generated successfully.")

Also applies to: 265-266, 292-296


345-354: ⚠️ Potential issue | 🟠 Major

Fail the run for invalid DIR inputs instead of silently skipping.

When a specific target_dir is passed and it fails Python-version extraction or flavor detection, the script currently continues and may still exit 0.

💡 Minimal fix to propagate failure in `DIR=` flow
         python_version = extract_python_version(tdir)
         if python_version is None:
             warn(f"Could not extract valid Python version from directory name: {tdir}")
             warn("Expected directory format: .../ubi9-python-X.Y")
+            if target_dir is not None:
+                failed_dirs.append(tdir)
             continue

         flavors = detect_flavors(tdir)
         if not flavors:
             warn(f"No Dockerfiles found in {tdir} (cpu/cuda/rocm). Skipping.")
+            if target_dir is not None:
+                failed_dirs.append(tdir)
             continue
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/pylocks_generator.py` around lines 345 - 354, When processing a
specific directory (tdir) the script currently warns and continues on failures
from extract_python_version(tdir) and detect_flavors(tdir); update both failure
branches so that if a target_dir filter is active (check the variable that holds
the user-specified DIR, e.g. target_dir) the script terminates with a non-zero
exit (sys.exit(1) or raise SystemExit) instead of continue(), otherwise retain
the current warn+continue behavior for broader runs. Make the checks adjacent to
the existing extract_python_version and detect_flavors calls so failures for the
targeted DIR propagate as an error return.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/cves/python.md`:
- Around line 78-80: The grep example only searches pylock.toml and misses files
like uv.lock.d/pylock.<flavor>.toml; update the example command (the line
containing grep -r "tornado" --include="pylock.toml" .) to include the
uv.lock.d/pylock.*.toml pattern as another --include (or otherwise use an
include pattern that matches both) so the rh-index mode lockfile variants are
covered.

---

Duplicate comments:
In `@docs/cves/python.md`:
- Around line 90-96: Replace bare `uv` usage in the example commands with the
repository-pinned wrapper `./uv` so the docs use the pinned version;
specifically update the examples that show "uv tree | grep -A5 -B5 tornado" and
"uv tree --invert tornado" to use "./uv" and also make the same change for the
other occurrence noted around the later example (the block referenced as lines
219-223) so all examples consistently call the local ./uv wrapper.
- Around line 19-21: Multiple fenced code blocks in docs/cves/python.md are
missing language identifiers (MD040). Edit the fenced blocks at the ranges
called out (lines ~19, 28-32, 41-45, 116-119, 176-179) and add appropriate
language tags (e.g., ```text, ```bash, or ```toml) to each opening fence so the
markdown linter no longer flags MD040; pick `text` for plain file paths, `bash`
for shell commands, and `toml` for constraint files or config snippets to match
the content.

In `@scripts/pylocks_generator.py`:
- Around line 345-354: When processing a specific directory (tdir) the script
currently warns and continues on failures from extract_python_version(tdir) and
detect_flavors(tdir); update both failure branches so that if a target_dir
filter is active (check the variable that holds the user-specified DIR, e.g.
target_dir) the script terminates with a non-zero exit (sys.exit(1) or raise
SystemExit) instead of continue(), otherwise retain the current warn+continue
behavior for broader runs. Make the checks adjacent to the existing
extract_python_version and detect_flavors calls so failures for the targeted DIR
propagate as an error return.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 077fa609-30a4-4fbd-9733-bfa498027c13

📥 Commits

Reviewing files that changed from the base of the PR and between 1955290 and d26add0.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (20)
  • .github/workflows/build-notebooks-TEMPLATE.yaml
  • .github/workflows/code-quality.yaml
  • .github/workflows/docs.yaml
  • .github/workflows/piplock-renewal.yaml
  • .github/workflows/security.yaml
  • .pre-commit-config.yaml
  • Makefile
  • README.md
  • ci/generate_code.sh
  • dependencies/cve-constraints.txt
  • docs/cves/python.md
  • jupyter/datascience/ubi9-python-3.12/pyproject.toml
  • jupyter/pytorch/ubi9-python-3.12/pyproject.toml
  • jupyter/rocm/pytorch/ubi9-python-3.12/pyproject.toml
  • jupyter/trustyai/ubi9-python-3.12/pyproject.toml
  • pyproject.toml
  • scripts/pylocks_generator.py
  • scripts/pylocks_generator.sh
  • uv
  • uv.toml
💤 Files with no reviewable changes (1)
  • scripts/pylocks_generator.sh
✅ Files skipped from review due to trivial changes (14)
  • uv.toml
  • .github/workflows/build-notebooks-TEMPLATE.yaml
  • jupyter/datascience/ubi9-python-3.12/pyproject.toml
  • .github/workflows/piplock-renewal.yaml
  • .github/workflows/security.yaml
  • jupyter/rocm/pytorch/ubi9-python-3.12/pyproject.toml
  • .github/workflows/code-quality.yaml
  • dependencies/cve-constraints.txt
  • jupyter/pytorch/ubi9-python-3.12/pyproject.toml
  • .github/workflows/docs.yaml
  • jupyter/trustyai/ubi9-python-3.12/pyproject.toml
  • pyproject.toml
  • README.md
  • ci/generate_code.sh
🚧 Files skipped from review as they are similar to previous changes (3)
  • .pre-commit-config.yaml
  • uv
  • Makefile

Comment thread docs/cves/python.md
Comment on lines +78 to +80
# Search in pylock.toml files
grep -r "tornado" --include="pylock.toml" .
```
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Broaden lockfile search example for rh-index mode.

The current grep only checks pylock.toml and misses uv.lock.d/pylock.<flavor>.toml files.

💡 Doc patch
 # Search in pylock.toml files
-grep -r "tornado" --include="pylock.toml" .
+grep -r "tornado" --include="pylock*.toml" .
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
# Search in pylock.toml files
grep -r "tornado" --include="pylock.toml" .
```
# Search in pylock.toml files
grep -r "tornado" --include="pylock*.toml" .
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/cves/python.md` around lines 78 - 80, The grep example only searches
pylock.toml and misses files like uv.lock.d/pylock.<flavor>.toml; update the
example command (the line containing grep -r "tornado" --include="pylock.toml"
.) to include the uv.lock.d/pylock.*.toml pattern as another --include (or
otherwise use an include pattern that matches both) so the rh-index mode
lockfile variants are covered.

@jiridanek jiridanek force-pushed the cherry-pick-qol-rhoai-2.25 branch 2 times, most recently from 756e2dd to c160f51 Compare March 31, 2026 10:07
@jiridanek jiridanek changed the title chore(rhoai-2.25): cherry-pick developer tooling QoL from main [rhoai-2.25] chore: cherry-pick developer tooling QoL from main Apr 16, 2026
jiridanek and others added 3 commits April 16, 2026 21:47
…txt (opendatahub-io#2886)

RHAIENG-2458: CVE-2025-66418 urllib3 decompression vulnerability

- Create dependencies/cve-constraints.txt as single source of truth for
  CVE-induced minimum version constraints
- Update pylocks_generator.sh to use --constraints flag with CVE file
- Update comments on urllib3 overrides in jupyter images to explain that
  override is needed because odh-elyra's appengine-python-standard has
  an obnoxious urllib3<2 constraint
- Add docs/cves/python.md documenting the CVE resolution workflow

This approach:
- Centralizes CVE fixes in one file
- Applies constraints at resolution time via uv pip compile --constraints
- Uses override-dependencies only where needed (odh-elyra conflict)
- Prevents CVEs from returning through transitive dependencies

Thanks to Adriana Theodorakopoulou for contributing the CVE resolution
workflow documentation.

(cherry picked from commit 2e8d387)
…generator.sh` to Python (opendatahub-io#3057)

**Files created:**
- `scripts/pylocks_generator.py` -- Python rewrite of the bash script using typer for CLI

**Files modified:**
- `pyproject.toml` -- added `typer` to dev dependencies
- `uv.lock` -- updated by `uv lock` to include typer as explicit dep
- `Makefile` -- `refresh-lock-files` target now calls `python3 scripts/pylocks_generator.py`
- `ci/generate_code.sh` -- uses `"${REPO_ROOT}/uv" run scripts/pylocks_generator.py` (consistent with adjacent lines)
- `scripts/lockfile-generators/create-requirements-lockfile.sh` -- all references updated from `.sh` to `.py`

**Files deleted:**
- `scripts/pylocks_generator.sh` -- the original bash script

**Verification results:**
- `--help` shows the correct typer CLI with `INDEX_MODE` and `TARGET_DIR` arguments
- Running against `jupyter/minimal/ubi9-python-3.12` produced byte-identical lock files to the bash script (zero diff after fixing arg formatting)
- Correctly detects Python version, flavors, effective mode, and generates all three flavor locks (CPU, CUDA, ROCM)

Minimal, focused PR: rewrite the script, update call sites, delete old script. No tests, no ruff include changes (follow-up).

- **1 deleted file** (`pylocks_generator.sh`, 376 lines)
- **1 added file** (`pylocks_generator.py`, ~200-250 lines), structured to mirror the bash script's section layout so reviewers can compare side by side
- **4 small edits** (pyproject.toml, Makefile, ci/generate_code.sh, create-requirements-lockfile.sh)

Add `typer` to `[dependency-groups] dev` in [pyproject.toml](pyproject.toml) (it's already in `uv.lock` as a transitive dep, but needs to be explicit). Run `uv lock` to sync.

Use typer for CLI with the same interface as the bash script:

```python
"""Generate Python dependency lock files (pylock.toml) using uv pip compile."""

from __future__ import annotations

import os
import re
import subprocess
import sys
from pathlib import Path

import typer

ROOT_DIR = Path(__file__).resolve().parent.parent
UV = ROOT_DIR / "uv"
CVE_CONSTRAINTS_FILE = ROOT_DIR / "dependencies" / "cve-constraints.txt"
PUBLIC_INDEX = "--default-index=https://pypi.org/simple"
MAIN_DIRS = ["jupyter", "runtimes", "rstudio", "codeserver"]
UV_MIN_VERSION = (0, 4, 0)

app = typer.Typer()

@app.command()
def main(
    index_mode: str = typer.Argument("auto", help="..."),
    target_dir: Path | None = typer.Argument(None, help="..."),
) -> None:
    ...
```

Key functions (same logical sections as the bash script):

- `read_conf_value(conf_file: Path, key: str) -> str | None` -- simple line parser replacing awk
- `check_uv()` -- verify `./uv` exists, parse version, compare as `tuple(int, ...)` >= `(0, 4, 0)`
- `find_target_dirs(target_dir: Path | None) -> list[Path]` -- `pathlib.glob("**/pyproject.toml")`
- `detect_flavors(project_dir: Path) -> set[str]` -- check `Dockerfile.{cpu,cuda,rocm}` existence
- `get_index_flags(project_dir: Path, flavor: str) -> list[str] | None` -- read `build-args/<flavor>.conf`
- `run_lock(project_dir, flavor, index_flags, mode, python_version, upgrade) -> bool` -- `subprocess.run(cwd=project_dir)` with list args

Implementation notes:

- Build the `uv pip compile` command as a list, conditionally appending `--upgrade`
- Use `cwd=target_dir` in `subprocess.run()` and `os.path.relpath()` for CVE constraints path
- `FORCE_LOCKFILES_UPGRADE` env var read via `os.environ.get()`

- [Makefile](Makefile) line 443: `bash scripts/pylocks_generator.sh $(INDEX_MODE) $(DIR)` -> `python3 scripts/pylocks_generator.py $(INDEX_MODE) $(DIR)`
- [ci/generate_code.sh](ci/generate_code.sh) line 11: `bash scripts/pylocks_generator.sh` -> `"${REPO_ROOT}/uv" run scripts/pylocks_generator.py` (matches the two lines above it)
- [scripts/lockfile-generators/create-requirements-lockfile.sh](scripts/lockfile-generators/create-requirements-lockfile.sh) -- all references:
  - Line 31: `PYLOCKS_GENERATOR="scripts/pylocks_generator.sh"` -> `.py`
  - Line 43: help text reference
  - Line 57: help text reference
  - Lines 73-74: guard check and error message
  - Line 115: `bash "$PYLOCKS_GENERATOR"` -> `python3 "$PYLOCKS_GENERATOR"`

Clean deletion. It remains in git history.

Run `python3 scripts/pylocks_generator.py --help` and a targeted invocation like `python3 scripts/pylocks_generator.py auto jupyter/minimal/ubi9-python-3.12`.

* Here's what I addressed from the CodeRabbit review:

**Fixed (3 items):**
- **Makefile**: Changed `python3` to `./uv run` so typer is available via the dev environment
- **`mode: str` -> `mode: IndexMode`**: `run_lock` now takes the enum directly, and the caller passes `IndexMode.rh_index` / `IndexMode.public_index` instead of raw strings
- **`find_target_dirs` sort/dedup**: Uses a `set` for dedup and returns `sorted(dirs)` for stable CI output

**Rejected (4 items):**
- **`@app.callback()`**: Tested it -- produces a misleading `COMMAND [ARGS]...` in help. `@app.command()` works correctly with enum arguments; `auto` is parsed as the enum value, not a subcommand name
- **`read_conf_value` hardening (export/quotes)**: The actual `.conf` files don't use these patterns, and the original bash awk parser didn't handle them either. Unnecessary scope creep
- **Skipped dirs tracking**: The bash script didn't track these either. Behavior parity
- **`source "$CONF_FILE"` in create-requirements-lockfile.sh**: Pre-existing, not introduced by this PR

(cherry picked from commit 8da26d3)
fix: downgrade to Python 3.12 and fix CI failures

- Pin Python to 3.12 (uv 0.8.12 only has 3.14.0rc2 which breaks pydantic)
- Update pyright/ruff target versions to 3.12
- Fix except syntax for 3.12 compatibility (parenthesized form)
- Remove generate_kustomization.py call (not present on rhoai-2.25)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

fix: resolve pre-commit failures (ruff, pyright)

Cherry-pick fixes from main (50c960f):
- RUF067: move PROJECT_ROOT from __init__.py to _common.py
- pyright: remove return from finally block in check-software-versions.py
- ruff format: reformat 2 test files

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

fix: update stale references and regenerate uv.lock

- docs/cves/python.md: replace pylocks_generator.sh with .py
- README.md: fix uv version examples (0.10.6 → 0.8.12)
- uv.lock: regenerate after typer dependency addition

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@jiridanek jiridanek force-pushed the cherry-pick-qol-rhoai-2.25 branch from c160f51 to 98dfacd Compare April 17, 2026 05:16
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
ci/check-software-versions.py (1)

254-262: LGTM — cleanup failure no longer short-circuits the return.

Setting ret_code = 1 in the finally branch instead of return 1 preserves any prior failure signal and avoids returning from a finally (which would mask exceptions). Behavior is otherwise equivalent since print_delimiter() is now invoked on both paths.

Optional nit: print_delimiter() is called in both branches and could be hoisted out of the if/else for a small readability win:

♻️ Optional simplification
     finally:
         if stop_and_remove_container(container_id) != 0:
             log.error(f"Failed to stop/remove the container '{container_id}' for the '{image_ref}' tag!")
-            print_delimiter()
             ret_code = 1
-        else:
-            print_delimiter()
+        print_delimiter()
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ci/check-software-versions.py` around lines 254 - 262, The finally block
currently sets ret_code = 1 when stop_and_remove_container(container_id) returns
non-zero and calls print_delimiter() in both branches; simplify by keeping the
failure behavior (set ret_code = 1 on non-zero from stop_and_remove_container)
but hoist print_delimiter() out of the if/else to avoid duplication; locate the
finally handling around stop_and_remove_container(container_id), ret_code,
container_id, image_ref and move the single print_delimiter() call to execute
unconditionally after the if/else while preserving the ret_code assignment and
the log.error(...) call on failure.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@scripts/pylocks_generator.py`:
- Around line 136-152: The check_uv() logic currently assumes uv --version
succeeds and that all dot-separated tokens are pure integers; update it to first
check subprocess.run(...).returncode and treat a non-zero return as a failure
(log the actual stderr/stdout and exit), then robustly parse version_str
obtained from result.stdout by stripping non-numeric suffixes or by extracting
only the leading numeric components (ignore tokens with non-digit characters or
stop parsing at the first non-numeric token) and guard the int() conversion with
try/except to catch ValueError; ensure any parsing failure falls back to a safe
"0.0.0" (or logs the raw version and exits) before comparing the resulting tuple
to UV_MIN_VERSION, referencing UV, version_str, version_tuple and UV_MIN_VERSION
in your changes.

---

Nitpick comments:
In `@ci/check-software-versions.py`:
- Around line 254-262: The finally block currently sets ret_code = 1 when
stop_and_remove_container(container_id) returns non-zero and calls
print_delimiter() in both branches; simplify by keeping the failure behavior
(set ret_code = 1 on non-zero from stop_and_remove_container) but hoist
print_delimiter() out of the if/else to avoid duplication; locate the finally
handling around stop_and_remove_container(container_id), ret_code, container_id,
image_ref and move the single print_delimiter() call to execute unconditionally
after the if/else while preserving the ret_code assignment and the
log.error(...) call on failure.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro Plus

Run ID: 9e550c71-704c-4049-9076-b8b8db1e94aa

📥 Commits

Reviewing files that changed from the base of the PR and between d26add0 and 98dfacd.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (21)
  • .github/workflows/build-notebooks-TEMPLATE.yaml
  • .github/workflows/code-quality.yaml
  • .github/workflows/docs.yaml
  • .pre-commit-config.yaml
  • .python-version
  • Makefile
  • README.md
  • ci/check-software-versions.py
  • ci/generate_code.sh
  • dependencies/cve-constraints.txt
  • docs/cves/python.md
  • jupyter/datascience/ubi9-python-3.12/pyproject.toml
  • jupyter/pytorch/ubi9-python-3.12/pyproject.toml
  • jupyter/rocm/pytorch/ubi9-python-3.12/pyproject.toml
  • jupyter/trustyai/ubi9-python-3.12/pyproject.toml
  • pyproject.toml
  • scripts/pylocks_generator.py
  • scripts/pylocks_generator.sh
  • tests/__init__.py
  • tests/_common.py
  • tests/containers/workbenches/workbench_image_test.py
💤 Files with no reviewable changes (1)
  • scripts/pylocks_generator.sh
✅ Files skipped from review due to trivial changes (10)
  • .python-version
  • tests/containers/workbenches/workbench_image_test.py
  • tests/init.py
  • tests/_common.py
  • jupyter/trustyai/ubi9-python-3.12/pyproject.toml
  • jupyter/datascience/ubi9-python-3.12/pyproject.toml
  • pyproject.toml
  • jupyter/rocm/pytorch/ubi9-python-3.12/pyproject.toml
  • README.md
  • jupyter/pytorch/ubi9-python-3.12/pyproject.toml
🚧 Files skipped from review as they are similar to previous changes (6)
  • ci/generate_code.sh
  • .github/workflows/docs.yaml
  • .github/workflows/code-quality.yaml
  • .pre-commit-config.yaml
  • dependencies/cve-constraints.txt
  • Makefile

Comment on lines +136 to +152
try:
result = subprocess.run(
[str(UV), "--version"],
capture_output=True,
text=True,
check=False,
)
version_str = result.stdout.strip().split()[1] if result.stdout.strip() else "0.0.0"
except (IndexError, FileNotFoundError):
version_str = "0.0.0"

version_tuple = tuple(int(x) for x in version_str.split("."))
if version_tuple < UV_MIN_VERSION:
min_ver = ".".join(str(x) for x in UV_MIN_VERSION)
error(f"uv version {version_str} found, but >= {min_ver} is required.")
error("Please upgrade uv: https://github.com/astral-sh/uv")
raise SystemExit(1)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

check_uv() can crash on non-numeric uv version tokens.

version_str.split(".") followed by int(x) will raise ValueError on any non-numeric component (e.g. uv 0.8.12+abcdef, 0.9.0rc1, or a dev build), and that exception isn’t covered by the except (IndexError, FileNotFoundError) above — the whole script aborts with a traceback instead of a clean error. Also, result.returncode isn’t checked, so a failing uv --version silently falls back to "0.0.0" and the user sees a misleading "version >= 0.4.0 required" message rather than the real problem.

🛡️ Safer parsing
-    try:
-        result = subprocess.run(
-            [str(UV), "--version"],
-            capture_output=True,
-            text=True,
-            check=False,
-        )
-        version_str = result.stdout.strip().split()[1] if result.stdout.strip() else "0.0.0"
-    except (IndexError, FileNotFoundError):
-        version_str = "0.0.0"
-
-    version_tuple = tuple(int(x) for x in version_str.split("."))
+    try:
+        result = subprocess.run(
+            [str(UV), "--version"],
+            capture_output=True, text=True, check=False,
+        )
+    except FileNotFoundError:
+        error(f"Unable to execute uv wrapper at '{UV}'.")
+        raise SystemExit(1)
+    if result.returncode != 0:
+        error(f"'{UV} --version' failed: {result.stderr.strip() or result.stdout.strip()}")
+        raise SystemExit(1)
+    m = re.match(r"uv\s+(\d+)\.(\d+)\.(\d+)", result.stdout.strip())
+    if not m:
+        error(f"Could not parse uv version from: {result.stdout.strip()!r}")
+        raise SystemExit(1)
+    version_tuple = tuple(int(x) for x in m.groups())
+    version_str = ".".join(str(x) for x in version_tuple)
🧰 Tools
🪛 Ruff (0.15.10)

[error] 137-137: subprocess call: check for execution of untrusted input

(S603)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/pylocks_generator.py` around lines 136 - 152, The check_uv() logic
currently assumes uv --version succeeds and that all dot-separated tokens are
pure integers; update it to first check subprocess.run(...).returncode and treat
a non-zero return as a failure (log the actual stderr/stdout and exit), then
robustly parse version_str obtained from result.stdout by stripping non-numeric
suffixes or by extracting only the leading numeric components (ignore tokens
with non-digit characters or stop parsing at the first non-numeric token) and
guard the int() conversion with try/except to catch ValueError; ensure any
parsing failure falls back to a safe "0.0.0" (or logs the raw version and exits)
before comparing the resulting tuple to UV_MIN_VERSION, referencing UV,
version_str, version_tuple and UV_MIN_VERSION in your changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant