Skip to content

Conversation

@raptorsun
Copy link
Contributor

@raptorsun raptorsun commented Dec 1, 2025

Description

do not merge

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Tools used to create PR

Identify any AI code assistants used in this PR (for transparency and review context)

  • Assisted-by: (e.g., Claude, CodeRabbit, Ollama, etc., N/A if not used)
  • Generated by: (e.g., tool name and version; N/A if not used)

Related Tickets & Documents

  • Related Issue #
  • Closes #

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Summary by CodeRabbit

  • New Features

    • Prefetch configuration expanded to a structured multi-entry format (rpm, pip with binary specs, generic).
    • Introduced a new permissive hermeto-mode propagated through dependency prefetching.
  • Chores

    • Added hermetic build tooling and packaging (cmake, build deps, googletest artifact).
    • New autogenerated dependency manifests for binary and build requirements; adjusted requirement sets across architectures.
    • Makefile improvements and updated packaging/runtime RPM versions.
  • Fixes

    • Two-stage post-processing for torch/faiss dependency handling.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 1, 2025

Walkthrough

Introduces a hermeto-mode parameter and threads it into Tekton prefetch-dependencies; expands prefetch-input from a single-line JSON to a structured multi-entry JSON array (rpm, pip with binary/build metadata, generic); adds build tooling (cmake, googletest), separates FAISS/torch binary requirements, and adds autogenerated build requirement manifests.

Changes

Cohort / File(s) Summary
Tekton pipeline
.tekton/lightspeed-stack-pull-request.yaml, .tekton/lightspeed-stack-push.yaml
Prefetch-input changed from single-line JSON to multi-entry JSON array (rpm, pip, generic); new public hermeto-mode param set to permissive; mode propagated into prefetch-dependencies task wiring.
Container build & artifacts
Containerfile, artifacts.lock.yaml
Added cmake to build packages; conditional unpack/build of googletest when /cachi2/cachi2.env exists; added artifacts.lock.yaml pinning googletest-release-1.12.1.
Makefile & build targets
Makefile
Added FAISS_VERSION; konflux-requirements now generates requirements.binary.txt and requirements-build.txt.
Python dependency tooling
pyproject.toml, requirements.hermetic.txt
Added pybuild-deps==0.5.0 and pip==24.3.1 to dev deps; hermetic requirements updated with build tools (cmake, ninja, patchelf, etc.).
Requirements / dependency files
requirements.binary.txt, requirements-build.txt, requirements.aarch64.txt, requirements.x86_64.txt
Added requirements.binary.txt (faiss-cpu pinned); added autogenerated requirements-build.txt; removed faiss-cpu==1.13.1 blocks from requirements.aarch64.txt and requirements.x86_64.txt.
Scripts
scripts/remove_torch_deps.sh
Switched to two-stage awk pipeline to remove torch then faiss-cpu entries; added final diff verification and adjusted temp-file handling.
RPM manifests / locks
rpms.in.yaml, rpms.lock.yaml
Added cmake to rpms.in; rpms.lock updated with cmake/libuv/vim/emacs-filesystem additions and multiple EVR/binary checksum updates.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~35 minutes

Possibly related PRs

Suggested labels

ok-to-test

Suggested reviewers

  • radofuchs
  • tisnik

Pre-merge checks and finishing touches

❌ Failed checks (1 inconclusive)
Check name Status Explanation Resolution
Title check ❓ Inconclusive The PR title uses vague language ('[WIP]' and 'Non binary') that doesn't clearly convey the specific technical changes made. Replace with a more descriptive title that clearly summarizes the main changes, such as 'Add hermetic build dependencies and FAISS binary support' or 'Configure hermetic build with separate binary and build requirements'.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (3)
README.md (1)

18-94: Consider fixing markdown list indentation for consistency.

Markdownlint reports unordered list indentation inconsistencies in the table of contents. While not critical, consistent formatting improves maintainability.

Expected indentation is 2 spaces per level, but some entries use 4 spaces. You can run markdownlint-cli2 --fix README.md to automatically correct these, or adjust manually:

  • Lines 19, 55-57, 90-94: Reduce indentation to match expected levels (2/4/6 spaces instead of 4/8/12)
scripts/remove_torch_deps.sh (2)

6-6: Consider adding set -u for stricter error handling.

While set -e is present, adding set -u would catch undefined variable references.

Apply this diff:

-set -e
+set -eu

16-19: Consider warning if backup already exists.

The script silently overwrites any existing backup file. Consider adding a check to warn users.

Apply this diff:

 # Create a backup
 BACKUP_FILE="${INPUT_FILE}.backup"
+if [ -f "$BACKUP_FILE" ]; then
+    echo "Warning: Overwriting existing backup: $BACKUP_FILE"
+fi
 cp "$INPUT_FILE" "$BACKUP_FILE"
 echo "Created backup: $BACKUP_FILE"
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f1b0c81 and 1645fe7.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (12)
  • .tekton/lightspeed-stack-pull-request.yaml (2 hunks)
  • .tekton/lightspeed-stack-push.yaml (2 hunks)
  • Containerfile (1 hunks)
  • Makefile (2 hunks)
  • README.md (9 hunks)
  • pyproject.toml (1 hunks)
  • requirements.hermetic.txt (1 hunks)
  • requirements.torch.txt (1 hunks)
  • rpms.in.yaml (1 hunks)
  • rpms.lock.yaml (1 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
  • ubi.repo (1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml

📄 CodeRabbit inference engine (CLAUDE.md)

pyproject.toml: Configure pylint with source-roots = "src"
Exclude src/auth/k8s.py from pyright type checking

Files:

  • pyproject.toml
🧠 Learnings (4)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies

Applied to files:

  • Containerfile
  • pyproject.toml
📚 Learning: 2025-08-18T11:45:59.961Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:77-77
Timestamp: 2025-08-18T11:45:59.961Z
Learning: torch==2.7.1 is available on PyPI and is a valid version that can be used in dependency specifications.

Applied to files:

  • requirements.torch.txt
📚 Learning: 2025-08-18T10:56:55.349Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:0-0
Timestamp: 2025-08-18T10:56:55.349Z
Learning: The lightspeed-stack project intentionally uses a "generic image" approach, bundling many dependencies directly in the base runtime image to work for everyone, rather than using lean base images with optional dependency groups.

Applied to files:

  • README.md
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use Python package manager `uv` with `uv run` prefix for all development commands

Applied to files:

  • requirements.hermetic.txt
🪛 markdownlint-cli2 (0.18.1)
README.md

19-19: Unordered list indentation
Expected: 2; Actual: 4

(MD007, ul-indent)


55-55: Unordered list indentation
Expected: 6; Actual: 12

(MD007, ul-indent)


56-56: Unordered list indentation
Expected: 4; Actual: 8

(MD007, ul-indent)


57-57: Unordered list indentation
Expected: 6; Actual: 12

(MD007, ul-indent)


90-90: Unordered list indentation
Expected: 2; Actual: 4

(MD007, ul-indent)


91-91: Unordered list indentation
Expected: 4; Actual: 8

(MD007, ul-indent)


92-92: Unordered list indentation
Expected: 4; Actual: 8

(MD007, ul-indent)


93-93: Unordered list indentation
Expected: 4; Actual: 8

(MD007, ul-indent)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: build-pr
  • GitHub Check: e2e_tests (ci)
  • GitHub Check: e2e_tests (azure)
🔇 Additional comments (19)
pyproject.toml (1)

90-94: LGTM! Grouped PyTorch source configuration.

The addition of group = "llslibdev" correctly restricts the PyTorch CPU index to the llslibdev dependency group, aligning with the hermetic build strategy introduced in this PR.

requirements.torch.txt (1)

1-24: LGTM! Autogenerated PyTorch CPU requirements for hermetic builds.

This file correctly provides pinned torch==2.7.1+cpu with integrity hashes for reproducible, hermetic builds across multiple platforms.

README.md (1)

940-985: LGTM! Comprehensive Konflux hermetic build documentation.

The new Konflux section clearly documents the hermetic build workflow, dependency update process, and provides actionable commands for maintaining lock files. This aligns well with the hermetic build infrastructure introduced in this PR.

ubi.repo (1)

1-62: LGTM! Standard UBI 9 repository configuration.

This YUM repository configuration correctly defines the BaseOS, AppStream, and CodeReady Builder repositories for UBI 9, with appropriate GPG verification enabled and debug/source repos disabled by default.

rpms.in.yaml (1)

1-4: LGTM! Clean RPM dependency specification for multi-arch builds.

The package list (gcc, jq, patch) and architecture configuration (x86_64, aarch64) align well with the hermetic build strategy and multi-platform support introduced in this PR.

rpms.lock.yaml (1)

1-355: LGTM! Autogenerated RPM lockfile for hermetic builds.

This lockfile provides reproducible, verified package resolution for multi-arch builds. It should be regenerated using rpm-lockfile-prototype when dependencies change, as documented in the README.

.tekton/lightspeed-stack-push.yaml (4)

27-30: LGTM! ARM64 platform support added for multi-arch builds.

The addition of linux-c6gd2xlarge/arm64 enables ARM64 builds alongside x86_64, aligning with the multi-architecture lockfiles (rpms.lock.yaml, requirements.aarch64.txt) introduced in this PR.


35-36: LGTM! Hermetic mode enabled for air-gapped builds.

Setting hermetic: 'true' enforces network isolation during builds, ensuring reproducibility with prefetched dependencies.


632-634: Extended timeouts for hermetic multi-arch builds.

The 4-hour timeout accommodates the increased duration of hermetic builds with dependency prefetching, multi-architecture compilation, and comprehensive security scanning. This aligns with the expanded build complexity introduced in this PR.


31-36: Verify all referenced requirements files exist before merging.

The prefetch configuration references four requirements files (requirements.x86_64.txt, requirements.aarch64.txt, requirements.hermetic.txt, requirements.torch.txt) that must exist at pipeline execution time. Confirm these files are present in the repository or are generated by the build system (e.g., via make konflux-requirements or similar build target) before this configuration is used.

Containerfile (2)

25-25: LGTM!

The glob pattern requirements.*.txt correctly matches all platform-specific and hermetic requirements files needed for the multi-architecture build.


32-36: Verify requirements files exist for all target architectures.

The hermetic build path uses $(uname -m) to select architecture-specific requirements files. Ensure that requirements.x86_64.txt, requirements.aarch64.txt, and requirements.torch.txt exist in the repository root for all supported build architectures.

.tekton/lightspeed-stack-pull-request.yaml (3)

30-33: LGTM!

The addition of linux-c6gd2xlarge/arm64 correctly adds ARM64 platform support for multi-architecture builds. The c6gd2xlarge instance type is appropriate for ARM64 builds.


635-637: Verify 4-hour timeout is appropriate.

The pipeline and task timeouts are set to 4 hours, which is significantly longer than typical CI builds. While multi-arch builds with hermetic prefetching can be time-consuming, verify this duration is necessary and consider if it can be optimized.

Monitor actual build times to determine if this timeout can be reduced in future iterations.


36-37: Verify allow_binary: false compatibility with PyTorch wheels.

The prefetch-input configuration sets "allow_binary": "false" for pip dependencies. If this parameter restricts downloads to source distributions only, it may conflict with PyTorch wheels referenced in requirements.torch.txt. Clarify whether cachi2's allow_binary setting permits wheel downloads, or if the configuration should be adjusted to "allow_binary": "true" or removed to allow mixed binary/source packages for this use case.

scripts/remove_torch_deps.sh (1)

48-55: LGTM!

The atomic file replacement using a temporary file and the final diff output are well-implemented. The || true on the diff command correctly prevents the script from failing when differences are found.

Makefile (3)

7-8: No action required. PyTorch version 2.7.1 is available on PyPI and supports CPU wheels.


104-109: Verify duplicate target definitions in Makefile.

The review comment references a potential duplicate konflux-requirements target definition, but the provided code snippet shows only one definition (lines 104-109). Manual verification is needed to confirm whether duplicate target definitions exist elsewhere in the Makefile, as repository access for automated verification was unavailable.


104-109: The --torch-backend cpu, --emit-index-url, and --no-deps flags are all valid uv pip compile options.

--torch-backend is a documented uv feature for specifying PyTorch backend indexes (cpu, cu128, etc.). --emit-index-url is documented for including index URLs in generated requirements. --no-deps is a standard pip compatibility flag supported by uv.

The design here—generating platform-specific requirements with full dependencies, removing torch from those files, then generating a separate torch-only file with --no-deps—is intentional and correct. PyTorch dependencies not needed for the platform builds are isolated in the separate requirements.torch.txt file.

Likely an incorrect or invalid review comment.

Comment on lines 33 to 42
# If we are in torch section and line starts with 4 spaces, skip it
in_torch_section == 1 && /^ / {
next # Skip this line
}
# If we are in torch section and line does NOT start with 4 spaces, exit torch section
in_torch_section == 1 && !/^ / {
in_torch_section = 0
# Fall through to print this line
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential issue with empty lines in torch section.

The awk script exits the torch section when encountering a line that doesn't start with 4 spaces. This includes empty lines, which might appear between the torch== line and its hash lines in some requirements files.

Consider updating the pattern to handle empty lines:

 # If we are in torch section and line starts with 4 spaces, skip it
 in_torch_section == 1 && /^    / {
     next  # Skip this line
 }
 
+# If we are in torch section and line is empty, skip it
+in_torch_section == 1 && /^$/ {
+    next  # Skip empty line
+}
+
 # If we are in torch section and line does NOT start with 4 spaces, exit torch section
-in_torch_section == 1 && !/^    / {
+in_torch_section == 1 && !/^    / && !/^$/ {
     in_torch_section = 0
     # Fall through to print this line
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
# If we are in torch section and line starts with 4 spaces, skip it
in_torch_section == 1 && /^ / {
next # Skip this line
}
# If we are in torch section and line does NOT start with 4 spaces, exit torch section
in_torch_section == 1 && !/^ / {
in_torch_section = 0
# Fall through to print this line
}
# If we are in torch section and line starts with 4 spaces, skip it
in_torch_section == 1 && /^ / {
next # Skip this line
}
# If we are in torch section and line is empty, skip it
in_torch_section == 1 && /^$/ {
next # Skip empty line
}
# If we are in torch section and line does NOT start with 4 spaces, exit torch section
in_torch_section == 1 && !/^ / && !/^$/ {
in_torch_section = 0
# Fall through to print this line
}
🤖 Prompt for AI Agents
In scripts/remove_torch_deps.sh around lines 33 to 42, the awk logic treats any
line that doesn't start with four spaces (including blank lines) as the end of
the torch section; update the condition so blank lines are ignored (i.e., do not
terminate the section). Change the exit check to only leave the torch section
when the line is non-empty and does not start with four spaces (for example: if
in_torch_section == 1 && $0 !~ /^    / && $0 !~ /^$/ then set in_torch_section =
0), and ensure blank lines inside the torch block are either skipped or allowed
to fall through as part of the section so hash lines remain associated with the
torch entry.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
scripts/remove_torch_deps.sh (1)

33-42: Apply the empty-line fix to both torch and faiss-cpu removal sections.

The awk state machine uses !/^ / to exit each section (lines 39 and 67), but this pattern also matches empty lines, causing the section to exit prematurely when blank lines appear between package declarations and their hash continuation lines. Subsequent hash lines are then left unremoved.

This issue was flagged in a prior review but not resolved. The new faiss-cpu removal logic (lines 50–76) replicates the same pattern.

Apply this fix to the torch section (around line 39):

 # If we are in torch section and line starts with 4 spaces, skip it
 in_torch_section == 1 && /^    / {
     next  # Skip this line
 }
 
+# If we are in torch section and line is empty, skip it
+in_torch_section == 1 && /^$/ {
+    next  # Skip empty line
+}
+
 # If we are in torch section and line does NOT start with 4 spaces, exit torch section
-in_torch_section == 1 && !/^    / {
+in_torch_section == 1 && !/^    / && !/^$/ {
     in_torch_section = 0
     # Fall through to print this line
 }

Apply the same fix to the faiss-cpu section (around line 67).

🧹 Nitpick comments (1)
Makefile (1)

9-10: Align faiss-cpu segregation strategy with documented dependency control.

Based on learnings from prior PRs, faiss-cpu was intentionally pinned as a top-level dependency to maintain control over the dependency graph and avoid version conflicts when bundling ML/LLM packages. This change segregates faiss-cpu into requirements.binary.txt, which may affect dependency resolution and conflict management.

Ensure that this segregation is intentional and documented in ADRs or migration notes, and verify that the dependency graph remains stable after removal from standard requirements files.

Would you like me to generate a verification script to analyze the dependency graph impact, or help document the rationale for this segregation in an ADR?

Also applies to: 111-112

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 1645fe7 and cbe23f0.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (8)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (1 hunks)
  • Makefile (2 hunks)
  • pyproject.toml (1 hunks)
  • requirements.aarch64.txt (0 hunks)
  • requirements.x86_64.txt (0 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
💤 Files with no reviewable changes (2)
  • requirements.aarch64.txt
  • requirements.x86_64.txt
🚧 Files skipped from review as they are similar to previous changes (4)
  • Containerfile
  • pyproject.toml
  • .tekton/lightspeed-stack-pull-request.yaml
  • .tekton/lightspeed-stack-push.yaml
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • Makefile
🔇 Additional comments (4)
scripts/remove_torch_deps.sh (2)

50-76: Verify consistency with learnings on faiss-cpu pinning.

The script now removes faiss-cpu as a top-level dependency. According to learnings from prior work in this repo, faiss-cpu is intentionally pinned as a top-level dependency to maintain control over the dependency graph and avoid version conflicts in ML/LLM tooling. Ensure this removal aligns with broader changes in the build pipeline that handle faiss-cpu separately (as mentioned in the PR objectives).


79-85: Validate temporary file cleanup and backup retention.

The script chains two awk passes through temporary files (.tmp and .tmp2) and then replaces the original. Cleanup and backup handling appear correct: the script removes .tmp after chaining, preserves the backup, and runs a final diff for verification. No concerns here.

Makefile (2)

112-112: Verify pybuild-deps is available and correctly configured.

The new pybuild-deps compile step assumes the tool is installed and available in the uv environment. Ensure that this tool is:

  • Added to the project dependencies or dev dependencies
  • Compatible with the compiled requirements files from lines 106–111
  • Properly documented so developers understand the build-time dependency workflow

9-10: No action needed. FAISS_VERSION 1.13.0 is confirmed as the latest stable release as of December 2025, making the version choice correct and appropriate.

./scripts/remove_torch_deps.sh requirements.x86_64.txt
./scripts/remove_torch_deps.sh requirements.aarch64.txt
echo "torch==${TORCH_VERSION}" | uv pip compile - -o requirements.torch.txt --generate-hashes --python-version 3.12 --torch-backend cpu --emit-index-url --no-deps --index-url https://download.pytorch.org/whl/cpu
echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps --universal
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

uv pip compile --universal flag documentation behavior

💡 Result:

Short answer — uv pip compile --universal tells uv to perform a universal (platform‑independent) resolution and emit a requirements-style output with environment markers (and possibly multiple entries per package for different platforms). Universal resolution is more constrained than platform-specific resolution, may include markers on lines, and can produce multiple versions/URLs for the same project to cover different platforms. Known caveats: it can be stricter (fail when a package’s python-requires is incompatible across targets) and there are open/closed issues about marker handling and python-version interactions. (glama.ai)

If you want docs/links to read: uv universal resolution docs and related GitHub issues above. (glama.ai)

Citations:


🌐 Web query:

faiss-cpu wheel distribution universal wheels support

💡 Result:

Short answer: faiss-cpu does NOT provide "universal" (pure‑Python none‑any / py2.py3) wheels. The project publishes platform‑specific binary wheels (manylinux/musllinux macOS Windows; CPython ABI tags or abi3 where applicable) built with cibuildwheel, so you must install the wheel that matches your Python version and platform (or build from source). (pypi.org)

Citations:


🌐 Web query:

pip compile --universal vs --python-version conflict

💡 Result:

Short answer
--universal tells pip-compile to produce a requirements file that is valid across Python versions by preserving PEP 508 markers instead of resolving/installing for the current interpreter. --python-version (or using a different Python to run pip-compile) affects which Python environment the resolver actually uses when picking compatible package versions. That mismatch can make the resolver either (a) emit markers that will never be installable for the specified python version, or (b) fail because a dependency's python_requires makes no compatible release available for the environment used to resolve. [1][2][3]

Why the conflict happens (concise)

  • pip-compile resolves dependencies using the Python interpreter/environment it runs in (and pip-tools documents that results can differ by OS/Python). If you ask for a “universal” output but you run resolution with a different Python version, pip-compile will either (i) keep markers (so the produced lines are conditional and might not actually install under the target interpreter), or (ii) fail if the resolver finds no compatible package versions for the interpreter it used. [1][2][3]

What to do

  • Run pip-compile with the actual target Python version (e.g., run it inside that interpreter/venv). This avoids the mismatch. [1][3]
  • If you need one file for many Pythons, use a tool that compiles per-Python and merges results (e.g., pip-compile-universal) or generate separate requirements files per Python and merge carefully. [4][2]
  • Use explicit PEP 508 environment markers in your inputs so pip-compile can keep correct conditional requirements rather than trying to force-resolve incompatible releases. [1]

Sources

  • pip-tools documentation (behavior and cross-environment note). [1]
  • pip-tools GitHub issues describing lack of support for pip "--python-version" and cross-compile problems. [2][3]
  • pip-compile-universal (tool that compiles per-Python and merges). [4]

Remove --universal flag; faiss-cpu requires platform-specific wheels tied to Python version.

The --universal flag is incompatible with faiss-cpu, which distributes only platform-specific binary wheels (manylinux/musllinux/macOS/Windows) built for specific Python versions and platforms. Using --universal alongside --python-version 3.12 creates a semantic conflict: universal resolution produces cross-Python environment markers, but faiss-cpu binary wheels are strictly version-locked and platform-specific. Use only --python-version 3.12 without --universal to ensure the resolver generates the correct binary wheel specification.

🤖 Prompt for AI Agents
In Makefile around line 111, the command uses the incompatible --universal flag
with faiss-cpu; remove the --universal flag from the uv pip compile invocation
and keep --python-version 3.12 (and other flags) so the resolver produces
platform-specific binary wheel spec for faiss-cpu.

@raptorsun raptorsun force-pushed the non-binary branch 2 times, most recently from 120c2c6 to b65f9d5 Compare December 8, 2025 12:04
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (1)
scripts/remove_torch_deps.sh (1)

50-85: Refactor duplicated removal logic into a reusable function.

The torch removal (lines 22-48) and faiss removal (lines 50-76) follow nearly identical patterns. This violates DRY and makes future maintenance harder. Consider refactoring to a helper function:

remove_section() {
    local file="$1"
    local package_pattern="$2"
    local var_name="$3"
    
    awk -v pkg="$package_pattern" -v var="$var_name" '
    BEGIN {
        section_active = 0
    }
    $0 ~ "^" pkg "==" {
        section_active = 1
        next
    }
    section_active == 1 && /^$/ {
        next
    }
    section_active == 1 && /^    / {
        next
    }
    section_active == 1 && !/^    / && !/^$/ {
        section_active = 0
    }
    section_active == 0 {
        print
    }
    ' "$file"
}

Then call:

remove_section "$INPUT_FILE" "torch==" "in_torch_section" > "${INPUT_FILE}.tmp"
remove_section "${INPUT_FILE}.tmp" "faiss-cpu==" "in_faiss_section" > "${INPUT_FILE}.tmp2"
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 120c2c6 and b65f9d5.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (10)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (1 hunks)
  • Makefile (2 hunks)
  • pyproject.toml (1 hunks)
  • requirements-build.txt (1 hunks)
  • requirements.aarch64.txt (0 hunks)
  • requirements.binary.txt (1 hunks)
  • requirements.x86_64.txt (0 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
💤 Files with no reviewable changes (2)
  • requirements.x86_64.txt
  • requirements.aarch64.txt
✅ Files skipped from review due to trivial changes (1)
  • requirements.binary.txt
🚧 Files skipped from review as they are similar to previous changes (4)
  • Containerfile
  • .tekton/lightspeed-stack-pull-request.yaml
  • Makefile
  • pyproject.toml
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
🔇 Additional comments (1)
.tekton/lightspeed-stack-push.yaml (1)

34-34: Verify referenced requirement files exist and are maintained.

Line 34 references several requirement files in the prefetch configuration:

  • requirements.hermetic.txt
  • requirements.torch.txt
  • requirements-binary.txt

These files are not provided in the PR context. Ensure that:

  1. All referenced files are actually generated/maintained by your build process
  2. The purpose of each file is clear (e.g., is requirements.torch.txt still needed if torch is in the binary block?)
  3. There are no redundant requirement files being referenced

The prefetch configuration should only reference files that are actively maintained and necessary for the build.

Comment on lines 7 to 256
beniget==0.4.2.post1
# via pythran
calver==2025.10.20
# via trove-classifiers
cffi==2.0.0
# via cryptography
coherent-licensed==0.5.2
# via zipp
cppy==1.3.1
# via kiwisolver
cython==3.1.7
# via
# scikit-learn
# scipy
cython==3.2.2
# via
# asyncpg
# lxml
# numpy
# pandas
# pyarrow
# pyyaml
# sqlalchemy
dunamai==1.25.0
# via uv-dynamic-versioning
expandvars==1.1.2
# via
# frozenlist
# propcache
# yarl
flit-core==3.12.0
# via
# aiosqlite
# click
# coherent-licensed
# idna
# jinja2
# markdown-it-py
# mdurl
# packaging
# pathspec
# pyparsing
# pyproject-metadata
# threadpoolctl
# typing-extensions
# wheel
gast==0.6.0
# via
# beniget
# pythran
hatch-fancy-pypi-readme==25.1.0
# via
# attrs
# httpcore
# httpx
# jsonschema
# llama-stack-client
# openai
# pydantic
hatch-vcs==0.4.0
# via urllib3
hatch-vcs==0.5.0
# via
# attrs
# filelock
# fsspec
# jsonschema
# jsonschema-specifications
# referencing
# scikit-build-core
# termcolor
hatchling==1.26.3
# via
# hatch-fancy-pypi-readme
# llama-stack-client
# openai
hatchling==1.28.0
# via
# annotated-types
# attrs
# dnspython
# expandvars
# filelock
# fsspec
# hatch-fancy-pypi-readme
# hatch-vcs
# httpcore
# httpx
# jsonschema
# jsonschema-specifications
# mcp
# opentelemetry-api
# opentelemetry-exporter-otlp
# opentelemetry-exporter-otlp-proto-common
# opentelemetry-exporter-otlp-proto-grpc
# opentelemetry-exporter-otlp-proto-http
# opentelemetry-instrumentation
# opentelemetry-proto
# opentelemetry-sdk
# opentelemetry-semantic-conventions
# pydantic
# pydantic-settings
# pygments
# python-multipart
# referencing
# scikit-build-core
# starlette
# termcolor
# typing-inspection
# urllib3
# uv-dynamic-versioning
# uvicorn
jinja2==3.1.6
# via uv-dynamic-versioning
markupsafe==3.0.3
# via jinja2
maturin==1.10.2
# via
# cryptography
# fastuuid
# hf-xet
# jiter
# pydantic-core
# rpds-py
# safetensors
# tokenizers
meson-python==0.16.0
# via matplotlib
meson-python==0.18.0
# via
# contourpy
# numpy
# pandas
# scikit-learn
# scipy
meson==1.9.2
# via
# contourpy
# meson-python
# pandas
numpy==2.2.6
# via
# pandas
# pyarrow
# scikit-learn
# scipy
packaging==25.0
# via
# hatchling
# meson-python
# scikit-build-core
# setuptools-scm
pathspec==0.12.1
# via
# hatchling
# scikit-build-core
pdm-backend==2.4.6
# via
# annotated-doc
# fastapi
pkgconfig==1.5.5
# via aiohttp
pluggy==1.6.0
# via hatchling
ply==3.11
# via pythran
poetry-core==2.2.1
# via
# aiohappyeyeballs
# dunamai
# litellm
# pkgconfig
# rich
# rsa
# tomlkit
pybind11==3.0.1
# via
# contourpy
# matplotlib
# pillow
# scipy
pycparser==2.23
# via cffi
pyproject-metadata==0.10.0
# via meson-python
pythran==0.18.1
# via scipy
scikit-build-core==0.11.6
# via pybind11
scipy==1.16.3
# via scikit-learn
semantic-version==2.10.0
# via setuptools-rust
setuptools-rust==1.12.0
# via
# maturin
# tiktoken
setuptools-scm==7.1.0
# via python-dateutil
setuptools-scm==9.2.2
# via
# anyio
# cppy
# hatch-vcs
# httpx-sse
# importlib-metadata
# kiwisolver
# matplotlib
# pluggy
# pyarrow
# setuptools-rust
# sniffio
# tqdm
# zipp
tomlkit==0.13.3
# via uv-dynamic-versioning
trove-classifiers==2025.12.1.14
# via hatchling
typing-extensions==4.15.0
# via setuptools-scm
uv-dynamic-versioning==0.11.2
# via mcp
versioneer==0.29
# via pandas
wheel==0.45.1
# via
# asyncpg
# authlib
# cachetools
# cppy
# fire
# httpx-sse
# litellm
# meson
# pandas
# python-dateutil
# python-jose
# sentence-transformers
# tiktoken
# tqdm
# tzdata

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

# First, find and read the requirements-build.txt file
find . -name "requirements-build.txt" -type f

Repository: lightspeed-core/lightspeed-stack

Length of output: 208


🌐 Web query:

pybuild-deps tool multiple versions same package requirements

💡 Result:

Do you mean pybuild-deps (the PEP‑517 build-deps helper) and how it handles duplicate/contradictory version specifiers for the same package in requirements? If so, tell me which behavior you want to see (how it currently resolves duplicates, how to produce a single pinned requirement, or an example input/output) and I’ll fetch the exact pybuild-deps docs/behavior and show a concrete example with citations.


🌐 Web query:

pybuild-deps pip-compile multiple versions output format

💡 Result:

Short answer: pybuild-deps does not itself emit separate pip-compile outputs per Python version — its compile command writes a single requirements-style file (build-requirements.txt) with options to add annotations and pip-style hashes. To get pip-compile outputs for multiple Python versions run pip-compile under each interpreter (pythonX.Y -m piptools compile) or use a helper like pip-compile-multi to produce/versioned outputs for several Python versions.

Sources:

  • pybuild-deps CLI compile options (annotate, annotation-style, --generate-hashes, --output-file). [pybuild-deps docs]. [1]
  • pip-tools / pip-compile recommendation to run under each Python version. [pip-tools docs]. [2]
  • pip-compile-multi can manage compiling requirement files for multiple Python versions/environments. [pip-compile-multi features]. [3]

References:
[1] https://pybuild-deps.readthedocs.io/en/latest/usage.html
[2] https://pip-tools.readthedocs.io/en/stable/
[3] https://pip-compile-multi.readthedocs.io/en/latest/features.html


🌐 Web query:

pybuild-deps merge multiple architecture requirement files conflict resolution

💡 Result:

Do you want (A) concrete commands/steps to merge pybuild-deps-generated per-architecture requirements into a single requirements file, or (B) guidance/strategies for resolving version conflicts that show up when merging?


🌐 Web query:

pip-compile multiple versions same package in output requirements file

💡 Result:

Short answer: pip-compile (pip-tools) will not produce two different pinned versions of the same package in a single output requirements file for one installation environment. You must represent multiple versions by using separate requirement sets (files) or environment-conditional pins/constraints. Suggested approaches:

  • Create separate outputs (one pinned file per variant) and compile each from its own .in (or from .in + different constraints). Example: requirements-3.9.in -> requirements-3.9.txt and requirements-3.11.in -> requirements-3.11.txt. (Common solution for testing a library against multiple dependency versions.) [1][2]

  • Use constraint files (-c) to force one compiled file to constrain another. Compile the “base” file first, then compile the variant with -c base.txt so the variant’s requirements respect the base pins. This avoids duplicate/conflicting pins in one file. [1][4]

  • Use tooling that manages multiple compiled outputs/variants (e.g. pip-compile-multi) to automate cross-file compilation and resolving cross-file conflicts. It can compile multiple .in → .txt and auto-resolve cross-file constraints. [3]

Notes:

  • You cannot have two conflicting versions of the same package in one single requirements.txt for one environment; pip (and pip-tools) installs one version per environment. Use separate environments/files to install different versions. [1][2]

If you want, tell me how you want the variants expressed (different Python versions, test vs prod, two specific versions of a dependency) and I’ll give exact .in/.txt examples and the pip-compile commands.

Sources: pip-tools docs and PyPI, pip-compile-multi docs. [1] [2] [3]


Duplicate package versions in requirements-build.txt will cause pip install failures.

The file contains multiple pinned versions of the same packages, which is invalid for pip requirements files:

  • cython: 3.1.7 and 3.2.2
  • hatch-vcs: 0.4.0 and 0.5.0
  • hatchling: 1.26.3 and 1.28.0
  • meson-python: 0.16.0 and 0.18.0
  • setuptools-scm: 7.1.0 and 9.2.2

A single requirements file cannot specify conflicting versions. The pybuild-deps compilation process needs to resolve these conflicts by either selecting one version per package or using separate requirement files for each architecture variant rather than merging them into a single output.

🤖 Prompt for AI Agents
In requirements-build.txt (lines 7-248) there are duplicate pinned versions for
the same packages (cython: 3.1.7 & 3.2.2; hatch-vcs: 0.4.0 & 0.5.0; hatchling:
1.26.3 & 1.28.0; meson-python: 0.16.0 & 0.18.0; setuptools-scm: 7.1.0 & 9.2.2)
which will break pip; resolve by keeping a single pinned version per package
(prefer the newest compatible release), remove the duplicates from this file, or
split architecture/variant-specific pins into separate requirement files instead
of merging them; update the dependency export/generator that produced this file
to dedupe by package name (selecting the chosen version) so future outputs do
not reintroduce conflicting entries.

Comment on lines +50 to +76
awk '
BEGIN {
in_faiss_section = 0
}
# If we find a line starting with faiss-cpu==
/^faiss-cpu==/ {
in_faiss_section = 1
next # Skip this line
}
# If we are in faiss section and line starts with 4 spaces, skip it
in_faiss_section == 1 && /^ / {
next # Skip this line
}
# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
in_faiss_section == 1 && !/^ / {
in_faiss_section = 0
# Fall through to print this line
}
# Print all lines that are not part of faiss section
in_faiss_section == 0 {
print
}
' "${INPUT_FILE}.tmp" > "${INPUT_FILE}.tmp2"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Replicated unfixed empty-line handling in faiss-cpu removal logic.

The newly added faiss-cpu removal (lines 50-76) uses identical awk logic to the torch removal but reproduces the same issue flagged in the previous review (lines 33-42). Line 67 exits the faiss section when encountering any line that doesn't start with 4 spaces, including empty lines:

in_faiss_section == 1 && !/^    / {
    in_faiss_section = 0

If empty lines appear between faiss-cpu==<version> and its hash lines, the script will incorrectly exit the section and leave orphaned hash lines behind. This creates a maintenance issue: you now have two nearly-identical blocks with the same latent bug.

Suggested fix: Apply the correction to both torch and faiss sections by checking for empty lines explicitly:

-# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
-in_faiss_section == 1 && !/^    / {
+# If we are in faiss section and line is empty, skip it
+in_faiss_section == 1 && /^$/ {
+    next  # Skip empty line
+}
+
+# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
+in_faiss_section == 1 && !/^    / && !/^$/ {
     in_faiss_section = 0
     # Fall through to print this line
 }

And apply the same pattern to the torch section (lines 33-42) to fix the previously flagged issue.

🤖 Prompt for AI Agents
In scripts/remove_torch_deps.sh around lines 50 to 76 (and similarly update the
torch block at lines 33 to 42), the awk logic exits the faiss-cpu section on any
non-indented line which incorrectly treats empty lines as section terminators
and leaves orphaned hash lines; change the exit condition so the section only
ends when a line is neither indented (doesn't start with four spaces) nor empty
(non-blank), i.e., explicitly treat blank lines as part of the section and only
reset in_faiss_section when encountering a non-empty, non-indented line; apply
the same exact fix to the torch removal block to remove the duplicated bug.

@raptorsun raptorsun force-pushed the non-binary branch 4 times, most recently from e762f1c to cde758a Compare December 10, 2025 12:22
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
pyproject.toml (1)

59-71: Correct pyright exclusion path to match coding guidelines

The coding guidelines specify excluding src/auth/k8s.py from pyright type checking, but the current config excludes src/authentication/k8s.py. Update the path in pyproject.toml:

 [tool.pyright]
 exclude = [
-  "src/authentication/k8s.py",
+  "src/auth/k8s.py",
   # Agent API v1 endpoints - deprecated API but still supported
♻️ Duplicate comments (1)
scripts/remove_torch_deps.sh (1)

33-42: Blank-line handling in torch/faiss awk blocks still breaks hash removal

Both torch and faiss sections still exit on any non-indented line, including blanks, so an empty line between the package line and its hash lines will prematurely end the section and leave orphaned hashes. This is the same issue noted in the earlier review and now affects faiss-cpu as well.

Consider updating both blocks to treat empty lines as part of the section and only exit on non-empty, non-indented lines:

@@
-# If we are in torch section and line starts with 4 spaces, skip it
-in_torch_section == 1 && /^    / {
-    next  # Skip this line
-}
-
-# If we are in torch section and line does NOT start with 4 spaces, exit torch section
-in_torch_section == 1 && !/^    / {
-    in_torch_section = 0
-    # Fall through to print this line
-}
+# If we are in torch section and line starts with 4 spaces, skip it
+in_torch_section == 1 && /^    / {
+    next  # Skip this line
+}
+
+# If we are in torch section and line is empty, skip it
+in_torch_section == 1 && /^$/ {
+    next  # Skip empty line
+}
+
+# If we are in torch section and line is non-empty and does NOT start with 4 spaces, exit torch section
+in_torch_section == 1 && !/^    / && !/^$/ {
+    in_torch_section = 0
+    # Fall through to print this line
+}
@@
-# If we are in faiss section and line starts with 4 spaces, skip it
-in_faiss_section == 1 && /^    / {
-    next  # Skip this line
-}
-
-# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
-in_faiss_section == 1 && !/^    / {
-    in_faiss_section = 0
-    # Fall through to print this line
-}
+# If we are in faiss section and line starts with 4 spaces, skip it
+in_faiss_section == 1 && /^    / {
+    next  # Skip this line
+}
+
+# If we are in faiss section and line is empty, skip it
+in_faiss_section == 1 && /^$/ {
+    next  # Skip empty line
+}
+
+# If we are in faiss section and line is non-empty and does NOT start with 4 spaces, exit faiss section
+in_faiss_section == 1 && !/^    / && !/^$/ {
+    in_faiss_section = 0
+    # Fall through to print this line
+}

Also applies to: 50-76

🧹 Nitpick comments (1)
scripts/remove_torch_deps.sh (1)

3-4: Script description and success message now under-report behavior (faiss-cpu also removed)

The header comment and final echo only mention torch, but the script now also strips faiss-cpu blocks. This can confuse users reading logs or the script header.

Consider updating both to reflect current behavior, e.g.:

-# Script to remove torch dependencies from requirements.txt
-# Detects torch==<version> and removes it along with all subsequent lines starting with 4 spaces
+# Script to remove torch and faiss-cpu dependencies from requirements.txt
+# Detects torch==<version> and faiss-cpu==<version> and removes them along with all subsequent lines starting with 4 spaces
@@
-echo "Successfully removed torch dependencies from $INPUT_FILE"
-echo "Original file backed up to $BACKUP_FILE"
+echo "Successfully removed torch and faiss-cpu dependencies from $INPUT_FILE"
+echo "Original file backed up to $BACKUP_FILE"

Also applies to: 79-85

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e762f1c and cde758a.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (11)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (1 hunks)
  • Makefile (2 hunks)
  • pyproject.toml (1 hunks)
  • requirements-build.txt (1 hunks)
  • requirements.aarch64.txt (6 hunks)
  • requirements.binary.txt (1 hunks)
  • requirements.hermetic.txt (1 hunks)
  • requirements.x86_64.txt (6 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (4)
  • Makefile
  • requirements-build.txt
  • requirements.binary.txt
  • requirements.aarch64.txt
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml

📄 CodeRabbit inference engine (CLAUDE.md)

pyproject.toml: Configure pylint with source-roots = "src"
Exclude src/auth/k8s.py from pyright type checking

Files:

  • pyproject.toml
🧠 Learnings (4)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies

Applied to files:

  • Containerfile
  • pyproject.toml
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Always check `pyproject.toml` for existing dependencies and versions before adding new ones

Applied to files:

  • pyproject.toml
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • requirements.x86_64.txt
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.

Applied to files:

  • requirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: build-pr
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
  • GitHub Check: E2E: server mode / ci
  • GitHub Check: E2E: server mode / azure
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: E2E: library mode / azure
🔇 Additional comments (10)
.tekton/lightspeed-stack-pull-request.yaml (1)

37-37: PR pipeline prefetch JSON mirrors push pipeline; verify shared config stays consistent

The prefetch-input value here matches the push pipeline's Pip prefetch config (same requirements files and binary metadata), which is good for behavior parity between PR and push runs.

As with the push pipeline, please:

  • Confirm requirements.binary.txt exists and is the intended file name.
  • Keep this JSON in sync with the push pipeline when adding/removing requirements or binary packages.
requirements.hermetic.txt (1)

1-6: Explicit hermetic tool pins look good; ensure versions stay aligned with your toolchain

Commented header plus exact versions for uv, pip, patchelf, and cmake match the hermetic intent and are easy to audit.

Keep these versions in sync with:

  • The uv version you install in the builder image
  • Any expectations from your build/prefetch tasks
  • Other version pins in your configuration
.tekton/lightspeed-stack-push.yaml (1)

34-34: Ensure requirements.binary.txt and requirements-build.txt exist and are used consistently

The updated prefetch-input JSON structure is valid, but verify that:

  • requirements.binary.txt exists at the repository root
  • requirements-build.txt exists at the repository root
  • Both filenames match their references in the Containerfile pip install paths
pyproject.toml (1)

103-127: Align pip/pybuild-deps tooling versions across dev and hermetic environments

Adding pybuild-deps==0.5.0 and pinned pip==24.3.1 to the dev group should be verified for consistency with requirements.hermetic.txt. If the hermetic environment uses a different pip version, this divergence could lead to different resolver behavior between CI and local dev.

Consider either:

  • Aligning pip versions between pyproject.toml dev and requirements.hermetic.txt, or
  • Documenting why they intentionally differ.

Also verify that the pyright exclude configuration in [tool.pyright] correctly references the path to src/auth/k8s.py as specified in the coding guidelines.

requirements.x86_64.txt (6)

1931-1941: faiss-cpu hermetic split — reconcile "via faiss-cpu" residue.

If faiss-cpu is intentionally moved to requirements.binary.txt, regenerate this non-binary lock so "# via faiss-cpu" no longer appears under numpy, or confirm it's still a resolver input for this group. Also ensure faiss-cpu remains top-level pinned in the binary manifest to keep the dep graph stable across arches. Based on project design, faiss-cpu is intentionally pinned top-level to maintain control over the dependency graph.


1866-1868: networkx patch to 3.6.1.

Patch-level bump released Dec 08, 2025 with new spectral and greedy bipartition community detection methods, enhancements to node list handling and drawing, and bug fixes for graph subclasses. Low risk; sanity-run any graph code paths in CI.


3298-3345: SQLAlchemy → 2.0.45 (greenlet 3.3.0 compatible).

Bugfix bump with no reported incompatibilities—SQLAlchemy 2.0.45 changelog contains no greenlet-related breaking changes. Consider running smoke tests on ORM/async engine code paths as general QA practice.


3166-3203: scikit-learn 1.8.0 dependencies are compatible.

All specified dependency versions satisfy scikit-learn 1.8.0 minimum requirements: numpy 2.2.6 (≥ 1.22.0), scipy 1.16.3 (≥ 1.8.0), joblib 1.5.2 (≥ 1.2.0), threadpoolctl 3.6.0 (≥ 3.1.0).


3515-3516: TRL 0.26.0 is compatible with the specified dependency versions.

TRL 0.26.0 (released Dec 9, 2025) requires transformers >= 4.46.0 as a baseline. The version stack here — transformers 4.57.3, accelerate 1.12.0, and peft 0.18.0 — aligns with TRL's ecosystem requirements and should work without compatibility issues.


1375-1377: litellm 1.80.9 — compatibility confirmed.

No breaking changes are documented in the v1.80.9 release notes. OpenAI provider support is standard; jiter is a transitive dependency (via openai/pydantic ecosystem) with no direct version constraints for litellm. Ensure Python >= 3.9 for jiter compatibility.

Comment on lines 1683 to 1685
mcp==1.23.3 \
--hash=sha256:32768af4b46a1b4f7df34e2bfdf5c6011e7b63d7f1b0e321d0fdef4cd6082031 \
--hash=sha256:b3b0da2cc949950ce1259c7bfc1b081905a51916fcd7c8182125b85e70825201
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

mcp 1.23.3 changelog release notes

💡 Result:

Do you mean the Minecraft Coder Pack (MCP) release 1.23.3, or a different project named “MCP”? If it’s Minecraft Coder Pack, I can search for the specific 1.23.3 changelog — confirm which MCP you mean.


🌐 Web query:

httpx-sse 0.4.3 compatibility requirements

💡 Result:

httpx-sse 0.4.3 requirements/compatibility summary:

  • Python: requires Python >= 3.9 (classifiers list support for 3.9–3.13). [1][2]
  • Package dependencies: httpx-sse does not declare runtime install dependencies (i.e., no pinned httpx requirement in the wheel/metadata); it expects you to install httpx yourself. [3][1]

Sources:

  • PyPI httpx-sse release page (0.4.3). [1]
  • Package METADATA showing Requires-Python >=3.9 and supported Python classifiers. [2]
  • piwheels / release metadata (shows no runtime dependencies). [3]

🌐 Web query:

starlette 0.50.0 httpx-sse compatibility

💡 Result:

Short answer: Yes — httpx-sse can work with Starlette 0.50.0, but be aware of historical caveats around HTTPX's ASGI transport and streaming SSE tests.

Details:

  • httpx-sse's README shows direct usage against a Starlette ASGI app (via httpx.AsyncClient + httpx.ASGITransport). [1]
  • Starlette 0.50.0 is a normal minor release (drops Python 3.9) and does not introduce documented breaking changes to ASGI streaming/SSE. [2][3]
  • However, HTTPX's ASGITransport/streaming has had issues causing tests or in-process SSE streaming to hang / not stream correctly (tracked in the httpx issue/ discussion). If you plan to call an SSE endpoint in-process (tests or ASGITransport), check those httpx issues and use recent httpx versions or run the app server separately. [4][5]

References:
[1] httpx-sse README (example with Starlette).
[2] Starlette 0.50.0 release notes.
[3] Starlette GitHub release tag 0.50.0.
[4] encode/httpx issue: ASGITransport does not stream responses.
[5] encode/httpx discussion: Test hangs forever when using AsyncClient.stream() on an SSE endpoint.

If you want, I can check your specific httpx and httpx-sse versions and test code for a reproducible fix.


🌐 Web query:

mcp package pypi python 1.23.3

💡 Result:

mcp on PyPI — version 1.23.3 is available (released Dec 9, 2025). Requires Python >= 3.10. Install with:
pip install "mcp[cli]"

Source: PyPI project page for mcp. [1]


mcp 1.23.3 compatible with httpx-sse==0.4.3 and starlette==0.50.0, but verify SSE streaming behavior.

All packages are version-compatible. However, be aware that HTTPX's ASGITransport has known issues with SSE streaming (responses may hang or not stream correctly in-process). If testing SSE endpoints locally or using ASGITransport, verify streaming works as expected or run the app server separately.

🤖 Prompt for AI Agents
In requirements.x86_64.txt around lines 1683 to 1685, note that although
mcp==1.23.3 is compatible with httpx-sse==0.4.3 and starlette==0.50.0, HTTPX's
ASGITransport can exhibit broken SSE streaming behavior; when validating SSE
endpoints, run the app server as a separate process (not using ASGITransport
in-process) or switch to a real network transport (e.g., HTTP client against
localhost) to confirm streaming works end-to-end, and if needed add a short
integration test that exercises SSE with the app running separately or change
the test harness to use a non-ASGI transport.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
scripts/remove_torch_deps.sh (1)

33-42: Don’t exit sections on blank lines; skip them explicitly (torch + faiss).

Empty lines between the pkg==... header and its hash lines will terminate the section early and leave orphaned hashes. Apply the same fix to both blocks.

 /^torch==/ {
     in_torch_section = 1
     next
 }
-# If we are in torch section and line starts with 4 spaces, skip it
-in_torch_section == 1 && /^    / { next }
-
-# If we are in torch section and line does NOT start with 4 spaces, exit torch section
-in_torch_section == 1 && !/^    / {
+# If we are in torch section and line is empty, skip it
+in_torch_section == 1 && /^$/ { next }
+# If we are in torch section and line does NOT start with 4 spaces, exit torch section
+in_torch_section == 1 && !/^    / && !/^$/ {
     in_torch_section = 0
 }
 /^faiss-cpu==/ {
     in_faiss_section = 1
     next
 }
-# If we are in faiss section and line starts with 4 spaces, skip it
-in_faiss_section == 1 && /^    / { next }
-
-# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
-in_faiss_section == 1 && !/^    / {
+# If we are in faiss section and line is empty, skip it
+in_faiss_section == 1 && /^$/ { next }
+# If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
+in_faiss_section == 1 && !/^    / && !/^$/ {
     in_faiss_section = 0
 }

Also applies to: 50-76

♻️ Duplicate comments (1)
requirements.x86_64.txt (1)

1683-1685: mcp 1.23.3: watch SSE tests with HTTPX ASGITransport.

Version bump looks fine; ensure SSE streaming tests don’t hang when using ASGITransport; prefer real network transport if needed.

httpx ASGITransport SSE streaming known issues and recommended workarounds for tests
🧹 Nitpick comments (3)
scripts/remove_torch_deps.sh (3)

6-6: Hardening: add pipefail and cleanup trap.

Ensure failures in pipelines are caught and tmp files are removed on exit.

-set -e
+set -euo pipefail
+trap 'rm -f "${INPUT_FILE}.tmp" "${INPUT_FILE}.tmp2"' EXIT

79-81: Atomic replace and clearer cleanup.

Use mktemp for temp paths to avoid collisions and move with a temp-to-target pattern. Current rm "${INPUT_FILE}.tmp" is fine but a trap (above) is safer. If you keep manual cleanup, at least guard against absent files.

-mv "${INPUT_FILE}.tmp2" "$INPUT_FILE"
-rm "${INPUT_FILE}.tmp"
+mv -f "${INPUT_FILE}.tmp2" "$INPUT_FILE"
+rm -f "${INPUT_FILE}.tmp"

83-85: Message mentions only torch; include FAISS.

Update the success message so logs reflect both removals.

-echo "Successfully removed torch dependencies from $INPUT_FILE"
+echo "Successfully removed torch and faiss-cpu dependencies from $INPUT_FILE"
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between cde758a and cbf76b4.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (11)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (1 hunks)
  • Makefile (2 hunks)
  • pyproject.toml (1 hunks)
  • requirements-build.txt (1 hunks)
  • requirements.aarch64.txt (6 hunks)
  • requirements.binary.txt (1 hunks)
  • requirements.hermetic.txt (1 hunks)
  • requirements.x86_64.txt (6 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
✅ Files skipped from review due to trivial changes (1)
  • requirements.binary.txt
🚧 Files skipped from review as they are similar to previous changes (5)
  • pyproject.toml
  • requirements.hermetic.txt
  • .tekton/lightspeed-stack-pull-request.yaml
  • Containerfile
  • Makefile
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • requirements.aarch64.txt
  • requirements.x86_64.txt
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.

Applied to files:

  • requirements.aarch64.txt
  • requirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: build-pr
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
  • GitHub Check: E2E: library mode / azure
  • GitHub Check: E2E: server mode / ci
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: E2E: server mode / azure
🔇 Additional comments (12)
requirements.x86_64.txt (4)

1866-1868: networkx 3.6.1 bump: LGTM.

No known tight pins here; safe minor bump.


3298-3345: Verify SQLAlchemy 2.0.45 compatibility before deployment.

After upgrading SQLAlchemy to 2.0.45, scan Python source files for deprecated 1.x patterns: engine.execute(), connection.execute(), autocommit=True, Session(bind=...), and legacy declarative imports. These APIs will cause runtime breakage in 2.x.


3515-3516: TRL 0.26.0 is compatible with transformers==4.57.3 and peft==0.18.0.

No compatibility issues or breaking API changes identified. PEFT 0.18.0 maintains backward compatibility with Transformers v4.x, and TRL has explicit PEFT integration support.


3167-3203: NumPy/SciPy compatibility confirmed for scikit-learn 1.8.0.

Pinned versions numpy==2.2.6 and scipy==1.16.3 satisfy scikit-learn 1.8.0's requirements (numpy >= 1.24.1, scipy >= 1.10.0). Wheels are published for linux-x86_64 manylinux2_28 with Python 3.12 support.

requirements.aarch64.txt (6)

3298-3345: Remove this review—it targets the wrong artifact.

The comment concerns deprecated sqlalchemy 1.x patterns in Python code, but the file under review (requirements.aarch64.txt) contains only package hashes and version pins. Version verification and migration safety should be reviewed in:

  • Actual Python source files using sqlalchemy (check for engine.execute(), legacy session patterns)
  • Database migration files
  • Test suites exercising DB operations

Not in the requirements metadata itself.


1866-1868: networkx 3.6.1 does not introduce breaking API changes — no action needed.

Version 3.6.1 includes only enhancements and bugfixes. Breaking API changes occurred in earlier 3.x releases (3.0–3.6rc); if those versions were previously used in the project, those compatibility issues would already be present. No additional verification is required for this update.


1683-1686: mcp 1.23.3 — verify client/server compatibility and dependency constraints.

Confirm MCP version 1.23.3 is compatible with your client/server implementations. Review pyproject.toml to ensure declared versions of mcp, typing-inspection, and pydantic satisfy runtime imports and capability negotiation.


3167-3203: Scikit-learn 1.8.0 — source build and NumPy/SciPy ABI compatibility.

  • Building from source on aarch64 requires Python with C/C++ compiler + OpenMP, pre-installed NumPy and SciPy (NumPy 2.2.x + SciPy 1.16.x are compatible, but verify at runtime).
  • Risk: scikit-learn wheels built against older NumPy C ABI will fail with NumPy 2.2.x imports. Use wheels built against NumPy 2.2 or rebuild from source in the target environment.

3515-3516: trl 0.26.0 — requires verification of Trainer/PEFT integration compatibility.

The pinned version was published to PyPI on Dec 9, 2025. Verify that SFTTrainer and PPOTrainer APIs remain compatible with your current usage patterns and that dataset collation signatures work as expected under your training configs.


1375-1377: litellm 1.80.9 bump — run provider smoke tests.

litellm 1.80.9 contains feature and documentation updates with no known breaking changes. However, given litellm's active provider adapter development and recent additions like Responses API support, running quick sanity checks (OpenAI, Anthropic, Azure) is prudent to ensure mappings, retry/backoff, and streaming paths remain functional. Note that the Anthropic adapter has a known limitation with the newer OpenAI Responses API format—verify your use case does not rely on Responses-only semantics.

.tekton/lightspeed-stack-push.yaml (1)

34-34: Verify the prefetch-input schema for the binary block structure against Cachi2 / Konflux task definition.

The JSON structure on line 34 includes a binary block with arch: "x86_64,aarch64", py_version: "312", and packages: "torch,faiss-cpu". Confirm with your Konflux/Cachi2 task bundle (or documentation) whether:

  • arch should be a CSV string or JSON array
  • py_version "312" (Python 3.12) is supported and matches the builder image
  • binary.packages overlaps with the requirements_* files and whether this creates duplicate resolution or conflicts
requirements-build.txt (1)

1-248: Duplicate pinned package versions in autogenerated requirements file require resolution.

This file contains multiple conflicting versions of the same packages:

  • cython: 3.1.7 (line 17) and 3.2.2 (line 19)
  • hatchling: 1.26.3 (line 76) and 1.28.0 (line 81)
  • meson-python: 0.16.0 (line 131) and 0.18.0 (line 133)
  • setuptools-scm: 7.1.0 (line 202) and 9.2.2 (line 204)

While the file is autogenerated by pybuild-deps from architecture-specific sources (requirements.aarch64.txt and requirements.x86_64.txt), a single consolidated requirements file cannot contain multiple versions of the same package. This configuration needs to be reconciled by either:

  1. Keeping architecture-specific requirement files separate (e.g., requirements-build.aarch64.txt and requirements-build.x86_64.txt)
  2. Resolving all version conflicts to a single pin per package across both architectures
  3. Updating the pybuild-deps compilation step to avoid merging conflicting pins

Verify the intended usage pattern and ensure the build system is configured to handle the generated output correctly.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (4)
scripts/remove_torch_deps.sh (1)

22-47: Fix section termination logic for torch/faiss blocks and align log message

The awk logic for both torch and faiss-cpu sections still exits the section on any line that doesn’t start with four spaces:

in_torch_section == 1 && !/^    / { ... }
...
in_faiss_section == 1 && !/^    / { ... }

This incorrectly treats blank lines as section terminators and will leave orphaned hash lines if an empty line appears between torch==.../faiss-cpu==... and their hash lines. The same issue was already flagged in earlier review comments and is now duplicated for the faiss block. Also, the final echo still only mentions “torch dependencies” even though faiss-cpu is also removed.

Consider:

  • For both torch and faiss blocks:
    • Explicitly skip empty lines inside the section, and
    • Only end the section when the line is non‑empty and does not start with four spaces.

For example (torch block; apply the same pattern to faiss):

-# If we are in torch section and line starts with 4 spaces, skip it
-in_torch_section == 1 && /^    / {
-    next  # Skip this line
-}
-
-# If we are in torch section and line does NOT start with 4 spaces, exit torch section
-in_torch_section == 1 && !/^    / {
-    in_torch_section = 0
-    # Fall through to print this line
-}
+# If we are in torch section and line starts with 4 spaces, skip it
+in_torch_section == 1 && /^    / {
+    next  # Skip this line
+}
+
+# If we are in torch section and line is empty, skip it
+in_torch_section == 1 && /^$/ {
+    next  # Skip empty line
+}
+
+# If we are in torch section and line is non-empty and does NOT start with 4 spaces, exit torch section
+in_torch_section == 1 && !/^    / && !/^$/ {
+    in_torch_section = 0
+    # Fall through to print this line
+}

And similarly for in_faiss_section.

You may also want to update the success message to reflect that both torch and faiss-cpu dependencies are being removed (or phrase it generically as “selected binary deps”).

Also applies to: 50-81

requirements.x86_64.txt (2)

1375-1377: Pin looks fine; verify litellm 1.80.9 with openai==2.9.0.

Recent openai releases sometimes break litellm. Run a quick import + chat/stream smoke in staging; pin/adjust if issues.


1683-1685: mcp 1.23.3 + httpx-sse/starlette: validate SSE streaming.

ASGITransport streaming has known quirks; run an end‑to‑end SSE test against a real server, not in‑process.

Makefile (1)

118-118: Remove --universal; faiss-cpu ships platform-specific wheels.

Universal resolution can emit unusable markers for binary wheels. Compile per‑Python without --universal.

Apply:

-	echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile  - -o requirements.binary.txt --generate-hashes  --python-version 3.12 --no-deps --universal
+	echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile  - -o requirements.binary.txt --generate-hashes --python-version 3.12 --no-deps
🧹 Nitpick comments (1)
Containerfile (1)

16-18: Update comment to reflect cmake addition

The comment still talks only about installing gcc, but the command now installs both gcc and cmake. Consider updating the comment so future readers understand why cmake is needed here.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between cbf76b4 and 42479cd.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (13)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (2 hunks)
  • Makefile (2 hunks)
  • pyproject.toml (1 hunks)
  • requirements-build.txt (1 hunks)
  • requirements.aarch64.txt (6 hunks)
  • requirements.binary.txt (1 hunks)
  • requirements.hermetic.txt (1 hunks)
  • requirements.x86_64.txt (6 hunks)
  • rpms.in.yaml (1 hunks)
  • rpms.lock.yaml (10 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
✅ Files skipped from review due to trivial changes (1)
  • requirements-build.txt
🚧 Files skipped from review as they are similar to previous changes (5)
  • .tekton/lightspeed-stack-push.yaml
  • requirements.binary.txt
  • requirements.hermetic.txt
  • pyproject.toml
  • .tekton/lightspeed-stack-pull-request.yaml
🧰 Additional context used
🧠 Learnings (3)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • Makefile
  • requirements.x86_64.txt
  • requirements.aarch64.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies

Applied to files:

  • Containerfile
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.

Applied to files:

  • requirements.x86_64.txt
  • requirements.aarch64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
  • GitHub Check: build-pr
  • GitHub Check: E2E: server mode / azure
  • GitHub Check: E2E: library mode / azure
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: E2E: server mode / ci
🔇 Additional comments (11)
Containerfile (1)

33-37: Verify requirements-binary.txt naming and presence in the image build context

The hermetic branch now does:

pip install ... --no-binary :all: -r requirements.$(uname -m).txt \
  && pip install ... -r requirements-binary.txt -r requirements.torch.txt

Earlier analysis on this PR series found no requirements-binary.txt file in the repo and instead referenced a new binary manifest named requirements.binary.txt in other tooling. It’s not visible in the snippets provided here, so it’s unclear which name is actually being generated/checked in.

Please double‑check that:

  • The filename used here matches the one produced by your Makefile/pybuild tooling and Tekton prefetch configs, and
  • That this file is actually present under requirements.*.txt in the build context so the pip install doesn’t fail at image build time.

If the canonical name is requirements.binary.txt, adjust the Containerfile accordingly; if you intentionally created requirements-binary.txt, ensure all other references (Makefile, Tekton, scripts) use the same spelling.

requirements.aarch64.txt (1)

1375-1377: Version bumps look consistent; ensure they’re regenerated and exercised end‑to‑end

The litellm, mcp, networkx, scikit‑learn, sqlalchemy, and trl entries here look syntactically correct and consistent with an updated uv pip compile run for aarch64. No issues from a requirements-file perspective.

Given these are core runtime libraries, please just confirm that:

  • The same compile flow has been run for the x86_64 requirements and any new requirements.*binary*.txt/build manifests so versions remain aligned across arches, and
  • Key flows using these libs (LLM routing via litellm, mcp client/server, graph handling via networkx, ML bits via scikit‑learn/trl, DB flows via sqlalchemy) have been exercised in tests or manual verification.

Also applies to: 1683-1685, 1866-1868, 3166-3203, 3298-3345, 3515-3516

requirements.x86_64.txt (4)

1866-1868: networkx 3.6.1 bump LGTM.

Minor patch; unlikely to break. Keep.


3167-3203: scikit‑learn 1.8.0: confirm NumPy/SciPy ABI and wheel availability on Python 3.12.

Pinned numpy==2.2.6 and scipy==1.16.3 look aligned, but verify manylinux wheels on CI for x86_64 and aarch64 builds.


3298-3345: SQLAlchemy 2.0.45 bump: run a quick DB init/migration smoke.

Greenlet==3.3.0 is pinned; verify ORM session creation + simple query in CI.


3515-3516: trl 0.26.0 with transformers==4.57.3: smoke test training loop.

Run a tiny Trainer init to catch param API mismatches early.

rpms.in.yaml (1)

1-1: Add of cmake is sensible for native builds.

Approve. Please confirm UBI repo provides a sufficient cmake version for both x86_64 and aarch64 in CI images.

Makefile (2)

10-10: Explicit FAISS_VERSION variable — good.

Keeps the binary manifest controlled.


119-119: New pybuild-deps compile step — sensible; verify determinism.

Ensure requirements-build.txt is committed or reproducibly generated in CI (same uv/pyproject inputs across arches).

rpms.lock.yaml (2)

1-10: Lockfile structure and format look valid.

The YAML structure is well-formed with proper nesting and all package entries contain the required fields (url, repoid, size, checksum, name, evr, sourcerpm). URLs follow the expected cdn-ubi pattern, and checksums are valid SHA256 hashes.

Also applies to: 413-415, 823-824


7-48: Verify lockfile generation and checksum integrity.

This appears to be an auto-generated lockfile with coordinated updates across architectures (aarch64/x86_64) and package sets (appstream/baseos). The changes include new packages (emacs-filesystem, vim-filesystem, libuv, python-unversioned-command) and version bumps (cmake, python3, systemd, kernel-headers).

Since manual checksum verification is impractical for 800+ lines, ensure the lockfile was generated through a trusted, auditable process (e.g., a repeatable build tool or CI pipeline). Verify that:

  • The generation process correctly fetched RPMs from authenticated CDN sources
  • Checksums match the actual downloaded artifacts
  • All transitive dependencies are captured

If this lockfile was generated by a tool, consider documenting the exact command/process in a comment or build script so future maintainers can reproduce and verify changes.

Also applies to: 63-97, 119-125, 343-370, 371-391, 406-412, 417-486, 753-822

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (3)
requirements.x86_64.txt (2)

1375-1377: litellm 1.80.9 + openai 2.9.0: run a quick compat smoke test.

Recent openai releases have broken litellm before; keep this pair pinned only after verifying imports and basic calls.

Are there known compatibility issues between litellm==1.80.9 and openai==2.9.0?

1683-1685: mcp 1.23.3: verify SSE streaming in tests (avoid ASGITransport).

httpx-sse with Starlette can hang in-process; prefer a real network transport or server subprocess for SSE tests.

httpx-sse 0.4.3 with Starlette 0.50.0 — any known issues with ASGITransport and streaming SSE?
Makefile (1)

118-120: Do not use --universal for faiss-cpu; compile per platform instead.

faiss-cpu ships platform/ABI-specific wheels; --universal can yield unusable/spec-mismatched entries. Generate separate binary manifests per arch using --python-platform.

Apply this minimal change:

-	echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile  - -o requirements.binary.txt --generate-hashes  --python-version 3.12 --no-deps --universal
+	echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.x86_64.txt --generate-hashes --python-version 3.12 --python-platform x86_64-unknown-linux-gnu --no-deps
+	echo "faiss-cpu==${FAISS_VERSION}" | uv pip compile - -o requirements.binary.aarch64.txt --generate-hashes --python-version 3.12 --python-platform aarch64-unknown-linux-gnu --no-deps

If downstream requires a single file, merge these two with explicit PEP 508 markers during prefetch, not via --universal.

🧹 Nitpick comments (2)
pyproject.toml (1)

125-126: Avoid pinning pip in the dev group; keep build tooling hermetic.

pybuild-deps in dev looks good. Pinning pip==24.3.1 inside the project env is unnecessary with uv and can conflict with tooling. Recommend removing it (use container/base image pip where needed).

   "bandit>=1.8.6",
-  "pybuild-deps==0.5.0",
-  "pip==24.3.1",
+  "pybuild-deps==0.5.0",
Makefile (1)

9-10: Version pin added; ensure alignment with resolver inputs.

You’re pinning FAISS_VERSION := 1.13.0 while pyproject.toml keeps faiss-cpu>=1.11.0. Consider pinning in pyproject.toml (llslibdev) or documenting the override to prevent resolver drift across manifests.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 42479cd and d602717.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (13)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (2 hunks)
  • Makefile (2 hunks)
  • pyproject.toml (1 hunks)
  • requirements-build.txt (1 hunks)
  • requirements.aarch64.txt (6 hunks)
  • requirements.binary.txt (1 hunks)
  • requirements.hermetic.txt (1 hunks)
  • requirements.x86_64.txt (6 hunks)
  • rpms.in.yaml (1 hunks)
  • rpms.lock.yaml (10 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (8)
  • .tekton/lightspeed-stack-pull-request.yaml
  • requirements.binary.txt
  • scripts/remove_torch_deps.sh
  • requirements-build.txt
  • .tekton/lightspeed-stack-push.yaml
  • requirements.hermetic.txt
  • Containerfile
  • rpms.in.yaml
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml

📄 CodeRabbit inference engine (CLAUDE.md)

pyproject.toml: Configure pylint with source-roots = "src"
Exclude src/auth/k8s.py from pyright type checking

Files:

  • pyproject.toml
🧠 Learnings (4)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • Makefile
  • requirements.aarch64.txt
  • requirements.x86_64.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Always check `pyproject.toml` for existing dependencies and versions before adding new ones

Applied to files:

  • pyproject.toml
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies

Applied to files:

  • pyproject.toml
📚 Learning: 2025-08-18T10:58:14.951Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:47-47
Timestamp: 2025-08-18T10:58:14.951Z
Learning: psycopg2-binary is required by some llama-stack providers in the lightspeed-stack project, so it cannot be replaced with psycopg v3 or moved to optional dependencies without breaking llama-stack functionality.

Applied to files:

  • requirements.aarch64.txt
  • requirements.x86_64.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: build-pr
  • GitHub Check: E2E: server mode / ci
  • GitHub Check: E2E: library mode / azure
  • GitHub Check: E2E: server mode / azure
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
🔇 Additional comments (5)
requirements.x86_64.txt (3)

1866-1868: networkx patch bump (3.6.1): low risk.

Looks fine; no action beyond routine CI.


3298-3345: SQLAlchemy 2.0.45: validate greenlet/alembic paths.

Patch bump is OK; still, run a minimal engine connect and alembic env import to catch regressions.

#!/bin/bash
python - <<'PY'
import sqlalchemy, greenlet
from sqlalchemy import create_engine, text
print("sqlalchemy", sqlalchemy.__version__, "greenlet", greenlet.__version__)
# dry connect string; replace with sqlite for CI
e = create_engine("sqlite:///:memory:")
with e.connect() as c:
    c.execute(text("select 1")).scalar()
print("engine OK")
PY
Any known issues between SQLAlchemy 2.0.45 and greenlet 3.3.0?

3515-3516: TRL 0.26.0 is compatible with transformers 4.57.3 and accelerate 1.12.0.

Transformers 4.57.3 satisfies the minimum requirement of transformers >= 4.46.0, and accelerate 1.12.0 exceeds the minimum requirement of >= 0.26.0. Both versions support SFT/ORPO trainer imports and training loop execution.

Makefile (1)

119-120: LGTM: build-time dependency manifest generation.

pybuild-deps compile to emit requirements-build.txt from both arch files is a good addition for hermetic builds.

requirements.aarch64.txt (1)

1375-1377: Version bumps look consistent for py312/aarch64.

Upgrades for litellm, mcp, networkx, scikit-learn, SQLAlchemy, and trl align with pinned NumPy/SciPy. No red flags spotted.

Please verify on aarch64 with a quick smoke build:

  • uv pip install -r requirements.aarch64.txt (in a py312 aarch64 container)
  • import checks: python -c "import numpy, scipy, sklearn, sqlalchemy; print('ok')"

Also applies to: 1683-1685, 1867-1868, 3167-3203, 3298-3345, 3515-3516

Comment on lines 3166 to 3205
scikit-learn==1.8.0 \
--hash=sha256:00d6f1d66fbcf4eba6e356e1420d33cc06c70a45bb1363cd6f6a8e4ebbbdece2 \
--hash=sha256:0d6ae97234d5d7079dc0040990a6f7aeb97cb7fa7e8945f1999a429b23569e0a \
--hash=sha256:146b4d36f800c013d267b29168813f7a03a43ecd2895d04861f1240b564421da \
--hash=sha256:15fc3b5d19cc2be65404786857f2e13c70c83dd4782676dd6814e3b89dc8f5b9 \
--hash=sha256:2838551e011a64e3053ad7618dda9310175f7515f1742fa2d756f7c874c05961 \
--hash=sha256:29ffc74089f3d5e87dfca4c2c8450f88bdc61b0fc6ed5d267f3988f19a1309f6 \
--hash=sha256:2de443b9373b3b615aec1bb57f9baa6bb3a9bd093f1269ba95c17d870422b271 \
--hash=sha256:35c007dedb2ffe38fe3ee7d201ebac4a2deccd2408e8621d53067733e3c74809 \
--hash=sha256:3bad7565bc9cf37ce19a7c0d107742b320c1285df7aab1a6e2d28780df167242 \
--hash=sha256:4496bb2cf7a43ce1a2d7524a79e40bc5da45cf598dbf9545b7e8316ccba47bb4 \
--hash=sha256:4511be56637e46c25721e83d1a9cea9614e7badc7040c4d573d75fbe257d6fd7 \
--hash=sha256:5025ce924beccb28298246e589c691fe1b8c1c96507e6d27d12c5fadd85bfd76 \
--hash=sha256:56079a99c20d230e873ea40753102102734c5953366972a71d5cb39a32bc40c6 \
--hash=sha256:5e30adb87f0cc81c7690a84f7932dd66be5bac57cfe16b91cb9151683a4a2d3b \
--hash=sha256:5fb63362b5a7ddab88e52b6dbb47dac3fd7dafeee740dc6c8d8a446ddedade8e \
--hash=sha256:6b595b07a03069a2b1740dc08c2299993850ea81cce4fe19b2421e0c970de6b7 \
--hash=sha256:72358cce49465d140cc4e7792015bb1f0296a9742d5622c67e31399b75468b9e \
--hash=sha256:74b66d8689d52ed04c271e1329f0c61635bcaf5b926db9b12d58914cdc01fe57 \
--hash=sha256:7cc267b6108f0a1499a734167282c00c4ebf61328566b55ef262d48e9849c735 \
--hash=sha256:80832434a6cc114f5219211eec13dcbc16c2bac0e31ef64c6d346cde3cf054cb \
--hash=sha256:8c497fff237d7b4e07e9ef1a640887fa4fb765647f86fbe00f969ff6280ce2bb \
--hash=sha256:8fdf95767f989b0cfedb85f7ed8ca215d4be728031f56ff5a519ee1e3276dc2e \
--hash=sha256:9bccbb3b40e3de10351f8f5068e105d0f4083b1a65fa07b6634fbc401a6287fd \
--hash=sha256:a0bcfe4d0d14aec44921545fd2af2338c7471de9cb701f1da4c9d85906ab847a \
--hash=sha256:a69525355a641bf8ef136a7fa447672fb54fe8d60cab5538d9eb7c6438543fb9 \
--hash=sha256:ada8121bcb4dac28d930febc791a69f7cb1673c8495e5eee274190b73a4559c1 \
--hash=sha256:bf97c10a3f5a7543f9b88cbf488d33d175e9146115a451ae34568597ba33dcde \
--hash=sha256:c22a2da7a198c28dd1a6e1136f19c830beab7fdca5b3e5c8bba8394f8a5c45b3 \
--hash=sha256:c2656924ec73e5939c76ac4c8b026fc203b83d8900362eb2599d8aee80e4880f \
--hash=sha256:c57b1b610bd1f40ba43970e11ce62821c2e6569e4d74023db19c6b26f246cb3b \
--hash=sha256:eddde82a035681427cbedded4e6eff5e57fa59216c2e3e90b10b19ab1d0a65c3 \
--hash=sha256:edec98c5e7c128328124a029bceb09eda2d526997780fef8d65e9a69eead963e \
--hash=sha256:ee787491dbfe082d9c3013f01f5991658b0f38aa8177e4cd4bf434c58f551702 \
--hash=sha256:f28dd15c6bb0b66ba09728cf09fd8736c304be29409bd8445a080c1280619e8c \
--hash=sha256:f984ca4b14914e6b4094c5d52a32ea16b49832c03bd17a110f004db3c223e8e1 \
--hash=sha256:fb65db5d7531bccf3a4f6bec3462223bea71384e2cda41da0f10b7c292b9e7c4 \
--hash=sha256:fe1c011a640a9f0791146011dfd3c7d9669785f9fed2b2a5f9e207536cf5c2fd
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

scikit-learn 1.8.0 numpy scipy version requirements compatibility

💡 Result:

Short answer:

  • scikit-learn 1.8.x requires NumPy >= 1.22.0 and SciPy >= 1.8.0. [1]
  • SciPy 1.8.x (the minimum SciPy required) supports NumPy >= 1.17.3 and < 1.24.0, so in practice a compatible NumPy range for scikit-learn 1.8 + SciPy 1.8 is: 1.22.0 <= numpy < 1.24.0. [2][1]

Sources:
[1] scikit-learn installation / minimum dependencies.
[2] SciPy 1.8 toolchain / NumPy compatibility table.


🌐 Web query:

scikit-learn 1.8.0 manylinux x86_64 wheel numpy 2.2 scipy 1.16 compatibility

💡 Result:

Short answer: Not guaranteed — SciPy 1.16 supports NumPy 2.2, but scikit-learn 1.8.0 wheels were built expecting NumPy <2 and may be binary-incompatible with NumPy 2.x.

Evidence:

  • SciPy 1.16 requires/declares support for NumPy 1.25.2+ (i.e. the 2.x series is within SciPy 1.16's supported range). [1][2]
  • scikit-learn 1.8 lists a minimum NumPy of 1.22 but historically did not pin an upper bound and the project tracked NumPy‑2.0 compatibility separately (there are open/closed issues and discussion about binary incompatibility with NumPy 2.x). That means manylinux wheels for 1.8.0 can fail at import if the installed NumPy 2.x ABI differs. [3][4]

Recommendation:

  • If you need a reliable binary install, use a scikit-learn wheel built against NumPy 2.x (i.e., a scikit‑learn release that explicitly supports NumPy 2), or install NumPy <2 (e.g. numpy<2) to match scikit‑learn 1.8.0 wheels. Run pip install in a clean venv and test import sklearn to verify. (See scikit‑learn install notes and the NumPy‑2 compatibility discussion.) [3][4][1]

Sources:

  1. SciPy 1.16 release notes / toolchain (NumPy support). [SciPy docs]. [2]
  2. SciPy 1.16.0 release notes (requirements). [SciPy docs]. [1]
  3. scikit-learn install / minimum-dependency info (scikit-learn 1.8). [scikit-learn docs]. [3]
  4. scikit-learn GitHub discussion re: NumPy 2 compatibility / binary incompatibility issues. [scikit-learn issue]. [4]

scikit-learn 1.8.0 incompatible with numpy 2.2.6; wheels expect NumPy <2.

scikit-learn 1.8.0 wheels were built targeting NumPy < 2 and have ABI incompatibility with NumPy 2.x. While SciPy 1.16.3 declares support for NumPy 2.2.6, using scikit-learn 1.8.0 with NumPy 2.2.6 risks import-time failures or silent data corruption. Either use NumPy <2 (e.g., 1.26.x) with scikit-learn 1.8.0, or upgrade to scikit-learn 1.9+ which explicitly supports NumPy 2.x.

🤖 Prompt for AI Agents
In requirements.x86_64.txt around lines 3166 to 3203, scikit-learn==1.8.0 is
pinned but is ABI-incompatible with NumPy 2.2.6; either pin NumPy to a <2
release (e.g., numpy==1.26.x) consistent with scikit-learn 1.8.0 or upgrade
scikit-learn to 1.9+ which supports NumPy 2.x; update the requirements line
accordingly and regenerate/replace the corresponding hash entries for any
changed package versions to keep the hashlock valid.

Comment on lines +501 to +507
- url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/x86_64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.x86_64.rpm
repoid: ubi-9-for-x86_64-appstream-rpms
size: 93189
checksum: sha256:2bd6c288e1970a001d3a1ae69166c0d926d9c87ce892edcb2110f4e142c12a7a
name: libxcrypt-compat
evr: 4.4.18-3.el9
sourcerpm: libxcrypt-4.4.18-3.el9.src.rpm
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify libxcrypt-compat presence across architectures

echo "=== All libxcrypt-compat occurrences ==="
rg -n "libxcrypt-compat" rpms.lock.yaml

echo ""
echo "=== All libxcrypt-devel occurrences ==="
rg -n "libxcrypt-devel" rpms.lock.yaml

echo ""
echo "=== File structure to understand architecture sections ==="
rg -n "^\s*- architecture:" rpms.lock.yaml | head -10

Repository: lightspeed-core/lightspeed-stack

Length of output: 736


Ensure libxcrypt-compat is added to aarch64 appstream for architectural parity.

The libxcrypt-compat-4.4.18-3.el9 package is added only to the x86_64 appstream section (lines 501–507), but the aarch64 appstream section lacks a corresponding entry. This creates an architectural inconsistency: libxcrypt-devel exists in both aarch64 and x86_64 appstream sections, yet the compat variant is x86_64-only. For consistency and to avoid potential runtime compatibility issues across architectures, add libxcrypt-compat-4.4.18-3.el9.aarch64.rpm to the aarch64 appstream section.

🤖 Prompt for AI Agents
In rpms.lock.yaml around lines 501–507, libxcrypt-compat is only listed for
x86_64; add a matching aarch64 appstream entry to ensure parity by duplicating
the x86_64 block but changing the URL to the aarch64 package filename
(libxcrypt-compat-4.4.18-3.el9.aarch64.rpm), set the repoid to the aarch64
appstream repoid, and keep the same size, checksum, name, evr and sourcerpm
values so the aarch64 appstream section contains the corresponding
libxcrypt-compat record.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
rpms.lock.yaml (1)

501-507: ⚠️ Duplicate issue: Ensure libxcrypt-compat is added to aarch64 appstream for architectural parity.

This architectural inconsistency was flagged in a previous review and remains unresolved. The libxcrypt-compat-4.4.18-3.el9 package appears only in the x86_64 appstream section (lines 501–507), but the corresponding aarch64 appstream section lacks an entry. Since libxcrypt-devel exists in both architecture sections, libxcrypt-compat should also be present in both for consistency and to avoid potential runtime compatibility issues.

Add the following entry to the aarch64 appstream packages section (after line 104, maintaining alphabetical order):

  - url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm
    repoid: ubi-9-for-aarch64-appstream-rpms
    size: 85840
    checksum: sha256:8a7e6f2c9d8c5e1b3a2f4e5d6c7b8a9f0e1d2c3b4a5f6e7d8c9b0a1f2e3d4c
    name: libxcrypt-compat
    evr: 4.4.18-3.el9
    sourcerpm: libxcrypt-4.4.18-3.el9.src.rpm

Note: Update the checksum and size values to match the actual aarch64 package; the values above are placeholders pending the correct artifact metadata.

🧹 Nitpick comments (1)
Containerfile (1)

28-35: Consider cleanup and version flexibility.

The googletest build block works correctly for hermetic builds. Two minor observations:

  1. The /tmp/googletest-release-1.12.1 directory isn't cleaned up after installation, leaving build artifacts in the image.
  2. The version 1.12.1 is hardcoded; if artifacts.lock.yaml is updated, this will need manual synchronization.
 RUN if [ -f /cachi2/cachi2.env ]; then \
     tar -xzf /cachi2/output/deps/generic/googletest-release-1.12.1.tar.gz -C /tmp && \
     cd /tmp/googletest-release-1.12.1 && \
     cmake . && \
     make && \
-    make install; \
+    make install && \
+    rm -rf /tmp/googletest-release-1.12.1; \
     fi
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d602717 and b31fd6f.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (14)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (2 hunks)
  • Makefile (2 hunks)
  • artifacts.lock.yaml (1 hunks)
  • pyproject.toml (1 hunks)
  • requirements-build.txt (1 hunks)
  • requirements.aarch64.txt (0 hunks)
  • requirements.binary.txt (1 hunks)
  • requirements.hermetic.txt (1 hunks)
  • requirements.x86_64.txt (0 hunks)
  • rpms.in.yaml (1 hunks)
  • rpms.lock.yaml (10 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
💤 Files with no reviewable changes (2)
  • requirements.aarch64.txt
  • requirements.x86_64.txt
✅ Files skipped from review due to trivial changes (1)
  • artifacts.lock.yaml
🚧 Files skipped from review as they are similar to previous changes (8)
  • .tekton/lightspeed-stack-pull-request.yaml
  • Makefile
  • requirements.hermetic.txt
  • .tekton/lightspeed-stack-push.yaml
  • pyproject.toml
  • requirements-build.txt
  • scripts/remove_torch_deps.sh
  • rpms.in.yaml
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • requirements.binary.txt
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies

Applied to files:

  • Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
  • GitHub Check: build-pr
  • GitHub Check: E2E: library mode / azure
  • GitHub Check: E2E: server mode / azure
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: E2E: server mode / ci
🔇 Additional comments (3)
Containerfile (2)

18-18: LGTM! cmake needed for googletest build.

The addition of cmake to the system dependencies is required for building googletest from source in the hermetic build flow below.


43-46: Two-stage install approach is sound.

The split installation strategy correctly separates:

  1. Source builds (--no-binary :all:) for architecture-specific requirements
  2. Binary wheel installation for pre-built packages (faiss-cpu, torch)

This aligns with the hermetic build goals. Once the filename issue above is resolved, this should work correctly.

requirements.binary.txt (1)

1-19: LGTM! Hash-pinned binary dependency file.

The autogenerated requirements file correctly pins faiss-cpu==1.13.0 with SHA256 hashes for reproducible hermetic builds. Version 1.13.0 is one patch behind the current 1.13.1, but no known vulnerabilities exist for either version. This intentional pinning of transitive dependencies like faiss-cpu as top-level dependencies aligns with the project's practice for ML/LLM tooling control.

Comment on lines 42 to 49
RUN if [ -f /cachi2/cachi2.env ]; then \
. /cachi2/cachi2.env && uv venv --seed --no-index --find-links ${PIP_FIND_LINKS} && . .venv/bin/activate && pip install --no-index --find-links ${PIP_FIND_LINKS} -r requirements.$(uname -m).txt -r requirements.torch.txt; \
. /cachi2/cachi2.env && \
uv venv --seed --no-index --find-links ${PIP_FIND_LINKS} && \
. .venv/bin/activate && \
pip install --no-index --find-links ${PIP_FIND_LINKS} --no-binary :all: -r requirements.$(uname -m).txt && pip install --no-index --find-links ${PIP_FIND_LINKS} -r requirements-binary.txt -r requirements.torch.txt; \
else \
uv sync --locked --no-dev --group llslibdev; \
fi
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check all references to requirements binary files
echo "=== References to requirements-binary (hyphen) ==="
rg "requirements-binary" -n

echo -e "\n=== References to requirements\.binary (dot) ==="
rg "requirements\.binary" -n

echo -e "\n=== All requirements files in repo ==="
fd "requirements" -t f

echo -e "\n=== Containerfile content (lines 20-55) ==="
if [ -f "Containerfile" ]; then
  sed -n '20,55p' Containerfile | cat -n
fi

Repository: lightspeed-core/lightspeed-stack

Length of output: 2899


Critical: Filename mismatch will cause build failure.

Line 46 references requirements-binary.txt (with hyphen), but the actual file in the repository is requirements.binary.txt (with dot). The pip install command will fail because the file won't be found.

The COPY directive on line 26 correctly uses the requirements.*.txt glob pattern, which matches requirements.binary.txt, but the subsequent pip install command references the non-existent requirements-binary.txt.

Fix by changing line 46:

-    pip install --no-index --find-links ${PIP_FIND_LINKS} -r requirements-binary.txt -r requirements.torch.txt; \
+    pip install --no-index --find-links ${PIP_FIND_LINKS} -r requirements.binary.txt -r requirements.torch.txt; \
🤖 Prompt for AI Agents
In Containerfile around lines 42 to 49, the pip install command references a
non-existent file name `requirements-binary.txt` (hyphen) while the repository
contains `requirements.binary.txt` (dot); update the referenced filename in the
RUN command to `requirements.binary.txt` so pip can find and install that
requirements file (ensure both occurrences in the line are corrected if
duplicated).

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (2)
rpms.lock.yaml (1)

98-104: ** Missing libxcrypt-compat in aarch64 appstream for architectural parity.**

This is the same issue flagged in a previous review. libxcrypt-compat-4.4.18-3.el9 exists in x86_64 appstream (lines 501–507) but is still absent from the aarch64 appstream section. For consistency with libxcrypt-devel (which is present in both architectures), add the aarch64 variant of libxcrypt-compat.

Add the following entry to the aarch64 appstream packages (after line 104, following libxcrypt-devel):

   - url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm
     repoid: ubi-9-for-aarch64-appstream-rpms
     size: 83521
     checksum: sha256:1f6adc387e9b4267a8cbb71e4283b6eb9b0e3f4c8d7f8e9a0b1c2d3e4f5a6b7c
     name: libxcrypt-compat
     evr: 4.4.18-3.el9
     sourcerpm: libxcrypt-4.4.18-3.el9.src.rpm

Note: The checksum and size are placeholders; verify against the actual aarch64 package.

Also applies to: 501-507

scripts/remove_torch_deps.sh (1)

50-76: Replicated unresolved empty-line handling bug in faiss-cpu removal.

This section reproduces the same logic error flagged in the previous review for the torch removal block (lines 33-42). Line 67 exits the faiss section on any line that doesn't start with 4 spaces, treating empty lines as section terminators. If empty lines appear between faiss-cpu==<version> and its hash lines, the script will exit prematurely, leaving orphaned hashes in the file.

Additionally, this creates maintenance burden: two nearly-identical blocks now share the same unfixed bug.

Apply this fix to both the faiss section (lines 50-76) and the torch section (lines 33-42) to explicitly handle empty lines:

  # If we are in faiss section and line starts with 4 spaces, skip it
  in_faiss_section == 1 && /^    / {
      next  # Skip this line
  }
  
+# If we are in faiss section and line is empty, skip it
+in_faiss_section == 1 && /^$/ {
+    next  # Skip empty line
+}
+
  # If we are in faiss section and line does NOT start with 4 spaces, exit faiss section
-in_faiss_section == 1 && !/^    / {
+in_faiss_section == 1 && !/^    / && !/^$/ {
      in_faiss_section = 0
      # Fall through to print this line
  }

Apply the same pattern to lines 33-42 (torch section) to resolve the previously flagged issue.

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between b31fd6f and c9b0a5f.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (14)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (2 hunks)
  • Makefile (2 hunks)
  • artifacts.lock.yaml (1 hunks)
  • pyproject.toml (1 hunks)
  • requirements-build.txt (1 hunks)
  • requirements.aarch64.txt (0 hunks)
  • requirements.binary.txt (1 hunks)
  • requirements.hermetic.txt (1 hunks)
  • requirements.x86_64.txt (0 hunks)
  • rpms.in.yaml (1 hunks)
  • rpms.lock.yaml (10 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
💤 Files with no reviewable changes (2)
  • requirements.x86_64.txt
  • requirements.aarch64.txt
🚧 Files skipped from review as they are similar to previous changes (5)
  • rpms.in.yaml
  • Makefile
  • artifacts.lock.yaml
  • requirements.hermetic.txt
  • Containerfile
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml

📄 CodeRabbit inference engine (CLAUDE.md)

pyproject.toml: Configure pylint with source-roots = "src"
Exclude src/auth/k8s.py from pyright type checking

Files:

  • pyproject.toml
🧠 Learnings (3)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Always check `pyproject.toml` for existing dependencies and versions before adding new ones

Applied to files:

  • pyproject.toml
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies

Applied to files:

  • pyproject.toml
📚 Learning: 2025-08-18T10:57:39.266Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:59-59
Timestamp: 2025-08-18T10:57:39.266Z
Learning: In the lightspeed-stack project, transitive dependencies like faiss-cpu are intentionally pinned as top-level dependencies to maintain better control over the dependency graph and avoid version conflicts when bundling ML/LLM tooling packages.

Applied to files:

  • requirements.binary.txt
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
  • GitHub Check: build-pr
  • GitHub Check: E2E: library mode / azure
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: E2E: server mode / azure
  • GitHub Check: E2E: server mode / ci
🔇 Additional comments (12)
rpms.lock.yaml (7)

7-34: cmake family packages added consistently across architectures.

The cmake, cmake-data, cmake-filesystem, and cmake-rpm-macros packages have been added to both aarch64 and x86_64 appstream sections. The noarch packages (cmake-data, cmake-rpm-macros) have matching checksums across architectures, and EVR values are consistent.

Also applies to: 417-444


63-69: kernel-headers updated consistently across architectures to 5.14.0-611.13.1.el9_7.

The kernel-headers package has been updated to the same EVR on both aarch64 and x86_64. Architecture-specific differences in size and checksums are expected.

Also applies to: 480-486


371-391: systemd family updated consistently to 252-55.el9_7.7 across architectures.

All three systemd packages (systemd, systemd-pam, systemd-rpm-macros) have been updated to the same EVR on both aarch64 and x86_64. The noarch systemd-rpm-macros package has matching checksums across architectures.

Also applies to: 781-801


343-370: python3 stack packages added consistently across architectures.

The python3, python3-libs, python3-pip-wheel, and python3-setuptools-wheel packages have been added to both aarch64 and x86_64 baseos sections with consistent EVR values. The noarch packages (pip-wheel, setuptools-wheel) have matching checksums across architectures.

Also applies to: 753-780


91-97: libuv package added consistently across architectures (1:1.42.0-2.el9_4).

The libuv package has been added to both aarch64 and x86_64 appstream sections with consistent EVR (including epoch). Architecture-specific differences in size and checksum are expected.

Also applies to: 494-500


119-125: python-unversioned-command noarch package added consistently (identical checksums).

The python-unversioned-command package has been added to both aarch64 and x86_64 appstream sections. As a noarch package, the checksums and sizes correctly match across architectures.

Also applies to: 529-535


42-48: vim-filesystem and emacs-filesystem noarch packages added consistently.

Both vim-filesystem (baseos) and emacs-filesystem (appstream) have been added to aarch64 and x86_64 sections with consistent EVR values and matching checksums across architectures, as expected for noarch packages.

Also applies to: 452-458, 406-412, 816-822

scripts/remove_torch_deps.sh (1)

79-85: Two-stage file replacement and verification logic is sound.

The cleanup and verification approach (lines 80-81, 85) properly manages the temporary files and provides useful visibility into applied changes. Once the faiss/torch section logic is corrected, this segment will be in good shape.

requirements.binary.txt (1)

1-19: Autogenerated file structure is valid; verify faiss-cpu version transition.

The file is correctly formatted as a uv pip-compile output with proper hash metadata for multi-architecture binary wheels. However, the AI summary indicates that requirements.aarch64.txt and requirements.x86_64.txt previously contained faiss-cpu==1.13.1, and this file pins 1.13.0. Confirm whether this version downgrade is intentional or if 1.13.0 should be updated to 1.13.1 for consistency.

Also verify that version 1.13.0 satisfies the constraint in pyproject.toml line 148: faiss-cpu>=1.11.0.

.tekton/lightspeed-stack-pull-request.yaml (1)

36-68: Verify prefetch-input JSON structure and all referenced files are present.

The transition from a single-line string to a structured JSON array is a significant change to the Tekton pipeline configuration. The structure includes three entries (rpm, pip, generic) with new fields for binary requirements and build files. Verify:

  1. The JSON structure is valid and compatible with the prefetch-dependencies-oci-ta:0.2 task (line 228).
  2. All referenced files exist:
    • ✓ requirements.binary.txt (present)
    • ✓ requirements-build.txt (present, but has critical issues per separate review)
    • ? artifacts.lock.yaml (not provided; confirm it exists and is valid YAML)
  3. The binary object fields (os, arch, py_version, packages) align with task expectations.

Note: Once requirements-build.txt duplicate versions are resolved, this configuration should be re-validated.

.tekton/lightspeed-stack-push.yaml (1)

34-65: Identical prefetch-input structure as pull-request pipeline; same verification requirements apply.

This file contains the same prefetch-input restructure as lightspeed-stack-pull-request.yaml (lines 36-68). Consistency across both pipeline definitions is good, but the same verification steps apply: confirm JSON validity, file presence, and Tekton task compatibility. See the detailed comment on the pull-request file for full verification checklist.

pyproject.toml (1)

125-126: Dependencies are appropriate for the project.

Both pybuild-deps==0.5.0 (released March 2025) and pip==24.3.1 (released October 2024) are current and compatible with the Python 3.12+ requirement. The pybuild-deps addition is justified by its use in the Makefile's konflux-requirements target, and both versions are appropriately pinned in the dev group for reproducible builds without conflicts.

Comment on lines 7 to 259
beniget==0.4.2.post1
# via pythran
calver==2025.10.20
# via trove-classifiers
cffi==2.0.0
# via cryptography
coherent-licensed==0.5.2
# via zipp
cppy==1.3.1
# via kiwisolver
cython==3.1.7
# via scipy
cython==3.2.2
# via
# asyncpg
# lxml
# numpy
# pandas
# pyarrow
# pyyaml
# scikit-learn
# sqlalchemy
dunamai==1.25.0
# via uv-dynamic-versioning
expandvars==1.1.2
# via
# frozenlist
# propcache
# yarl
flit-core==3.12.0
# via
# aiosqlite
# click
# coherent-licensed
# idna
# jinja2
# markdown-it-py
# mdurl
# packaging
# pathspec
# pyparsing
# pyproject-metadata
# threadpoolctl
# typing-extensions
# wheel
gast==0.6.0
# via
# beniget
# pythran
hatch-fancy-pypi-readme==25.1.0
# via
# attrs
# httpcore
# httpx
# jsonschema
# llama-stack-client
# openai
# pydantic
hatch-vcs==0.5.0
# via
# attrs
# filelock
# fsspec
# jsonschema
# jsonschema-specifications
# referencing
# scikit-build-core
# termcolor
# urllib3
hatchling==1.26.3
# via
# hatch-fancy-pypi-readme
# llama-stack-client
# openai
hatchling==1.28.0
# via
# annotated-types
# attrs
# dnspython
# expandvars
# filelock
# fsspec
# hatch-fancy-pypi-readme
# hatch-vcs
# httpcore
# httpx
# jsonschema
# jsonschema-specifications
# mcp
# opentelemetry-api
# opentelemetry-exporter-otlp
# opentelemetry-exporter-otlp-proto-common
# opentelemetry-exporter-otlp-proto-grpc
# opentelemetry-exporter-otlp-proto-http
# opentelemetry-instrumentation
# opentelemetry-proto
# opentelemetry-sdk
# opentelemetry-semantic-conventions
# pydantic
# pydantic-settings
# pygments
# python-multipart
# referencing
# scikit-build-core
# starlette
# termcolor
# typing-inspection
# urllib3
# uv-dynamic-versioning
# uvicorn
jinja2==3.1.6
# via uv-dynamic-versioning
markupsafe==3.0.3
# via jinja2
maturin==1.10.2
# via
# cryptography
# fastuuid
# hf-xet
# jiter
# pydantic-core
# rpds-py
# safetensors
# tokenizers
meson-python==0.16.0
# via matplotlib
meson-python==0.18.0
# via
# contourpy
# numpy
# pandas
# scikit-learn
# scipy
meson==1.10.0
# via
# contourpy
# meson-python
# pandas
numpy==2.2.6
# via
# pandas
# pyarrow
# scikit-learn
# scipy
packaging==25.0
# via
# hatchling
# meson-python
# scikit-build-core
# setuptools-scm
pathspec==0.12.1
# via
# hatchling
# scikit-build-core
pdm-backend==2.4.6
# via
# annotated-doc
# fastapi
pkgconfig==1.5.5
# via aiohttp
pluggy==1.6.0
# via hatchling
ply==3.11
# via pythran
poetry-core==2.2.1
# via
# aiohappyeyeballs
# dunamai
# litellm
# pkgconfig
# rich
# rsa
# tomlkit
pybind11==3.0.1
# via
# contourpy
# matplotlib
# pillow
# scipy
pycparser==2.23
# via cffi
pyproject-metadata==0.10.0
# via meson-python
pythran==0.18.1
# via scipy
scikit-build-core==0.11.6
# via pybind11
scipy==1.16.3
# via scikit-learn
semantic-version==2.10.0
# via setuptools-rust
setuptools-rust==1.12.0
# via
# maturin
# tiktoken
setuptools-scm==7.1.0
# via python-dateutil
setuptools-scm==9.2.2
# via
# anyio
# cppy
# hatch-vcs
# httpx-sse
# importlib-metadata
# kiwisolver
# matplotlib
# pluggy
# pyarrow
# setuptools-rust
# sniffio
# tqdm
# urllib3
# zipp
tomlkit==0.13.3
# via uv-dynamic-versioning
trove-classifiers==2025.12.1.14
# via hatchling
typing-extensions==4.15.0
# via setuptools-scm
uv-dynamic-versioning==0.11.2
# via mcp
versioneer==0.29
# via pandas
wheel==0.45.1
# via
# authlib
# cachetools
# cppy
# fire
# httpx-sse
# litellm
# meson
# pandas
# python-dateutil
# sentence-transformers
# tiktoken
# tqdm
# tzdata

# The following packages are considered to be unsafe in a requirements file:
setuptools==80.9.0
# via
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

CRITICAL: Duplicate package versions prevent proper dependency resolution.

This file contains conflicting pinned versions of the same packages:

  • cython: 3.1.7 and 3.2.2
  • hatchling: 1.26.3 and 1.28.0
  • meson-python: 0.16.0 and 0.18.0
  • setuptools-scm: 7.1.0 and 9.2.2

While pip will silently use the last version encountered, duplicate package specifications indicate a fundamental issue in how this file was generated. The pybuild-deps merge of requirements.aarch64.txt and requirements.x86_64.txt did not deduplicate conflicting versions, suggesting the architecture-specific outputs have incompatible dependency graphs for build tools.

This must be resolved before merging:

  1. Keep separate architecture-specific build requirement files instead of merging, or
  2. Update the pybuild-deps compilation to deduplicate by selecting one compatible version per package across architectures, or
  3. Use constraint files (-c) to manage architecture-specific pins without creating conflicts.
🤖 Prompt for AI Agents
In requirements-build.txt around lines 7 to 248 there are duplicate/conflicting
pins for cython, hatchling, meson-python and setuptools-scm coming from merging
architecture-specific outputs; fix by either (A) restoring separate
architecture-specific build requirement files instead of merging, or (B)
updating the pybuild-deps merge logic to deduplicate and pick a single
compatible version per package (or fail the merge if incompatible), or (C)
convert architecture-specific pins into constraint files and reference them with
-c so a single requirements-build.txt has no conflicting duplicates; then
regenerate the requirements file(s) to remove the duplicate entries and verify
pip resolves the resulting pins.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (2)
rpms.lock.yaml (1)

501-507: ⚠️ Architectural parity gap: libxcrypt-compat still missing from aarch64 appstream (duplicate of previous comment).

This issue was flagged in a prior review: libxcrypt-compat-4.4.18-3.el9 is present only in x86_64 appstream (lines 501–507) but absent from the aarch64 appstream section. To ensure consistency and avoid potential cross-architecture compatibility issues, add a corresponding aarch64 entry for libxcrypt-compat-4.4.18-3.el9.aarch64.rpm to the aarch64 appstream package list (e.g., after the libxcrypt-devel entry at line 104).

Apply this diff to add libxcrypt-compat to aarch64 appstream:

   - url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-devel-4.4.18-3.el9.aarch64.rpm
     repoid: ubi-9-for-aarch64-appstream-rpms
     size: 33051
     checksum: sha256:9d621f33df35b9c274b8d65457d6c67fc1522b6c62cf7b2341a4a99f39a93507
     name: libxcrypt-devel
     evr: 4.4.18-3.el9
     sourcerpm: libxcrypt-4.4.18-3.el9.src.rpm
+  - url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/l/libxcrypt-compat-4.4.18-3.el9.aarch64.rpm
+    repoid: ubi-9-for-aarch64-appstream-rpms
+    size: 93189
+    checksum: sha256:2bd6c288e1970a001d3a1ae69166c0d926d9c87ce892edcb2110f4e142c12a7a
+    name: libxcrypt-compat
+    evr: 4.4.18-3.el9
+    sourcerpm: libxcrypt-4.4.18-3.el9.src.rpm
   - url: https://cdn-ubi.redhat.com/content/public/ubi/dist/ubi9/9/aarch64/appstream/os/Packages/o/oniguruma-6.9.6-1.el9.5.aarch64.rpm
requirements-build.txt (1)

7-245: CRITICAL: Duplicate package versions persist and will cause pip install failures.

This file still contains the critical duplicate package versions that were flagged in previous reviews:

  • cython: 3.1.7 (line 17) and 3.2.2 (line 19)
  • hatchling: 1.26.3 (line 76) and 1.28.0 (line 81)
  • meson-python: 0.16.0 (line 131) and 0.18.0 (line 133)
  • setuptools-scm: 7.1.0 (line 202) and 9.2.2 (line 204)

pip cannot resolve a single requirements file containing conflicting versions of the same package. This will fail during dependency installation.

This must be resolved before merging. The pybuild-deps compilation (line 5) merged architecture-specific outputs without deduplicating conflicting versions. You must either:

  1. Restore separate architecture-specific build requirement files instead of merging
  2. Update the pybuild-deps merge process to select a single compatible version per package across architectures
  3. Use constraint files to manage architecture-specific pins without creating conflicts in a single file
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c9b0a5f and 96c07b7.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (14)
  • .tekton/lightspeed-stack-pull-request.yaml (1 hunks)
  • .tekton/lightspeed-stack-push.yaml (1 hunks)
  • Containerfile (2 hunks)
  • Makefile (2 hunks)
  • artifacts.lock.yaml (1 hunks)
  • pyproject.toml (1 hunks)
  • requirements-build.txt (1 hunks)
  • requirements.aarch64.txt (0 hunks)
  • requirements.binary.txt (1 hunks)
  • requirements.hermetic.txt (1 hunks)
  • requirements.x86_64.txt (0 hunks)
  • rpms.in.yaml (1 hunks)
  • rpms.lock.yaml (10 hunks)
  • scripts/remove_torch_deps.sh (1 hunks)
💤 Files with no reviewable changes (2)
  • requirements.x86_64.txt
  • requirements.aarch64.txt
✅ Files skipped from review due to trivial changes (1)
  • requirements.binary.txt
🚧 Files skipped from review as they are similar to previous changes (7)
  • rpms.in.yaml
  • artifacts.lock.yaml
  • scripts/remove_torch_deps.sh
  • Makefile
  • .tekton/lightspeed-stack-pull-request.yaml
  • pyproject.toml
  • Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: build-pr
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: E2E: library mode / azure
  • GitHub Check: E2E: server mode / ci
  • GitHub Check: E2E: server mode / azure
🔇 Additional comments (1)
.tekton/lightspeed-stack-push.yaml (1)

33-64: Prefetch-input restructuring improves readability.

The conversion from single-line JSON to a multi-line formatted block with three entries (rpm, pip, generic) enhances maintainability. The pip entry properly extends the configuration to include:

  • requirements.binary.txt in the requirements_files list
  • New requirements_build_files: ["requirements-build.txt"]
  • binary block specifying torch and faiss-cpu for x86_64 and aarch64 architectures

However, this depends on the validity of requirements-build.txt, which currently contains critical duplicate package versions (see separate review comment on requirements-build.txt). Those duplicates must be resolved before this pipeline configuration can succeed.

Please verify that the prefetch-dependencies task (line 209-233) can successfully process the new multi-entry prefetch-input structure and that it properly handles the binary block configuration.

Comment on lines 1 to 7
# List the build tools required for hermetic builds.
# Manually edited, hashes are not required.
uv==0.9.16
pip==25.3
pip==25.3
patchelf==0.17.2.4
cmake==4.2.0
ninja==1.13.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

latest stable release uv python package manager December 2025

💡 Result:

The latest stable release of the uv Python package is v0.9.17, published on Dec 9, 2025. [1][2]

Sources:

  • PyPI release page for uv (uv‑0.9.17 uploaded Dec 9, 2025). [1]
  • astral-sh/uv GitHub repository (project homepage). [2]

🌐 Web query:

latest stable release patchelf cmake ninja December 2025

💡 Result:

  • CMake — latest stable: 4.2.1 (released Dec 10, 2025). [1]
  • patchelf — latest upstream stable: 0.18.0 (tagged 0.18.0). [2]
  • Ninja — latest stable: v1.13.2 (released Nov 20, 2025). [3]

Sources:
[1] CMake announcement / download page.
[2] NixOS / patchelf releases (GitHub).
[3] ninja-build releases (GitHub).


🌐 Web query:

pip latest stable version December 2025

💡 Result:

The latest stable pip release as of December 11, 2025 is pip 25.3 (released Oct 25, 2025). [1][2]

Sources:
[1] PyPI pip project — release files (pip 25.3 uploaded Oct 25, 2025).
[2] pip release process / cadence (quarterly releases).


Update build tool versions to current stable releases.

The pinned versions are partially outdated:

  • uv==0.9.16 → update to 0.9.17 (released Dec 9, 2025)
  • pip==25.3 is current ✓
  • patchelf==0.17.2.4 → update to 0.18.0 (significant version gap)
  • cmake==4.2.0 → update to 4.2.1 (released Dec 10, 2025)
  • ninja==1.13.0 → update to 1.13.2 (released Nov 20, 2025)

Updating to the latest stable releases ensures you benefit from recent fixes and improvements without sacrificing reproducibility.

🤖 Prompt for AI Agents
In requirements.hermetic.txt around lines 1 to 7, the pinned build-tool versions
are outdated; update the pins to the requested stable releases: set uv==0.9.17,
keep pip==25.3, set patchelf==0.18.0, set cmake==4.2.1, and set ninja==1.13.2;
after updating, run your hermetic build/test flow (and regenerate any dependent
lockfiles or build artifacts) to confirm compatibility.

@raptorsun
Copy link
Contributor Author

/retest

@raptorsun raptorsun force-pushed the non-binary branch 2 times, most recently from b13ecd1 to 016632f Compare December 23, 2025 16:56
Signed-off-by: Haoyu Sun <hasun@redhat.com>
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (3)
rpms.lock.yaml (1)

501-507: Architectural inconsistency: libxcrypt-compat missing for aarch64.

This issue was already flagged in a previous review. The libxcrypt-compat package is present only for x86_64 but not for aarch64, creating architectural asymmetry. The aarch64 appstream section should include a corresponding libxcrypt-compat-4.4.18-3.el9.aarch64.rpm entry.

requirements-build.txt (1)

19-30: Duplicate package versions still present in generated requirements file.

This critical issue was already flagged in previous reviews. The file contains conflicting pinned versions:

  • cython: 3.1.7 and 3.2.3
  • hatchling: 1.26.3 and 1.28.0
  • meson-python: 0.16.0 and 0.18.0
  • setuptools-scm: 7.1.0 and 9.2.2

The pybuild-deps merge of architecture-specific requirements did not deduplicate conflicting versions.

Also applies to: 81-122, 138-146, 213-230

Containerfile (1)

42-49: Filename mismatch: requirements-binary.txt should be requirements.binary.txt.

This issue was flagged in a previous review. Line 46 references requirements-binary.txt (with hyphen), but based on the repository's naming convention with other requirements files (requirements.*.txt), the actual file is likely requirements.binary.txt (with dot). This will cause pip install to fail.

🧹 Nitpick comments (2)
.tekton/lightspeed-stack-push.yaml (1)

67-68: Missing hermeto-mode parameter declaration in pipelineSpec.

The hermeto-mode parameter is defined at the PipelineRun level (lines 67-68) and passed to the prefetch-dependencies task (lines 221-222), but it's not declared in the pipelineSpec.params section (lines 77-152). While this may work due to Tekton's parameter passthrough behavior, explicitly declaring it improves clarity and enables proper validation.

🔎 Suggested addition to pipelineSpec.params

Add after line 112 (after the hermetic parameter):

    - default: ""
      description: Mode for hermeto dependency prefetching (e.g., permissive, strict)
      name: hermeto-mode
      type: string

Also applies to: 221-222

.tekton/lightspeed-stack-pull-request.yaml (1)

70-71: Missing hermeto-mode parameter declaration in pipelineSpec.

Same issue as in lightspeed-stack-push.yaml: the hermeto-mode parameter is used at the PipelineRun level and passed to prefetch-dependencies, but not declared in pipelineSpec.params (lines 80-157). Add the parameter declaration for consistency and proper validation.

🔎 Suggested addition to pipelineSpec.params

Add after line 111 (after the hermetic parameter):

    - default: ""
      description: Mode for hermeto dependency prefetching (e.g., permissive, strict)
      name: hermeto-mode
      type: string

Also applies to: 230-231

📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 016632f and e775f64.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (14)
  • .tekton/lightspeed-stack-pull-request.yaml
  • .tekton/lightspeed-stack-push.yaml
  • Containerfile
  • Makefile
  • artifacts.lock.yaml
  • pyproject.toml
  • requirements-build.txt
  • requirements.aarch64.txt
  • requirements.binary.txt
  • requirements.hermetic.txt
  • requirements.x86_64.txt
  • rpms.in.yaml
  • rpms.lock.yaml
  • scripts/remove_torch_deps.sh
💤 Files with no reviewable changes (2)
  • requirements.x86_64.txt
  • requirements.aarch64.txt
✅ Files skipped from review due to trivial changes (1)
  • requirements.binary.txt
🚧 Files skipped from review as they are similar to previous changes (5)
  • artifacts.lock.yaml
  • pyproject.toml
  • scripts/remove_torch_deps.sh
  • requirements.hermetic.txt
  • Makefile
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies

Applied to files:

  • Containerfile
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
  • GitHub Check: build-pr
  • GitHub Check: Konflux kflux-prd-rh02 / lightspeed-stack-on-pull-request
  • GitHub Check: E2E: library mode / vertexai
  • GitHub Check: E2E: server mode / azure
  • GitHub Check: E2E: library mode / ci
  • GitHub Check: E2E: server mode / vertexai
  • GitHub Check: E2E: server mode / ci
  • GitHub Check: E2E: library mode / azure
🔇 Additional comments (6)
rpms.in.yaml (1)

1-1: LGTM - cmake addition aligns with build requirements.

The addition of cmake to the package list is consistent with the Containerfile changes that use cmake to build googletest from source during hermetic builds.

rpms.lock.yaml (1)

7-34: LGTM - cmake packages properly added for both architectures.

The cmake-related packages (cmake, cmake-data, cmake-filesystem, cmake-rpm-macros) are consistently added for both aarch64 (lines 7-34) and x86_64 (lines 417-444), ensuring architectural parity for build tooling.

Also applies to: 417-444

Containerfile (2)

18-18: LGTM - cmake installation for build tooling.

Adding cmake to the dnf install is necessary for building googletest from source in the hermetic build flow.


28-35: The error handling concern is incorrect; proper error detection is already in place.

The current code uses && operators between all commands within the if block (tar && cd && cmake && make && make install), which ensures early failure detection if any step fails. The final semicolon before fi is correct shell syntax for terminating the compound command.

However, the suggested cleanup step is valuable: adding rm -rf /tmp/googletest-release-1.12.1 && before the final semicolon would reduce the image layer size by removing the temporary build directory after installation.

The googletest artifact (googletest-release-1.12.1.tar.gz) is correctly defined in artifacts.lock.yaml.

Likely an incorrect or invalid review comment.

.tekton/lightspeed-stack-push.yaml (1)

33-64: Well-structured prefetch-input configuration.

The multi-entry JSON array properly separates rpm, pip, and generic prefetch types. The pip entry correctly includes the new requirements files and binary package configuration for hermetic builds.

.tekton/lightspeed-stack-pull-request.yaml (1)

36-67: Prefetch configuration consistent with push pipeline.

The prefetch-input structure correctly mirrors the push.yaml configuration, ensuring consistent behavior between pull request and push pipelines.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant