Skip to content

Enable Cirrus on feedstocks that had access to open-gpu-server#1982

Merged
jaimergp merged 5 commits intoconda-forge:mainfrom
jaimergp:enable-cirrus-cpu
Apr 4, 2026
Merged

Enable Cirrus on feedstocks that had access to open-gpu-server#1982
jaimergp merged 5 commits intoconda-forge:mainfrom
jaimergp:enable-cirrus-cpu

Conversation

@jaimergp
Copy link
Copy Markdown
Member

@jaimergp jaimergp commented Apr 3, 2026

Checklist:

  • I want to request (or revoke) access to an opt-in CI resource:
    • Pinged the relevant feedstock team(s)
    • Added a small description explaining why access is needed

Comes from conda-forge/.cirun#174

cc @conda-forge/core, @conda-forge/astra-toolbox, @conda-forge/cutlass, @conda-forge/flash-attn, @conda-forge/jaxlib, @conda-forge/libmagma, @conda-forge/mongodb, @conda-forge/muscat-split, @conda-forge/nodejs, @conda-forge/onnxruntime, @conda-forge/pytorch-cpu, @conda-forge/pytorch_scatter, @conda-forge/qt-webengine, @conda-forge/ray-packages, @conda-forge/tensorflow, @conda-forge/tinygrad, @conda-forge/torchao, @conda-forge/viskores, @conda-forge/vllm, @conda-forge/vtk-m, @conda-forge/webkit2gtk4-1 , @conda-forge/xformers.

Cirrus Runners will replace the open-gpu-server runners for CPU-only jobs. We have access to three concurrency units, whose cost goes as follows:

image

Once this is merged, you can use github_actions_labels in recipe/conda_build_config.yaml like this:

  • cirun-openstack-cpu-medium users should use ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-sm
  • cirun-openstack-cpu-large users should use ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-sm
  • cirun-openstack-cpu-xlarge users should use ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-md
  • cirun-openstack-cpu-2xlarge users should use ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-lg
  • cirun-openstack-cpu-4xlarge users should use ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-lg

For example:

github_actions_labels:
- ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-md  # [linux64]

@jaimergp jaimergp requested a review from a team as a code owner April 3, 2026 15:40
@jschueller
Copy link
Copy Markdown
Contributor

great! which one would you select for qt-webengine ?

@zklaus
Copy link
Copy Markdown
Contributor

zklaus commented Apr 3, 2026

great! which one would you select for qt-webengine ?

Well, currently qt-webengine is on cirun-openstack-cpu-xlarge, so we should start with ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-md (Medium).

@jaimergp jaimergp merged commit b6c2195 into conda-forge:main Apr 4, 2026
2 checks passed
@jaimergp
Copy link
Copy Markdown
Member Author

jaimergp commented Apr 4, 2026

Argh there's a bug in smithy, fixing at conda-forge/conda-smithy#2508

mharradon added a commit to mharradon/tensorflow-feedstock that referenced this pull request Apr 5, 2026
Migrate off decommissioned runner. See e.g. :
conda-forge/admin-requests#1982
jviehhauser added a commit to jviehhauser/onnxruntime-feedstock that referenced this pull request Apr 7, 2026
The Quansight Open GPU Server (OpenStack) was decommissioned on
2026-03-13 (conda-forge/.cirun#174), so cirun-openstack-cpu-4xlarge
runners no longer exist. All Linux GHA jobs were timing out after 24h
waiting for runners that will never come.

Cirrus runners access was granted via conda-forge/admin-requests#1982.
This switches Linux builds to ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-lg
(16 CPUs, 48GB RAM). Windows builds remain on cirun-azure-windows-4xlarge.

Re-rendered with conda-smithy 3.59.0.
andrew-anyscale added a commit to andrew-anyscale/ray-packages-feedstock that referenced this pull request Apr 7, 2026
Per conda-forge/admin-requests#1982, replace cirun-openstack-cpu-4xlarge with ghcr.io/cirruslabs/ubuntu-runner-amd64:24.04-lg.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: andrew <andrew@anyscale.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants