Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rust 1.72.1 and delete rust-for-linux_rust build config #2068

Merged
merged 6 commits into from
Oct 9, 2023

Conversation

ojeda
Copy link
Contributor

@ojeda ojeda commented Aug 25, 2023

Rust 1.72.1 will be the next version supported by the kernel [1].

In addition, this PR deletes the rust-for-linux_rust build config. The old rust branch is going away, and unlikely to be updated anymore. Keeping it around would mean keeping an extra build environment in KernelCI. Even if we ended up upgrading the compiler in that branch (and thus the extra build environment may not be needed), it is not worth spending KernelCI's resources testing the branch (e.g. testing one of the stable trees would be better).

Link: https://lore.kernel.org/rust-for-linux/[email protected]/ [1]

@ojeda ojeda changed the title Rust 1.72 Rust 1.72.0 and delete rust-for-linux_rust build config Aug 25, 2023
@nuclearcat nuclearcat added the staging-skip Don't test automatically on staging.kernelci.org label Aug 27, 2023
The old `rust` branch is going away, and unlikely to be updated
anymore. Keeping it around would mean keeping an extra build
environment in KernelCI. Even if we ended up upgrading the compiler
in that branch (and thus the extra build environment may not be
needed), it is not worth spending KernelCI's resources testing
the branch (e.g. testing one of the stable trees would be better).

Thus delete the build config, which will enable deleting the build
environment too.

Signed-off-by: Miguel Ojeda <[email protected]>
@ojeda ojeda changed the title Rust 1.72.0 and delete rust-for-linux_rust build config Rust 1.72.1 and delete rust-for-linux_rust build config Sep 26, 2023
@ojeda ojeda marked this pull request as ready for review September 26, 2023 00:23
Rust 1.72.1 will be the next version supported by the kernel [1].

Therefore, add it as a new build environment.

Link: https://lore.kernel.org/rust-for-linux/[email protected]/ [1]
Signed-off-by: Miguel Ojeda <[email protected]>
All Rust branches have been moved to Rust 1.72.0, thus the build
environment for the previous version is no longer needed.

Signed-off-by: Miguel Ojeda <[email protected]>
@ojeda
Copy link
Contributor Author

ojeda commented Oct 5, 2023

The next upgrade is at #2124.

@gctucker
Copy link
Contributor

gctucker commented Oct 6, 2023

Thanks, it's been a busy past few weeks but let's see if we can wrap up all the Rust-related changes today ahead of Monday's production update.

@ojeda
Copy link
Contributor Author

ojeda commented Oct 6, 2023

No worries & thanks a lot @gctucker et al.!

@nuclearcat
Copy link
Member

staging-skip was set cause it was breaking staging builds, it require some more updates in other repositories (probably at least deploy scripts).

@gctucker
Copy link
Contributor

gctucker commented Oct 6, 2023

staging-skip was set cause it was breaking staging builds, it require some more updates in other repositories (probably at least deploy scripts).

Right, and that's also what I was going to investigate today. Last time I looked there was apparently a problem with a rustc Docker image or something related to that, and builds were failing in a weird way.

@gctucker
Copy link
Contributor

gctucker commented Oct 6, 2023

OK I probably won't have time to complete everything today but I'll make sure some jobs get run over the weekend and see if we can wrap things up on Monday morning as the production update happens later in the day. If it "misses the boat" it should definitely all be ready for the following Monday, or maybe we can do a small incremental update halfway through the week if that really helps. I'll keep you posted in any case.

@gctucker gctucker self-assigned this Oct 6, 2023
@gctucker gctucker added staging-skip Don't test automatically on staging.kernelci.org and removed staging-skip Don't test automatically on staging.kernelci.org labels Oct 6, 2023
@gctucker
Copy link
Contributor

gctucker commented Oct 6, 2023

The kernelci/staging-rustc-1.72:x86 Docker image has been build and pushed. Next step, get a kernel build out of it (that's where it got interesting last time).

@gctucker gctucker removed the staging-skip Don't test automatically on staging.kernelci.org label Oct 6, 2023
@ojeda
Copy link
Contributor Author

ojeda commented Oct 6, 2023

Thanks! Let's see then... :)

@gctucker
Copy link
Contributor

gctucker commented Oct 6, 2023

Barf, and the staging VM is having some difficulties but we've got a sysadmin on the case. Such a build up for a few builds 😅 I'll poke the pipeline again tomorrow and let the machine work over the weekend. Have a good one!

@nuclearcat
Copy link
Member

Staging crashed due missing template, i guess deploy scripts need to be updated. Adding skip label, as we have lot of other pending PR.

jinja2.exceptions.TemplateNotFound: rustc-1.71-x86.jinja2

@nuclearcat nuclearcat added the staging-skip Don't test automatically on staging.kernelci.org label Oct 9, 2023
@gctucker gctucker removed the staging-skip Don't test automatically on staging.kernelci.org label Oct 9, 2023
@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

That's not a sysadmin issue ;) Resuming work on this now, I still think we can have it merged today before the prod update. Got some results from the weekend jobs.

@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

Fixed here: kernelci/kernelci-deploy@f26278b

@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

I'll make a summary of all the things that need to be done to upgrade the rustc toolchain version on the PR for 1.73 so next time it should hopefully work without failing jobs on staging.

@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

Got a build running with rustc 1.72, it worked nearly all the way until the end but then seemed to get stuck during this kselftest build step:

make[4]: Entering directory '/tmp/kci/linux/tools/testing/selftests/dt'
/tmp/kci/linux/tools/testing/selftests/../../../scripts/dtc/dt-extract-compatibles -d /tmp/kci/linux/tools/testing/selftests/../../.. > /tmp/kci/linux/build/kselftest/dt/compatible_list

@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

It's busy with a Python script working things out, that may be unrelated to rustc though:

root       53977 99.7  0.0  21204 17232 ?        R    07:27   7:08              |                           \_ python3 /tmp/kci/linux/tools/testing/selftests/../../../scripts/dtc/dt-extract-compatibles -d /tmp/kci/linux/tools/testing/selftests/../../..

@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

Alright will give this another spin with mainline rather than linux-next in the meantime.

@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

It's busy with a Python script working things out, that may be unrelated to rustc though:

root       53977 99.7  0.0  21204 17232 ?        R    07:27   7:08              |                           \_ python3 /tmp/kci/linux/tools/testing/selftests/../../../scripts/dtc/dt-extract-compatibles -d /tmp/kci/linux/tools/testing/selftests/../../..

@nfraprado ^ FYI - have you seen cases where the script might get stuck in an infinite loop?

@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

Alright with mainline I get a kernel build failure:

error[E0522]: definition of an unknown language item: `box_free`
   --> ../rust/alloc/alloc.rs:342:23
    |
342 | #[cfg_attr(not(test), lang = "box_free")]
    |                       ^^^^^^^^^^^^^^^^^ definition of unknown language item `box_free`

error: aborting due to previous error

but I think this is expected as a rust kernel build error. So as far as I can tell the KernelCI config is all working and this can be merged now.

Full details for the mainline build can be found here:
https://staging.kernelci.org/build/id/6523b072ec3bd74049e19380/

I had to forcibly abort the linux-next kernel build and the output was lost.

Copy link
Contributor

@gctucker gctucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested OK on staging.

@gctucker gctucker added this pull request to the merge queue Oct 9, 2023
Merged via the queue into kernelci:main with commit 92d9411 Oct 9, 2023
4 checks passed
@ojeda ojeda deleted the rust-1.72 branch October 9, 2023 09:34
@ojeda
Copy link
Contributor Author

ojeda commented Oct 9, 2023

I think this is expected as a rust kernel build error.

Yeah, that appears because in mainline we have still 1.71's standard library, but we are using 1.72 as a compiler in that build.

If we want to get rid of those, then we would need to keep at least 2 Rust compilers around, i.e. one for mainline, and one for the time periods where we have a new version in rust-next (which, ideally, would be at least 2 weeks, i.e. so that the compiler upgrade is tested in linux-next).

If the Rust builds start to work well, then I think we should try to do it, so that other maintainers can easily take advantage of this for their subsystems.

Thanks again for fixing the Rust support!

@nfraprado
Copy link
Contributor

@gctucker Can't say I have. Though did it really get stuck in an infinite loop? It does take a bit of time for that step to run as it's parsing the whole kernel source. On my machine takes about 15 seconds. Since you merged this PR I assume it worked out in the end then?

@gctucker
Copy link
Contributor

gctucker commented Oct 9, 2023

I cancelled it after about 10min I believe, well it's hard to tell exactly how long it stayed there but definitely more than 15s. I could try to reproduce it locally with the same Docker image and source code version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants