llvmPackages_14.{mlir,flang}: init#163878
Conversation
|
Put these in 'tools' since while MLIR is largely a library here, it's not a runtime library like the others. |
|
As for installing MLIR tools:
Idea 1: Current approach: manually enumerate, explicitly build, and install the various utilities (and possibly have issues in other MLIR projects or in the future) Idea 2: Patch llvm's cmake to let users of Idea 3: Patch (and attempt to upstream) in a new EDIT: The manual installation we're doing works fine (if not my favorite) for MLIR, and is enough for flang (hence this PR). However as we pull in more projects using MLIR, these issues with how they try to install themselves will be more problematic (I've worked on two besides flang so far). |
d5fe40e to
74f1fa7
Compare
|
(rebasing+force-pushing for LLVM 14 release) |
8d14fb5 to
84c7d13
Compare
|
|
||
| mkdir -p $out/share/vim-plugins/ | ||
| cp -r ../utils/vim $out/share/vim-plugins/mlir | ||
| install -Dt $out/share/emacs/site-lisp ../utils/emacs/mlir-mode.el |
There was a problem hiding this comment.
I think the MLIRConfig.cmake might also need to be fixed up here or somewhere else. Since the output is split, the $lib/lib/cmake folder will be moved to $dev/lib/cmake, which might invalidate some variables like MLIR_CMAKE_DIR.
| # Patch around check for being built native (maybe because not built w/LLVM?) | ||
| postPatch = lib.optionalString enableRunners '' | ||
| for x in **/CMakeLists.txt; do | ||
| substituteInPlace "$x" --replace 'if(TARGET ''${LLVM_NATIVE_ARCH})' 'if (1)' |
There was a problem hiding this comment.
In LLVM git / v15 there is inverse check too, so we'd need to add this to avoid build breakage:
substituteInPlace "$x" --replace 'if(NOT TARGET ''${LLVM_NATIVE_ARCH})' 'if (0)'
There was a problem hiding this comment.
globstar doesn't seem to be enabled in stdenv/bash, so the for loop doesn't process files recursively. We can use find -name CMakeLists.txt | while read f; do ...; done instead.
|
I tried to use MLIR at $dayjob, but found the way llvmPackages does multi-derivation and multi-output to cause too much trouble. I ended up making a simple LLVM "monobuild" expression which has working MLIR cmake files: # Build a full LLVM in one derivation with one output -- in contrast to nixpkgs
# llvmPackages which splits the package into multiple derivations (and multiple
# outputs).
#
# Pros:
# * Easier to match upstream builds / make MLIR CMake files work.
# * No patching required.
#
# Cons:
# * Slower to build / iterate on.
# * Larger closure sizes.
# * Not useable as stdenv component as $CC etc. doesn't get wrapped.
{ lib, stdenv, fetchFromGitHub, cmake, python3, ncurses, zlib, libffi, libxml2 }:
stdenv.mkDerivation rec {
pname = "llvm";
version = "15.0.0";
src = fetchFromGitHub {
owner = "llvm";
repo = "llvm-project";
rev = "llvmorg-${version}";
sha256 = "sha256-4yviNtiJJLE6JPNqwPwRBk9fZ9vXqsVIaNOucrePmy8=";
};
nativeBuildInputs = [
cmake python3
];
buildInputs = [
libffi
libxml2
];
propagatedBuildInputs = [
ncurses
zlib
];
sourceRoot = "source/llvm";
cmakeFlags = [
"-DLLVM_ENABLE_PROJECTS=mlir"
"-DLLVM_ENABLE_RTTI=ON"
];
requiredSystemFeatures = [ "big-parallel" ];
meta = {
description = "A collection of modular and reusable compiler and toolchain technologies";
homepage = "https://llvm.org/";
license = lib.licenses.ncsa;
};
} |
|
Given that Triton (https://github.com/openai/triton) needs MLIR and Torch 2.0 will make use of Triton for its @rrbutani any thoughts from your recent updates to |
|
@ConnorBaker Thanks for the heads-up about I've got an re: Triton: do you happen to know what their plan is for the version of LLVM/MLIR they will require? I'm not familiar with Triton (and I couldn't find an answer after quickly glancing at their repo; they make reference to LLVM 11 and their CMake scripts seem to accept any LLVM version 6.0+) but other projects I've worked with that use MLIR tend to track upstream very closely and generally don't seem to line up their releases with LLVM/MLIR releases. EDIT: I think we're fine; they seem to use LLVM/MLIR releases (currently 14.0.6) instead of HEAD, albeit their own distributions. |
|
@rrbutani They're working their way towards using HEAD; they have a PR open to use LLVM 15 here: triton-lang/triton#1070. Both their internal team and the team at Meta want to track HEAD more closely. I didn't notice that they switched to using their own distribution; that's new! I've been wallowing in Dockerfiles the past month or two building Magma, PyTorch, TorchVision, and Triton from source. Eventually I broke and have been trying to package it with Nix since. Please let me know if there's anything I can do to help get MLIR with CUDA support in Nixpkgs. I don't want to go back to docker :( |
|
Hi! Is there a reason to only target one llvm version? I'm currently using EDIT 2023-04-06: In fact we're going to need |
| ''; | ||
|
|
||
| postBuild = '' | ||
| make ${lib.concatStringsSep " " bins} -j$NIX_BUILD_CORES -l$NIX_BUILD_CORES |
There was a problem hiding this comment.
I don't think we set -l anymore in stdenv since a few months
|
Could someone please update this for LLVM 17? Flang has become a lot more useful in the meantime and can now build executables without passing experimental flags. |
|
Following on the previous ping, @dtzWill what are your plans wrt this? Do you plan to look into this again, do you need any help to push this further and get it merged, or should we mark this closed for now? CC @NixOS/rocm-maintainers can you help assessing the situation? How does this interact with |
|
Related: #280572 |
Description of changes
Add packages for MLIR and Flang.
MLIR: https://mlir.llvm.org/
Flang: https://flang.llvm.org/docs/
Mostly came for packaging MLIR, but flang is nice to have and helps
check that the produced MLIR package is functional.
Note that (for now) using flang to produce binaries requires
gfortranavailable as well.
Editor bits for MLIR are installed as well as mlir-lsp-server.
Includes some of the
runnersfor MLIR which are at least neat and while don'tsignificantly increase closure size, might be a liability should this
end up a dependency of something lighter (size or portability).
Things done
sandbox = trueset innix.conf? (See Nix manual)nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage./result/bin/)nixos/doc/manual/md-to-db.shto update generated release notes