Skip to content

common : add getpwuid fallback for HF cache when HOME is not set#21035

Merged
angt merged 1 commit into
ggml-org:masterfrom
angt:common-add-getpwuid-fallback-for-hf-cache-when-home-is-not-set
Mar 26, 2026
Merged

common : add getpwuid fallback for HF cache when HOME is not set#21035
angt merged 1 commit into
ggml-org:masterfrom
angt:common-add-getpwuid-fallback-for-hf-cache-when-home-is-not-set

Conversation

@angt
Copy link
Copy Markdown
Member

@angt angt commented Mar 26, 2026

Overview

Like llama's cache, we need a fallback when HOME is not defined, mostly thanks to systemd.

Additional information

We don't need to check for __EMSCRIPTEN__:

$ cat test.cpp
#include <iostream>
#include <fstream>
#include <cstdlib>
#include <filesystem>

#include <unistd.h>
#include <pwd.h>

namespace fs = std::filesystem;

int main() {
    const char * home = std::getenv("HOME");
    std::cout << "HOME: " << home << std::endl;

    fs::path file = fs::path(home) / "test.txt";

    std::ofstream out(file);
    out << "something" << std::endl;
    out.close();

    std::ifstream in(file);
    std::string line;

    while (std::getline(in, line)) {
        std::cout << line << std::endl;
    }

    fs::remove(file);
    return 0;
}

$ emcc -std=c++17 test.cpp

$ node a.out.js
HOME: /home/web_user
something

If it's ok i can update fs_get_cache_directory after.

Signed-off-by: Adrien Gallouët <angt@huggingface.co>
@angt angt merged commit 287b5b1 into ggml-org:master Mar 26, 2026
44 of 45 checks passed
slartibardfast pushed a commit to slartibardfast/llama.cpp that referenced this pull request Apr 12, 2026
Seunghhon pushed a commit to Seunghhon/llama.cpp that referenced this pull request Apr 26, 2026
rsenthilkumar6 pushed a commit to rsenthilkumar6/llama.cpp that referenced this pull request May 1, 2026
ljubomirj pushed a commit to ljubomirj/llama.cpp that referenced this pull request May 6, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Misc. bug: Unable to start llama-server as systemd service without specifying huggingface cache options

3 participants