Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Issue]: Unable to Set Cache Directory in Read-Only File System Environments #1074

Closed
alexpwrd opened this issue Dec 27, 2023 · 6 comments
Closed

Comments

@alexpwrd
Copy link

alexpwrd commented Dec 27, 2023

Describe the issue

I am using the Autogen library in an AWS Lambda environment and I am encountering an issue related to the cache directory.

In AWS Lambda, the file system is read-only except for the /tmp directory. However, when I use the GPTAssistantAgent, UserProxyAgent, GroupChat, or GroupChatManager classes in the Autogen library, it seems that they are trying to create a cache directory at .cache/41, which is not writable in AWS Lambda. This results in an OSError: [Errno 30] Cache directory ".cache/41" does not exist and could not be created.

I tried to set the cache directory to /tmp using the Completion.set_cache(cache_path_root="/tmp") method, but it seems that the GPTAssistantAgent, UserProxyAgent, GroupChat, or GroupChatManager classes are creating their own instances of the Completion class, which use the default cache directory .cache.

Unfortunately, the Autogen library does not seem to provide a way to change the cache directory for these classes directly. I would like to ask if there is a way to set the cache directory for these classes to /tmp or if there is a workaround for this issue.

I would appreciate any assistance you could provide.

Steps to reproduce

No response

Screenshots and logs

No response

Additional Information

No response

@rickyloynd-microsoft
Copy link
Contributor

@sonichi

@sonichi
Copy link
Contributor

sonichi commented Dec 29, 2023

try: OpenAIWrapper.cache_path_root="/tmp"
Completion is deprecated for openai>=1

@ekzhu
Copy link
Collaborator

ekzhu commented Dec 29, 2023

This should be part of the llm_config, which we should use a typed class rather than a dict that user must guess it's schema. Trying to address this in: #1095

@alexpwrd
Copy link
Author

thanks @sonichi that resolved it

@jonmadison-amzn
Copy link

jonmadison-amzn commented Aug 3, 2024

this is still broken for Lambda deployments. currently investigating. cache_path_root is ignored, setting cache_seed does work to change the /.cache subdir (which is unsuccessful still given docker is read-only)

it's something to do with this hardcoded LEGACY_CACHE_DIR in the OpenAIWrapper that create uses. If anyone knows offhand what else to look for let me know as i'm still relatively new to Python

ended up monkey patching Cache.disk in the handler to force its hand:

@staticmethod
def my_disk(cache_seed: Union[str, int] = 99, cache_path_root: str = "/tmp") -> "Cache":
    print(f"my_disk, cache_path_root: {cache_path_root}")
    """
    Create a Disk cache instance.

    Args:
        cache_seed (Union[str, int], optional): A seed for the cache. Defaults to 42.
        cache_path_root (str, optional): The root path for the disk cache. Defaults to ".cache".

    Returns:
        Cache: A Cache instance configured for Disk caching.
    """
    return Cache({"cache_seed": cache_seed, "cache_path_root": "/tmp"})

Cache.disk = my_disk

@rritam94
Copy link

I'm having the same issue even with the provided solutions. Has anyone been able to fix it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants