Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bump version to 0.2.8 #1349

Merged
merged 4 commits into from
Jan 22, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 2 additions & 3 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ jobs:
python -m pip install --upgrade pip wheel
pip install -e .
python -c "import autogen"
pip install -e. pytest mock
pip install pytest mock
- name: Set AUTOGEN_USE_DOCKER based on OS
shell: bash
run: |
Expand All @@ -53,8 +53,7 @@ jobs:
- name: Coverage
if: matrix.python-version == '3.10'
run: |
pip install -e .[test]
pip install -e .[redis]
pip install -e .[test,redis]
coverage run -a -m pytest test --ignore=test/agentchat/contrib --skip-openai
coverage xml
- name: Upload coverage to Codecov
Expand Down
3 changes: 3 additions & 0 deletions autogen/cache/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
from .cache import Cache

__all__ = ["Cache"]
1 change: 0 additions & 1 deletion autogen/cache/cache.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
import os
from typing import Dict, Any

from autogen.cache.cache_factory import CacheFactory
Expand Down
2 changes: 1 addition & 1 deletion autogen/version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "0.2.7"
__version__ = "0.2.8"
2 changes: 1 addition & 1 deletion test/agentchat/test_cache_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
import pytest
import autogen
from autogen.agentchat import AssistantAgent, UserProxyAgent
from autogen.cache.cache import Cache
from autogen.cache import Cache

sys.path.append(os.path.join(os.path.dirname(__file__), ".."))
from conftest import skip_openai, skip_redis # noqa: E402
Expand Down
4 changes: 3 additions & 1 deletion test/conftest.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,14 @@
import pytest

skip_openai = False
skip_redis = False


# Registers command-line option '--skip-openai' via pytest hook.
# Registers command-line option '--skip-openai' and '--skip-redis' via pytest hook.
# When this flag is set, it indicates that tests requiring OpenAI should be skipped.
sonichi marked this conversation as resolved.
Show resolved Hide resolved
def pytest_addoption(parser):
parser.addoption("--skip-openai", action="store_true", help="Skip all tests that require openai")
parser.addoption("--skip-redis", action="store_true", help="Skip all tests that require redis")


# pytest hook implementation extracting the '--skip-openai' command line arg and exposing it globally
Expand Down
2 changes: 1 addition & 1 deletion website/docs/Use-Cases/agent_chat.md
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@ By adopting the conversation-driven control with both programming language and n
Since version 0.2.8, a configurable context manager allows you to easily configure LLM cache, using either DiskCache or Redis. All agents inside the context manager will use the same cache.

```python
from autogen.cache.cache import Cache
from autogen.cache import Cache

with Cache.redis(cache_seed=42, redis_url="redis://localhost:6379/0") as cache:
user.initiate_chat(assistant, message=coding_task, cache=cache)
Expand Down
2 changes: 1 addition & 1 deletion website/docs/Use-Cases/enhanced_inference.md
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,7 @@ the cache, using either DiskCache or Redis.
All `OpenAIWrapper` created inside the context manager can use the same cache through the constructor.

```python
from autogen.cache.cache import Cache
from autogen.cache import Cache

with Cache.redis(cache_seed=42, redis_url="redis://localhost:6379/0") as cache:
client = OpenAIWrapper(..., cache=cache)
Expand Down
Loading