Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PEP 621 #1154

Closed
wants to merge 6 commits into from
Closed

PEP 621 #1154

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 26 additions & 8 deletions autogen/oai/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -287,22 +287,42 @@ def yes_or_no_filter(context, response):

def _completions_create(self, client, params):
completions = client.chat.completions if "messages" in params else client.completions
# If streaming is enabled, has messages, and does not have functions, then
# iterate over the chunks of the response
if params.get("stream", False) and "messages" in params and "functions" not in params:
# If streaming is enabled and has messages, then iterate over the chunks of the response.
if params.get("stream", False) and "messages" in params:
response_contents = [""] * params.get("n", 1)
finish_reasons = [""] * params.get("n", 1)
completion_tokens = 0

# Set the terminal text color to green
print("\033[32m", end="")

# Prepare for potential function call
full_function_call = None
# Send the chat completion request to OpenAI's API and process the response in chunks
for chunk in completions.create(**params):
if chunk.choices:
for choice in chunk.choices:
content = choice.delta.content
function_call_chunk = choice.delta.function_call
finish_reasons[choice.index] = choice.finish_reason

# Handle function call
if function_call_chunk:
if hasattr(function_call_chunk, "name") and function_call_chunk.name:
if full_function_call is None:
full_function_call = {"name": "", "arguments": ""}
full_function_call["name"] += function_call_chunk.name
completion_tokens += 1
if hasattr(function_call_chunk, "arguments") and function_call_chunk.arguments:
full_function_call["arguments"] += function_call_chunk.arguments
completion_tokens += 1
if choice.finish_reason == "function_call":
# Need something here? I don't think so.
pass
if not content:
continue
# End handle function call

# If content is present, print it to the terminal and update response variables
if content is not None:
print(content, end="", flush=True)
Expand Down Expand Up @@ -336,7 +356,7 @@ def _completions_create(self, client, params):
index=i,
finish_reason=finish_reasons[i],
message=ChatCompletionMessage(
role="assistant", content=response_contents[i], function_call=None
role="assistant", content=response_contents[i], function_call=full_function_call
),
logprobs=None,
)
Expand All @@ -346,16 +366,14 @@ def _completions_create(self, client, params):
index=i,
finish_reason=finish_reasons[i],
message=ChatCompletionMessage(
role="assistant", content=response_contents[i], function_call=None
role="assistant", content=response_contents[i], function_call=full_function_call
),
)

response.choices.append(choice)
else:
# If streaming is not enabled or using functions, send a regular chat completion request
# Functions are not supported, so ensure streaming is disabled
# If streaming is not enabled, send a regular chat completion request
params = params.copy()
params["stream"] = False
response = completions.create(**params)
return response

Expand Down
68 changes: 68 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -49,3 +49,71 @@ unfixable = ["F401"]
[tool.ruff.mccabe]
# Unlike Flake8, default to a complexity level of 10.
max-complexity = 10

[project]
name = "pyautogen"
version = "0.2.2"
description = "Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework"
readme = "README.md"
authors = [
{name = "AutoGen", email = "[email protected]"},
]
classifiers = [
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3",
]
requires-python = ">=3.9, <3.12"
dependencies = [
"diskcache",
"flaml",
"openai~=1.3",
"python-dotenv",
"termcolor",
"tiktoken",
"openai>=1.3",
"pydantic>=1.10,<3",
]

[project.urls]
Homepage = "https://github.com/microsoft/autogen"

[project.optional-dependencies]
test = [
"coverage>=5.3",
"ipykernel",
"nbconvert",
"nbformat",
"pre-commit",
"pytest-asyncio",
"pytest>=6.1.1",
]
blendsearch = [
"flaml[blendsearch]",
]
mathchat = [
"pydantic==1.10.9",
"sympy",
"wolframalpha",
]
retrievechat = [
"chromadb",
"ipython",
"pypdf",
"sentence_transformers",
]
teachable = [
"chromadb",
]
lmm = [
"pillow",
"replicate",
]
graphs = [
"matplotlib~=3.8.1",
"networkx~=3.2.1",
]

[build-system]
requires = ["pdm-backend"]
build-backend = "pdm.backend"
14 changes: 7 additions & 7 deletions website/docs/Installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,19 +31,19 @@ The following command will deactivate the current `conda` environment:
conda deactivate
```

### Option 3: poetry
### Option 3: pdm

Another option is with `poetry`, which is a dependency manager for Python.
Another option is with `pdm`, which is a dependency manager for Python.

[Poetry](https://python-poetry.org/docs/) is a tool for dependency management and packaging in Python. It allows you to declare the libraries your project depends on and it will manage (install/update) them for you. Poetry offers a lockfile to ensure repeatable installs, and can build your project for distribution.
[PDM](https://python-poetry.org/docs/) is a tool for dependency management and packaging in Python. It allows you to declare the libraries your project depends on and it will manage (install/update) them for you. PDM offers a lockfile to ensure repeatable installs, and can build your project for distribution.

You can install it by following [this doc](https://python-poetry.org/docs/#installation),
You can install it by following [this doc](https://pdm-project.org/latest/#installation),
and then create a virtual environment as below:
```bash
poetry init
poetry shell
pdm init
pdm venv activate

poetry add pyautogen
pdm add pyautogen
```

The following command will deactivate the current `poetry` environment:
Expand Down
Loading