-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: ValidationError
occurs when running branch gemini
#1139
Comments
Thanks for the details! It seems you are using a different pydantic version. Can you try:
|
Looks like it helps with Exception in pydantic_core package, but here is two more bugs in First case: exceeds MAX_TOKENS raises
|
Looking at the block below, it seems like you have the "human_input_mode" parameter wrong.
Also, the code and the log you provided did not match. For instance, the agents' names do not match from the code to the output log. One possibility is that the code is changed after caching the same question, and you can try to run |
Can you anyone resolved the error |
Please fix the code. Still getting Installation (This time running in Google Colab)!pip install https://github.com/microsoft/autogen/archive/gemini.zip
!pip install "google-generativeai" "pydash" "pillow"
!pip install git+https://github.com/microsoft/autogen.git@gemini
from google.colab import userdata Source codefrom autogen import UserProxyAgent, ConversableAgent, config_list_from_json, AssistantAgent
config_list_gemini = [{
"model": "gemini-pro",
"api_key": userdata.get('GOOGLE_AI_API_KEY'),
"api_type": "google"
}]
def first():
assistant = AssistantAgent("assistant",
llm_config={"config_list": config_list_gemini, "seed": 42},
max_consecutive_auto_reply=13)
user_proxy = UserProxyAgent("user_proxy",
code_execution_config={"work_dir": "coding", "use_docker": False},
human_input_mode="NEVER",
is_termination_msg = lambda x: content_str(x.get("content")).find("TERMINATE") >= 0)
user_proxy.initiate_chat(assistant, message="Write a program in python that Sort the array with Bubble Sort: [4, 1, 3, 2]")
if __name__ == "__main__":
first() Tracebackuser_proxy (to assistant):
Write a program in python that Sort the array with Bubble Sort: [4, 1, 3, 2]
--------------------------------------------------------------------------------
---------------------------------------------------------------------------
StopCandidateException Traceback (most recent call last)
[<ipython-input-2-fd2ce89ef12a>](https://localhost:8080/#) in <cell line: 37>()
36
37 if __name__ == "__main__":
---> 38 first()
39 second()
8 frames
[/usr/local/lib/python3.10/dist-packages/google/generativeai/generative_models.py](https://localhost:8080/#) in send_message(self, content, generation_config, safety_settings, stream, **kwargs)
382 glm.Candidate.FinishReason.MAX_TOKENS,
383 ):
--> 384 raise generation_types.StopCandidateException(response.candidates[0])
385
386 self._last_sent = content
StopCandidateException: finish_reason: RECITATION
index: 0 The same setting and caught different Exception
|
@OTR Thanks for contributing these examples and logs! Code updated. Please try again. |
You hit the The problem is triggered by the fact that "Write a program in python that Sort the array with Bubble Sort: [4, 1, 3, 2]" is a really common problem. Whatever is the answer it will generate, there is a similar data in its training data. Gemini will then refuse to give it to you. Gemini does not want to plagiate. |
Is there any workaround for the "RECITATION" issue? |
@OTR @rakotomandimby @naourass Thanks for all your interest using AutoGen for Gemini. Do you have any suggestions regarding the "RECITATION" problem for Gemini? As pointed out by @rakotomandimby, there are already lots of complaints for the Gemini API (not from AutoGen, but from other usages). If a try... catch... exception could not resolve this issue elegantly, how to proceed and kick Gemini to give us a response (from prompt engineering perspective)? |
@BeibinLi Could you please provide a working example of using autogen at Your last commit points to not existing page 9bc4d4e Such notebook doesn't exist: When I am attempting to launch my sample code, which previously launched without errors, I am encountering a compilation error: !pip install https://github.com/microsoft/autogen/archive/gemini.zip
!pip install "google-generativeai" "pydash" "pillow"
!pip install git+https://github.com/microsoft/autogen.git@gemini Traceback: https://gist.github.com/OTR/2c175ef404955dfca68bce57d5727e0e |
@OTR Thanks for pointing out. The notebook is at: https://github.com/microsoft/autogen/blob/gemini/notebook/agentchat_gemini.ipynb For the installation bug, it seems like a rm -rf /tmp/pip-req-build-spmrq54h Thanks!!! |
* simplify the initiation of chat * version update * include openai * completion * load config list from json * initiate_chat * oai config list * oai config list * config list * config_list * raise_error * retry_time * raise condition * oai config list * catch file not found * catch openml error * handle openml error * handle openml error * handle openml error * handle openml error * handle openml error * handle openml error * close microsoft#1139 * use property * termination msg * AIUserProxyAgent * smaller dev container * update notebooks * match * document code execution and AIUserProxyAgent * gpt 3.5 config list * rate limit * variable visibility * remove unnecessary import * quote * notebook comments * remove mathchat from init import * two users * import location * expose config * return str not tuple * rate limit * ipython user proxy * message * None result * rate limit * rate limit * rate limit * rate limit
…icrosoft#1142) * simplify the initiation of chat * version update * include openai * completion * load config list from json * initiate_chat * oai config list * oai config list * config list * config_list * raise_error * retry_time * raise condition * oai config list * catch file not found * catch openml error * handle openml error * handle openml error * handle openml error * handle openml error * handle openml error * handle openml error * close microsoft#1139 * use property * termination msg * AIUserProxyAgent * smaller dev container * update notebooks * match * document code execution and AIUserProxyAgent * gpt 3.5 config list * rate limit * variable visibility * remove unnecessary import * quote * notebook comments * remove mathchat from init import * two users * import location * expose config * return str not tuple * rate limit * ipython user proxy * message * None result * rate limit * rate limit * rate limit * rate limit * make auto_reply a common method for all agents * abs path * refactor and doc * set mathchat_termination * code format * modified * emove import * code quality * sender -> messages * system message * clean agent hierarchy * dict check * invalid oai msg * return * openml error * docstr --------- Co-authored-by: kevin666aa <[email protected]>
Closing due to inactivity. |
Describe the bug
First case Scenario
Given
Environment: My local Windows 10 machine
Python version: Python 3.11.7 (tags/v3.11.7:fa7a6f2, Dec 4 2023, 19:24:49) [MSC v.1937 64 bit (AMD64)] on win32
autogen: version 0.2.2
branch:
gemini
When
I tried to run
first
function from samples listed below on my local windows machineThen
I got an error
StopCandidateException
( I rarely get that exception, like floating bug)Then when I tried to reproduce the Exception above, I got another Exception. (See step 5 from steps to reproduce)
Second Case Scenario
I tryed to reproduce the mentioned bug on Github Spaces platform.
Given:
Environment: Running on github spaces
Python version:
autogen: version 0.2.2
branch:
gemini
commit #: c6792a8
When:
I tried to run
second
function from samples withgemini
branch and my Google Gen AI API KEYThen:
I got
ValidationError
inpydantic_core
package.Steps to reproduce
Step 1
At first I tryed to install
autogen
package fromgemini
branch with following commmands (python3.10 by default):Step 2
And I got an Exception from second case scenario.
Step 3
Then I tryed to create an isolated environment with poetry and python3.11, installed autogen with following commands:
Step 4
And I got an Exception from Second case scenario in pydantic_core package.
Step 5
Then I thought maybe get rid of hard coded version for the last installed package ->
0.2.0b4
and tried on my local machine within isolated poetry environment the following commands:Step 6
Run just
first
function withinmain
block. Comment outsecond
function call.Step 7
And I got
StopCandidateException
(First case scenario). But It is floating bug, and I managed to get it only once.Expected Behavior
Agents should start communicating.
Screenshots and logs
Full Traceback for First case scenario :
Full Traceback for Second case scenario :
Additional Information
Code samples I tried to run
Packages installed for the first case:
$ pip freeze
Packages installed for the second case:
$ pip freeze
The text was updated successfully, but these errors were encountered: