Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix response parsing to make gemini tool use work #1591

Merged
merged 9 commits into from
Dec 18, 2024

Conversation

dirkbrnd
Copy link
Contributor

@dirkbrnd dirkbrnd commented Dec 17, 2024

Description

Our run response parsing for the playground was not able to adequately parse messages in responses where tool_calls contained unserializable objects from Gemini. This resolves it.

Fixes #1509

Type of change

Please check the options that are relevant:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Model update
  • Infrastructure change

Checklist

  • My code follows Phidata's style guidelines and best practices
  • I have performed a self-review of my code
  • I have added docstrings and comments for complex logic
  • My changes generate no new warnings or errors
  • I have added cookbook examples for my new addition (if needed)
  • I have updated requirements.txt/pyproject.toml (if needed)
  • I have verified my changes in a clean environment

Copy link

@samyogdhital samyogdhital left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check these comments before merging the branch.

Comment on lines +6 to +11
finance_agent = Agent(
name="Finance Agent",
model=Gemini(id="gemini-2.0-flash-exp"),
tools=[YFinanceTools(stock_price=True)],
debug_mode=True,
)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add these two flags

  • add_chat_history_to_messages=True,
  • num_history_responses=3

And see if you get error.

Persistent memory is not working.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please test with this sample config

master = Agent(
    name="Master",
    model=model,
    agent_id="master",
    team=[web_agent],
    storage=SqlAgentStorage(table_name="master_sessions", db_file="tmp/agents.db"),
    markdown=True,
    add_chat_history_to_messages=True,
    num_history_responses=3
)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check this error please.
image

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@samyogdhital @manthanguptaa It seems that persistence still cannot save messages related to tool use in Gemini
Is there a fix on the way?
If not, I can try to fix it after the New Year holidays.

DEBUG    ============== user ==============                                                                                                                                
DEBUG    how many messages did i send?                                                                                                                                     
DEBUG    ============== model ==============                                                                                                                               
DEBUG    Tool Calls: [                                                                                                                                                     
           {                                                                                                                                                               
             "type": "function",                                                                                                                                           
             "function": {                                                                                                                                                 
               "name": "get_chat_history",                                                                                                                                 
               "arguments": "{}"                                                                                                                                           
             }                                                                                                                                                             
           }                                                                                                                                                               
         ]                                                                                                                                                                 
DEBUG    ============== tool ==============                                                                                                                                
DEBUG    ['']                                                                                                                                                              
DEBUG    ============== model ==============                                                                                                                               
DEBUG    I'm sorry, I cannot fulfill this request. There is no information about how many messages you sent.                                                               
                                                                                                                                                                           
DEBUG    **************** METRICS START ****************                                                                                                                   
DEBUG    * Time to first token:         1.4130s                                                                                                                            
DEBUG    * Time to generate response:   1.5110s                                                                                                                            
DEBUG    * Tokens per second:           15.2219 tokens/s                                                                                                                   
DEBUG    * Input tokens:                1065                                                                                                                               
DEBUG    * Output tokens:               23                                                                                                                                 
DEBUG    * Total tokens:                1088                                                                                                                               
DEBUG    **************** METRICS END ******************                                                                                                                   
DEBUG    ---------- Gemini Response End ----------                                                                                                                         
DEBUG    ---------- Gemini Response End ----------                                                                                                                         
DEBUG    Added 4 Messages to AgentMemory                                                                                                                                   
DEBUG    Added AgentRun to AgentMemory                                                                                                                                     
DEBUG    Exception upserting into table: (builtins.TypeError) Object of type RepeatedComposite is not JSON serializable                                                    
         [SQL: INSERT INTO agent_sessions (session_id, agent_id, user_id, memory, agent_data, user_data, session_data, created_at) VALUES (?, ?, ?, ?, ?, ?, ?, ?) ON      
         CONFLICT (session_id) DO UPDATE SET agent_id = ?, user_id = ?, memory = ?, agent_data = ?, user_data = ?, session_data = ?, updated_at = ?]                       
         [parameters: [{}]]                                                                                                                                                
DEBUG    Checking if table exists: agent_sessions                                                                                                                          
DEBUG    *********** Agent Run End: df9761df-1868-4b4d-afc4-b7305132d7b8 ***********                                                                                       

Exception upserting into table: (builtins.TypeError) Object of type RepeatedComposite is not JSON serializable

debug_mode=True,
)

app = Playground(agents=[finance_agent]).get_app(use_async=False)
Copy link

@samyogdhital samyogdhital Dec 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we have use_async as true and still having gemini working perfectly?
cause openi, groq and other models work without turning off use_async.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@samyogdhital Gemini right now doesn't support async. It's on the roadmap but till then you have to use use_async=False

@manthanguptaa
Copy link
Contributor

@dirkbrnd I tried the PR against what @samyogdhital raised with the following agent config and it threw an error

from phi.agent import Agent
from phi.storage.agent.postgres import PgAgentStorage
from phi.tools.yfinance import YFinanceTools
from phi.playground import Playground, serve_playground_app
from phi.model.google import Gemini

finance_agent = Agent(
    name="Finance Agent",
    model=Gemini(id="gemini-2.0-flash-exp"),
    tools=[YFinanceTools(stock_price=True)],
    storage=PgAgentStorage(table_name="agent_sessions", db_url="postgresql+psycopg://ai:ai@localhost:5532/ai"),
    add_history_to_messages=True,
    debug_mode=True,
)

app = Playground(agents=[finance_agent]).get_app(use_async=False)

if __name__ == "__main__":
    serve_playground_app("gemini_agents:app", reload=True)

Error

Traceback (most recent call last):
  File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/responses.py", line 259, in __call__
    await wrap(partial(self.listen_for_disconnect, receive))
  File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/responses.py", line 255, in wrap
    await func()
  File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/responses.py", line 232, in listen_for_disconnect
    message = await receive()
  File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/uvicorn/protocols/http/httptools_impl.py", line 555, in receive
    await self.message_event.wait()
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/asyncio/locks.py", line 226, in wait
    await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 16c64d400

During handling of the above exception, another exception occurred:

  + Exception Group Traceback (most recent call last):
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
  |     result = await app(  # type: ignore[func-returns-value]
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
  |     return await self.app(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/fastapi/applications.py", line 1054, in __call__
  |     await super().__call__(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/applications.py", line 113, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/middleware/errors.py", line 187, in __call__
  |     raise exc
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/middleware/errors.py", line 165, in __call__
  |     await self.app(scope, receive, _send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/middleware/cors.py", line 93, in __call__
  |     await self.simple_response(scope, receive, send, request_headers=headers)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/middleware/cors.py", line 144, in simple_response
  |     await self.app(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     raise exc
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/routing.py", line 715, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/routing.py", line 735, in app
  |     await route.handle(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/routing.py", line 288, in handle
  |     await self.app(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/routing.py", line 76, in app
  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     raise exc
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/routing.py", line 74, in app
  |     await response(scope, receive, send)
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/responses.py", line 259, in __call__
  |     await wrap(partial(self.listen_for_disconnect, receive))
  |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 763, in __aexit__
  |     raise BaseExceptionGroup(
  | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/responses.py", line 255, in wrap
    |     await func()
    |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/responses.py", line 244, in stream_response
    |     async for chunk in self.body_iterator:
    |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/concurrency.py", line 62, in iterate_in_threadpool
    |     yield await anyio.to_thread.run_sync(_next, as_iterator)
    |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/anyio/to_thread.py", line 56, in run_sync
    |     return await get_async_backend().run_sync_in_worker_thread(
    |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
    |     return await future
    |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 943, in run
    |     result = context.run(func, *args)
    |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/starlette/concurrency.py", line 51, in _next
    |     return next(iterator)
    |   File "/Users/manthangupta/Desktop/lab/phidata/phi/playground/router.py", line 93, in chat_response_streamer
    |     for run_response_chunk in run_response:
    |   File "/Users/manthangupta/Desktop/lab/phidata/phi/agent/agent.py", line 1794, in _run
    |     for model_response_chunk in self.model.response_stream(messages=messages_for_model):
    |   File "/Users/manthangupta/Desktop/lab/phidata/phi/model/google/gemini.py", line 716, in response_stream
    |     for response in self.invoke_stream(messages=messages):
    |   File "/Users/manthangupta/Desktop/lab/phidata/phi/model/google/gemini.py", line 437, in invoke_stream
    |     contents=self.format_messages(messages),
    |   File "/Users/manthangupta/Desktop/lab/phidata/phi/model/google/gemini.py", line 159, in format_messages
    |     message_parts = message.parts  # type: ignore
    |   File "/Users/manthangupta/Desktop/lab/phidata/phienv/lib/python3.9/site-packages/pydantic/main.py", line 892, in __getattr__
    |     raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
    | AttributeError: 'Message' object has no attribute 'parts'
    +------------------------------------

@manthanguptaa manthanguptaa merged commit a3c4bd6 into main Dec 18, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Error while using Gemini model
4 participants