Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Langchain-Ollama: ChatOllama stopped responding and AgentExecutor only performs actions #28281

Open
5 tasks done
miguelg719 opened this issue Nov 22, 2024 · 19 comments
Open
5 tasks done
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@miguelg719
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Main Example on GitHub

from langchain_ollama import ChatOllama

llm = ChatOllama(model="llama3.2")
llm.invoke("Sing a ballad of LangChain.") 

Error Message and Stack Trace (if applicable)

TypeError: 'NoneType' object is not iterable

Description

Tested with multiple people, the new version of Ollama must have changed output format but ChatOllama now cannot provide any text result.

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 23.5.0: Wed May 1 20:14:38 PDT 2024; root:xnu-10063.121.3~5/RELEASE_ARM64_T6020
Python Version: 3.11.3 (main, Apr 19 2023, 18:49:55) [Clang 14.0.6 ]

Package Information

langchain_core: 0.3.19
langchain: 0.1.17
langchain_community: 0.0.37
langsmith: 0.1.144
langchain_experimental: 0.0.57
langchain_ollama: 0.2.0
langchain_openai: 0.1.6
langchain_text_splitters: 0.0.1

Optional packages not installed

langgraph
langserve

Other Dependencies

aiohttp: 3.9.3
aiosqlite: 0.18.0
aleph-alpha-client: Installed. No version info available.
anthropic: Installed. No version info available.
arxiv: Installed. No version info available.
assemblyai: Installed. No version info available.
async-timeout: 4.0.2
atlassian-python-api: Installed. No version info available.
azure-ai-documentintelligence: Installed. No version info available.
azure-ai-formrecognizer: Installed. No version info available.
azure-ai-textanalytics: Installed. No version info available.
azure-cognitiveservices-speech: Installed. No version info available.
azure-core: Installed. No version info available.
azure-cosmos: Installed. No version info available.
azure-identity: Installed. No version info available.
azure-search-documents: Installed. No version info available.
beautifulsoup4: 4.12.2
bibtexparser: Installed. No version info available.
cassio: Installed. No version info available.
chardet: 4.0.0
clarifai: Installed. No version info available.
cloudpickle: 2.2.1
cohere: Installed. No version info available.
couchbase: Installed. No version info available.
dashvector: Installed. No version info available.
databricks-vectorsearch: Installed. No version info available.
dataclasses-json: 0.6.5
datasets: 2.14.6
dgml-utils: Installed. No version info available.
docarray[hnswlib]: Installed. No version info available.
elasticsearch: Installed. No version info available.
esprima: Installed. No version info available.
faiss-cpu: 1.8.0
faker: Installed. No version info available.
feedparser: Installed. No version info available.
fireworks-ai: Installed. No version info available.
friendli-client: Installed. No version info available.
geopandas: Installed. No version info available.
gitpython: 3.1.43
google-cloud-documentai: Installed. No version info available.
gql: Installed. No version info available.
gradientai: Installed. No version info available.
hdbcli: Installed. No version info available.
hologres-vector: Installed. No version info available.
html2text: Installed. No version info available.
httpx: 0.27.0
httpx-sse: Installed. No version info available.
huggingface_hub: 0.26.2
javelin-sdk: Installed. No version info available.
jinja2: 3.1.3
jq: Installed. No version info available.
jsonpatch: 1.33
jsonschema: 4.17.3
lxml: 4.9.2
manifest-ml: Installed. No version info available.
markdownify: Installed. No version info available.
motor: Installed. No version info available.
msal: Installed. No version info available.
mwparserfromhell: Installed. No version info available.
mwxml: Installed. No version info available.
newspaper3k: Installed. No version info available.
nlpcloud: Installed. No version info available.
numexpr: 2.8.4
numpy: 1.24.3
nvidia-riva-client: Installed. No version info available.
oci: Installed. No version info available.
ollama: 0.4.0
openai: 1.26.0
openapi-pydantic: Installed. No version info available.
openlm: Installed. No version info available.
oracle-ads: Installed. No version info available.
oracledb: Installed. No version info available.
orjson: 3.9.15
packaging: 23.2
pandas: 1.5.3
pdfminer-six: 20231228
pgvector: Installed. No version info available.
praw: Installed. No version info available.
premai: Installed. No version info available.
presidio-analyzer: Installed. No version info available.
presidio-anonymizer: Installed. No version info available.
psychicapi: Installed. No version info available.
py-trello: Installed. No version info available.
pydantic: 2.10.1
pyjwt: 2.8.0
pymupdf: Installed. No version info available.
pypdf: 4.2.0
pypdfium2: 4.30.0
pyspark: Installed. No version info available.
PyYAML: 6.0.1
qdrant-client: Installed. No version info available.
rank-bm25: Installed. No version info available.
rapidfuzz: 3.9.0
rapidocr-onnxruntime: Installed. No version info available.
rdflib: Installed. No version info available.
requests: 2.32.3
requests-toolbelt: 1.0.0
rspace_client: Installed. No version info available.
scikit-learn: 1.2.2
sentence-transformers: 3.2.1
SQLAlchemy: 1.4.39
sqlite-vss: Installed. No version info available.
streamlit: Installed. No version info available.
sympy: 1.13.1
tabulate: 0.9.0
telethon: Installed. No version info available.
tenacity: 8.2.2
tidb-vector: Installed. No version info available.
tiktoken: 0.6.0
timescale-vector: Installed. No version info available.
torch: 2.5.1
tqdm: 4.67.0
transformers: 4.46.2
tree-sitter: Installed. No version info available.
tree-sitter-languages: Installed. No version info available.
typer: 0.13.0
typing-extensions: 4.12.2
upstash-redis: Installed. No version info available.
vdms: Installed. No version info available.
vowpal-wabbit-next: Installed. No version info available.
xata: Installed. No version info available.
xmltodict: Installed. No version info available.

@miguelg719 miguelg719 changed the title Langchain Ollama Langchain-Ollama: ChatOllama stopped responding and AgentExecutor only performs actions Nov 22, 2024
@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Nov 22, 2024
@rtuin
Copy link

rtuin commented Nov 22, 2024

I'm also getting this error using qwen2.5-coder:7b model. Perhaps it helps to share the stack trace:

Traceback (most recent call last):
  [...redacted stack trace...]
    response = llm.invoke(messages)
               ^^^^^^^^^^^^^^^^^^^^
  File "my_anonimized_project_dir/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 286, in invoke
    self.generate_prompt(
  File "my_anonimized_project_dir/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 786, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my_anonimized_project_dir/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 643, in generate
    raise e
  File "my_anonimized_project_dir/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 633, in generate
    self._generate_with_cache(
  File "my_anonimized_project_dir/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache
    result = self._generate(
             ^^^^^^^^^^^^^^^
  File "my_anonimized_project_dir/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 644, in _generate
    final_chunk = self._chat_stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my_anonimized_project_dir/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 558, in _chat_stream_with_aggregation
    tool_calls=_get_tool_calls_from_response(stream_resp),
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my_anonimized_project_dir/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 70, in _get_tool_calls_from_response
    for tc in response["message"]["tool_calls"]:
TypeError: 'NoneType' object is not iterable

Edit: mention that i use a different model

@Ruslando
Copy link

Same thing happening with Llama3.1

1 similar comment
@Ruslando
Copy link

Same thing happening with Llama3.1

@Fernando7181
Copy link

I was having this problem using llama3 but once switched to llma3.1 everything is working fine and using base_model

@AlbertoFormaggio1
Copy link

@Fernando7181 can it be the case that llama3.1 was already downloaded in your system, while llama3 was freshly downloaded after the update?
I am having this problem with every model I am using (all of them pulled today from ollama)

@pythongirl325
Copy link

I believe the Ollama 0.4.0 update changed how the tool call API works, it now returns None instead of having no tool_call key on the response message (https://github.com/ollama/ollama-python/blob/main/ollama/_types.py#L220)

https://github.com/langchain-ai/langchain/blob/master/libs/partners/ollama/langchain_ollama/chat_models.py#L69

Here the condition should be response["message"]["tool_calls"] is not None instead of "tool_calls" in response["message"]

@miguelg719
Copy link
Author

@pythongirl325 great lead, attaching log to support this issue. Nonetheless, it's interesting that in my case it's able to select a tool but not generate a text output response, even when taking out tools and only using a simple ChatOllama call

| ERROR:backend.main:Error testing Ollama: 'NoneType' object is not iterable
| Traceback (most recent call last):
| File "/app/backend/main.py", line 58, in test_ollama
| response = await ollama_chat_completion(
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/app/backend/agent/services.py", line 29, in ollama_chat_completion
| response = await llm.ainvoke(messages)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 307, in ainvoke
| llm_result = await self.agenerate_prompt(
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 796, in agenerate_prompt
| return await self.agenerate(
| ^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 756, in agenerate
| raise exceptions[0]
| File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 924, in _agenerate_with_cache
| result = await self._agenerate(
| ^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 731, in _agenerate
| final_chunk = await self._achat_stream_with_aggregation(
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 601, in _achat_stream_with_aggregation
| tool_calls=_get_tool_calls_from_response(stream_resp),
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/local/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 70, in _get_tool_calls_from_response
| for tc in response["message"]["tool_calls"]:
| TypeError: 'NoneType' object is not iterable

@Fernando7181
Copy link

@Fernando7181 can it be the case that llama3.1 was already downloaded in your system, while llama3 was freshly downloaded after the update? I am having this problem with every model I am using (all of them pulled today from ollama)

I don't think so because I downloaded not that long ago, and I'm using it for my vector database and RAG system and seems to be working just fine. I know that when I was using llama3 wasn't working

@rrajakaec
Copy link

How to resolve this issue? Do we need to re-download the llama3.2 model or we need to switch to ChatOpenAI.
Do we need to wait till the issue is resolved by the support team.

From
llm = ChatOllama(model='llama3.2', temperature=0)
To
llm = ChatOpenAI(model="llama3.2", api_key="ollama", base_url="http://localhost/11434", temperature=0)

@pythongirl325
Copy link

pythongirl325 commented Nov 22, 2024

@pythongirl325 great lead, attaching log to support this issue. Nonetheless, it's interesting that in my case it's able to select a tool but not generate a text output response, even when taking out tools and only using a simple ChatOllama call

I experienced this without any tools as well. I wanted to try and switch from using the ollama api directly to using the langchain library.

Here's the code I ran to get the issue:

import langchain
import langchain_ollama
from langchain_core.messages import HumanMessage, SystemMessage


model = langchain_ollama.ChatOllama(
    model="hermes3:8b"
)

messages = [
    SystemMessage(content="Transate the following from English to Italian."),
    HumanMessage(content="How are you?")
]

model.invoke(messages)

My stack trace looks pretty much like yours.

I have not used langchain before, so I might be doing something wrong here.

@rtuin
Copy link

rtuin commented Nov 22, 2024

@edmcman made a fix for this here: #28291

@edmcman
Copy link

edmcman commented Nov 22, 2024

As a work-around, you can pip install 'ollama<0.4.0'

@rrajakaec
Copy link

rrajakaec commented Nov 22, 2024

As a work-around, you can pip install 'ollama<0.4.0'

After downgrading the ollama-0.4.0 to ollama-0.3.3, the issue got resolved.

@lsukharn
Copy link

pip install 'ollama<0.4.0' works for me. Thanks @edmcman

@Fernando7181
Copy link

Fernando7181 commented Nov 22, 2024

@pythongirl325 great lead, attaching log to support this issue. Nonetheless, it's interesting that in my case it's able to select a tool but not generate a text output response, even when taking out tools and only using a simple ChatOllama call

I experienced this without any tools as well. I wanted to try and switch from using the ollama api directly to using the langchain library.

Here's the code I ran to get the issue:

import langchain
import langchain_ollama
from langchain_core.messages import HumanMessage, SystemMessage


model = langchain_ollama.ChatOllama(
    model="hermes3:8b"
)

messages = [
    SystemMessage(content="Transate the following from English to Italian."),
    HumanMessage(content="How are you?")
]

model.invoke(messages)

My stack trace looks pretty much like yours.

I have not used langchain before, so I might be doing something wrong here.

This is how im doing mine

def ask(query: str):
    chain = rag_chain()
    result = chain["run"]({"input": query})
    print(result)

ask("What is 2 + 2?")```

@hgudella
Copy link

I believe the Ollama 0.4.0 update changed how the tool call API works, it now returns None instead of having no tool_call key on the response message (https://github.com/ollama/ollama-python/blob/main/ollama/_types.py#L220)

https://github.com/langchain-ai/langchain/blob/master/libs/partners/ollama/langchain_ollama/chat_models.py#L69

Here the condition should be response["message"]["tool_calls"] is not None instead of "tool_calls" in response["message"]

This is the actual fix. Can we please create new version and publish with fix? Thank You!

@Fernando7181
Copy link

I agree, i think response["message"]["tool_calls"] is not None Should be the right fix and we could close this issue

@edmcman
Copy link

edmcman commented Nov 23, 2024

That is the fix here: #28291

@jmorganca
Copy link
Contributor

Hi all, this is also fixed in the ollama package from version 0.4.1 onwards – so sorry about that: https://github.com/ollama/ollama-python/releases/tag/v0.4.1

pip install -U ollama

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests