-
Notifications
You must be signed in to change notification settings - Fork 15.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Langchain-Ollama: ChatOllama stopped responding and AgentExecutor only performs actions #28281
Comments
I'm also getting this error using
Edit: mention that i use a different model |
Same thing happening with Llama3.1 |
1 similar comment
Same thing happening with Llama3.1 |
I was having this problem using llama3 but once switched to llma3.1 everything is working fine and using base_model |
@Fernando7181 can it be the case that llama3.1 was already downloaded in your system, while llama3 was freshly downloaded after the update? |
I believe the Ollama 0.4.0 update changed how the tool call API works, it now returns Here the condition should be |
@pythongirl325 great lead, attaching log to support this issue. Nonetheless, it's interesting that in my case it's able to select a tool but not generate a text output response, even when taking out tools and only using a simple ChatOllama call | ERROR:backend.main:Error testing Ollama: 'NoneType' object is not iterable |
I don't think so because I downloaded not that long ago, and I'm using it for my vector database and RAG system and seems to be working just fine. I know that when I was using llama3 wasn't working |
How to resolve this issue? Do we need to re-download the llama3.2 model or we need to switch to ChatOpenAI. From |
I experienced this without any tools as well. I wanted to try and switch from using the ollama api directly to using the langchain library. Here's the code I ran to get the issue: import langchain
import langchain_ollama
from langchain_core.messages import HumanMessage, SystemMessage
model = langchain_ollama.ChatOllama(
model="hermes3:8b"
)
messages = [
SystemMessage(content="Transate the following from English to Italian."),
HumanMessage(content="How are you?")
]
model.invoke(messages) My stack trace looks pretty much like yours. I have not used langchain before, so I might be doing something wrong here. |
As a work-around, you can |
After downgrading the ollama-0.4.0 to ollama-0.3.3, the issue got resolved. |
pip install 'ollama<0.4.0' works for me. Thanks @edmcman |
This is how im doing mine def ask(query: str):
chain = rag_chain()
result = chain["run"]({"input": query})
print(result)
ask("What is 2 + 2?")``` |
This is the actual fix. Can we please create new version and publish with fix? Thank You! |
I agree, i think |
That is the fix here: #28291 |
Hi all, this is also fixed in the
|
Checked other resources
Example Code
Main Example on GitHub
Error Message and Stack Trace (if applicable)
TypeError: 'NoneType' object is not iterable
Description
Tested with multiple people, the new version of Ollama must have changed output format but ChatOllama now cannot provide any text result.
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
The text was updated successfully, but these errors were encountered: