We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug When I run the basic test script at https://docs.gptr.dev/docs/gpt-researcher/llms/running-with-ollama#run-llm-test-script-for-gptr, it produces the following error:
Traceback (most recent call last): File "D:\Users\Chickens\Documents\code_projects\gpt-researcher\tests\test-ollama.py", line 14, in <module> llm_provider = get_llm( "ollama", ...<4 lines>... verify_ssl=False # Add this line ) File "D:\ProgramData\.conda_envs\gpt_researcher\Lib\site-packages\gpt_researcher\utils\llm.py", line 19, in get_llm return GenericLLMProvider.from_provider(llm_provider, **kwargs) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ProgramData\.conda_envs\gpt_researcher\Lib\site-packages\gpt_researcher\llm_provider\generic\base.py", line 77, in from_provider llm = ChatOllama(base_url=os.environ["OLLAMA_BASE_URL"], **kwargs) ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ File "<frozen os>", line 716, in __getitem__ KeyError: 'OLLAMA_BASE_URL'
I'm running this on Windows 11, with a 3.13.1 Python interpreter in a Conda virtual environment.
To Reproduce Steps to reproduce the behavior:
conda create -p D:\ProgramData\.conda_envs\gpt_researcher
conda activate D:\ProgramData\.conda_envs\gpt_researcher
conda install conda-forge::pip
cd D:\Users\Chickens\Documents\code_projects
git clone https://github.com/assafelovic/gpt-researcher.git
cd gpt-researcher
pip install -r .\requirements.txt
./tests/test-ollama.py
python ./tests/test-ollama.py
Expected behavior The base_url property defined in the test script should be used to initialise the ChatOllama object.
base_url
Screenshots If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Additional context When I specify the os.environ["OLLAMA_BASE_URL"] = OLLAMA_BASE_URL in the test script, it produces another error:
os.environ["OLLAMA_BASE_URL"] = OLLAMA_BASE_URL
Traceback (most recent call last): File "D:\Users\Chickens\Documents\code_projects\gpt-researcher\tests\test-ollama.py", line 14, in <module> llm_provider = get_llm( "ollama", ...<4 lines>... verify_ssl=False # Add this line ) File "D:\ProgramData\.conda_envs\gpt_researcher\Lib\site-packages\gpt_researcher\utils\llm.py", line 19, in get_llm return GenericLLMProvider.from_provider(llm_provider, **kwargs) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ProgramData\.conda_envs\gpt_researcher\Lib\site-packages\gpt_researcher\llm_provider\generic\base.py", line 77, in from_provider llm = ChatOllama(base_url=os.environ["OLLAMA_BASE_URL"], **kwargs) TypeError: langchain_ollama.chat_models.ChatOllama() got multiple values for keyword argument 'base_url'
The text was updated successfully, but these errors were encountered:
@hayjohnny2000 does this still occur? @ElishaKay perhaps you can help with this?
Sorry, something went wrong.
Welcome @hayjohnny2000
It's possible there were breaking changes during a recent refactoring.
I tested the Ollama script in the Docs as well recently and didn't succeed in getting it working.
Would love to see a PR if you crack it
The solution is to simply remove the OLLAMA_BASE_URL variable from the script and function argument and simply use the one from the .env file
OLLAMA_BASE_URL
.env
No branches or pull requests
Describe the bug
When I run the basic test script at https://docs.gptr.dev/docs/gpt-researcher/llms/running-with-ollama#run-llm-test-script-for-gptr, it produces the following error:
I'm running this on Windows 11, with a 3.13.1 Python interpreter in a Conda virtual environment.
To Reproduce
Steps to reproduce the behavior:
conda create -p D:\ProgramData\.conda_envs\gpt_researcher
conda activate D:\ProgramData\.conda_envs\gpt_researcher
conda install conda-forge::pip
cd D:\Users\Chickens\Documents\code_projects
git clone https://github.com/assafelovic/gpt-researcher.git
cd gpt-researcher
pip install -r .\requirements.txt
./tests/test-ollama.py
python ./tests/test-ollama.py
Expected behavior
The
base_url
property defined in the test script should be used to initialise the ChatOllama object.Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Additional context
When I specify the
os.environ["OLLAMA_BASE_URL"] = OLLAMA_BASE_URL
in the test script, it produces another error:The text was updated successfully, but these errors were encountered: