Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Internal Sever Error when modifying or deleting knowledge base #12041

Open
5 tasks done
XM-LEES opened this issue Dec 24, 2024 · 2 comments
Open
5 tasks done

Internal Sever Error when modifying or deleting knowledge base #12041

XM-LEES opened this issue Dec 24, 2024 · 2 comments
Labels
🐞 bug Something isn't working good first issue Good first issue for newcomers

Comments

@XM-LEES
Copy link

XM-LEES commented Dec 24, 2024

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.14.2

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

  1. create knowledge
    upload documents
    select Automatic Chunk and economical index mode
    save & process

2.delete knowledge base
select delete and confim

✔️ Expected Behavior

remove the knowledge base successfully

❌ Actual Behavior

All the documents are in processing quene, it will raise internal sever error when deleting knowledge

image
image

Here's a section of the log recurring after container starting

worker-1      | None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
worker-1      | 2024-12-24 07:05:21,258.258 INFO [MainThread] [utils.py:148] - Note: NumExpr detected 24 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
worker-1      | 2024-12-24 07:05:21,259.259 INFO [MainThread] [utils.py:160] - NumExpr defaulting to 8 threads.
worker-1      | /app/api/.venv/lib/python3.12/site-packages/celery/platforms.py:829: SecurityWarning: You're running the worker with superuser privileges: this is
worker-1      | absolutely not recommended!
worker-1      | 
worker-1      | Please specify a different user using the --uid option.
worker-1      | 
worker-1      | User information: uid=0 euid=0 gid=0 egid=0
worker-1      | 
worker-1      |   warnings.warn(SecurityWarning(ROOT_DISCOURAGED.format(
worker-1      | Traceback (most recent call last):
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/kombu/transport/__init__.py", line 63, in resolve_transport
worker-1      |     transport = TRANSPORT_ALIASES[transport]
worker-1      |                 ~~~~~~~~~~~~~~~~~^^^^^^^^^^^
worker-1      | KeyError: ''
worker-1      | 
worker-1      | During handling of the above exception, another exception occurred:
worker-1      | 
worker-1      | Traceback (most recent call last):
worker-1      |   File "/app/api/.venv/bin/celery", line 8, in <module>
worker-1      |     sys.exit(main())
worker-1      |              ^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/celery/__main__.py", line 15, in main
worker-1      |     sys.exit(_main())
worker-1      |              ^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/celery/bin/celery.py", line 236, in main
worker-1      |     return celery(auto_envvar_prefix="CELERY")
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
worker-1      |     return self.main(*args, **kwargs)
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/click/core.py", line 1078, in main
worker-1      |     rv = self.invoke(ctx)
worker-1      |          ^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
worker-1      |     return _process_result(sub_ctx.command.invoke(sub_ctx))
worker-1      |                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
worker-1      |     return ctx.invoke(self.callback, **ctx.params)
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/click/core.py", line 783, in invoke
worker-1      |     return __callback(*args, **kwargs)
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/click/decorators.py", line 33, in new_func
worker-1      |     return f(get_current_context(), *args, **kwargs)
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/celery/bin/base.py", line 135, in caller
worker-1      |     return f(ctx, *args, **kwargs)
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/celery/bin/worker.py", line 348, in worker
worker-1      |     worker = app.Worker(
worker-1      |              ^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/celery/worker/worker.py", line 98, in __init__
worker-1      |     self.setup_instance(**self.prepare_args(**kwargs))
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/celery/worker/worker.py", line 119, in setup_instance
worker-1      |     self._conninfo = self.app.connection_for_read()
worker-1      |                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/celery/app/base.py", line 821, in connection_for_read
worker-1      |     return self._connection(url or self.conf.broker_read_url, **kwargs)
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/celery/app/base.py", line 880, in _connection
worker-1      |     return self.amqp.Connection(
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/kombu/connection.py", line 201, in __init__
worker-1      |     if not get_transport_cls(transport).can_parse_url:
worker-1      |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/kombu/transport/__init__.py", line 91, in get_transport_cls
worker-1      |     _transport_cache[transport] = resolve_transport(transport)
worker-1      |                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
worker-1      |   File "/app/api/.venv/lib/python3.12/site-packages/kombu/transport/__init__.py", line 72, in resolve_transport
worker-1      |     raise KeyError(f'No such transport: {transport}')
worker-1      | KeyError: 'No such transport: '
worker-1      | Running migrations
@dosubot dosubot bot added the 🐞 bug Something isn't working label Dec 24, 2024
Copy link

dosubot bot commented Dec 24, 2024

Hey, @XM-LEES! I'm here to help you with any bugs or questions you have. Let's work together to solve the issue you're facing.

I found a similar unsolved discussion regarding a document being stuck in the queue with an error message indicating that none of PyTorch, TensorFlow >= 2.0, or Flax have been found [1]. This might be related to the issue you're experiencing.

For the "Internal Server Error" when modifying or deleting a knowledge base in Dify, here are some potential causes and solutions:

  1. Missing ML Frameworks: The log indicates that none of PyTorch, TensorFlow, or Flax are found. Ensure these frameworks are installed in your Docker environment. You can add them to your Dockerfile or requirements.txt and rebuild your Docker images [2].

  2. Celery Configuration: The error log shows a KeyError related to "No such transport." This suggests an issue with the Celery broker URL configuration. Verify that the CELERY_BROKER_URL is correctly set in your environment configuration. The format should be appropriate for your setup, whether using a direct connection or Sentinel mode [3].

  3. Celery Worker: Ensure that the Celery worker service is running properly, as it handles tasks related to the knowledge base. Check the worker logs for any errors and confirm that all necessary services are up and running [4].

By addressing these areas, you may resolve the internal server error and document processing issues.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@crazywoola crazywoola added the good first issue Good first issue for newcomers label Dec 24, 2024
@XM-LEES
Copy link
Author

XM-LEES commented Dec 24, 2024

I rollback to commit 967eb81 (tag: 0.14.0),and it works well
so if you are in urgency, maybe you can try 0.14.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working good first issue Good first issue for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants