You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 16, 2024. It is now read-only.
no matter which model I use e.g. gpt-4-32k-0613 or the new gpt-4-1106-preview with 128k context window I always get:
Mon, 06 Nov 2023 21:44:24 GMT [ERROR] [OpenAI-API Error: Error: Prompt is too long. Max token count is 3071, but prompt is 3539 tokens long.]
TypeError: Cannot read properties of undefined (reading 'response')
at CommandHandler.onMessage (file:///opt/matrix_gpt_bot/3.1.4/dist/handlers.js:132:86)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)