Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Org style workflow where properties can be specified in org file level or org header level #99

Merged
merged 3 commits into from
Jan 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 36 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,7 @@ To apply syntax highlighted to your `#+begin_ai ...` blocks just add a language
The `#+begin_ai...#+end_ai` block can take the following options.

##### For ChatGPT
By default, the content of ai blocks are interpreted as messages for ChatGPT. Text following `[ME]:` is associated with the user, text following `[AI]:` is associated as the model's response. Optionally you can start the block with a `[SYS]: <behahvior>` input to prime the model (see `org-ai-default-chat-system-prompt` below).
By default, the content of ai blocks are interpreted as messages for ChatGPT. Text following `[ME]:` is associated with the user, text following `[AI]:` is associated as the model's response. Optionally you can start the block with a `[SYS]: <behavior>` input to prime the model (see `org-ai-default-chat-system-prompt` below).

- `:max-tokens number` - number of maximum tokens to generate (default: nil, use OpenAI's default)
- `:temperature number` - temperature of the model (default: 1)
Expand All @@ -155,6 +155,41 @@ By default, the content of ai blocks are interpreted as messages for ChatGPT. Te
- `:presence-penalty` - presence penalty of the model (default: 0)
- `:sys-everywhere` - repeat the system prompt for every user message (default: nil)

If you have a lot of different threads of conversation regarding the same topic and settings (system prompt, temperature, etc) and you don't want to repeat all the options, you can set org file scope properties or create a org heading with property drawer, such that all `#+begin_ai...#+end_ai` blocks under that heading will inherit the settings.

Examples:
```org
* Emacs (multiple conversations re emacs continue in this subtree)
:PROPERTIES:
:SYS: You are a emacs expert. You can help me by answering my questions. You can also ask me questions to clarify my intention.
:temperature: 0.5
:model: gpt-3.5-turbo
:END:

** Web programming via elisp
#+begin_ai
How to call a REST API and parse its JSON response?
#+end_ai

** Other emacs tasks
#+begin_ai...#+end_ai

* Python (multiple conversations re python continue in this subtree)
:PROPERTIES:
:SYS: You are a python programmer. Respond to the task with detailed step by step instructions and code.
:temperature: 0.1
:model: gpt-4
:END:

** Learning QUIC
#+begin_ai
How to setup a webserver with http3 support?
#+end_ai

** Other python tasks
#+begin_ai...#+end_ai
```

The following custom variables can be used to configure the chat:

- `org-ai-default-chat-model` (default: `"gpt-3.5-turbo"`)
Expand Down
62 changes: 37 additions & 25 deletions org-ai-openai.el
Original file line number Diff line number Diff line change
Expand Up @@ -224,31 +224,43 @@ number of tokens to generate. `TEMPERATURE' is the temperature of
the distribution. `TOP-P' is the top-p value. `FREQUENCY-PENALTY'
is the frequency penalty. `PRESENCE-PENALTY' is the presence
penalty. `CONTEXT' is the context of the special block."
(let ((context (or context (org-ai-special-block)))
(buffer (current-buffer)))
(let* ((info (org-ai-get-block-info context))
(model (or model (alist-get :model info) (if messages org-ai-default-chat-model org-ai-default-completion-model)))
(max-tokens (or max-tokens (alist-get :max-tokens info) org-ai-default-max-tokens))
(top-p (or top-p (alist-get :top-p info)))
(temperature (or temperature (alist-get :temperature info)))
(frequency-penalty (or frequency-penalty (alist-get :frequency-penalty info)))
(presence-penalty (or presence-penalty (alist-get :presence-penalty info)))
(callback (if messages
(lambda (result) (org-ai--insert-chat-completion-response context buffer result))
(lambda (result) (org-ai--insert-stream-completion-response context buffer result)))))
(setq org-ai--current-insert-position-marker nil)
(setq org-ai--chat-got-first-response nil)
(setq org-ai--debug-data nil)
(setq org-ai--debug-data-raw nil)
(org-ai-stream-request :prompt prompt
:messages messages
:model model
:max-tokens max-tokens
:temperature temperature
:top-p top-p
:frequency-penalty frequency-penalty
:presence-penalty presence-penalty
:callback callback))))
(let* ((context (or context (org-ai-special-block)))
(buffer (current-buffer))
(info (org-ai-get-block-info context))
(callback (if messages
(lambda (result) (org-ai--insert-chat-completion-response context buffer result))
(lambda (result) (org-ai--insert-stream-completion-response context buffer result)))))
(cl-macrolet ((let-with-captured-arg-or-header-or-inherited-property
Copy link
Owner

@rksm rksm Jan 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sweeet lisp power.

(definitions &rest body)
`(let ,(cl-loop for (sym . default-form) in definitions collect
`(,sym (or ,sym
(alist-get ,(intern (format ":%s" (symbol-name sym))) info)
(when-let ((prop (org-entry-get-with-inheritance ,(symbol-name sym))))
(if (eq (quote ,sym) 'model)
prop
(if (stringp prop) (string-to-number prop) prop)))
,@default-form)))
,@body)))
(let-with-captured-arg-or-header-or-inherited-property
((model (if messages org-ai-default-chat-model org-ai-default-completion-model))
(max-tokens org-ai-default-max-tokens)
(top-p)
(temperature)
(frequency-penalty)
(presence-penalty))
(setq org-ai--current-insert-position-marker nil)
(setq org-ai--chat-got-first-response nil)
(setq org-ai--debug-data nil)
(setq org-ai--debug-data-raw nil)
(org-ai-stream-request :prompt prompt
:messages messages
:model model
:max-tokens max-tokens
:temperature temperature
:top-p top-p
:frequency-penalty frequency-penalty
:presence-penalty presence-penalty
:callback callback)))))

(defun org-ai--insert-stream-completion-response (context buffer &optional response)
"Insert the response from the OpenAI API into the buffer.
Expand Down
8 changes: 5 additions & 3 deletions org-ai.el
Original file line number Diff line number Diff line change
Expand Up @@ -113,20 +113,22 @@ result."
(content (org-ai-get-block-content context))
(req-type (org-ai--request-type info))
(sys-prompt-for-all-messages (or (not (eql 'x (alist-get :sys-everywhere info 'x)))
org-ai-default-inject-sys-prompt-for-all-messages)))
(org-entry-get-with-inheritance "SYS-EVERYWHERE")
org-ai-default-inject-sys-prompt-for-all-messages))
(default-system-prompt (or (org-entry-get-with-inheritance "SYS") org-ai-default-chat-system-prompt)))
(cl-case req-type
(completion (org-ai-stream-completion :prompt content
:context context))
(image (org-ai-create-and-embed-image context))
(sd-image (org-ai-create-and-embed-sd context))
(local-chat (org-ai-oobabooga-stream :messages (org-ai--collect-chat-messages
content
org-ai-default-chat-system-prompt
default-system-prompt
sys-prompt-for-all-messages)
:context context))
(t (org-ai-stream-completion :messages (org-ai--collect-chat-messages
content
org-ai-default-chat-system-prompt
default-system-prompt
sys-prompt-for-all-messages)
:context context)))))

Expand Down