-
-
Notifications
You must be signed in to change notification settings - Fork 18
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
f091eab
commit 8e8ee06
Showing
2 changed files
with
38 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,36 @@ | ||
## Featues | ||
|
||
- New in buffer mode `phantom` | ||
- `stream` toggle for responses brought back | ||
- images handling UX improved | ||
- advertisement logic improved | ||
|
||
## Deprecated | ||
- `append`, `replace`, `insert` in prompt modes is deprecated and will be removed in 5.0 release. | ||
|
||
## Detaied description | ||
|
||
### Phantom mode | ||
|
||
Phantom is the overlay UI placed inline in the editor view (see the picture below). It doesn't affects content of the view. | ||
|
||
1. You can set `"prompt_mode": "phantom"` for AI assistant in its settings. | ||
2. [optional] Select some text to pass in context in to manipulate with. | ||
3. Hit `OpenAI: New Message` or `OpenAI: Chat Model Select` and ask whatever you'd like in popup input pane. | ||
4. Phantom will appear below the cursor position or the beginning of the selection while the streaming LLM answer occurs. | ||
5. You can apply actions to the llm prompt, they're quite self descriptive and follows behavior deprecated in buffer commands. | ||
6. You can hit `ctrl+c` to stop prompting same as with in `panel` mode. | ||
|
||
### Stream toggle | ||
|
||
You can toggle streaming behavior of a model response with `"stream": false` setting in per assistant basis. That's pretty much it, the default value is `true`. | ||
|
||
### Images handling UX improved | ||
|
||
Images paths can now be fetched from the clipboard in addition to be extracted from the selection in a given view. It could be either a single image path [and nothing more than that] or a list of such paths separated with a new line, e.g. `/Users/username/Documents/Project/image0.png\n/Users/username/Documents/Project/image1.png`. | ||
|
||
Please note the parser that is trying to deduct whether the content of your clipboard is an [list of] image[s] is made by AI and quite fragile, so don't expect too much from it. | ||
|
||
### Advertisement logic improvement | ||
|
||
Advertisements now appear only when users excessively utilize the plugin, such as by processing too many tokens or sending/receiving an excessive number of messages. |