Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[do not land][teaching] Proof of concept diff to show how streaming events get mapped from oboe helper for us to process #910

Closed
wants to merge 2 commits into from

Conversation

rossdanlm
Copy link
Contributor

@rossdanlm rossdanlm commented Jan 13, 2024

[do not land][teaching] Proof of concept diff to show how streaming events get mapped from oboe helper for us to process. This goes into a bit more detail than #806

This goes into a bit more detail on how streaming concepts are all connected together. I don't exactly know what oboeInstance.node(on, fn); does behind the scenes but you can just blackbox and know that it handles streaming for us


Stack created with Sapling. Best reviewed with ReviewStack.

Rossdan Craig [email protected] added 2 commits January 13, 2024 01:19
… button should be disabled

I did a few things:

1. Create a new state id `runningPromptId` to check which prompt is running, making sure to set these correctly on run prompt actions
2. Read from this state in the AIConfigEditor and passed it as a prop down to the `RunPromptButton`

Question: Is there a better way of doing step #2 so that we don't need to keep piping it?

## Test Plan
Make sure that:
1. When you are running a prompt, you can still cancel it
2. When another prompt is running, you can't run any other prompt
3. When prompt is complete (or cancelled or errored), all the prompts can run again

https://github.com/lastmile-ai/aiconfig/assets/151060367/6e9a9cbf-6469-4ecb-ba74-1ede2ed1a292
…vents get mapped from oboe helper for us to process

This goes into a bit more detail on how streaming concepts are all connected together. I don't exactly know what `oboeInstance.node(on, fn);` does behind the scenes but you can just blackbox and know that it handles streaming for us
@rossdanlm rossdanlm closed this Jan 13, 2024
rossdanlm pushed a commit that referenced this pull request Jan 13, 2024
Before we used to not support streaming, so when we would return `aiconfig` it would be from a blocking hanging operation. This meant that we needed to set `isRunning` prompt state to be true while we were waiting, but now we don't need to do that anymore after we migrated all run events to return in streaming response format, even for non-streaming models: #806

Also we are now no longer using the `streamApi` helper since we added and are now using `streamingApiChain`, which was added in #789

Finally, if you want more resources on how streaming is connected, you can check out #910 which is a teaching guide I built for adding explaining how the code is connected

## Test Plan
Both streaming and non-streaming models work as before
rossdanlm pushed a commit that referenced this pull request Jan 13, 2024
Before we used to not support streaming, so when we would return `aiconfig` it would be from a blocking hanging operation. This meant that we needed to set `isRunning` prompt state to be true while we were waiting, but now we don't need to do that anymore after we migrated all run events to return in streaming response format, even for non-streaming models: #806

Also we are now no longer using the `streamApi` helper since we added and are now using `streamingApiChain`, which was added in #789

Finally, if you want more resources on how streaming is connected, you can check out #910 which is a teaching guide I built for adding explaining how the code is connected

## Test Plan
Both streaming and non-streaming models work as before

https://github.com/lastmile-ai/aiconfig/assets/151060367/b62e7887-20af-4c0c-ab85-eeaacaab64e0
rossdanlm added a commit that referenced this pull request Jan 14, 2024
…911)

Delete `aiconfig_complete` stream response, replace with `aiconfig`


Before we used to not support streaming, so when we would return
`aiconfig` it would be from a blocking hanging operation. This meant
that we needed to set `isRunning` prompt state to be true while we were
waiting, but now we don't need to do that anymore after we migrated all
run events to return in streaming response format, even for
non-streaming models: #806

Also we are now no longer using the `streamApi` helper since we added
and are now using `streamingApiChain`, which was added in
#789

Finally, if you want more resources on how streaming is connected, you
can check out #910 which is
a teaching guide I built for adding explaining how the code is connected

## Test Plan
Both streaming and non-streaming models work as before


https://github.com/lastmile-ai/aiconfig/assets/151060367/b62e7887-20af-4c0c-ab85-eeaacaab64e0

---
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/911).
* #912
* __->__ #911
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant