Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add streaming session and ability to use streaming #57

Merged
merged 19 commits into from
May 15, 2023
Merged

Conversation

Krivoblotsky
Copy link
Collaborator

@Krivoblotsky Krivoblotsky commented Apr 18, 2023

What

  1. Chats and Completions have been updated with streaming functionality
  2. ChatStreamingResult has been created
  3. Tests have been updated

Why

#14

Affected Areas

Completions, Chats

@Krivoblotsky Krivoblotsky mentioned this pull request Apr 18, 2023
Sources/OpenAI/OpenAI.swift Outdated Show resolved Hide resolved
public let index: Int
public let message: Chat
public let finishReason: String
public let message: Chat?
Copy link
Contributor

@DJBen DJBen Apr 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you want to mix the complete and partial results, can you comment in what condition these properties are nil?

I have moderate preference in separating a complete Choice from a PartialChoice, provided that they have major differences, despite the official OpenAI API mix them together.

// Choice stays the same

public struct PartialChoice: Codable, Equatable {
  struct Delta: Codable, Equatable {
    public let content: String?
    public let role: Chat.Role?
  }
  public let index: Int
  public let delta: Delta
}

As a result, the result can contain enum of either Choice or PartialChoice
What do you think? Appreciated!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've decided to keep ChatResultto not break backward compatibility and created "ChatStreamingResult" along with a separate function for streaming. I found this solution more straightforward and clean. What do you think about this?

@DJBen
Copy link
Contributor

DJBen commented Apr 19, 2023

Since we are using two separate functions for the chats(query:) and chatStream(query:) to represent streaming and non-streaming version of the chat api, we can potentially get rid of the stream: Bool? property inside ChatQuery, and modify the encoder to serialize stream: false and stream: true respectively when calling these two different functions. But it's probably a large model change.

@DJBen
Copy link
Contributor

DJBen commented Apr 19, 2023

I added streaming in the demo app #59

Adopt streaming in Demo app
@Krivoblotsky
Copy link
Collaborator Author

It looks like everything is working as expected 🤞
Going to check more tests on weekends and proceed.
https://user-images.githubusercontent.com/1411778/235212448-0aa844d5-7ffb-487e-a0bb-37eed5666f48.mov

@DJBen
Copy link
Contributor

DJBen commented May 9, 2023

Hi @Krivoblotsky do you know when this can be merged?

@Krivoblotsky
Copy link
Collaborator Author

Hi, @DJBen
I have plans for this week for this to be updated and merged.

@Krivoblotsky Krivoblotsky marked this pull request as ready for review May 15, 2023 11:12
Update documentation on streaming
Update documentation
@Krivoblotsky
Copy link
Collaborator Author

Hey, @DJBen.
I'm ready to merge it, but wanted to ask you for a review first 🙏

Copy link
Contributor

@DJBen DJBen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG!

@Krivoblotsky Krivoblotsky merged commit 12ba7f4 into main May 15, 2023
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants