Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming is broken #14

Closed
randomsnowflake opened this issue Mar 8, 2023 · 11 comments
Closed

Streaming is broken #14

randomsnowflake opened this issue Mar 8, 2023 · 11 comments
Assignees
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@randomsnowflake
Copy link

Code

import Foundation
import OpenAI

let openAI = OpenAI(apiToken: "sk-...")

let query = OpenAI.ChatQuery(model: .gpt3_5Turbo, messages: [.init(role: "user", content: "hi!")], stream: true)
let result = try await openAI.chats(query: query)

Exception

Swift/ErrorType.swift:200: Fatal error: Error raised at top level: Swift.DecodingError.dataCorrupted(Swift.DecodingError.Context(codingPath: [], debugDescription: "The given data was not valid JSON.", underlyingError: Optional(Error Domain=NSCocoaErrorDomain Code=3840 "Invalid value around line 1, column 0." UserInfo={NSDebugDescription=Invalid value around line 1, column 0., NSJSONSerializationErrorIndex=0})))

2023-03-08 14:37:01.246548+0100 CLIChat[81308:5267103] Swift/ErrorType.swift:200: Fatal error: Error raised at top level: Swift.DecodingError.dataCorrupted(Swift.DecodingError.Context(codingPath: [], debugDescription: "The given data was not valid JSON.", underlyingError: Optional(Error Domain=NSCocoaErrorDomain Code=3840 "Invalid value around line 1, column 0." UserInfo={NSDebugDescription=Invalid value around line 1, column 0., NSJSONSerializationErrorIndex=0})))

It works with stream: false

@Krivoblotsky Krivoblotsky added bug Something isn't working help wanted Extra attention is needed labels Mar 8, 2023
@Krivoblotsky
Copy link
Collaborator

Hey, @randomsnowflake.
Thanks for bringing this to our attention.
Looks like streaming is unsupported for now and we need to implement server-sent events for this.

@rshimoda rshimoda self-assigned this Mar 27, 2023
@longseespace
Copy link
Contributor

I hope this will be implemented soon. Meanwhile, you can use this alternative (only for streaming - I'm using both clients in my app)

https://github.com/nate-parrott/openai-streaming-completions-swift

@randomsnowflake
Copy link
Author

I already use it. I think it's also the only Swift OpenAI API with streaming support.

However, have a look at my issue report here: Missing Model Parameters and No Error Message ... it's not as mature as I would like. Looking forward to MacPaw's implementation of streaming.

@Krivoblotsky
Copy link
Collaborator

I know that @rshimoda is on his way to adding streaming support here, so I believe it must be implemented shorty 👌

@rshimoda
Copy link
Collaborator

Hey there, I'll do my best to implement this soon 😉

@DJBen
Copy link
Contributor

DJBen commented Apr 13, 2023

@rshimoda Hi Sergi do you have a timeline for the implementation? Let me know if you need any help or have any WIP branch.

@Krivoblotsky
Copy link
Collaborator

@rshimoda 🙏

@DJBen
Copy link
Contributor

DJBen commented Apr 17, 2023

I am exploring a solution right now - it looks like a bigger work than I expect, and it is more akin to a separate project than 'fixing' the streaming.

We might need to rely on other EventSource libraries like https://github.com/launchdarkly/swift-eventsource. Let me know if you think is okay to add dependency to another library.

@Krivoblotsky
Copy link
Collaborator

Hey, @DJBen and everyone.
I have some implementation in mind that doesn't require any 3rd party libraries. I have implemented PoC that supports streaming for Chats API with minimal changes required.
Could you take a look at #57?

It has several weak points, such as optional delta and message in ChatResult, but overal looks good to me.
Public interfaces is untouched, but the completion coulbe called multiple times with the delta:

openAI.chats(query: .init(model: .gpt3_5Turbo, messages: [.init(role: .user, content: "Who is Taras Shevchenko?")], stream: true)) { result in
    switch result {
    case .success(let res):
        //called multiple times
        if let messageDelta = res.choices.first?.delta?.content {
            print(messageDelta)
        }
    case .failure(let error):
        print(error)
    }
}

Any feedback is very appreciated 🙏

@Krivoblotsky
Copy link
Collaborator

Almost there: #57 🤞

@Krivoblotsky
Copy link
Collaborator

Fixed by #57

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

5 participants