Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compatibility Issues with SpeakGPT and Groq llama3 Endpoints #330

Open
WiegerWolf opened this issue Apr 27, 2024 · 3 comments
Open

Compatibility Issues with SpeakGPT and Groq llama3 Endpoints #330

WiegerWolf opened this issue Apr 27, 2024 · 3 comments

Comments

@WiegerWolf
Copy link

Description

There are compatibility issues when using the openai-kotlin library with the SpeakGPT app, specifically with Groq endpoint. The app functions correctly, but it consistently triggers error messages that appear to be related to how exceptions are handled within the library.

See original bug report here for more context.

Steps to Reproduce

  1. Integrate the openai-kotlin library with the SpeakGPT app.
  2. Configure the app to interact with Groq endpoint.
  3. Observe that the app correctly processes responses but incorrectly triggers error messages.

Expected Behavior

The library should handle endpoint interactions without triggering unnecessary error messages, ensuring smooth operation across different configurations.

Actual Behavior

Error messages are triggered after every API response, despite the responses being correct and fully processed. This suggests an issue with the exception handling mechanism in the library when used with endpoints other than the default.

Issue Same chat, next message API url Model Event log (not sure if it's relevant)
photo1714243676 photo1714244382 photo1714243733 photo1714243733 (1) photo1714244937

It looks like speak-gpt app adds this text somewhere here, I guess, so it's not an issue with configuration or the API, it's an issue with error handling logic inside speak-gpt app. It fires when it shouldn't, I think.

Additional Information

  • SpeakGPT app version: 3.22
  • Library version: [library version is not known to me]

Possible Solution

It would be beneficial to review the exception handling logic within the library to ensure it is robust across various endpoints. Alternatively, providing more detailed documentation or configuration options to handle such cases might help.

Links

Thank you for looking into this matter. Your assistance will help improve the usability of the library in diverse applications.

@aallam
Copy link
Owner

aallam commented Apr 29, 2024

The library throws exceptions when known issues are returned by the OpenAI APIs.
When an unexpected error occurs, a generic exception is thrown.
This may vary on a case-by-case basis. Therefore, to resolve any issues of this nature, it would be best to have information about which endpoint is failing and why.

@WiegerWolf
Copy link
Author

Thank you for your prompt response, @aallam.

Here are the specifics related to the endpoint and model that are causing the issue:

  • API URL: https://api.groq.com/openai/v1/
  • Model Used: llama3-70b-8192
  • API Key: API keys can be obtained for free at Groq API Console.

The error messages are triggered after every API response, which are correct and as expected, but the app still shows an error message. This issue seems to be related to how exceptions are handled when using this particular endpoint and model with the openai-kotlin library.

Could we possibly look into the exception handling logic for responses from this specific API? It might help to understand why the library perceives these correct responses as errors and throws exceptions.

Thank you for assisting in improving the library's compatibility with various endpoints.

@sanity
Copy link

sanity commented May 1, 2024

This may be related, I'm getting this exception while trying to use the completion endpoint with Groq:

Exception in thread "main" io.ktor.serialization.JsonConvertException: Illegal input: Field 'param' is required for type with serial name 'com.aallam.openai.api.exception.OpenAIErrorDetails', but it was missing at path: $.error
	at io.ktor.serialization.kotlinx.KotlinxSerializationConverter.deserialize(KotlinxSerializationConverter.kt:90)
	at io.ktor.serialization.ContentConverterKt$deserialize$$inlined$map$1$2.emit(Emitters.kt:224)
	at kotlinx.coroutines.flow.FlowKt__BuildersKt$asFlow$$inlined$unsafeFlow$3.collect(SafeCollector.common.kt:116)
	at io.ktor.serialization.ContentConverterKt$deserialize$$inlined$map$1.collect(SafeCollector.common.kt:113)
	at kotlinx.coroutines.flow.FlowKt__ReduceKt.firstOrNull(Reduce.kt:243)
	at kotlinx.coroutines.flow.FlowKt.firstOrNull(Unknown Source)
	at io.ktor.serialization.ContentConverterKt.deserialize(ContentConverter.kt:123)
	at io.ktor.client.plugins.contentnegotiation.ContentNegotiation.convertResponse$ktor_client_content_negotiation(ContentNegotiation.kt:230)
	at io.ktor.client.plugins.contentnegotiation.ContentNegotiation$Plugin$install$2.invokeSuspend(ContentNegotiation.kt:262)
	at io.ktor.client.plugins.contentnegotiation.ContentNegotiation$Plugin$install$2.invoke(ContentNegotiation.kt)
	at io.ktor.client.plugins.contentnegotiation.ContentNegotiation$Plugin$install$2.invoke(ContentNegotiation.kt)
	at io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:120)
	at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SuspendFunctionGun.kt:78)
	at io.ktor.client.HttpClient$4.invokeSuspend(HttpClient.kt:177)
	at io.ktor.client.HttpClient$4.invoke(HttpClient.kt)
	at io.ktor.client.HttpClient$4.invoke(HttpClient.kt)
	at io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:120)
	at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SuspendFunctionGun.kt:78)
	at io.ktor.util.pipeline.SuspendFunctionGun.proceedWith(SuspendFunctionGun.kt:88)
	at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invokeSuspend(HttpCallValidator.kt:142)
	at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invoke(HttpCallValidator.kt)
	at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invoke(HttpCallValidator.kt)
	at io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:120)
	at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SuspendFunctionGun.kt:78)
	at io.ktor.util.pipeline.SuspendFunctionGun.execute$ktor_utils(SuspendFunctionGun.kt:98)
	at io.ktor.util.pipeline.Pipeline.execute(Pipeline.kt:77)
	at io.ktor.client.call.HttpClientCall.bodyNullable(HttpClientCall.kt:89)
	at com.aallam.openai.client.internal.http.HttpTransport.openAIAPIException(HttpTransport.kt:73)
	at com.aallam.openai.client.internal.http.HttpTransport.handleException(HttpTransport.kt:48)
	at com.aallam.openai.client.internal.http.HttpTransport.perform(HttpTransport.kt:23)
	at com.aallam.openai.client.internal.http.HttpTransport$perform$1.invokeSuspend(HttpTransport.kt)
	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
	at kotlinx.coroutines.internal.LimitedDispatcher$Worker.run(LimitedDispatcher.kt:115)
	at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:103)
	at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:584)
	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:793)
	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:697)
	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:684)
Caused by: kotlinx.serialization.MissingFieldException: Field 'param' is required for type with serial name 'com.aallam.openai.api.exception.OpenAIErrorDetails', but it was missing at path: $.error
	at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:95)
	at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableValue(AbstractDecoder.kt:43)
	at kotlinx.serialization.encoding.AbstractDecoder.decodeNullableSerializableElement(AbstractDecoder.kt:78)
	at com.aallam.openai.api.exception.OpenAIError$$serializer.deserialize(OpenAIErrorDetails.kt:11)
	at com.aallam.openai.api.exception.OpenAIError$$serializer.deserialize(OpenAIErrorDetails.kt:11)
	at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:69)
	at kotlinx.serialization.json.Json.decodeFromString(Json.kt:107)
	at io.ktor.serialization.kotlinx.KotlinxSerializationConverter.deserialize(KotlinxSerializationConverter.kt:82)
	... 38 more
Caused by: kotlinx.serialization.MissingFieldException: Field 'param' is required for type with serial name 'com.aallam.openai.api.exception.OpenAIErrorDetails', but it was missing
	at kotlinx.serialization.internal.PluginExceptionsKt.throwMissingFieldException(PluginExceptions.kt:20)
	at com.aallam.openai.api.exception.OpenAIErrorDetails.<init>(OpenAIErrorDetails.kt:24)
	at com.aallam.openai.api.exception.OpenAIErrorDetails$$serializer.deserialize(OpenAIErrorDetails.kt:24)
	at com.aallam.openai.api.exception.OpenAIErrorDetails$$serializer.deserialize(OpenAIErrorDetails.kt:24)
	at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:69)
	... 45 more

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants