-
Notifications
You must be signed in to change notification settings - Fork 111
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DATA-3467: Cloud Inference CLI #4748
base: main
Are you sure you want to change the base?
Conversation
|
||
// printInferenceResponse prints a neat representation of the GetInferenceResponse. | ||
func (c *viamClient) printInferenceResponse(resp *mlinferencepb.GetInferenceResponse) { | ||
printf(c.c.App.Writer, "Inference Response:") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You would know better than I would but is this the kind of format that you would want? Meaning, normally from a CLI I would expect JSON or something configurable that I can do programmatically. But I also know absolutely nothing about ML and this could be the way ML people want the response.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Honestly, this format isn't very good for most usecases, but it's readable. I think the primary use case of this command will be to debug / easily check if the service is functioning.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This LGTM! I'm going to approve but I would recommend waiting for @tahiyasalam to give a review as she almost certainly will have more insight than I about the response type and just wanna make sure we're matching what is expected.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good from sdk-land
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wooh! Thanks so much for doing this! 2 Qs/comments from me.
cli/app.go
Outdated
Required: true, | ||
}, | ||
&cli.StringFlag{ | ||
Name: inferenceFlagModelID, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should try to have parity with the training CLI, where we specify model-org-ID and model-name to uniquely identify the registry ID https://github.com/viamrobotics/rdk/blob/main/cli/ml_training.go#L22
I think we should either have both training and inference require:
- model org ID + name + version OR
- registry ID + version
I'll leave it up to you decide what makes the most sense.
printf(c.c.App.Writer, " No output tensors.") | ||
} | ||
|
||
printf(c.c.App.Writer, "Annotations:") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we just include here the format [x_min, y_min, x_max, y_max]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One more q! But otherwise looks great, thank you
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! Thank you <3
Adds CLI command for running cloud inference.
Use with commands of the following form
Tested manually with sample data: