-
Notifications
You must be signed in to change notification settings - Fork 271
feat: Add ramalama client command with basic implementation #1151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Reviewer's Guide by SourceryThis pull request introduces a new Sequence diagram for the client command executionsequenceDiagram
participant CLI as Command Line Interface
participant client_cli as client_cli function
participant exec_cmd as exec_cmd function
CLI->>client_cli: ramalama client HOST ARGS
client_cli->>client_cli: Construct client_args
client_cli->>exec_cmd: exec_cmd(client_args)
exec_cmd->>CLI: Execute ramalama-client-core with arguments
Updated class diagram showing the location of
|
| Change | Details | Files |
|---|---|---|
Introduces a new client subcommand to interact with an OpenAI endpoint. |
|
ramalama/cli.py |
Moves the get_cmd_with_wrapper function from Model class to common module. |
|
ramalama/model.pyramalama/common.py |
Adds a man page for the ramalama-client command. |
|
docs/ramalama-client.1.md |
Possibly linked issues
- Resolve undefined errors in
huggingface.pyandmodel.py#123: The PR implements the client requested in the issue.
Tips and commands
Interacting with Sourcery
- Trigger a new review: Comment
@sourcery-ai reviewon the pull request. - Continue discussions: Reply directly to Sourcery's review comments.
- Generate a GitHub issue from a review comment: Ask Sourcery to create an
issue from a review comment by replying to it. You can also reply to a
review comment with@sourcery-ai issueto create an issue from it. - Generate a pull request title: Write
@sourcery-aianywhere in the pull
request title to generate a title at any time. You can also comment
@sourcery-ai titleon the pull request to (re-)generate the title at any time. - Generate a pull request summary: Write
@sourcery-ai summaryanywhere in
the pull request body to generate a PR summary at any time exactly where you
want it. You can also comment@sourcery-ai summaryon the pull request to
(re-)generate the summary at any time. - Generate reviewer's guide: Comment
@sourcery-ai guideon the pull
request to (re-)generate the reviewer's guide at any time. - Resolve all Sourcery comments: Comment
@sourcery-ai resolveon the
pull request to resolve all Sourcery comments. Useful if you've already
addressed all the comments and don't want to see them anymore. - Dismiss all Sourcery reviews: Comment
@sourcery-ai dismisson the pull
request to dismiss all existing Sourcery reviews. Especially useful if you
want to start fresh with a new review - don't forget to comment
@sourcery-ai reviewto trigger a new review! - Generate a plan of action for an issue: Comment
@sourcery-ai planon
an issue to generate a plan of action for it.
Customizing Your Experience
Access your dashboard to:
- Enable or disable review features such as the Sourcery-generated pull request
summary, the reviewer's guide, and others. - Change the review language.
- Add, remove or edit custom review instructions.
- Adjust other review settings.
Getting Help
- Contact our support team for questions or feedback.
- Visit our documentation for detailed guides and information.
- Keep in touch with the Sourcery team by following us on X/Twitter, LinkedIn or GitHub.
| *.volume | ||
| __pycache__/ | ||
| __pycache__/ | ||
| .aider* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I actually used this aider tool to rewrite a lot of the boilerplate associated with introducing a new command.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @ericcurtin - I've reviewed your changes - here's some feedback:
Overall Comments:
- Consider using
shlex.jointo construct the command line inclient_clifor better handling of arguments with spaces. - It would be helpful to add a brief description of the
HOSTargument in theclient_parserfunction.
Here's what I looked at during the review
- 🟡 General issues: 1 issue found
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
|
What kind of commands is the user going to send to the server? Is this replacing |
"ramalama run" runs client + server and terminates both on completion. This runs only the client, it's useful for testing servers. |
|
Might be useful for testing but difficult for us to support. Perhaps make it hidden for now, or need to document that this is only for experimenting at this point. |
|
Added "(experimental)", I could leave it as an undocumented option possibly, would have to see if argparse allows that |
Signed-off-by: Eric Curtin <[email protected]>
Summary by Sourcery
Add a new 'client' subcommand to the Ramalama CLI for interacting with AI Model servers
New Features:
Enhancements:
Documentation: