Skip to content

Conversation

@ericcurtin
Copy link
Member

@ericcurtin ericcurtin commented Apr 8, 2025

Summary by Sourcery

Add a new 'client' subcommand to the Ramalama CLI for interacting with AI Model servers

New Features:

  • Introduce a new 'client' CLI command that allows users to interact with AI Model servers by specifying a host and optional arguments

Enhancements:

  • Refactor the get_cmd_with_wrapper method to be a standalone function in the common module
  • Modify the CLI to support a new client subcommand with flexible argument handling

Documentation:

  • Add man page documentation for the new ramalama-client command, explaining its usage and providing an example

@sourcery-ai
Copy link
Contributor

sourcery-ai bot commented Apr 8, 2025

Reviewer's Guide by Sourcery

This pull request introduces a new client subcommand for interacting with OpenAI endpoints. It includes the command definition, argument parsing, and execution logic. Additionally, it refactors the get_cmd_with_wrapper function and adds a man page for the new client command.

Sequence diagram for the client command execution

sequenceDiagram
    participant CLI as Command Line Interface
    participant client_cli as client_cli function
    participant exec_cmd as exec_cmd function

    CLI->>client_cli: ramalama client HOST ARGS
    client_cli->>client_cli: Construct client_args
    client_cli->>exec_cmd: exec_cmd(client_args)
    exec_cmd->>CLI: Execute ramalama-client-core with arguments
Loading

Updated class diagram showing the location of get_cmd_with_wrapper

classDiagram
    class Model {
        -gpu_args(self, args, runner=False)
        -exec_model_in_container(self, model_path, cmd_args, args)
    }
    note for Model "get_cmd_with_wrapper was moved to common.py"

    class Common {
        +get_cmd_with_wrapper(cmd_args)
    }

    Model -- Common: uses
Loading

File-Level Changes

Change Details Files
Introduces a new client subcommand to interact with an OpenAI endpoint.
  • Adds a client_parser function to define the client subcommand, including positional arguments for the host and optional arguments to override the default prompt.
  • Implements the client_cli function to handle the execution of the client command, constructing the command-line arguments for ramalama-client-core and executing it.
ramalama/cli.py
Moves the get_cmd_with_wrapper function from Model class to common module.
  • Removes the get_cmd_with_wrapper method from the Model class.
  • Adds the get_cmd_with_wrapper function to the common module.
ramalama/model.py
ramalama/common.py
Adds a man page for the ramalama-client command.
  • Creates a new man page docs/ramalama-client.1.md with a description, synopsis, options, examples, and see also section.
docs/ramalama-client.1.md

Possibly linked issues


Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!
  • Generate a plan of action for an issue: Comment @sourcery-ai plan on
    an issue to generate a plan of action for it.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

*.volume
__pycache__/
__pycache__/
.aider*
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually used this aider tool to rewrite a lot of the boilerplate associated with introducing a new command.

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @ericcurtin - I've reviewed your changes - here's some feedback:

Overall Comments:

  • Consider using shlex.join to construct the command line in client_cli for better handling of arguments with spaces.
  • It would be helpful to add a brief description of the HOST argument in the client_parser function.
Here's what I looked at during the review
  • 🟡 General issues: 1 issue found
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@rhatdan
Copy link
Member

rhatdan commented Apr 8, 2025

What kind of commands is the user going to send to the server? Is this replacing ramalama run?

@ericcurtin
Copy link
Member Author

What kind of commands is the user going to send to the server? Is this replacing ramalama run?

"ramalama run" runs client + server and terminates both on completion. This runs only the client, it's useful for testing servers.

@rhatdan
Copy link
Member

rhatdan commented Apr 8, 2025

Might be useful for testing but difficult for us to support. Perhaps make it hidden for now, or need to document that this is only for experimenting at this point.

@ericcurtin
Copy link
Member Author

ericcurtin commented Apr 8, 2025

Added "(experimental)", I could leave it as an undocumented option possibly, would have to see if argparse allows that

@rhatdan rhatdan merged commit 81faa3e into main Apr 8, 2025
17 checks passed
@ericcurtin ericcurtin deleted the client branch April 9, 2025 00:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants