Skip to content

Conversation

@drewpayment
Copy link
Contributor

Added a new AI provider and enabled Google Gemini Pro support. Only added the Gemini Pro 1.0 model for now, but should be able to simply add to the models list if adopted.

Added this issue: #308

@mitchross
Copy link

Consider adding a small readme note on how to generate Gemini API key.

@drewpayment
Copy link
Contributor Author

Consider adding a small readme note on how to generate Gemini API key.

Added notes to README.md.

@di-sukharev
Copy link
Owner

@drewpayment great job, could you run npm run build to fix the /out conflicts and look into other conflicts pls

@drewpayment
Copy link
Contributor Author

Will do, I need to make a couple modifications to the prompt too because Gemini is hallucinating once in awhile.

@di-sukharev
Copy link
Owner

@drewpayment let me know when this is ready <3

@drewpayment
Copy link
Contributor Author

I worked on it a little bit last night but I'm a bit perplexed because it keeps using the initial default prompt and writing commit details based on it. Trying to find the best way to undo that bit without impacting anything else.

@matt-degraffenreid
Copy link
Contributor

matt-degraffenreid commented Mar 20, 2024

Hey @drewpayment,

I actually work on Google's agent assist so I have some insight into your issue. The reason that's happening is because you need to give the model context, examples, and a message.

Screenshot 2024-03-20 at 7 37 36 AM
Screenshot 2024-03-20 at 7 37 50 AM
Screenshot 2024-03-20 at 7 38 01 AM
Screenshot 2024-03-20 at 7 38 12 AM

The implementation I used is also different I had to use @google-cloud/aiplatform and used application creds for auth. My result being this:

Screenshot 2024-03-20 at 7 43 03 AM

Try using this as your prompt and see if the issue resolves itself. If it does then structure the format like this going forward.

The freeform paragraph is the context, the first pair of input/output is the example, the second one is the message.

"""You are to act as the author of a commit message in git. Your mission is to create clean and comprehensive commit messages as per the conventional commit convention and explain WHAT were the changes and mainly WHY the changes were done. I\'ll send you an output of \'git diff --staged\' command, and you are to convert it into a commit message.
  You should only include the changes that are relevant to the commit. Do not include any changes that are not relevant to the commit.
  Do not preface the commit with anything. Conventional commit keywords:fix, feat, build, chore, ci, docs, style, refactor, perf, test.  
    Don\'t add any descriptions to the commit, only commit message.

input: diff --git a/src/server.ts b/src/server.ts
    index ad4db42..f3b18a9 100644
    --- a/src/server.ts
    +++ b/src/server.ts
    @@ -10,7 +10,7 @@
    import {
        initWinstonLogger();
        
        const app = express();
        -const port = 7799;
        +const PORT = 7799;
        
        app.use(express.json());
        
        @@ -34,6 +34,6 @@
        app.use((_, res, next) => {
            // ROUTES
            app.use(PROTECTED_ROUTER_URL, protectedRouter);
            
            -app.listen(port, () => {
                -  console.log(\\`Server listening on port \\${port}\\`);
                +app.listen(process.env.PORT || PORT, () => {
                    +  console.log(\\`Server listening on port \\${PORT}\\`);
                });
output: fix(server.ts): change port variable case from lowercase port to uppercase PORT to improve semantics
feat(server.ts): add support for process.env.PORT environment variable to be able to run app on a configurable port

input: index b6490b6..d50c82d 100755
--- a/out/cli.cjs
+++ b/out/cli.cjs
@@ -6,8 +6,8 @@ var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
 var __getOwnPropNames = Object.getOwnPropertyNames;
 var __getProtoOf = Object.getPrototypeOf;
 var __hasOwnProp = Object.prototype.hasOwnProperty;
-var __commonJS = (cb, mod) => function __require() {
-  return mod || (0, cb[__getOwnPropNames(cb)[0]])((mod = { exports: {} }).exports, mod), mod.exports;
output:
"""

The code I wrote was quick and dirty, it works for me but it's not in state to open a PR. Let me know if this doesn't fix your issue and I'll take a day or so to clean it up and open a PR on your repo.

Edit: I only had to make change in gemini.ts. Also it looks like you're using pnpm, did you mean to try to merge the pnpm lockfile?

@drewpayment
Copy link
Contributor Author

Thanks for the notes, @matt-degraffenreid ! I haven't been able to work on cleaning this up this week yet, but I was hoping I would be able to just make the prompt changes directly in gemini.ts. I did notice I mistakenly committed my pnpm-lock.yaml as well. Planning to clean it up with the model fixes.

@di-sukharev
Copy link
Owner

let me know when i should take a look, merge and test it

@drewpayment drewpayment force-pushed the feat/add-gemini-support branch from eceb170 to 0ac7211 Compare May 1, 2024 18:45
@di-sukharev
Copy link
Owner

@drewpayment please run npm run build so the /out dir is not conflicting, hope it helps :)

@drewpayment
Copy link
Contributor Author

This is still on my list! I will try to get it cleaned up and ready tonight.

drewpayment and others added 10 commits May 4, 2024 14:21
✨ (utils/engine.ts): add support for Gemini engine

♻️ (openAi.ts & utils/engine.ts): add support for OCO_API_KEY env variable to configure apiKey

📝 (README.md): update documentation to reflect the new changes and configuration options

⬆️ (package.json): add @google/generative-ai dependency

🔧 (src/commands/config.ts): add configuration options for the Gemini engine and update validation logic to support both OpenAI and Gemini API keys

🔧 (src/commands/prepare-commit-msg-hook.ts, src/server.ts): update to use OCO_API_KEY instead of OCO_OPENAI_API_KEY

✨ (src/engine/gemini.ts): implement the Gemini engine for generating commit messages

✨ Add support for Gemini and Ollama AI engines

This commit introduces support for two new AI engine options: Gemini and Ollama. These engines can be used as alternatives to the existing OpenAI engine for generating commit messages.

Additionally, this commit includes several improvements and bug fixes:

- **(utils/engine.ts):** Adds support for Gemini and Ollama engines.
- **(openAi.ts & utils/engine.ts):** Adds support for `OCO_API_KEY` environment variable to configure the API key for all engines.
- **(generateCommitMessageFromGitDiff.ts):** Improves the logic for handling multiple commit messages by using `Promise.all`.
- **(engine/index.ts):** Exports all available AI engines for easier access.
- **(engine/ollama.ts):** Removes unnecessary export of `ollamaAi` instance.
- **(engine/openAi.ts):**  Makes the `OpenAi` class exportable and removes unnecessary exports.
…EY env variable

Adds support for the `OCO_OPENAI_API_KEY` environment variable as an alternative to `OCO_API_KEY` to configure the API key for OpenAI and Gemini AI providers. Updates the default Gemini model to `gemini-1.5-pro-latest` and includes it in the list of valid models. Also, improves the type safety and structure of messages and chat history in the Gemini engine. Additionally, enhances the handling of diffs and commit message generation, including splitting large diffs and merging file diffs to optimize token usage.
- Update TypeScript target and libraries in tsconfig.json to ensure compatibility.
- Remove unused TypeScript Node configuration and add missing dependencies for project setup.
…ev#304)

* 3.0.11

* build

* docs: update ollama usage readme (di-sukharev#301)

Signed-off-by: Albert Simon <albert.simon.sge@mango.com>
Co-authored-by: Albert Simon <albert.simon.sge@mango.com>

* 🚨 BREAKING CHANGES 🚨

- feat(engine/ollama): add support for local models and change prompt format to improve AI performance
+ fix(engine/ollama): fix issue with local model not responding correctly to requests

The commit message is now more concise, clear, and informative. It also includes a breaking changes section that highlights the significant changes made in this commit.

---------

Signed-off-by: Albert Simon <albert.simon.sge@mango.com>
Co-authored-by: di-sukharev <dim.sukharev@gmail.com>
Co-authored-by: Albert Simon <47634918+willyw0nka@users.noreply.github.com>
Co-authored-by: Albert Simon <albert.simon.sge@mango.com>
Co-authored-by: Константин Шуткин <shutkin-kn@mosmetro.ru>
✨ (utils/engine.ts): add support for Gemini engine

♻️ (openAi.ts & utils/engine.ts): add support for OCO_API_KEY env variable to configure apiKey

📝 (README.md): update documentation to reflect the new changes and configuration options

⬆️ (package.json): add @google/generative-ai dependency

🔧 (src/commands/config.ts): add configuration options for the Gemini engine and update validation logic to support both OpenAI and Gemini API keys

🔧 (src/commands/prepare-commit-msg-hook.ts, src/server.ts): update to use OCO_API_KEY instead of OCO_OPENAI_API_KEY

✨ (src/engine/gemini.ts): implement the Gemini engine for generating commit messages

✨ Add support for Gemini and Ollama AI engines

This commit introduces support for two new AI engine options: Gemini and Ollama. These engines can be used as alternatives to the existing OpenAI engine for generating commit messages.

Additionally, this commit includes several improvements and bug fixes:

- **(utils/engine.ts):** Adds support for Gemini and Ollama engines.
- **(openAi.ts & utils/engine.ts):** Adds support for `OCO_API_KEY` environment variable to configure the API key for all engines.
- **(generateCommitMessageFromGitDiff.ts):** Improves the logic for handling multiple commit messages by using `Promise.all`.
- **(engine/index.ts):** Exports all available AI engines for easier access.
- **(engine/ollama.ts):** Removes unnecessary export of `ollamaAi` instance.
- **(engine/openAi.ts):**  Makes the `OpenAi` class exportable and removes unnecessary exports.
…EY env variable

Adds support for the `OCO_OPENAI_API_KEY` environment variable as an alternative to `OCO_API_KEY` to configure the API key for OpenAI and Gemini AI providers. Updates the default Gemini model to `gemini-1.5-pro-latest` and includes it in the list of valid models. Also, improves the type safety and structure of messages and chat history in the Gemini engine. Additionally, enhances the handling of diffs and commit message generation, including splitting large diffs and merging file diffs to optimize token usage.
@drewpayment drewpayment marked this pull request as draft May 4, 2024 19:19
@drewpayment
Copy link
Contributor Author

Opened new PR #332

@drewpayment drewpayment closed this May 4, 2024
@drewpayment drewpayment deleted the feat/add-gemini-support branch June 9, 2024 03:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants