Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: vertexai authentication via service account #2863

Merged
merged 3 commits into from
Jul 22, 2024

Conversation

nicoloboschi
Copy link
Contributor

currently with vertexai embeddings and llm you can upload the json file but the auth is not well initialized and this makes those components unusable without setting the global environment variable

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. bug Something isn't working labels Jul 22, 2024
Copy link

This pull request is automatically being deployed by Amplify Hosting (learn more).

Access this pull request here: https://pr-2863.dmtpw4p5recq1.amplifyapp.com

Copy link
Contributor

@ogabrielluiz ogabrielluiz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jul 22, 2024
@ogabrielluiz ogabrielluiz merged commit ff592d7 into langflow-ai:main Jul 22, 2024
11 checks passed
anovazzi1 added a commit that referenced this pull request Jul 22, 2024
* fix: vertexai authentication via service account

* [autofix.ci] apply automated fixes

* fix: remove debugging print

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: anovazzi1 <[email protected]>
@rodgermoore
Copy link

rodgermoore commented Jul 25, 2024

Great was waiting for this one!

🥇

Edit:
Hmm I was hoping to use the Anthropic Claude 3.5 model via VertexAI but I can't get it to work. Did you @nicoloboschi or @ogabrielluiz manage to use the Anthropic models via VertexAI? Ref: https://console.cloud.google.com/vertex-ai/publishers/anthropic/model-garden/claude-3-5-sonnet

@rodgermoore
Copy link

@ogabrielluiz @nicoloboschi I created a custom component that enables Anthropic models on Vertex AI. Maybe you can add it to the regular VertexAI component or make a new Model? Here is the code:

import os
from langflow.custom import Component
from langflow.inputs import MessageTextInput
from langflow.inputs import StrInput, IntInput, FileInput, FloatInput
from langflow.template import Output
from langflow.schema.message import Message
from anthropic import AnthropicVertex
from google.oauth2 import service_account
from google.auth.transport.requests import Request

class AnthropicVertexComponent(Component):
    display_name = "Anthropic Vertex Component"
    description = "A component that interacts with the Anthropic Vertex API using Google Cloud authentication."
    icon = "VertexAI"
    name = "AnthropicVertexComponent"

    inputs = [
        MessageTextInput(
            name="message",
            display_name="Message",
            info="The message to send to the model.",
            value="Describe the number 42 in a few words."
        ),
        FileInput(
            name="service_account_key",
            display_name="Service Account Key File",
            info="JSON service account credentials file.",
            file_types=["json"],
        ),
        StrInput(
            name="model",
            display_name="Model",
            info="The Anthropic model to use.",
            value="claude-3-5-sonnet@20240620"
        ),        
        StrInput(
            name="region",
            display_name="Region",
            info="The Google Cloud region to use.",
            value="europe-west1"
        ),
        StrInput(
            name="project_id",
            display_name="Project ID",
            info="Your Google Cloud Project ID.",
            value="Project ID is mandatory"
        ),
        StrInput(
            name="system_message",
            display_name="System message",
            info="The system message to send to the model.",
            value="You are a helpful AI assistant.",
        ),
        IntInput(
            name="max_tokens",
            display_name="Max Tokens",
            info="The maximum number of tokens to generate.",
            value=256,
        ),
        FloatInput(
            name="temperature", 
            value=0.1, 
            display_name="Temperature",
            info="Controls randomness in the output. Lower values make the output more deterministic."
        ),
        FloatInput(
            name="top_p", 
            display_name="Top P", 
            value=0.95, 
            advanced=True,
            info="Controls diversity of the output. Lower values make the output more focused."
        ),
    ]

    outputs = [
        Output(display_name="Response", name="response", method="get_response"),
    ]

    def get_response(self) -> Message:
        try:
            # Load the service account key with the required scope
            credentials = service_account.Credentials.from_service_account_file(
                self.service_account_key,
                scopes=['https://www.googleapis.com/auth/cloud-platform']
            )

            # Ensure the credentials are valid and refreshed
            if credentials.expired:
                credentials.refresh(Request())

            # Initialize the AnthropicVertex client
            client = AnthropicVertex(
                project_id=self.project_id,
                region=self.region,
                credentials=credentials
            )

            # Create the message
            message = client.messages.create(
                model=self.model,
                max_tokens=self.max_tokens,
                temperature=self.temperature,
                top_p=self.top_p,
                system=self.system_message,
                messages=[
                    {
                        "role": "user",
                        "content": self.message,
                    }
                ],
            )
            
            # Extract the text content from the message object
            response_text = message.content[0].text if message.content else "No response generated"
            # response_text = message.model_dump_json(indent=2)
            self.status = "Message processed successfully"
            return Message(text=response_text)
        
        except Exception as e:
            error_message = f"Error processing message: {str(e)}"
            self.status = error_message
            return Message(text=error_message)

Langflow should support all imports, I didn't need to install requirements.

nicoloboschi added a commit to datastax/ragstack-ai-langflow that referenced this pull request Jul 30, 2024
* fix: vertexai authentication via service account

* [autofix.ci] apply automated fixes

* fix: remove debugging print

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: anovazzi1 <[email protected]>
(cherry picked from commit ff592d7)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working lgtm This PR has been approved by a maintainer size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants