-
Notifications
You must be signed in to change notification settings - Fork 4.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
problems with Vertex AI credentials on windows 11 pro #2735
Comments
Hey @severfire! 👋 I'm here to help you with bug fixes, answer your questions, and guide you to become a contributor. Let's squash those bugs together! To resolve the "default credentials not found" error when using Vertex AI credentials on Windows 11 Pro with Langflow version 1.0.9 and Python 3.12, ensure that the Here is an example of how you can set the params = {
"credentials": "path/to/your/service_account.json",
# other parameters
}
class_object = YourVertexAIClass # Replace with your actual class
initialized_object = initialize_vertexai(class_object, params) Make sure that the path to your service account JSON file is correct and accessible from your Windows 11 Pro environment. Additionally, ensure that the class ChatVertexAIComponent(LCModelComponent):
display_name = "Vertex AI"
description = "Generate text using Vertex AI LLMs."
icon = "VertexAI"
def build_config(self):
return {
"credentials": {
"display_name": "Credentials",
"field_type": "file",
"file_types": [".json"],
"file_path": None,
},
# other parameters
}
def build(
self,
input_value: Text,
credentials: Optional[str],
project: str,
examples: Optional[List[BaseMessage]] = [],
location: str = "us-central1",
max_output_tokens: int = 128,
model_name: str = "chat-bison",
temperature: float = 0.0,
top_k: int = 40,
top_p: float = 0.95,
verbose: bool = False,
stream: bool = False,
system_message: Optional[str] = None,
) -> Text:
try:
from langchain_google_vertexai import ChatVertexAI # type: ignore
except ImportError:
raise ImportError(
"To use the ChatVertexAI model, you need to install the langchain-google-vertexai package."
)
output = ChatVertexAI(
credentials=credentials,
examples=examples,
location=location,
max_output_tokens=max_output_tokens,
model_name=model_name,
project=project,
temperature=temperature,
top_k=top_k,
top_p=top_p,
verbose=verbose,
)
return self.get_chat_result(output, stream, input_value, system_message) Ensure that the |
@dosu Where should I edit this code you gave? Where are those files or where the code resides in app? |
dear @dosu, I think you're wrong, file looks like it should not be edited in this manner - also I am not looking for hack but more elegant solution. |
To resolve the issue with Vertex AI credentials in Langflow, you should set the Here is how you can set the
By setting the Additionally, Langflow provides a way to configure credentials directly within the application. You can set the def initialize_vertexai(class_object, params):
if credentials_path := params.get("credentials"):
from google.oauth2 import service_account # type: ignore
credentials_object = service_account.Credentials.from_service_account_file(filename=credentials_path)
params["credentials"] = credentials_object
return class_object(**params) You can use this function to initialize your Vertex AI component with the credentials directly: from langflow.interface.initialize.llm import initialize_vertexai
from langflow.components.models.VertexAiModel import ChatVertexAIComponent
params = {
"input_value": "Your input text here",
"credentials": "C:\\path\\to\\your\\service-account-file.json",
"project": "your-project-id",
"location": "us-central1",
"max_output_tokens": 128,
"model_name": "chat-bison",
"temperature": 0.0,
"top_k": 40,
"top_p": 0.95,
"verbose": False,
"stream": False,
"system_message": None,
}
vertex_ai_component = initialize_vertexai(ChatVertexAIComponent, params) This approach allows you to specify the credentials directly in your code, providing an alternative to setting the environment variable [1][2]. |
dear @dosu. got error 'Error Building Component the fix with variable did not work. full error
` |
To resolve the error "503 Getting metadata from plugin failed with error: 'str' object has no attribute 'before_request'" when using Vertex AI credentials on Windows 11 Pro with Langflow version 1.0.9 and Python 3.12, you need to ensure that the Here is the relevant part of the def initialize_vertexai(class_object, params):
if credentials_path := params.get("credentials"):
from google.oauth2 import service_account # type: ignore
credentials_object = service_account.Credentials.from_service_account_file(filename=credentials_path)
params["credentials"] = credentials_object
return class_object(**params) Make sure that the vertex_ai_component = VertexAIComponent()
llm = vertex_ai_component.build(
credentials="path/to/your/credentials.json",
location="us-central1",
max_output_tokens=128,
max_retries=6,
metadata={},
model_name="text-bison",
n=1,
name=None,
project="your-project-id",
request_parallelism=5,
streaming=False,
temperature=0.0,
top_k=40,
top_p=0.95,
tuned_model_name=None,
verbose=False
) Ensure that the path to the credentials file is correct and accessible from your environment. The |
regarding earlier post
where I should put it? how can i use it in python code snippet of a VertexAI node? |
To use the provided code snippet in the Langflow application, you should place it in the Here is the complete code snippet for the from typing import List, Optional
from langchain_core.messages.base import BaseMessage
from langflow.base.constants import STREAM_INFO_TEXT
from langflow.base.models.model import LCModelComponent
from langflow.field_typing import Text
class ChatVertexAIComponent(LCModelComponent):
display_name = "Vertex AI"
description = "Generate text using Vertex AI LLMs."
icon = "VertexAI"
field_order = [
"credentials",
"project",
"examples",
"location",
"max_output_tokens",
"model_name",
"temperature",
"top_k",
"top_p",
"verbose",
"input_value",
"system_message",
"stream",
]
def build_config(self):
return {
"credentials": {
"display_name": "Credentials",
"field_type": "file",
"file_types": [".json"],
"file_path": None,
},
"examples": {
"display_name": "Examples",
"multiline": True,
},
"location": {
"display_name": "Location",
"value": "us-central1",
},
"max_output_tokens": {
"display_name": "Max Output Tokens",
"value": 128,
"advanced": True,
},
"model_name": {
"display_name": "Model Name",
"value": "chat-bison",
},
"project": {
"display_name": "Project",
},
"temperature": {
"display_name": "Temperature",
"value": 0.0,
},
"top_k": {
"display_name": "Top K",
"value": 40,
"advanced": True,
},
"top_p": {
"display_name": "Top P",
"value": 0.95,
"advanced": True,
},
"verbose": {
"display_name": "Verbose",
"value": False,
"advanced": True,
},
"input_value": {"display_name": "Input"},
"stream": {
"display_name": "Stream",
"info": STREAM_INFO_TEXT,
"advanced": True,
},
"system_message": {
"display_name": "System Message",
"info": "System message to pass to the model.",
"advanced": True,
},
}
def build(
self,
input_value: Text,
credentials: Optional[str],
project: str,
examples: Optional[List[BaseMessage]] = [],
location: str = "us-central1",
max_output_tokens: int = 128,
model_name: str = "chat-bison",
temperature: float = 0.0,
top_k: int = 40,
top_p: float = 0.95,
verbose: bool = False,
stream: bool = False,
system_message: Optional[str] = None,
) -> Text:
try:
from langchain_google_vertexai import ChatVertexAI # type: ignore
except ImportError:
raise ImportError(
"To use the ChatVertexAI model, you need to install the langchain-google-vertexai package."
)
output = ChatVertexAI(
credentials=credentials,
examples=examples,
location=location,
max_output_tokens=max_output_tokens,
model_name=model_name,
project=project,
temperature=temperature,
top_k=top_k,
top_p=top_p,
verbose=verbose,
)
return self.get_chat_result(output, stream, input_value, system_message) To use this component in the Python code snippet of a VertexAI node, you can instantiate and configure the vertex_ai_component = ChatVertexAIComponent()
config = vertex_ai_component.build_config()
vertex_ai_instance = vertex_ai_component.build(
input_value="Your input text here",
credentials="C:\\path\\to\\your\\service-account-file.json",
project="your-project-id",
location="us-central1",
max_output_tokens=128,
model_name="chat-bison",
temperature=0.0,
top_k=40,
top_p=0.95,
verbose=False,
stream=False,
system_message="Your system message here"
) This code snippet demonstrates how to configure and instantiate the |
Could you try 1.0.12? |
Thanks! I hope to test tomorrow! |
Worked out, thanks! |
@severfire @nicoloboschi I created a custom component to use the Anthropic models on VertexAI: #2863 (comment) |
Bug Description
Hello,
I do have strange error, I am using Windows 11 Pro,
And while trying to use Credentials file on VertexAI, I am always getting
'Error Building Component
Error building Component Vertex AI: Your default credentials were not found. To set up Application Default Credentials, see https://cloud.google.com/docs/authentication/external/set-up-adc for more information.'
Credentials are tested with flowise and other apps. They work. Maybe something is up with windows and this app and paths? Maybe there should be something like Credentials storage system for Credentials like in Flowise?
Thanks for help!
a.
Reproduction
Use Basic prompt template. Replace OpenAI with Vertex AI model. Put in file with credentials.
Expected behavior
Error Building Component
Error building Component Vertex AI: Your default credentials were not found. To set up Application Default Credentials, see https://cloud.google.com/docs/authentication/external/set-up-adc for more information.
Who can help?
No response
Operating System
Windows 11 pro
Langflow Version
1.0.9
Python Version
The text was updated successfully, but these errors were encountered: