Skip to content

Conversation

kasanatte
Copy link
Contributor

@kasanatte kasanatte commented Aug 7, 2025

What type of PR is this?
/kind feature

What this PR does / why we need it:
This integration would provide users with out-of-the-box large model capabilities through a ChatUI interface, significantly enhancing cluster management efficiency.

Which issue(s) this PR fixes:
Fixes #234

Special notes for your reviewer:

Does this PR introduce a user-facing change?:

Adds AI-powered assistant integration with support for local MCP server, basic chatbot UI, and natural language interaction for Karmada cluster operations.
image

This PR currently accomplishes the following:

  • MCP Integration (Local Only): Integrated karmada-mcp-server as an external local service via environment variables.
  • Chatbot Example Page: Added a demo page showcasing the assistant/chatbot interface.
  • Local Development Only: The current setup only supports local development.

Upcoming Plans:

  • Container Deployment: Support deployment via container or standalone binary as a new service.
  • Chat UI Enhancements.
  • Multi-Model Provider Support: Add support for additional model providers like DeepSeek, Qwen, and Gemini.
  • Karmada Website Embedding for RAG.
  • multi cluster diagnosis which we still need more discussion. some related issue: Ability to view ResourceBindings on Karmada Dashboard#227

@karmada-bot karmada-bot added do-not-merge/work-in-progress Indicates that a PR should not merge because it is a work in progress. kind/feature Categorizes issue or PR as related to a new feature. labels Aug 7, 2025
@karmada-bot
Copy link
Collaborator

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign samzong for approval. For more information see the Kubernetes Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@karmada-bot karmada-bot added the size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files. label Aug 7, 2025
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Summary of Changes

Hello @kasanatte, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a significant new feature to the Karmada Dashboard: an AI-powered assistant. This assistant is designed to enhance cluster management efficiency by providing natural language interaction capabilities. It integrates with a new Model Context Protocol (MCP) server, allowing it to perform real-time queries and operations on Karmada clusters through function calls, alongside a basic chatbot UI for user interaction.

Highlights

  • AI Assistant Integration: Adds a new AI-powered assistant feature to the Karmada Dashboard, enabling natural language interaction for cluster management.
  • Model Context Protocol (MCP) Support: Integrates with a local karmada-mcp-server to provide advanced capabilities, allowing the AI model to execute tools and retrieve real-time cluster information.
  • Chatbot User Interface: Introduces a new AssistantPage in the UI, providing a basic chat interface where users can interact with the AI assistant, including a toggle to enable/disable MCP integration.
  • Backend API Endpoints: Implements new /assistant, /chat, and /chat/tools API endpoints in the Go backend to handle chat requests, stream responses, and manage MCP tool interactions.
  • Tool Calling Mechanism: The backend now supports OpenAI's function calling feature, allowing the AI model to determine when to call specific MCP tools, execute them, and then incorporate the results into its natural language response.
  • Environment Variable Configuration: The integration is configurable via environment variables for OpenAI API key, model, MCP server path, transport mode (stdio/SSE), and Karmada kubeconfig details.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments or fill out our survey to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces an AI-powered assistant to the Karmada dashboard, integrating MCP for enhanced cluster management. The review focuses on improving context handling in the Go backend to prevent potential resource leaks, as well as enhancing documentation clarity and frontend type safety for better maintainability.

Comment on lines 282 to 335
func (c *MCPClient) initializeStdioClient() error {
klog.Infof("Initializing MCP stdio client with server: %s", c.config.ServerPath)

// Create context with timeout for initialization
ctx, cancel := context.WithTimeout(context.Background(), c.config.ConnectTimeout)
c.ctx = ctx
c.cancel = cancel

// Create stdio transport with proper environment and args
stdioTransport := transport.NewStdio(
c.config.ServerPath,
nil,
"stdio",
"--karmada-kubeconfig="+c.config.KubeconfigPath,
"--karmada-context="+c.config.KarmadaContext,
)

// Create client with the transport
mcpClient := client.NewClient(stdioTransport)

// Start the client
if err := mcpClient.Start(ctx); err != nil {
cancel()
return fmt.Errorf("failed to start MCP client: %w", err)
}

c.client = mcpClient
klog.Infof("MCP stdio client started successfully")

// Initialize the client with proper handshake
initRequest := mcp.InitializeRequest{}
initRequest.Params.ProtocolVersion = mcp.LATEST_PROTOCOL_VERSION
initRequest.Params.ClientInfo = mcp.Implementation{
Name: "Karmada-Dashboard-MCP-Client",
Version: "0.0.0-dev",
}
initRequest.Params.Capabilities = mcp.ClientCapabilities{}

serverInfo, err := c.client.Initialize(ctx, initRequest)
if err != nil {
cancel()
return fmt.Errorf("failed to initialize MCP client: %w", err)
}

// Store server info for later use
c.serverInfo = serverInfo

klog.Infof("Connected to MCP server: %s (version %s)",
serverInfo.ServerInfo.Name, serverInfo.ServerInfo.Version)

klog.Infof("MCP stdio client connection established successfully")
return nil
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The context management in initializeStdioClient is incorrect and will lead to issues. You are creating a context with a timeout and storing it as the main context for the client (c.ctx). This context will expire after c.config.ConnectTimeout, causing any subsequent operations using it (like the mcpClient.Start if it's long-running) to fail or misbehave.

The Initialize call also uses this same timeout context, but it should have its own short-lived context for the handshake.

Please follow the pattern used in initializeSSEClient, which correctly separates the long-lived client context from the short-lived initialization context.

func (c *MCPClient) initializeStdioClient() error {
	klog.Infof("Initializing MCP stdio client with server: %s", c.config.ServerPath)

	// Create a long-lived context for the client's lifecycle
	c.ctx, c.cancel = context.WithCancel(context.Background())

	// Create stdio transport with proper environment and args
	stdioTransport := transport.NewStdio(
		c.config.ServerPath,
		nil,
		"stdio",
		"--karmada-kubeconfig="+c.config.KubeconfigPath,
		"--karmada-context="+c.config.KarmadaContext,
	)

	// Create client with the transport
	mcpClient := client.NewClient(stdioTransport)

	// Start the client with the long-lived context
	if err := mcpClient.Start(c.ctx); err != nil {
		c.cancel()
		return fmt.Errorf("failed to start MCP client: %w", err)
	}

	c.client = mcpClient
	klog.Infof("MCP stdio client started successfully")

	// Initialize the client with a separate, short-lived context for the handshake
	initCtx, initCancel := context.WithTimeout(context.Background(), c.config.ConnectTimeout)
	defer initCancel()

	initRequest := mcp.InitializeRequest{}
	initRequest.Params.ProtocolVersion = mcp.LATEST_PROTOCOL_VERSION
	initRequest.Params.ClientInfo = mcp.Implementation{
		Name:    "Karmada-Dashboard-MCP-Client",
		Version: "0.0.0-dev",
	}
	initRequest.Params.Capabilities = mcp.ClientCapabilities{}

	serverInfo, err := c.client.Initialize(initCtx, initRequest)
	if err != nil {
		c.cancel() // Cancel the main context if handshake fails
		return fmt.Errorf("failed to initialize MCP client: %w", err)
	}

	// Store server info for later use
	c.serverInfo = serverInfo

	klog.Infof("Connected to MCP server: %s (version %s)",
		serverInfo.ServerInfo.Name, serverInfo.ServerInfo.Version)

	klog.Infof("MCP stdio client connection established successfully")
	return nil
}

@warjiang
Copy link
Contributor

warjiang commented Aug 7, 2025

good job @kasanatte I'll check it later ~

@warjiang
Copy link
Contributor

warjiang commented Aug 7, 2025

/assigne

Co-authored-by: warjiang <[email protected]>
Signed-off-by: kasanatte <[email protected]>
@RainbowMango
Copy link
Member

feat(mcp-chatui): 优化聊天组件样式和功能

Just a kind reminder, we don't usually use Chinese in commit messages.

@warjiang
Copy link
Contributor

warjiang commented Sep 3, 2025

/assign

@warjiang
Copy link
Contributor

@kasanatte I think we can split this PR into multiple small PR, one possilbe way is that one PR for backend api、and the other one for web-ui implementation.

@@ -0,0 +1,665 @@
/*
Copyright 2024 The Karmada Authors.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mcp_client could be more likely a common module, could we seperate the mcp_client in to a standalone pkg like mcp ? any ideas?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mv the mcp_client into standalone pkg can seperate the implementation with the complex business logic, make it more easy to write test.

}

// loadMCPConfig loads configuration from environment variables and validates them.
func loadMCPConfig() (*MCPConfig, error) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not prefer to use environemnt var fist, although it's more convenient, otherwise we will have multi source of input option.

in karmada-dashboard api module, we've already have option to indicate module-scoped config

// Options contains everything necessary to create and run api.
type Options struct {
BindAddress net.IP
Port int
InsecureBindAddress net.IP
InsecurePort int
KubeConfig string
KubeContext string
SkipKubeApiserverTLSVerify bool
KarmadaKubeConfig string
KarmadaContext string
SkipKarmadaApiserverTLSVerify bool
Namespace string
DisableCSRFProtection bool
OpenAPIEnabled bool
}

you can just mv the config into Options struct, and set corresponding flag


// GetMCPClient returns a singleton MCP client instance
func GetMCPClient() (*MCPClient, error) {
mcpClientMutex.Lock()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍


// GetTools returns the available MCP tools.
func (c *MCPClient) GetTools() []MCPTool {
c.mu.RLock()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not use defer, make it simpler ~

if c.closed {
c.mu.RUnlock()
return "", errors.New("MCP client is closed")
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

width: calc(100vw - 20px);
right: -10px;
}
} No newline at end of file
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fomat problem~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
do-not-merge/work-in-progress Indicates that a PR should not merge because it is a work in progress. kind/feature Categorizes issue or PR as related to a new feature. size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[OSPP-2025] Integrate karmada-mcp-server into dashboard
4 participants