Skip to content

fix pa fp8 on mi35x#1173

Closed
fsx950223 wants to merge 3 commits intomainfrom
pa_mi35x
Closed

fix pa fp8 on mi35x#1173
fsx950223 wants to merge 3 commits intomainfrom
pa_mi35x

Conversation

@fsx950223
Copy link
Contributor

Motivation

Technical Details

Test Plan

Test Result

Submission Checklist

Copilot AI review requested due to automatic review settings October 13, 2025 04:06
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes support for FP8 data types on MI35x hardware by adding the torch.float8_e4m3fn data type mapping to the paged attention ROCm implementation.

  • Adds support for torch.float8_e4m3fn data type in the type mapping dictionary

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

torch.bfloat16: "__hip_bfloat16",
torch.float16: "_Float16",
torch.float8_e4m3fnuz: "uint8_t",
torch.float8_e4m3fn: "uint8_t"
Copy link

Copilot AI Oct 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing comma after the dictionary entry. This could cause syntax errors or unexpected behavior when additional entries are added later.

Suggested change
torch.float8_e4m3fn: "uint8_t"
torch.float8_e4m3fn: "uint8_t",

Copilot uses AI. Check for mistakes.
@valarLip
Copy link
Collaborator

fixed in #1159

@valarLip valarLip closed this Oct 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants