Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure gradient of tf.math.fidelity remains float32 when autographed. #596

Merged
merged 1 commit into from
Jun 18, 2021

Conversation

MichaelBroughton
Copy link
Collaborator

@MichaelBroughton MichaelBroughton commented Jun 17, 2021

When calling:

tf.GradientTape() as tape:
   tape.gradient(some fid op)

The fid op would backprop through to the inner_product op who's gradients are complex64 (as they should be). However this can be unintuitive to the user who might not be expecting this. This PR ensures that the gradient of the fidelity op remains f32 at all times. It does not change the types of the inner_product grad which would still remain complex64.

@MichaelBroughton MichaelBroughton requested a review from jaeyoo June 17, 2021 23:00
@MichaelBroughton MichaelBroughton merged commit 68e0ec3 into master Jun 18, 2021
@MichaelBroughton MichaelBroughton deleted the fid_f32_fix branch June 18, 2021 00:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants