Skip to content

Conversation

@ricardocardn
Copy link

🛠️ Fix: Avoid in-place modification in conv1d_step to preserve autograd graph
This PR addresses a critical issue in the conv1d_step function where the in-place update of conv_state (.copy_() and [..., -1:, :] = x) was interfering with PyTorch's autograd system, breaking gradient computation during backpropagation. This change ensures that conv_state is safely updated by creating a new tensor, preserving the computational graph and enabling correct gradient flow through time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant