Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

single smaps for batches #48

Merged
merged 2 commits into from
Jan 11, 2022
Merged

single smaps for batches #48

merged 2 commits into from
Jan 11, 2022

Conversation

ckolbPTB
Copy link
Contributor

@ckolbPTB ckolbPTB commented Jan 3, 2022

Running this code

import torch
import torchkbnufft as tkbn

image = torch.randn(11,1,8,8) + 1j * torch.randn(11,1,8,8)
smaps = torch.randn(11,1,8,8) + 1j * torch.randn(11,1,8,8)
omega = torch.rand(11,2,12)

toep_ob = tkbn.ToepNufft()
kernel = tkbn.calc_toeplitz_kernel(omega, im_size=[8,8])
image = toep_ob(image, kernel, smaps=smaps)

will carry out the Forward/backward NUFFT with Toeplitz embedding for all 11 batches. Nevertheless, often batches correspond to different dynamics and for those smaps is always the same (as it depends on the hardware rather than the underlying object to be imaged). In order to avoid having to make nbatch copies of smaps this PR allows for the following:

image = torch.randn(11,1,8,8) + 1j * torch.randn(11,1,8,8)
smaps = torch.randn(1,1,8,8) + 1j * torch.randn(1,1,8,8)
omega = torch.rand(11,2,12)

toep_ob = tkbn.ToepNufft()
kernel = tkbn.calc_toeplitz_kernel(omega, im_size=[8,8])
image = toep_ob(image, kernel, smaps=smaps)

Remark: The above code with smaps.shape = [1,1,8,8] will also run with the current main-branch but it will yield an image of dimension [1,1,8,8] rather than [11,1,8,8].

@mmuckley
Copy link
Owner

Hello @ckolbPTB this looks like a great contribution! Could you make sure that your code passes the formatting lints (particularly, black) so that I can merge it?

@ckolbPTB
Copy link
Contributor Author

Hi, sure no problem. I have never used this automatic format checking. Can I see somewhere which code lines cause the problem, or is it simply a matter of pip install black and this will automatically fix the problem? Thanks for your help!

@mmuckley
Copy link
Owner

Hello @ckolbPTB,

For torchkbnufft, you can see the exact version that CI uses in dev-requirements.txt

The following two lines should fix it if run from the root directory:

pip install black==21.10b0
black torchkbnufft

You should get a message that reformats the offending file.

@ckolbPTB
Copy link
Contributor Author

I think it passed all the checks now. Thanks for reviewing the PR so quickly!

Copy link
Owner

@mmuckley mmuckley left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@mmuckley mmuckley merged commit d801e62 into mmuckley:main Jan 11, 2022
@mmuckley
Copy link
Owner

mmuckley commented Jan 11, 2022

Thanks very much for these contributions @ckolbPTB!

mmuckley pushed a commit that referenced this pull request Nov 22, 2022
* single smaps for batches

* black formatting

Co-authored-by: Christoph Kolbitsch <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants