Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Toy demo problem outputs black screen #182

Closed
wkeithvan opened this issue May 29, 2018 · 4 comments
Closed

Toy demo problem outputs black screen #182

wkeithvan opened this issue May 29, 2018 · 4 comments

Comments

@wkeithvan
Copy link
Contributor

The toy demo problem is currently outputting a black image.

toy_problem

I did a fresh git clone of the repository and ran the Jupyter Notebook from demos (I had to copy it into the root folder to get the from tf_unet import ... lines working). Having made no other changes to any of the code, the image above is what I currently get at the last cell of the Jupyter Notebook. the last cells a few more time generates new images, but still gives a blank prediction.

@jakeret
Copy link
Owner

jakeret commented May 31, 2018

I also just ran a fresh copy and I'm getting the expected results. Are the intermediate results in the prediction folder also all black? Have you checked the tensorboard?

BTW: You don't need to copy the notebooks around if you install the package with:
python setup.py develop

@wkeithvan
Copy link
Contributor Author

I just recloned it again and ran it. No problems now. I was having trouble the same day on two separate computers, so possibly it is a dependencies issue I've since fixed on them in the days since.

One other question...
In tf_unet/layers.py, what is the point of def weight_variable_devonc? As far as I can tell, it is the same as def weight_variable...

@wkeithvan
Copy link
Contributor Author

Also, I noticed that there is one fewer up convolution layers than down convolution layers (3 down, 2 up in the default configuration). According to the original U-Net paper, shouldn't the number of up convolutions match the number of down convolutions (3 each for the default configuration)?

@jakeret
Copy link
Owner

jakeret commented Jun 4, 2018

Ok good to know, glad it resolved.
This is mainly for historical reasons but in principle it would give you the option to have different intializations for the down und up convolutions.

There are many more (down) convolution layers as up convolutions, this is ok. However there are the same number of max pooling (down sampling) operations as up convolutions (up sampling)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants