You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the paper, you have mentioned that each convolutional block for the DHAN follows by an Instance Normalization (IN) and LeakyReLU layer, however, it uses Batch Normalization here instead.
Am I missing out on something here or is this the representation for IN? I do know that BN is used for shadow removal, but nm was referred on the build_aggasatt_joint, so it confuses me a little.
The text was updated successfully, but these errors were encountered:
In the paper, you have mentioned that each convolutional block for the DHAN follows by an Instance Normalization (IN) and LeakyReLU layer, however, it uses Batch Normalization here instead.
ghost-free-shadow-removal/networks.py
Line 25 in 93cc1d6
Am I missing out on something here or is this the representation for IN? I do know that BN is used for shadow removal, but
nm
was referred on thebuild_aggasatt_joint
, so it confuses me a little.The text was updated successfully, but these errors were encountered: