Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why reshaping the top blob in RPN ? #20

Open
LucasMahieu opened this issue Mar 8, 2018 · 7 comments
Open

Why reshaping the top blob in RPN ? #20

LucasMahieu opened this issue Mar 8, 2018 · 7 comments

Comments

@LucasMahieu
Copy link

I don't understand why there is this line :
top[0].reshape(1, 5) in this file.

According to what I have understand, and according to the comment juste before this line :

     # rois blob: holds R regions of interest, each is a 5-tuple
     # (n, x1, y1, x2, y2) specifying an image batch index n and a
     # rectangle (x1, y1, x2, y2)

Are you sure this reshape is correct ?

@Microos
Copy link

Microos commented Apr 4, 2018

the first dim are dummy indices which are set to be 0 and expected by loss layer

@LucasMahieu
Copy link
Author

I agree with that, the problem is that the reshape should be :
top[0].reshape(N, 5)

With N = cfg.POST_NMS_KEEP

Is not right ?

@Microos
Copy link

Microos commented Apr 4, 2018

Where is this top[0].reshape(1, 5)? It's in setup() or forward()?

@LucasMahieu
Copy link
Author

in the setup part

@LucasMahieu
Copy link
Author

And then, a reshape is done in the forward function...

But it should be more logical to shape top[0] to the right shape in the setup() step, according to me.

@Microos
Copy link

Microos commented Apr 4, 2018

OK, I just saw the line you referred to.
This top, including other tops, that defined in setup() is used for caffe to perform the check at network initialization stage. The caffe will try to check if the dimension of all the blobs matches. Imagine that caffe create a dummy data according to your top's shape and let it flow through the whole network to check if the dimension of each layer's output in valid in all the subsequent layers. Therefore, you don't need to set the top to (N,5) cuz it's the shape that matters instead of the data inside. And of course, you cannot set the top to any other shape, it will failed when caffe performing the initialization. Hope this would help :D

@Microos
Copy link

Microos commented Apr 4, 2018

Then in forward() function which is the time that caffe has your real data flowing around, in this function, you can spit out the top that has dynamic data shape, say (2000, 5) as this proposal layer will normally do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants