Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay][Tensorflow] Allow an op as loop var. #3056

Merged
merged 1 commit into from
May 1, 2019
Merged

Conversation

lixiaoquan
Copy link
Contributor

@lixiaoquan lixiaoquan commented Apr 20, 2019

Allow binding a CallNode to a var to support op as loop var.

@zhiics @srkreddy1238 @jroesch Could you please review?

@srkreddy1238
Copy link
Contributor

Look good to me. @jroesch can you too confirm on the IR part ?

@zhiics
Copy link
Member

zhiics commented Apr 23, 2019

Thanks for the contribution. But I am not sure if we really want to do this because it looks to me that ANF should be able to avoid the mentioned problem. @jroesch please confirm.

@jroesch
Copy link
Member

jroesch commented Apr 24, 2019

@zhiics I think this is fine, we shouldn't need to make any changes to bind though, the original code was just generating a single loop variable for each sub-graph used as a loop variable this time. We can later ANF the program too.

@@ -383,11 +383,21 @@ class ExprBinder : public ExprMutator {
}
}

Expr VisitExpr_(const CallNode* op) final {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't need these changes we don't really want to do arbitrary expression to expression replacement.

else:
var_type = var.type_annotation

v = tvm.relay.var("loop_var" + str(i), type_annotation=var_type)
loop_vars.append(v)
bind_map[var] = v
Copy link
Member

@jroesch jroesch Apr 24, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think instead of calling bind which has specific semantics we should just add code like this:

class RewriteSubgraph(ExprMutator):
    def __init__(self, rewrite_map): 
        self.rewrite_map = rewrite_map

    def visit(self, expr):
       if expr in self.rewrite_map:
            return self.rewrite_map[expr]
       else: 
            return super().visit(expr)

def rewrite_subgraph(expr, rewrites):
     return RewriteSubgraph(rewrites).visit(expr)

then replace the call with bind to this, this will handle the general case and not just calls.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Do you think it should be a common function or just be kept in TF frontend?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is reasonable to just put it in the TF pass for now. We could also put in the frontend/common file where utilities are kept.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've modified it according to request.

@jroesch
Copy link
Member

jroesch commented Apr 27, 2019

Let me know when its ready for review agian.

@jroesch jroesch added the status: need update need update based on feedbacks label Apr 27, 2019
@lixiaoquan
Copy link
Contributor Author

@jroesch Will you please review latest change?

@jroesch
Copy link
Member

jroesch commented May 1, 2019

Great looks good, thanks!

@jroesch jroesch merged commit e6ca91e into apache:master May 1, 2019
wweic pushed a commit to wweic/tvm that referenced this pull request May 13, 2019
wweic pushed a commit to neo-ai/tvm that referenced this pull request May 13, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: need update need update based on feedbacks
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants