Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay] [TOPI] {relay,topi}.nn.sparse_transpose for **Square** CSR matrices #3707

Merged
merged 11 commits into from
Aug 6, 2019

Conversation

yy665
Copy link
Contributor

@yy665 yy665 commented Aug 5, 2019

Thanks for contributing to TVM! Please refer to guideline https://docs.tvm.ai/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers.

Implementation of fast transpose for CSR matrices with time complexity of O(nnz(X) + N), where X is the input CSR matrix, nnz(X) is the number of nonzeros of X, and N is the length of one dimension of X.

Please refer to https://github.com/scipy/scipy/blob/v0.14.0/scipy/sparse/sparsetools/csr.h#L380 for the algorithm implemented in Scipy.

Note that only square CSR matrices are supported since support on the sparse shape has not been introduced to Relay, hence only the first dimension of input matrices can be inferred.

Future roadmap:

  1. Support non-square matrices
  2. Add schedules for different devices.
  3. Support more sparse types

@ajtulloch @tmoreau89 I would really appraciate if you can review this PR.

Relay Test:

from tvm import relay
import topi
import tvm
from collections import namedtuple
import numpy as np
import scipy.sparse as sp

target = 'llvm'
data = relay.var("data")
indices = relay.var("indices")
indptr = relay.var("indptr")
CSR = namedtuple("CSR", ["data", "indices", "indptr"])
output = relay.nn.sparse_transpose(CSR(data, indices, indptr))

output_data, output_indices, output_indptr = output
# output is a relay tupletype

N = 1000
density = 0.5
x = sp.random(N, N, density=density, format='csr', dtype='float32')
params = {"data": x.data,
         "indices": x.indices,
         "indptr": x.indptr}

from tvm.contrib import graph_runtime
import time

func = relay.Function(relay.analysis.free_vars(output.astuple()), output.astuple())

print(func)

with relay.build_config(opt_level=0):
    graph, lib, params = relay.build(func, target, params=params)
    lib.save("lib.o")
ctx = tvm.context(target, 0)
m = graph_runtime.create(graph, lib, ctx)
m.set_input(**params)

print("finished compiling, testing one transpose time cost")
totaltime = 0
for i in range(30):
    st = time.time()
    # One transpose operation time
    m.run()
    end = time.time()
    # Retrieve output Tensor as numpy array
    outdata = m.get_output(0).asnumpy()
    outindices = m.get_output(1).asnumpy()
    outindptr = m.get_output(2).asnumpy()
    totaltime += (end-st)

output = sp.csr_matrix((outdata, outindices, outindptr), shape=(N,N)).todense()
print("One transpose operation time %f" % (totaltime/30))
assert np.allclose(output,x.todense().T)

src/relay/op/nn/sparse.cc Outdated Show resolved Hide resolved
topi/tests/python/test_topi_sparse.py Outdated Show resolved Hide resolved
Copy link
Contributor

@tmoreau89 tmoreau89 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work @Yulun-Yao, thank you for adding this new operator to topi/relay !

src/relay/op/nn/sparse.cc Outdated Show resolved Hide resolved
topi/python/topi/nn/sparse.py Show resolved Hide resolved
@tmoreau89
Copy link
Contributor

Thanks for resolving my comments. Let's wait for one more review, perhaps by @ajtulloch . Also now that there is traction on sparse operator support, we can perhaps discuss potential tensor types for sparse operators so we can correctly typecheck programs? Also solve the issue of enforcing the square matrix shape. This could be the object of an RFC.

@yy665
Copy link
Contributor Author

yy665 commented Aug 5, 2019

Thanks for resolving my comments. Let's wait for one more review, perhaps by @ajtulloch . Also now that there is traction on sparse operator support, we can perhaps discuss potential tensor types for sparse operators so we can correctly typecheck programs? Also solve the issue of enforcing the square matrix shape. This could be the object of an RFC.

I don't think we can enforce matrix shape. If we can enforce matrix shape it means that we can directly infer shape from relay end. It might be possible just to feed an integer tuple from relay end down to the topi level but I think that would be a really bad design. Eventually, we would like to see sparse operators act like other operators.

@ajtulloch
Copy link
Contributor

Looks good to me. Excited to see what you folks have planned :)

Copy link
Contributor

@tmoreau89 tmoreau89 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good work Yulun!

@tmoreau89 tmoreau89 merged commit 3b287c4 into apache:master Aug 6, 2019
wweic pushed a commit to wweic/tvm that referenced this pull request Aug 9, 2019
…matrices (apache#3707)

* add build gcn tutorial

* add transpose operator for square sparse matrices

* remove extra files

* change loop tag

* comply with lint

* comply with lint -- line too long

* comply with lint

* lint check

* lint check

* lint check

* apply marisa and theirry's reviews
wweic pushed a commit to neo-ai/tvm that referenced this pull request Sep 6, 2019
…matrices (apache#3707)

* add build gcn tutorial

* add transpose operator for square sparse matrices

* remove extra files

* change loop tag

* comply with lint

* comply with lint -- line too long

* comply with lint

* lint check

* lint check

* lint check

* apply marisa and theirry's reviews
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants