Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New sample op #42

Open
wants to merge 11 commits into
base: dgl_graph
Choose a base branch
from
Open

New sample op #42

wants to merge 11 commits into from

Conversation

aksnzhy
Copy link

@aksnzhy aksnzhy commented Nov 19, 2018

No description provided.

heap[i] += w;
i = i >> 1;
}
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method is probably not necessary.


// Sample a vector by given the size n
void SampleWithoutReplacement(size_t n,
std::vector<size_t>& samples) {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

indent.

}
size_t i = 0;
for (auto it = sampled_idxs.begin(); it != sampled_idxs.end(); it++, i++)
sorted_idxs[i] = *it;
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this isn't sampling with shuffle.

@@ -163,96 +493,266 @@ static void GetSample(std::vector<dgl_id_t>& ver_list,
}
mp[rand_num] = true;
out.push_back(ver_list[rand_num]);
out_edge.push_back(edge_list[rand_num]);
sample_count++;
if (sample_count == max_num_neighbor) {
break;
}
}
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it doesn't seem we need this function.


// sample from arrayHeap
size_t Sample() {
float xi = heap[1] * (rand()%100/(float)101);
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so heap[1] is the root of the tree?

sp_prob[i] = probability[ver_list[i]];
}
ArrayHeap arrayHeap(sp_prob);
arrayHeap.SampleWithoutReplacement(max_num_neighbor, sp_index);
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to sort the index

tmp_edge_list,
num_neighbor,
tmp_sampled_src_list,
tmp_sampled_edge_list);
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you didn't use GetUniformSampleShuffle.

num_edges += tmp_sampled_src_list.size();

// TODO The code doesn't limit the maximal number of vertices correctly.
sub_vertices_count++;
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sub_ver_mp contains the sampled vertices. Do we need count the number of sampled vertices?

it's a bit tricky when limiting the number of sampled vertices. actually, we need to make sure sub_ver_mp.size() <= max_num_vertices.

// set seed for random sampling
srand(time(nullptr));

//#pragma omp parallel for
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

enable openmp.

srand(time(nullptr));
const float* probability = inputs[1].data().dptr<float>();

//#pragma omp parallel for
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

enable openmp

const int dev_mask,
DispatchMode* dispatch_mode,
std::vector<int> *in_attrs,
std::vector<int> *out_attrs) {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we merge these functions for uniform and non-uniform?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants