-
Notifications
You must be signed in to change notification settings - Fork 765
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hybrid Multifrontal #1323
Hybrid Multifrontal #1323
Conversation
|
||
auto discrete_clique = (*updatedBayesTree)[discrete_ordering.at(0)]; | ||
|
||
std::set<HybridBayesTreeClique::shared_ptr> clique_set; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I particularly want a review on this part. Am I creating the Bayes Tree with the discrete root correctly? Unit tests seem to say yes, but if there is a better way, I am all ears.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
without documentation (e.g. with math, I don;t even know what is happeingin). In general true for this entire piece of code. Now that you have the math, migth as well add it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I did a review :-)
Lots of small comments, some deeper comments, but my biggest beef: why do we need to overload something to do the correct thing? What is the underlying function doing wrong? I think I need more explanation of that before believing this is the correct solution. If the code that you are overriding is incorrect, should that not be fixed to properly support hybrid?
Also, highly suspicious of mutating operations in that code.
DiscreteKeys discrete_keys = {{M(0), 2}, {M(1), 2}, {M(2), 2}}; | ||
vector<double> probs = {0.012519475, 0.041280228, 0.075018647, 0.081663656, | ||
0.037152205, 0.12248971, 0.07349729, 0.08}; | ||
AlgebraicDecisionTree<Key> potentials(discrete_keys, probs); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is added but not used?
std::vector<double> measurements = {0, 1, 2, 2}; | ||
|
||
// This is the correct sequence | ||
// std::vector<size_t> discrete_seq = {1, 1, 0}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
delete commented out code
|
||
discreteGraph->add(DecisionTreeFactor(discrete_keys, probPrimeTree)); | ||
|
||
// Ordering discrete(graph.discreteKeys()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
delete or commented out code
|
||
double between_sigma = 1.0, measurement_sigma = 0.1; | ||
|
||
std::vector<double> expected_errors, expected_prob_primes; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
comment
for (size_t i = 0; i < pow(2, K - 1); i++) { | ||
std::vector<size_t> discrete_seq = getDiscreteSequence<K>(i); | ||
|
||
GaussianFactorGraph::shared_ptr linear_graph = specificProblem( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
specificProblem: what does that do? Badly named, add a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Renamed to specificModesFactorGraph
. This is a separate function with its own docstring (which is hopefully okay).
|
||
auto discrete_clique = (*updatedBayesTree)[discrete_ordering.at(0)]; | ||
|
||
std::set<HybridBayesTreeClique::shared_ptr> clique_set; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
without documentation (e.g. with math, I don;t even know what is happeingin). In general true for this entire piece of code. Now that you have the math, migth as well add it?
clique_set.insert(node.second); | ||
} | ||
|
||
// Set the root of the bayes tree as the discrete clique |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
more mutating? Very suspicious.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't sure how else to update the BayesTree so set the discrete clique to be the root. That is a requirement for the structure of the HybridBayesTree.
@@ -220,34 +220,89 @@ class GTSAM_EXPORT HybridGaussianFactorGraph | |||
* @brief Compute the VectorValues solution for the continuous variables for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does that mean? Add math
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated
* @param discrete_keys The discrete keys which form all the modes. | ||
* @param continuousBayesNet The Bayes Net representing the continuous | ||
* eliminated variables. | ||
* @return AlgebraicDecisionTree<Key> | ||
*/ | ||
template <typename BAYES> | ||
AlgebraicDecisionTree<Key> continuousProbPrimes( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not a great name. It refers to math that is not in the documentation. Suggest unnormalizedProbabilities
documenting the fact that it is exp(-E)
. On a side note: I question the wisdom of exponentiating small numbers before they will somehow be normalized. Typically we avoid doing this until we know the minimum error and then subtract the minimum error so the corresponding maximum prob' == 1.0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are unnormalized probabilities only on the continuous variables and probPrime
is the method name for computing the unnormalized probabilities, so it seemed to make sense. If you still think I should rename it to unnormalizedProbabilities
, will gladly do so.
Regarding the exponentiation, the minimum error term will always be 0 due to pruning.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update: this function will get deleted.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Assuming this is WIP?
Ordering hybridOrdering = s.linearizedFactorGraph.getHybridOrdering(); | ||
HybridBayesTree::shared_ptr hybridBayesTree = | ||
s.linearizedFactorGraph.eliminateMultifrontal(hybridOrdering); | ||
HybridValues delta = hybridBayesTree->optimize(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why would I only get continuous values here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should be getting both. Probably should test for delta.discrete()
as well.
HybridBayesNet::shared_ptr bayesNet = | ||
graph.eliminateSequential(hybridOrdering); | ||
|
||
EXPECT_LONGS_EQUAL(5, bayesNet->size()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not a very strong test - comment where individual elimination steps are unit tested if you leave it like this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True. I added this because I was checking the number of bayes net nodes to verify if elimination is generating the correct conditional for continuous-discrete. This is redundant now though.
I'm merging this sow we get a holistic view in #1319. |
Perform correct multifrontal elimination for hybrid factor graphs.
This follows the same general scheme as #1319 but updates the
HybridBayesTree
to take care of the needed operations. I overloaded theeliminateMultifrontal
methods so that the elimination would be done correctly and without the need to change any major code in the unit tests.