Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Converting between representation of high dimensional polytopes #34

Closed
tomerarnon opened this issue Jan 5, 2019 · 5 comments
Closed

Comments

@tomerarnon
Copy link

In my use case (related to verification of neural networks) I am working with high dimensional polytopes/polyhedra, and have found that these polytopes fail to convert from h-rep to v-rep.

The particular case I run into this on is 750+ dimensional, corresponding to an image, but it can be seen in much smaller cases also, starting in the mid-20s.

I am using LazySets to represent the polytope, since that is what I am using in my application. I make a random example by randomly assigning constraints until I hit a valid set. Obviously this is silly, but it works:

using LazySets, Polyhedra, CDDLib 

function valid_hrep(nconst, dim)
    H = HPolyhedron(rand(nconst, dim) .- 0.5, rand(nconst) .- 0.5)
    while isempty(H) || !isbounded(H)
        H = HPolyhedron(rand(nconst, dim) .- 0.5, rand(nconst) .- 0.5)
    end 
    return HPolytope(H)
end

20-D tends to work every time:

julia> H = valid_hrep(30, 20);

julia> @time tovrep(H, backend = CDDLib.Library());
  0.994134 seconds (33.76 k allocations: 3.300 MiB)

25-D is hit or miss. In this case, I interrupt after a few minutes. In other cases, I have let it run for much longer to ensure that it indeed never finishes.

julia> H = valid_hrep(40, 25);

julia> @time tovrep(H, backend = CDDLib.Library()); #...
^C^C^C^C^CWARNING: Force throwing a SIGINT
ERROR: InterruptException:
Stacktrace:
 [1] macro expansion at /Users/tomer/.julia/packages/CDDLib/bKo5p/src/CDDLib.jl:24 [inlined]
 [2] dd_matrix2poly at /Users/tomer/.julia/packages/CDDLib/bKo5p/src/polyhedra.jl:52 [inlined]
 [3] CDDPolyhedra{Float64,Float64}(::CDDInequalityMatrix{Float64,Float64}) at /Users/tomer/.julia/packages/CDDLib/bKo5p/src/polyhedra.jl:68
 [4] CDDPolyhedra(::CDDInequalityMatrix{Float64,Float64}) at /Users/tomer/.julia/packages/CDDLib/bKo5p/src/polyhedra.jl:83
 [5] getpoly(::CDDLib.Polyhedron{Float64}, ::Bool) at /Users/tomer/.julia/packages/CDDLib/bKo5p/src/polyhedron.jl:60
 [6] getpoly at /Users/tomer/.julia/packages/CDDLib/bKo5p/src/polyhedron.jl:56 [inlined]
 [7] getext(::CDDLib.Polyhedron{Float64}) at /Users/tomer/.julia/packages/CDDLib/bKo5p/src/polyhedron.jl:51
 [8] vrep at /Users/tomer/.julia/packages/CDDLib/bKo5p/src/polyhedron.jl:156 [inlined]
 [9] pointtype(::CDDLib.Polyhedron{Float64}) at /Users/tomer/.julia/packages/Polyhedra/T1zRo/src/iterators.jl:175
...

Note that tovrep is a LazySets function that calls Polyhedra.polyhedron, which in turn calls the specified library.

I was wondering if this is a bug, or some inherent limitation to scaling the algorithms, but having just tested it with LRSLib, it appears that LRS succeeds in this case.

julia> @time tovrep(H, backend = LRSLib.Library());
 16.479215 seconds (17.26 M allocations: 3.084 GiB, 8.54% gc time)
@blegat
Copy link
Member

blegat commented Jan 5, 2019

Normally, LRS does not work in floating point arithmetic so it should have given you an error and the result returned in 16 s is most probably incorrect.
You can find a comparison here. You can see that they do not try dimensions higher than 50.
Representation conversion is limited to low dimension so it should take ages on polyhedra of dimension 750.
If you think about it, an hypercube in dimension 750 has 1500 facets and 2^750 extreme points so if you give the facets and ask for the extreme points, even the output size is way too large

@tomerarnon
Copy link
Author

Normally, LRS does not work in floating point arithmetic so it should have given you an error and the result returned in 16 s is most probably incorrect.

I didn't know that, thanks. I used it for the first time as a point of comparison on this issue, so I was unfamiliar with its limitations.
I just considered leaving an issue on LRSLib about your point that it should have error-ed, but it looks like you're the maintainer of that package as well! No need to spam you 😄

If you think about it, an hypercube in dimension 750 has 1500 facets and 2^750 extreme points so if you give the facets and ask for the extreme points, even the output size is way too large

Fair enough; I expected the method would not scale. So then you think this is an inherent limitation of the algorithm(s) that we start to see around 20 dimensions?

@blegat
Copy link
Member

blegat commented Jan 5, 2019

You can open an issue in LRSLib as a reminder that we should fix that :)

So then you think this is an inherent limitation of the algorithm(s) that we start to see around 20 dimensions?

More of the problem than the algorithm. Note than an algorithm time complexity is a least linear in the size of the input + output (as it needs to read the input and write the output),
here since the output can be exponential in the size of the input, the complexity need to be defined with care, see https://www.cs.mcgill.ca/~fukuda/soft/polyfaq/node18.html.
As you can see with e.g. https://www.cs.mcgill.ca/~fukuda/soft/polyfaq/node12.html and https://www.cs.mcgill.ca/~fukuda/soft/polyfaq/node19.html, the size of the input/output grows rapidly with the dimension.

@odow
Copy link
Contributor

odow commented Feb 21, 2024

This can probably be closed. There doesn't seem to be anything actionable left to do.

@odow
Copy link
Contributor

odow commented Feb 22, 2024

Closing because this seems resolved. Please comment if that's not the case and you'd like to re-open.

@odow odow closed this as completed Feb 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants