Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Translate ziggurat algorithms for normal and exponential variates to Julia. #6501

Merged
merged 1 commit into from
Apr 18, 2014

Conversation

andreasnoack
Copy link
Member

This pull request removes the C file that computes normal variates in favor of a translation to Julia. On julia.mit.edu the implementation in Julia was 25 pct. faster which I can't really explain why. On my MacBook, they are approximately equally fast.

The tables necessary for calculating the normal variates are made in BigFloat and finally translated to Uint64 and Float64 thereby making them identical on 32 and 64 bit systems without storing the hardcoded tables as we done for about a week.

The server is busy with other computations, and therefore I haven't run BigCrush on the normal and exponential variates yet, but I have tried the test that the rand fails, see #6464. Surprisingly, the normal variates don't fail the test.

@JeffBezanson
Copy link
Member

I still think I'd rather hard-code the tables; if nothing else this will make it easier to build more of julia without bigfloat support.

@andreasnoack
Copy link
Member Author

Okay. Is there a point in letting the generating function stay such that we can see how they are created?

I can see that Travis doesn't like this. I have removed the random directory because it only included dSMFT. Can someone with Travis knowledge help me here?

@StefanKarpinski
Copy link
Member

How about checking the tables against the generating code as a BigFloat test? That way we keep the code around, make sure it keeps working, and don't cause the build process to depend in BigFloats.

On Apr 11, 2014, at 1:04 PM, Andreas Noack Jensen [email protected] wrote:

Okay. Is there a point in letting the generating function stay such that we can see how they are created?

I can see that Travis don't like this. I have removed the random directory because it only included dSMFT. Can someone with Travis knowledge help me here?


Reply to this email directly or view it on GitHub.

@staticfloat
Copy link
Member

Travis is sad because make is trying to resolve the dsfmt-$(DSFMT_VER)/dSFMT.c dependency before it resolves the dsfmt-$(DSFMT_VER)/config.status dependency. This diff should fix your problems:

diff --git a/deps/Makefile b/deps/Makefile
index f9b5278..6f17fd5 100644
--- a/deps/Makefile
+++ b/deps/Makefile
@@ -645,7 +645,7 @@ dsfmt-$(DSFMT_VER)/config.status: dsfmt-$(DSFMT_VER).tar.gz
        $(TAR) -C dsfmt-$(DSFMT_VER) --strip-components 1 -xf dsfmt-$(DSFMT_VER).tar.gz && \
        cd dsfmt-$(DSFMT_VER) && patch < ../dSFMT.h.patch && patch < ../dSFMT.c.patch
        echo 1 > $@
-$(LIBRANDOM_OBJ_SOURCE): dsfmt-$(DSFMT_VER)/dSFMT.c dsfmt-$(DSFMT_VER)/config.status
+$(LIBRANDOM_OBJ_SOURCE): dsfmt-$(DSFMT_VER)/config.status dsfmt-$(DSFMT_VER)/dSFMT.c
        $(CC) $(CPPFLAGS) $(LIBRANDOM_CFLAGS) $(LDFLAGS) dsfmt-$(DSFMT_VER)/dSFMT.c -o librandom.$(SHLIB_EXT) && \
        $(INSTALL_NAME_CMD)librandom.$(SHLIB_EXT) librandom.$(SHLIB_EXT)
 $(LIBRANDOM_OBJ_TARGET): $(LIBRANDOM_OBJ_SOURCE)

@ViralBShah
Copy link
Member

Certainly good to have the generation code in there as well.

@andreasnoack
Copy link
Member Author

I have a problem with defining pointers to the arrays storing the ziggurat tables. First of all, I am in doubt if this ought to be faster that indexing the array with @inbounds, but in practice it made the code exponential code significantly faster and the normal code slightly/insignificantly faster. The problem is that after moving the code from an external file to Base, something goes wrong with the pointers. When printed, they don't show a hexadecimal value and Julia crashes when I try to load a value from them.

When using normal array indexing, lights are green(thank you @staticfloat) and both normal and exponential pass the BigCrush tests. I have included the table creation as a test for the hardcoded tables.

@StefanKarpinski
Copy link
Member

@inbounds doesn't avoid the null pointer checks when array objects could be undefined. Not sure if that's the issue, but that's a problem that I've encountered with this sort of thing.

@andreasnoack
Copy link
Member Author

Correction: The pointers do show a hexadecimal number, i.e. zero, so for some reason they become null pointers when the code is loaded from Base, but not when loaded from an external file.

@JeffBezanson
Copy link
Member

A Ptr is serialized as NULL, since saving the address would not do any good. On each startup, the exact addresses of objects can be different.

@JeffBezanson
Copy link
Member

When the code is compiled and run in the same session, the JIT will be able to inline the pointer constants into the generated code, which should explain the speedup. Unfortunately there is no way to obtain this optimization in pre-compiled code. We could add the equivalent of @ccallable for data, allowing the pointers to be read via cglobal.

@andreasnoack
Copy link
Member Author

@JeffBezanson, Thank you for the explanation. It makes sense.

I have removed the pointers and the normals are still much faster in the Julia implementation on julia.mit.edu, but the exponentials are slightly slower without the pointers.

The definitions with and without an RNG are almost identical. Is it possible to write a loop with @eval where I define functions with and without an argument? I have a loop tuple like

for t in (:nothing, :x)
    @eval begin
    f($t)=2
    end
end

but it doesn't define a version without an argument. Is it possible to make a definition like that?

@toivoh
Copy link
Contributor

toivoh commented Apr 15, 2014

I believe that something like

for t in ([], [:x])
    @eval begin
    f($(t...))=2
    end
end

should do the trick.

@andreasnoack
Copy link
Member Author

It works. Thanks @toivoh.

@ViralBShah
Copy link
Member

Is this good to merge now?

@andreasnoack
Copy link
Member Author

Yes, I think so.

ViralBShah added a commit that referenced this pull request Apr 18, 2014
Translate ziggurat algorithms for normal and exponential variates to Julia.
@ViralBShah ViralBShah merged commit 16c4168 into master Apr 18, 2014
@andreasnoack andreasnoack deleted the anj/randn branch August 13, 2014 13:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants