-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Translate ziggurat algorithms for normal and exponential variates to Julia. #6501
Conversation
I still think I'd rather hard-code the tables; if nothing else this will make it easier to build more of julia without bigfloat support. |
Okay. Is there a point in letting the generating function stay such that we can see how they are created? I can see that Travis doesn't like this. I have removed the |
How about checking the tables against the generating code as a BigFloat test? That way we keep the code around, make sure it keeps working, and don't cause the build process to depend in BigFloats.
|
Travis is sad because diff --git a/deps/Makefile b/deps/Makefile
index f9b5278..6f17fd5 100644
--- a/deps/Makefile
+++ b/deps/Makefile
@@ -645,7 +645,7 @@ dsfmt-$(DSFMT_VER)/config.status: dsfmt-$(DSFMT_VER).tar.gz
$(TAR) -C dsfmt-$(DSFMT_VER) --strip-components 1 -xf dsfmt-$(DSFMT_VER).tar.gz && \
cd dsfmt-$(DSFMT_VER) && patch < ../dSFMT.h.patch && patch < ../dSFMT.c.patch
echo 1 > $@
-$(LIBRANDOM_OBJ_SOURCE): dsfmt-$(DSFMT_VER)/dSFMT.c dsfmt-$(DSFMT_VER)/config.status
+$(LIBRANDOM_OBJ_SOURCE): dsfmt-$(DSFMT_VER)/config.status dsfmt-$(DSFMT_VER)/dSFMT.c
$(CC) $(CPPFLAGS) $(LIBRANDOM_CFLAGS) $(LDFLAGS) dsfmt-$(DSFMT_VER)/dSFMT.c -o librandom.$(SHLIB_EXT) && \
$(INSTALL_NAME_CMD)librandom.$(SHLIB_EXT) librandom.$(SHLIB_EXT)
$(LIBRANDOM_OBJ_TARGET): $(LIBRANDOM_OBJ_SOURCE) |
Certainly good to have the generation code in there as well. |
I have a problem with defining pointers to the arrays storing the ziggurat tables. First of all, I am in doubt if this ought to be faster that indexing the array with When using normal array indexing, lights are green(thank you @staticfloat) and both normal and exponential pass the BigCrush tests. I have included the table creation as a test for the hardcoded tables. |
@inbounds doesn't avoid the null pointer checks when array objects could be undefined. Not sure if that's the issue, but that's a problem that I've encountered with this sort of thing. |
Correction: The pointers do show a hexadecimal number, i.e. zero, so for some reason they become null pointers when the code is loaded from Base, but not when loaded from an external file. |
A |
When the code is compiled and run in the same session, the JIT will be able to inline the pointer constants into the generated code, which should explain the speedup. Unfortunately there is no way to obtain this optimization in pre-compiled code. We could add the equivalent of |
@JeffBezanson, Thank you for the explanation. It makes sense. I have removed the pointers and the normals are still much faster in the Julia implementation on julia.mit.edu, but the exponentials are slightly slower without the pointers. The definitions with and without an RNG are almost identical. Is it possible to write a loop with
but it doesn't define a version without an argument. Is it possible to make a definition like that? |
I believe that something like
should do the trick. |
It works. Thanks @toivoh. |
Is this good to merge now? |
Yes, I think so. |
Translate ziggurat algorithms for normal and exponential variates to Julia.
This pull request removes the C file that computes normal variates in favor of a translation to Julia. On julia.mit.edu the implementation in Julia was 25 pct. faster which I can't really explain why. On my MacBook, they are approximately equally fast.
The tables necessary for calculating the normal variates are made in
BigFloat
and finally translated toUint64
andFloat64
thereby making them identical on 32 and 64 bit systems without storing the hardcoded tables as we done for about a week.The server is busy with other computations, and therefore I haven't run BigCrush on the normal and exponential variates yet, but I have tried the test that the
rand
fails, see #6464. Surprisingly, the normal variates don't fail the test.