diff --git a/posts/2014-07-Conv-Nets-Modular/index.html b/posts/2014-07-Conv-Nets-Modular/index.html index 212075f..29eed17 100644 --- a/posts/2014-07-Conv-Nets-Modular/index.html +++ b/posts/2014-07-Conv-Nets-Modular/index.html @@ -288,7 +288,7 @@

Formalizing Convolutional Neu

Next Posts in this Series

Read the next post!

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.

diff --git a/posts/2014-07-Understanding-Convolutions/index.html b/posts/2014-07-Understanding-Convolutions/index.html index b00cf6e..f82b6ae 100644 --- a/posts/2014-07-Understanding-Convolutions/index.html +++ b/posts/2014-07-Understanding-Convolutions/index.html @@ -313,7 +313,7 @@

Conclusion

In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.

Next Posts in this Series

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.

I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.

diff --git a/posts/2014-12-Groups-Convolution/index.html b/posts/2014-12-Groups-Convolution/index.html index fec661c..688b84a 100644 --- a/posts/2014-12-Groups-Convolution/index.html +++ b/posts/2014-12-Groups-Convolution/index.html @@ -297,7 +297,7 @@

Conclusion

Group convolutions provide elegant language for talking about lots of situations involving probability. But, since this is a series of blog posts on convolutional neural networks, you may suspect that I have other interests in them. Well, you guessed correctly. Group convolutions naturally extend convolutional neural networks, with everything fitting together extremely nicely. Since convolutional neural networks are one of the most powerful tools in machine learning right now, that’s pretty interesting. In our next post, we will explore these networks.

Next Posts in this Series

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgements

I’m grateful to Yomna Nasser, Harry de Valence, Sam Eisenstat, and Sebastian Zany for taking the time to read and comment on draft version of this post – their feedback improved it a lot!

I’m also grateful to Guillaume Alain, Eliana Lorch, Dario Amodei, Aaron Courville, Yoshua Bengio, and Michael Nielsen for discussion of group convolution and its potential applications to neural networks.

diff --git a/posts/tags/convolution.xml b/posts/tags/convolution.xml index b48f9ac..f6dcae8 100644 --- a/posts/tags/convolution.xml +++ b/posts/tags/convolution.xml @@ -229,7 +229,7 @@ w_0 & w_1 & 0 & 0 & ...\\

In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.

Next Posts in this Series

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.

I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.

diff --git a/posts/tags/convolutional neural networks.xml b/posts/tags/convolutional neural networks.xml index dded6f6..72a7537 100644 --- a/posts/tags/convolutional neural networks.xml +++ b/posts/tags/convolutional neural networks.xml @@ -229,7 +229,7 @@ w_0 & w_1 & 0 & 0 & ...\\

In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.

Next Posts in this Series

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.

I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.

@@ -444,7 +444,7 @@ Filters learned by the first convolutional layer. The top half corresponds to th

Next Posts in this Series

Read the next post!

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.

diff --git a/posts/tags/deep learning.xml b/posts/tags/deep learning.xml index bb76d44..3acee0d 100644 --- a/posts/tags/deep learning.xml +++ b/posts/tags/deep learning.xml @@ -968,7 +968,7 @@ Filters learned by the first convolutional layer. The top half corresponds to th

Next Posts in this Series

Read the next post!

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.

diff --git a/posts/tags/math.xml b/posts/tags/math.xml index 903bcb3..b9fb167 100644 --- a/posts/tags/math.xml +++ b/posts/tags/math.xml @@ -229,7 +229,7 @@ w_0 & w_1 & 0 & 0 & ...\\

In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.

Next Posts in this Series

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.

I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.

diff --git a/posts/tags/modular neural networks.xml b/posts/tags/modular neural networks.xml index baaf228..9252d84 100644 --- a/posts/tags/modular neural networks.xml +++ b/posts/tags/modular neural networks.xml @@ -203,7 +203,7 @@ Filters learned by the first convolutional layer. The top half corresponds to th

Next Posts in this Series

Read the next post!

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.

diff --git a/posts/tags/neural networks.xml b/posts/tags/neural networks.xml index b4be67f..8121303 100644 --- a/posts/tags/neural networks.xml +++ b/posts/tags/neural networks.xml @@ -994,7 +994,7 @@ w_0 & w_1 & 0 & 0 & ...\\

In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.

Next Posts in this Series

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.

I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.

@@ -1209,7 +1209,7 @@ Filters learned by the first convolutional layer. The top half corresponds to th

Next Posts in this Series

Read the next post!

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.

diff --git a/posts/tags/probability.xml b/posts/tags/probability.xml index 83af7bd..902ed01 100644 --- a/posts/tags/probability.xml +++ b/posts/tags/probability.xml @@ -229,7 +229,7 @@ w_0 & w_1 & 0 & 0 & ...\\

In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.

Next Posts in this Series

This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!

-

Please comment below or on the side. Pull requests can be made on github.

+

Please comment below or on the side. Pull requests can be made on github.

Acknowledgments

I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.

I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.