diff --git a/posts/2014-07-Conv-Nets-Modular/index.html b/posts/2014-07-Conv-Nets-Modular/index.html index 212075f..29eed17 100644 --- a/posts/2014-07-Conv-Nets-Modular/index.html +++ b/posts/2014-07-Conv-Nets-Modular/index.html @@ -288,7 +288,7 @@
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.
In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.
I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.
diff --git a/posts/2014-12-Groups-Convolution/index.html b/posts/2014-12-Groups-Convolution/index.html index fec661c..688b84a 100644 --- a/posts/2014-12-Groups-Convolution/index.html +++ b/posts/2014-12-Groups-Convolution/index.html @@ -297,7 +297,7 @@Group convolutions provide elegant language for talking about lots of situations involving probability. But, since this is a series of blog posts on convolutional neural networks, you may suspect that I have other interests in them. Well, you guessed correctly. Group convolutions naturally extend convolutional neural networks, with everything fitting together extremely nicely. Since convolutional neural networks are one of the most powerful tools in machine learning right now, that’s pretty interesting. In our next post, we will explore these networks.
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m grateful to Yomna Nasser, Harry de Valence, Sam Eisenstat, and Sebastian Zany for taking the time to read and comment on draft version of this post – their feedback improved it a lot!
I’m also grateful to Guillaume Alain, Eliana Lorch, Dario Amodei, Aaron Courville, Yoshua Bengio, and Michael Nielsen for discussion of group convolution and its potential applications to neural networks.
diff --git a/posts/tags/convolution.xml b/posts/tags/convolution.xml index b48f9ac..f6dcae8 100644 --- a/posts/tags/convolution.xml +++ b/posts/tags/convolution.xml @@ -229,7 +229,7 @@ w_0 & w_1 & 0 & 0 & ...\\In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.
I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.
diff --git a/posts/tags/convolutional neural networks.xml b/posts/tags/convolutional neural networks.xml index dded6f6..72a7537 100644 --- a/posts/tags/convolutional neural networks.xml +++ b/posts/tags/convolutional neural networks.xml @@ -229,7 +229,7 @@ w_0 & w_1 & 0 & 0 & ...\\In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.
I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.
@@ -444,7 +444,7 @@ Filters learned by the first convolutional layer. The top half corresponds to thThis post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.
In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.
I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.
diff --git a/posts/tags/modular neural networks.xml b/posts/tags/modular neural networks.xml index baaf228..9252d84 100644 --- a/posts/tags/modular neural networks.xml +++ b/posts/tags/modular neural networks.xml @@ -203,7 +203,7 @@ Filters learned by the first convolutional layer. The top half corresponds to thThis post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.
In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.
I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.
@@ -1209,7 +1209,7 @@ Filters learned by the first convolutional layer. The top half corresponds to thThis post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m grateful to Eliana Lorch, Aaron Courville, and Sebastian Zany for their comments and support.
In fact, the use of highly-efficient parallel convolution implementations on GPUs has been essential to recent progress in computer vision.
This post is part of a series on convolutional neural networks and their generalizations. The first two posts will be review for those familiar with deep learning, while later ones should be of interest to everyone. To get updates, subscribe to my RSS feed!
-Please comment below or on the side. Pull requests can be made on github.
+Please comment below or on the side. Pull requests can be made on github.
I’m extremely grateful to Eliana Lorch, for extensive discussion of convolutions and help writing this post.
I’m also grateful to Michael Nielsen and Dario Amodei for their comments and support.