You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 10, 2021. It is now read-only.
Adding more tests for grid search and randomized search, using both classifiers and regressors. Removed prefix from output layers. Adding example to advanced section of documentation.
Copy file name to clipboardExpand all lines: docs/guide_advanced.rst
+22-4
Original file line number
Diff line number
Diff line change
@@ -44,16 +44,15 @@ Here's how to setup such a pipeline with a multi-layer perceptron as a classifie
44
44
You can then use the pipeline as you would the neural network, or any other standard API from scikit-learn.
45
45
46
46
47
-
48
47
Grid Search
49
48
-----------
50
49
51
-
In scikit-learn, you can use a ``GridSearchCV`` to optimize your neural network's parameters automatically, both the top-level parameters and the parameters within the layers. For example, assuming you have your MLP constructed as in the :ref:`Regression` example in the local variable called ``nn``, the layers are named automatically so you can refer to them as follows:
50
+
In scikit-learn, you can use a ``GridSearchCV`` to optimize your neural network's hyper-parameters automatically, both the top-level parameters and the parameters within the layers. For example, assuming you have your MLP constructed as in the :ref:`Regression` example in the local variable called ``nn``, the layers are named automatically so you can refer to them as follows:
52
51
53
52
* ``hidden0``
54
53
* ``hidden1``
55
54
* ...
56
-
* ``output2``
55
+
* ``output``
57
56
58
57
Keep in mind you can manually specify the ``name`` of any ``Layer`` in the constructor if you don't want the automatically assigned name. Then, you can use sklearn's hierarchical parameters to perform a grid search over those nested parameters too:
59
58
@@ -67,4 +66,23 @@ Keep in mind you can manually specify the ``name`` of any ``Layer`` in the const
This will search through the listed ``learning_rate`` values, the number of hidden units and the activation type for that layer too, and find the best combination of parameters.
69
+
This will search through the listed ``learning_rate`` values, the number of hidden units and the activation type for that layer too, and find the best combination of parameters.
70
+
71
+
72
+
Randomized Search
73
+
-----------------
74
+
75
+
In the cases when you have large numbers of hyper-parameters that you want to try automatically to find a good combination, you can use a randomized search as follows:
76
+
77
+
.. code:: python
78
+
79
+
from scipy import stats
80
+
from sklearn.grid_search import RandomizedSearchCV
0 commit comments