Skip to content

Parameters

Amro edited this page Jun 13, 2016 · 2 revisions

The Parameters section is shown below:

parameters

This section allows you to customize how exactly the neural network should be trained. There are several drop-down menus that allow you to perform this customization, which are the following:

Learning Rate

This stems from Stochastic Gradient Descent and so the learning rate is the step size to take to move closer towards the solution (moving in the direction of the gradient of the cost function). This drop-down menu allows you to choose between one of several learning rates. Setting a value too large may cause the training oscillate or even diverge, while setting the value too small makes the training slow to converge. Experiment with this value for different datasets. The default is 0.03.

Activation Function

The hidden layers in the neural network all have the same activation function, and you can choose between one of the following activation functions:

The first three are used primarily for classification while the last one (linear) is usually used in regression problems.

Regularization

This is a mechanism to prevent overfitting and to allow your neural network to generalize to new inputs. The options you have available are:

  • None: No regularization - the default
  • L1: L1 regularization using the technique outlined in Tsuruoka et al.'s work.
  • L2: L2 regularization

Regularization Rate

The strength of the regularization to apply. This drop-down menu allows you to choose between one of several regularization rates. Setting a value too large will potentially produce a neural network that has high bias and underfits the data. Setting a value to be too small, or even 0 risks the data to overfit or have high variance.

Tip: Start with a small value, then increase the value slowly to see what the effects are.

Problem Type

You can choose between classification or regression as the problem to solve, which is what the options in this drop-down menu give you.

Clone this wiki locally