Skip to content
This repository was archived by the owner on Nov 16, 2023. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,4 @@

.. index:: models, ensemble, classification

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleClassifier.py
:language: python
"""
2 changes: 1 addition & 1 deletion src/python/docs/docstrings/EnsembleClassifier.txt
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
or ``"LogLossReduction"``.


:output_combiner: indicates how to combine the predictions of the different
:param output_combiner: indicates how to combine the predictions of the different
models into a single prediction. There are five available output
combiners for clasification:

Expand Down
4 changes: 2 additions & 2 deletions src/python/docs/docstrings/EnsembleRegressor.txt
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
``"RSquared"``.


:output_combiner: indicates how to combine the predictions of the different
:param output_combiner: indicates how to combine the predictions of the different
models into a single prediction. There are five available output
combiners for clasification:

Expand Down Expand Up @@ -126,7 +126,7 @@
<nimbusml.ensemble.output_combiner.RegressorStacking>`


.. index:: models, ensemble, classification
.. index:: models, ensemble, regression

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleRegressor.py
Expand Down
6 changes: 3 additions & 3 deletions src/python/docs/docstrings/LinearSvmBinaryClassifier.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@
.. remarks::
Linear SVM implements an algorithm that finds a hyperplane in the
feature space for binary classification, by solving an SVM problem.
For instance, with feature values *f_0, f_1,..., f_{D-1}*, the
For instance, with feature values $f_0, f_1,..., f_{D-1}$, the
prediction is given by determining what side of the hyperplane the
point falls into. That is the same as the sign of the feautures'
weighted sum, i.e. *\sum_{i = 0}^{D-1} \left(w_i * f_i \right) + b*,
where *w_0, w_1,..., w_{D-1}* are the weights computed by the
weighted sum, i.e. $\sum_{i = 0}^{D-1} \left(w_i * f_i \right) + b$,
where $w_0, w_1,..., w_{D-1}$ are the weights computed by the
algorithm, and *b* is the bias computed by the algorithm.

This algorithm implemented is the PEGASOS method, which alternates
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,6 @@
<nimbusml.ensemble.output_combiner.RegressorStacking>`


.. index:: models, ensemble, classification
.. index:: models, ensemble, regression

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleClassifier.py
:language: python
"""
5 changes: 1 addition & 4 deletions src/python/nimbusml/ensemble/ensembleclassifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,8 +77,7 @@ class EnsembleClassifier(core, BasePredictor, ClassifierMixin):
``"AccuracyMicro"``, ``"AccuracyMacro"``, ``"LogLoss"``,
or ``"LogLossReduction"``.


:output_combiner: indicates how to combine the predictions of the different
:param output_combiner: indicates how to combine the predictions of the different
models into a single prediction. There are five available output
combiners for clasification:

Expand All @@ -97,8 +96,6 @@ class EnsembleClassifier(core, BasePredictor, ClassifierMixin):
outputs of the trained models, weighted by the specified metric. The
metric can be ``"AccuracyMicroAvg"`` or ``"AccuracyMacroAvg"``.

:param output_combiner: Output combiner.

:param normalize: Specifies the type of automatic normalization used:

* ``"Auto"``: if normalization is needed, it is performed
Expand Down
7 changes: 2 additions & 5 deletions src/python/nimbusml/ensemble/ensembleregressor.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,8 +77,7 @@ class EnsembleRegressor(core, BasePredictor, RegressorMixin):
can be ``"L1"``, ``"L2"``, ``"Rms"``, or ``"Loss"``, or
``"RSquared"``.


:output_combiner: indicates how to combine the predictions of the different
:param output_combiner: indicates how to combine the predictions of the different
models into a single prediction. There are five available output
combiners for clasification:

Expand All @@ -91,8 +90,6 @@ class EnsembleRegressor(core, BasePredictor, RegressorMixin):
of the different models on a training instance, and the instance's
label.

:param output_combiner: Output combiner.

:param normalize: Specifies the type of automatic normalization used:

* ``"Auto"``: if normalization is needed, it is performed
Expand Down Expand Up @@ -166,7 +163,7 @@ class EnsembleRegressor(core, BasePredictor, RegressorMixin):
<nimbusml.ensemble.output_combiner.RegressorStacking>`


.. index:: models, ensemble, classification
.. index:: models, ensemble, regression

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleRegressor.py
Expand Down
6 changes: 3 additions & 3 deletions src/python/nimbusml/ensemble/output_combiner/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
'ClassifierStacking',
'ClassifierVoting',
'ClassifierWeightedAverage',
'ClassifierAverage',
'ClassifierMedian',
'ClassifierStacking'
'RegressorAverage',
'RegressorMedian',
'RegressorStacking'
]
Original file line number Diff line number Diff line change
Expand Up @@ -58,9 +58,6 @@ class ClassifierBestPerformanceSelector(core):

.. index:: models, ensemble, classification

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleClassifier.py
:language: python
"""

@trace
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,11 +54,8 @@ class RegressorBestPerformanceSelector(core):
<nimbusml.ensemble.output_combiner.RegressorStacking>`


.. index:: models, ensemble, classification
.. index:: models, ensemble, regression

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleClassifier.py
:language: python
"""

@trace
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -77,8 +77,7 @@ class EnsembleClassifier(
``"AccuracyMicro"``, ``"AccuracyMacro"``, ``"LogLoss"``,
or ``"LogLossReduction"``.


:output_combiner: indicates how to combine the predictions of the different
:param output_combiner: indicates how to combine the predictions of the different
models into a single prediction. There are five available output
combiners for clasification:

Expand All @@ -97,8 +96,6 @@ class EnsembleClassifier(
outputs of the trained models, weighted by the specified metric. The
metric can be ``"AccuracyMicroAvg"`` or ``"AccuracyMacroAvg"``.

:param output_combiner: Output combiner.

:param normalize: Specifies the type of automatic normalization used:

* ``"Auto"``: if normalization is needed, it is performed
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -75,8 +75,7 @@ class EnsembleRegressor(
can be ``"L1"``, ``"L2"``, ``"Rms"``, or ``"Loss"``, or
``"RSquared"``.


:output_combiner: indicates how to combine the predictions of the different
:param output_combiner: indicates how to combine the predictions of the different
models into a single prediction. There are five available output
combiners for clasification:

Expand All @@ -89,8 +88,6 @@ class EnsembleRegressor(
of the different models on a training instance, and the instance's
label.

:param output_combiner: Output combiner.

:param normalize: Specifies the type of automatic normalization used:

* ``"Auto"``: if normalization is needed, it is performed
Expand Down Expand Up @@ -164,7 +161,7 @@ class EnsembleRegressor(
<nimbusml.ensemble.output_combiner.RegressorStacking>`


.. index:: models, ensemble, classification
.. index:: models, ensemble, regression

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleRegressor.py
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -58,9 +58,6 @@ class ClassifierBestPerformanceSelector(Component):

.. index:: models, ensemble, classification

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleClassifier.py
:language: python
"""

@trace
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,11 +54,8 @@ class RegressorBestPerformanceSelector(Component):
<nimbusml.ensemble.output_combiner.RegressorStacking>`


.. index:: models, ensemble, classification
.. index:: models, ensemble, regression

Example:
.. literalinclude:: /../nimbusml/examples/EnsembleClassifier.py
:language: python
"""

@trace
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,11 @@ class LinearSvmBinaryClassifier(
.. remarks::
Linear SVM implements an algorithm that finds a hyperplane in the
feature space for binary classification, by solving an SVM problem.
For instance, with feature values *f_0, f_1,..., f_{D-1}*, the
For instance, with feature values $f_0, f_1,..., f_{D-1}$, the
prediction is given by determining what side of the hyperplane the
point falls into. That is the same as the sign of the feautures'
weighted sum, i.e. *\sum_{i = 0}^{D-1} \left(w_i * f_i \right) + b*,
where *w_0, w_1,..., w_{D-1}* are the weights computed by the
weighted sum, i.e. $\sum_{i = 0}^{D-1} \left(w_i * f_i \right) + b$,
where $w_0, w_1,..., w_{D-1}$ are the weights computed by the
algorithm, and *b* is the bias computed by the algorithm.

This algorithm implemented is the PEGASOS method, which alternates
Expand Down
6 changes: 3 additions & 3 deletions src/python/nimbusml/linear_model/linearsvmbinaryclassifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,11 +29,11 @@ class LinearSvmBinaryClassifier(
.. remarks::
Linear SVM implements an algorithm that finds a hyperplane in the
feature space for binary classification, by solving an SVM problem.
For instance, with feature values *f_0, f_1,..., f_{D-1}*, the
For instance, with feature values $f_0, f_1,..., f_{D-1}$, the
prediction is given by determining what side of the hyperplane the
point falls into. That is the same as the sign of the feautures'
weighted sum, i.e. *\sum_{i = 0}^{D-1} \left(w_i * f_i \right) + b*,
where *w_0, w_1,..., w_{D-1}* are the weights computed by the
weighted sum, i.e. $\sum_{i = 0}^{D-1} \left(w_i * f_i \right) + b$,
where $w_0, w_1,..., w_{D-1}$ are the weights computed by the
algorithm, and *b* is the bias computed by the algorithm.

This algorithm implemented is the PEGASOS method, which alternates
Expand Down