Skip to content

Commit

Permalink
Merge pull request #100 from zeya30/docsite-update
Browse files Browse the repository at this point in the history
Docsite update
  • Loading branch information
dylanbouchard authored Jan 13, 2025
2 parents 74966e6 + a6be5ef commit ac731db
Show file tree
Hide file tree
Showing 125 changed files with 222 additions and 55 deletions.
2 changes: 1 addition & 1 deletion docs/latest/.buildinfo
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: 259f0daeeb0c99778d9a9bb1e10c103a
config: d0962c459949a91fb59466a018c3575f
tags: 645f666f9bcd5a90fca523b33c5a78b7
Binary file not shown.
Binary file modified docs/latest/.doctrees/_autosummary/langfair.auto.auto.doctree
Binary file not shown.
Binary file modified docs/latest/.doctrees/_autosummary/langfair.auto.doctree
Binary file not shown.
Binary file modified docs/latest/.doctrees/_autosummary/langfair.constants.doctree
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file modified docs/latest/.doctrees/_autosummary/langfair.generator.doctree
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file modified docs/latest/.doctrees/_autosummary/langfair.metrics.doctree
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file modified docs/latest/.doctrees/api.doctree
Binary file not shown.
Binary file modified docs/latest/.doctrees/auto_examples/auto_eval_demo.doctree
Binary file not shown.
Binary file not shown.
Binary file modified docs/latest/.doctrees/environment.pickle
Binary file not shown.
Binary file modified docs/latest/.doctrees/guide.doctree
Binary file not shown.
Binary file modified docs/latest/.doctrees/index.doctree
Binary file not shown.
Binary file modified docs/latest/.doctrees/usage.doctree
Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -704,18 +704,20 @@ <h1>langfair.generator.counterfactual.CounterfactualGenerator<a class="headerlin
</ul>
</dd>
<dt class="field-even">Returns<span class="colon">:</span></dt>
<dd class="field-even"><p><p>A dictionary with two keys: ‘data’ and ‘metadata’.
‘data’ : dict</p>
<blockquote>
<div><p>A dictionary containing the prompts and responses.</p>
</div></blockquote>
<dd class="field-even"><p><p>A dictionary with two keys: ‘data’ and ‘metadata’.</p>
<dl>
<dt>’metadata’<span class="classifier">dict</span></dt><dd><p>A dictionary containing metadata about the generation process.
‘non_completion_rate’ : float</p>
<blockquote>
<div><p>The rate at which the generation process did not complete.</p>
</div></blockquote>
<dt>’data’<span class="classifier">dict</span></dt><dd><p>A dictionary containing the prompts and responses.</p>
<dl class="simple">
<dt>’prompt’<span class="classifier">list</span></dt><dd><p>A list of prompts.</p>
</dd>
<dt>’response’<span class="classifier">list</span></dt><dd><p>A list of responses corresponding to the prompts.</p>
</dd>
</dl>
</dd>
<dt>’metadata’<span class="classifier">dict</span></dt><dd><p>A dictionary containing metadata about the generation process.</p>
<dl class="simple">
<dt>’non_completion_rate’<span class="classifier">float</span></dt><dd><p>The rate at which the generation process did not complete.</p>
</dd>
<dt>’temperature’<span class="classifier">float</span></dt><dd><p>The temperature parameter used in the generation process.</p>
</dd>
<dt>’count’<span class="classifier">int</span></dt><dd><p>The count of prompts used in the generation process.</p>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -622,26 +622,20 @@ <h1>langfair.generator.generator.ResponseGenerator<a class="headerlink" href="#l
</ul>
</dd>
<dt class="field-even">Returns<span class="colon">:</span></dt>
<dd class="field-even"><p><p>A dictionary with two keys: ‘data’ and ‘metadata’.
‘data’ : dict</p>
<blockquote>
<div><p>A dictionary containing the prompts and responses.
‘prompt’ : list</p>
<blockquote>
<div><p>A list of prompts.</p>
</div></blockquote>
<dd class="field-even"><p><p>A dictionary with two keys: ‘data’ and ‘metadata’.</p>
<dl>
<dt>’data’<span class="classifier">dict</span></dt><dd><p>A dictionary containing the prompts and responses.</p>
<dl class="simple">
<dt>’prompt’<span class="classifier">list</span></dt><dd><p>A list of prompts.</p>
</dd>
<dt>’response’<span class="classifier">list</span></dt><dd><p>A list of responses corresponding to the prompts.</p>
</dd>
</dl>
</div></blockquote>
<dl>
<dt>’metadata’<span class="classifier">dict</span></dt><dd><p>A dictionary containing metadata about the generation process.
‘non_completion_rate’ : float</p>
<blockquote>
<div><p>The rate at which the generation process did not complete.</p>
</div></blockquote>
</dd>
<dt>’metadata’<span class="classifier">dict</span></dt><dd><p>A dictionary containing metadata about the generation process.</p>
<dl class="simple">
<dt>’non_completion_rate’<span class="classifier">float</span></dt><dd><p>The rate at which the generation process did not complete.</p>
</dd>
<dt>’temperature’<span class="classifier">float</span></dt><dd><p>The temperature parameter used in the generation process.</p>
</dd>
<dt>’count’<span class="classifier">int</span></dt><dd><p>The count of prompts used in the generation process.</p>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -569,15 +569,35 @@ <h1>langfair.metrics.classification.metrics.baseclass.metrics.Metric<a class="he
<tr class="row-odd"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric.__init__" title="langfair.metrics.classification.metrics.baseclass.metrics.Metric.__init__"><code class="xref py py-obj docutils literal notranslate"><span class="pre">__init__</span></code></a>()</p></td>
<td><p></p></td>
</tr>
<tr class="row-even"><td><p><code class="xref py py-obj docutils literal notranslate"><span class="pre">binary_confusion_matrix</span></code>(y_true, y_pred)</p></td>
<td><p></p></td>
<tr class="row-even"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric.binary_confusion_matrix" title="langfair.metrics.classification.metrics.baseclass.metrics.Metric.binary_confusion_matrix"><code class="xref py py-obj docutils literal notranslate"><span class="pre">binary_confusion_matrix</span></code></a>(y_true, y_pred)</p></td>
<td><p>Method for computing binary confusion matrix</p></td>
</tr>
<tr class="row-odd"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric.evaluate" title="langfair.metrics.classification.metrics.baseclass.metrics.Metric.evaluate"><code class="xref py py-obj docutils literal notranslate"><span class="pre">evaluate</span></code></a>(groups, y_pred[, y_true, ratio])</p></td>
<td><p>Abstract method that needs to be implemented by the user when creating a new metric function.</p></td>
</tr>
</tbody>
</table>
</div>
<dl class="py method">
<dt class="sig sig-object py" id="langfair.metrics.classification.metrics.baseclass.metrics.Metric.binary_confusion_matrix">
<em class="property"><span class="pre">static</span><span class="w"> </span></em><span class="sig-name descname"><span class="pre">binary_confusion_matrix</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">y_true</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_pred</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric.binary_confusion_matrix" title="Link to this definition">#</a></dt>
<dd><p>Method for computing binary confusion matrix</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters<span class="colon">:</span></dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>y_true</strong> (<em>Array-like</em>) – Binary labels (ground truth values)</p></li>
<li><p><strong>y_pred</strong> (<em>Array-like</em>) – Binary model predictions</p></li>
</ul>
</dd>
<dt class="field-even">Returns<span class="colon">:</span></dt>
<dd class="field-even"><p>2x2 confusion matrix</p>
</dd>
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p>List[List[float]]</p>
</dd>
</dl>
</dd></dl>

<dl class="py method">
<dt class="sig sig-object py" id="langfair.metrics.classification.metrics.baseclass.metrics.Metric.evaluate">
<em class="property"><span class="pre">abstract</span><span class="w"> </span></em><span class="sig-name descname"><span class="pre">evaluate</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">groups</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_pred</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_true</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">ratio</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">False</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric.evaluate" title="Link to this definition">#</a></dt>
Expand Down Expand Up @@ -642,6 +662,7 @@ <h1>langfair.metrics.classification.metrics.baseclass.metrics.Metric<a class="he
<ul class="visible nav section-nav flex-column">
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric"><code class="docutils literal notranslate"><span class="pre">Metric</span></code></a><ul class="nav section-nav flex-column">
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric.__init__"><code class="docutils literal notranslate"><span class="pre">Metric.__init__()</span></code></a></li>
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric.binary_confusion_matrix"><code class="docutils literal notranslate"><span class="pre">Metric.binary_confusion_matrix()</span></code></a></li>
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.baseclass.metrics.Metric.evaluate"><code class="docutils literal notranslate"><span class="pre">Metric.evaluate()</span></code></a></li>
</ul>
</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -567,15 +567,35 @@ <h1>langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRatePa
<tr class="row-odd"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.__init__" title="langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.__init__"><code class="xref py py-obj docutils literal notranslate"><span class="pre">__init__</span></code></a>()</p></td>
<td><p>This class computes false negative rate parity.</p></td>
</tr>
<tr class="row-even"><td><p><code class="xref py py-obj docutils literal notranslate"><span class="pre">binary_confusion_matrix</span></code>(y_true, y_pred)</p></td>
<td><p></p></td>
<tr class="row-even"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.binary_confusion_matrix" title="langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.binary_confusion_matrix"><code class="xref py py-obj docutils literal notranslate"><span class="pre">binary_confusion_matrix</span></code></a>(y_true, y_pred)</p></td>
<td><p>Method for computing binary confusion matrix</p></td>
</tr>
<tr class="row-odd"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.evaluate" title="langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.evaluate"><code class="xref py py-obj docutils literal notranslate"><span class="pre">evaluate</span></code></a>(groups, y_pred, y_true[, ratio])</p></td>
<td><p>This method computes disparity in false negative rates between two groups.</p></td>
</tr>
</tbody>
</table>
</div>
<dl class="py method">
<dt class="sig sig-object py" id="langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.binary_confusion_matrix">
<em class="property"><span class="pre">static</span><span class="w"> </span></em><span class="sig-name descname"><span class="pre">binary_confusion_matrix</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">y_true</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_pred</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.binary_confusion_matrix" title="Link to this definition">#</a></dt>
<dd><p>Method for computing binary confusion matrix</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters<span class="colon">:</span></dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>y_true</strong> (<em>Array-like</em>) – Binary labels (ground truth values)</p></li>
<li><p><strong>y_pred</strong> (<em>Array-like</em>) – Binary model predictions</p></li>
</ul>
</dd>
<dt class="field-even">Returns<span class="colon">:</span></dt>
<dd class="field-even"><p>2x2 confusion matrix</p>
</dd>
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p>List[List[float]]</p>
</dd>
</dl>
</dd></dl>

<dl class="py method">
<dt class="sig sig-object py" id="langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.evaluate">
<span class="sig-name descname"><span class="pre">evaluate</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">groups</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_pred</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_true</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">ratio</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">False</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.evaluate" title="Link to this definition">#</a></dt>
Expand Down Expand Up @@ -664,6 +684,7 @@ <h1>langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRatePa
<ul class="visible nav section-nav flex-column">
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity"><code class="docutils literal notranslate"><span class="pre">FalseDiscoveryRateParity</span></code></a><ul class="nav section-nav flex-column">
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.__init__"><code class="docutils literal notranslate"><span class="pre">FalseDiscoveryRateParity.__init__()</span></code></a></li>
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.binary_confusion_matrix"><code class="docutils literal notranslate"><span class="pre">FalseDiscoveryRateParity.binary_confusion_matrix()</span></code></a></li>
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.false_discovery.FalseDiscoveryRateParity.evaluate"><code class="docutils literal notranslate"><span class="pre">FalseDiscoveryRateParity.evaluate()</span></code></a></li>
</ul>
</li>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -567,15 +567,35 @@ <h1>langfair.metrics.classification.metrics.false_negative.FalseNegativeRatePari
<tr class="row-odd"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.__init__" title="langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.__init__"><code class="xref py py-obj docutils literal notranslate"><span class="pre">__init__</span></code></a>()</p></td>
<td><p>This class computes false negative rate parity.</p></td>
</tr>
<tr class="row-even"><td><p><code class="xref py py-obj docutils literal notranslate"><span class="pre">binary_confusion_matrix</span></code>(y_true, y_pred)</p></td>
<td><p></p></td>
<tr class="row-even"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.binary_confusion_matrix" title="langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.binary_confusion_matrix"><code class="xref py py-obj docutils literal notranslate"><span class="pre">binary_confusion_matrix</span></code></a>(y_true, y_pred)</p></td>
<td><p>Method for computing binary confusion matrix</p></td>
</tr>
<tr class="row-odd"><td><p><a class="reference internal" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.evaluate" title="langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.evaluate"><code class="xref py py-obj docutils literal notranslate"><span class="pre">evaluate</span></code></a>(groups, y_pred, y_true[, ratio])</p></td>
<td><p>This method computes disparity in false negative rates between two groups.</p></td>
</tr>
</tbody>
</table>
</div>
<dl class="py method">
<dt class="sig sig-object py" id="langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.binary_confusion_matrix">
<em class="property"><span class="pre">static</span><span class="w"> </span></em><span class="sig-name descname"><span class="pre">binary_confusion_matrix</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">y_true</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_pred</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.binary_confusion_matrix" title="Link to this definition">#</a></dt>
<dd><p>Method for computing binary confusion matrix</p>
<dl class="field-list simple">
<dt class="field-odd">Parameters<span class="colon">:</span></dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>y_true</strong> (<em>Array-like</em>) – Binary labels (ground truth values)</p></li>
<li><p><strong>y_pred</strong> (<em>Array-like</em>) – Binary model predictions</p></li>
</ul>
</dd>
<dt class="field-even">Returns<span class="colon">:</span></dt>
<dd class="field-even"><p>2x2 confusion matrix</p>
</dd>
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p>List[List[float]]</p>
</dd>
</dl>
</dd></dl>

<dl class="py method">
<dt class="sig sig-object py" id="langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.evaluate">
<span class="sig-name descname"><span class="pre">evaluate</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">groups</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_pred</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">y_true</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">ratio</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">False</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.evaluate" title="Link to this definition">#</a></dt>
Expand Down Expand Up @@ -664,6 +684,7 @@ <h1>langfair.metrics.classification.metrics.false_negative.FalseNegativeRatePari
<ul class="visible nav section-nav flex-column">
<li class="toc-h2 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity"><code class="docutils literal notranslate"><span class="pre">FalseNegativeRateParity</span></code></a><ul class="nav section-nav flex-column">
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.__init__"><code class="docutils literal notranslate"><span class="pre">FalseNegativeRateParity.__init__()</span></code></a></li>
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.binary_confusion_matrix"><code class="docutils literal notranslate"><span class="pre">FalseNegativeRateParity.binary_confusion_matrix()</span></code></a></li>
<li class="toc-h3 nav-item toc-entry"><a class="reference internal nav-link" href="#langfair.metrics.classification.metrics.false_negative.FalseNegativeRateParity.evaluate"><code class="docutils literal notranslate"><span class="pre">FalseNegativeRateParity.evaluate()</span></code></a></li>
</ul>
</li>
Expand Down
Loading

0 comments on commit ac731db

Please sign in to comment.