-
-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ModelBuilder docs out of date #246
Comments
Those methods changed names because they are now static methods instead of properties. PR #235 + @staticmethod
+ def get_default_sampler_config() -> Dict:
- @property
- def default_sampler_config(self) -> Dict:
+ @staticmethod
+ def get_default_model_config(self) -> Dict:
- @property
- def default_model_config(self) -> Dict: There's a test for the experimental LinearModel class that uses a scikit-learn pipeline. Adapting it to be a standalone script: import numpy as np
import pandas as pd
from sklearn import set_config
from sklearn.compose import TransformedTargetRegressor
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import StandardScaler
from pymc_experimental.linearmodel import LinearModel
set_config(transform_output="pandas")
toy_X = pd.DataFrame({"input": np.linspace(start=0, stop=1, num=100)})
y = 5 * toy_X["input"] + 3
y = y + np.random.normal(0, 1, size=len(toy_X))
toy_y = pd.Series(y, name="output")
model_config = {
"intercept": {"loc": 0, "scale": 2},
"slope": {"loc": 0, "scale": 2},
"obs_error": 1,
"default_output_var": "y_hat",
}
model = Pipeline(
[
("input_scaling", StandardScaler()),
(
"linear_model",
TransformedTargetRegressor(LinearModel(model_config), transformer=StandardScaler()),
),
]
)
model.fit(toy_X, toy_y)
X_pred = pd.DataFrame({"input": np.random.uniform(low=0, high=1, size=100)})
model.predict(X_pred) If you want to get the posterior predictive samples transformed rather than just the expected value of the posterior prediction, then you need to extend TransformedTargetRegressor. I use this: import sklearn.compose
class TransformedTargetPYMCRegressor(sklearn.compose.TransformedTargetRegressor):
"""Add predict_posterior to sklearn.compose.TransformedTargetRegressor"""
def predict_posterior(self, X, **predict_params):
"""Predict using the base regressor, applying inverse.
The regressor is used to predict and the `inverse_func` or
`inverse_transform` is applied before returning the prediction.
Parameters
----------
X : {array-like, sparse matrix} of shape (n_samples, n_features)
Samples.
**predict_params : dict of str -> object
Parameters passed to the `predict_posterior` method of the underlying
regressor.
Returns
-------
y_hat : ndarray of shape (n_samples,)
Predicted values.
"""
# check_is_fitted(self)
pred = self.regressor_.predict_posterior(X, **predict_params)
# TODO: This only works if the output is reshaped to 2D. If draws & chains are separate dimensions, will fail.
if pred.ndim == 1:
pred_trans = self.transformer_.inverse_transform(pred.reshape(-1, 1))
else:
pred_trans = self.transformer_.inverse_transform(pred)
if self._training_dim == 1 and pred_trans.ndim == 2 and pred_trans.shape[1] == 1:
pred_trans = pred_trans.squeeze(axis=1)
return pred_trans
def predict_proba(self, X, **predict_params):
return self.predict_posterior(X, **predict_params) Then you can use |
Can someone do a PR that fixes the docs? |
@twiecki I can prepare a PR. This will be my first contribution to the docs. I guess I just need to follow the instructions in Contributing.md. |
@pdb5627 Yes, exactly. Let me know if you have any questions. |
The It would be good to see either:
|
Technically, you don't have to provide a
That's the idea of
Yeah, we want to write one but progress on this front has been slow. Do let me know if this is something you'd like to collaborate on, definitely looking for partners who can help with this. |
True but what if
I'm going to try and use |
Perhaps we should still keep the original API and have the sklearn one in an inherited class like originally proposed... |
Oh yeah, if there was originally a class which just converted pymc models into classes with all the configs attached and a generalised .fit(), that would be great. I'm sure loads of users would like that |
@theorashid Yeah, that's what we had, and you'd input a dict, so no shape problems. |
I don't know the original, but it would be good to allow |
Copying straight from the example (maybe it's just out of date docs, if so, how do I fix?)
Then running
Runs fine if I pass a some dictionaries
I tried to edit the class with
but that just changed the error to
Any ideas? Happy to do a fix PR if it's quick and easy
Also, if the docs are old, looking at the PR history with the merging of the
BayesianEstimator
andModelBuilder
classes, it would be great to have an example of a pymc model in a pipeline. For the project I'm working on, I currently have a pipeline in the_generate_and_preprocess_model_data()
, but it would be cool to have that outside the model.The text was updated successfully, but these errors were encountered: