Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add two step classifier #431

Merged
merged 28 commits into from
Jan 24, 2025
Merged
Show file tree
Hide file tree
Changes from 23 commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
bf480ff
add logreg and two step classifer
anna-charlotte Jan 16, 2025
e6d3e3b
add config param to enable 2-step-classifier
anna-charlotte Jan 16, 2025
1c2065c
fix FDRManger two-step-classifier parameter
anna-charlotte Jan 16, 2025
29168e4
extract fdr utility functions due to circular import
anna-charlotte Jan 16, 2025
124260f
fix logreg initialization
anna-charlotte Jan 16, 2025
6480c6a
fix fdr_utils test
anna-charlotte Jan 16, 2025
e450f24
Merge remote-tracking branch 'origin/main' into add-two-step-classifier
anna-charlotte Jan 16, 2025
5850360
fix fdr_utils refactoring
anna-charlotte Jan 16, 2025
e8fcc3f
remove redundant perform_fdr_new function
anna-charlotte Jan 17, 2025
052c109
revert fdr_utils changes
anna-charlotte Jan 17, 2025
9df38a8
clean up and add docstrings
anna-charlotte Jan 17, 2025
9e8c0c8
add missing docstring
anna-charlotte Jan 17, 2025
1605cfa
clean up fdrexperimental.py
anna-charlotte Jan 17, 2025
8551503
Merge remote-tracking branch 'origin/main' into add-two-step-classifier
anna-charlotte Jan 17, 2025
2d2be2a
formatting
anna-charlotte Jan 17, 2025
7a1f4a7
move models to new fdr_analysis module
anna-charlotte Jan 17, 2025
ceabe7c
move files from fdr_analysis to fdrx module
anna-charlotte Jan 17, 2025
fdf62db
add max_iteration parameter to 2-step-classifier.fit_predict()
anna-charlotte Jan 20, 2025
1c8157f
Merge remote-tracking branch 'origin/main' into add-two-step-classifier
anna-charlotte Jan 20, 2025
e6d6b95
refactoring of two-step-classifier helper functions
anna-charlotte Jan 21, 2025
65a11bb
add unit tests
anna-charlotte Jan 21, 2025
ef6fc45
formatting
anna-charlotte Jan 21, 2025
584bab7
fix test for get_target_decoy_partners
anna-charlotte Jan 21, 2025
c5e3eed
addressing PR comments
anna-charlotte Jan 24, 2025
21da1be
addressing PR comments
anna-charlotte Jan 24, 2025
7dd43a1
Merge remote-tracking branch 'origin/main' into add-two-step-classifier
anna-charlotte Jan 24, 2025
8c5547a
fix formatting
anna-charlotte Jan 24, 2025
3d602af
addressing pr comments, private attributes
anna-charlotte Jan 24, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions alphadia/constants/default.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -233,6 +233,7 @@ fdr:
keep_decoys: false
channel_wise_fdr: false
inference_strategy: "heuristic"
enable_two_step_classifier: false

search_output:
peptide_level_lfq: false
Expand Down
4 changes: 4 additions & 0 deletions alphadia/fdrx/models/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
from .logistic_regression import LogisticRegressionClassifier
from .two_step_classifier import TwoStepClassifier

__all__ = ["LogisticRegressionClassifier", "TwoStepClassifier"]
anna-charlotte marked this conversation as resolved.
Show resolved Hide resolved
128 changes: 128 additions & 0 deletions alphadia/fdrx/models/logistic_regression.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
import logging

import numpy as np
from sklearn.linear_model import LogisticRegression
from sklearn.preprocessing import StandardScaler

from alphadia.fdrexperimental import Classifier

logger = logging.getLogger()


class LogisticRegressionClassifier(Classifier):
def __init__(self) -> None:
"""Binary classifier using a logistic regression model."""
self.scaler = StandardScaler()
self.model = LogisticRegression()
self._fitted = False

@property
def fitted(self) -> bool:
return self._fitted

def fit(self, x: np.ndarray, y: np.ndarray) -> None:
"""Fit the classifier to the data.

Parameters
----------

x : np.array, dtype=float
Training data of shape (n_samples, n_features).

y : np.array, dtype=int
Target values of shape (n_samples,) or (n_samples, n_classes).

"""
x_scaled = self.scaler.fit_transform(x)
self.model.fit(x_scaled, y)
self._fitted = True

def predict(self, x: np.ndarray) -> np.ndarray:
"""Predict the class of the data.

Parameters
----------

x : np.array, dtype=float
Data of shape (n_samples, n_features).
anna-charlotte marked this conversation as resolved.
Show resolved Hide resolved

Returns
-------

y : np.array, dtype=float
Predicted class probabilities of shape (n_samples, n_classes).

"""
x_scaled = self.scaler.transform(x)
return self.model.predict(x_scaled)

def predict_proba(self, x: np.ndarray) -> np.ndarray:
"""Predict the class probabilities of the data.

Parameters
----------

x : np.array, dtype=float
Data of shape (n_samples, n_features).

Returns
-------

y : np.array, dtype=float
Predicted class probabilities of shape (n_samples, n_classes).

"""
x_scaled = self.scaler.transform(x)
return self.model.predict_proba(x_scaled)

def to_state_dict(self) -> dict:
"""Return the state of the classifier as a dictionary.

Returns
-------

dict : dict
Dictionary containing the state of the classifier.

"""
state_dict = {"_fitted": self._fitted}

if self._fitted:
state_dict.update(
{
"scaler_mean": self.scaler.mean_,
"scaler_var": self.scaler.var_,
"scaler_scale": self.scaler.scale_,
"scaler_n_samples_seen": self.scaler.n_samples_seen_,
"model_coef": self.model.coef_,
"model_intercept": self.model.intercept_,
"model_classes": self.model.classes_,
"is_fitted": self._fitted,
}
)

return state_dict

def from_state_dict(self, state_dict: dict) -> None:
"""Load the state of the classifier from a dictionary.

Parameters
----------

dict : dict
Dictionary containing the state of the classifier.

"""
self._fitted = state_dict["_fitted"]

if self.fitted:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please check if this should rather be if self._fitted:? if not, add a comment which deconfuses me :-)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I figured, since we implement Classifier, which has the @property def fitted(self): ..., one would use self._fitted for setting, but self.fitted for accessing, is that right? If so, I'll add a comment, or otherwise change it 😃

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as it's the class accesing an instance variable, it's fine to use self._fitted (this is equivalent in terms of logic as here the property is a 1:1 wrapper around self._fitted to make it public)
more than "fine" actually: would prefer it for consistency :-)

@GeorgWa why do we have these properties anyway? they don't seem to be used

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@anna-charlotte just FYI in case you also want to set your properties, you'd use the @value.setter decorator

class DummyClass:
    def __init__(self, value):
        self._value = value

    @property
    def value(self):
        print("getter")
        return self._value

    @value.setter
    def value(self, new_value):
        print("setter")
        self._value = new_value

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah I figured we don't have the setter here, as we would want the fitted attribute to be read-only access for the user, otherwise would there be an advantage in this property being private, versus just haveing a .fittedattribute?

self.scaler = StandardScaler()
self.scaler.mean_ = np.array(state_dict["scaler_mean"])
self.scaler.var_ = np.array(state_dict["scaler_var"])
self.scaler.scale_ = np.array(state_dict["scaler_scale"])
self.scaler.n_samples_seen_ = np.array(state_dict["scaler_n_samples_seen"])

self.model = LogisticRegression()
self.model.coef_ = np.array(state_dict["model_coef"])
self.model.intercept_ = np.array(state_dict["model_intercept"])
self.model.classes_ = np.array(state_dict["model_classes"])
Loading
Loading