Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add two step classifier #431

Draft
wants to merge 23 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
bf480ff
add logreg and two step classifer
anna-charlotte Jan 16, 2025
e6d3e3b
add config param to enable 2-step-classifier
anna-charlotte Jan 16, 2025
1c2065c
fix FDRManger two-step-classifier parameter
anna-charlotte Jan 16, 2025
29168e4
extract fdr utility functions due to circular import
anna-charlotte Jan 16, 2025
124260f
fix logreg initialization
anna-charlotte Jan 16, 2025
6480c6a
fix fdr_utils test
anna-charlotte Jan 16, 2025
e450f24
Merge remote-tracking branch 'origin/main' into add-two-step-classifier
anna-charlotte Jan 16, 2025
5850360
fix fdr_utils refactoring
anna-charlotte Jan 16, 2025
e8fcc3f
remove redundant perform_fdr_new function
anna-charlotte Jan 17, 2025
052c109
revert fdr_utils changes
anna-charlotte Jan 17, 2025
9df38a8
clean up and add docstrings
anna-charlotte Jan 17, 2025
9e8c0c8
add missing docstring
anna-charlotte Jan 17, 2025
1605cfa
clean up fdrexperimental.py
anna-charlotte Jan 17, 2025
8551503
Merge remote-tracking branch 'origin/main' into add-two-step-classifier
anna-charlotte Jan 17, 2025
2d2be2a
formatting
anna-charlotte Jan 17, 2025
7a1f4a7
move models to new fdr_analysis module
anna-charlotte Jan 17, 2025
ceabe7c
move files from fdr_analysis to fdrx module
anna-charlotte Jan 17, 2025
fdf62db
add max_iteration parameter to 2-step-classifier.fit_predict()
anna-charlotte Jan 20, 2025
1c8157f
Merge remote-tracking branch 'origin/main' into add-two-step-classifier
anna-charlotte Jan 20, 2025
e6d6b95
refactoring of two-step-classifier helper functions
anna-charlotte Jan 21, 2025
65a11bb
add unit tests
anna-charlotte Jan 21, 2025
ef6fc45
formatting
anna-charlotte Jan 21, 2025
584bab7
fix test for get_target_decoy_partners
anna-charlotte Jan 21, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions alphadia/constants/default.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -233,6 +233,7 @@ fdr:
keep_decoys: false
channel_wise_fdr: false
inference_strategy: "heuristic"
enable_two_step_classifier: false

search_output:
peptide_level_lfq: false
Expand Down
4 changes: 4 additions & 0 deletions alphadia/fdrx/models/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
from .logistic_regression import LogisticRegressionClassifier
from .two_step_classifier import TwoStepClassifier

__all__ = ["LogisticRegressionClassifier", "TwoStepClassifier"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we need the logic in this file? (especially the __all__ -> we're not using that idiom anywhere else in alphadia)

128 changes: 128 additions & 0 deletions alphadia/fdrx/models/logistic_regression.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
import logging

import numpy as np
from sklearn.linear_model import LogisticRegression
from sklearn.preprocessing import StandardScaler

from alphadia.fdrexperimental import Classifier

logger = logging.getLogger()


class LogisticRegressionClassifier(Classifier):
def __init__(self) -> None:
"""Binary classifier using a logistic regression model."""
self.scaler = StandardScaler()
self.model = LogisticRegression()
self._fitted = False

@property
def fitted(self) -> bool:
return self._fitted

def fit(self, x: np.ndarray, y: np.ndarray) -> None:
"""Fit the classifier to the data.

Parameters
----------

x : np.array, dtype=float
Training data of shape (n_samples, n_features).

y : np.array, dtype=int
Target values of shape (n_samples,) or (n_samples, n_classes).

"""
x_scaled = self.scaler.fit_transform(x)
self.model.fit(x_scaled, y)
self._fitted = True

def predict(self, x: np.ndarray) -> np.ndarray:
"""Predict the class of the data.

Parameters
----------

x : np.array, dtype=float
Data of shape (n_samples, n_features).
Comment on lines +46 to +47
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

at other places you are using capital X .. please choose one ;-) (check also what the rest of the code uses)

(and adapt also x_scaled)


Returns
-------

y : np.array, dtype=float
Predicted class probabilities of shape (n_samples, n_classes).

"""
x_scaled = self.scaler.transform(x)
return self.model.predict(x_scaled)

def predict_proba(self, x: np.ndarray) -> np.ndarray:
"""Predict the class probabilities of the data.

Parameters
----------

x : np.array, dtype=float
Data of shape (n_samples, n_features).

Returns
-------

y : np.array, dtype=float
Predicted class probabilities of shape (n_samples, n_classes).

"""
x_scaled = self.scaler.transform(x)
return self.model.predict_proba(x_scaled)

def to_state_dict(self) -> dict:
"""Return the state of the classifier as a dictionary.

Returns
-------

dict : dict
Dictionary containing the state of the classifier.

"""
state_dict = {"_fitted": self._fitted}

if self._fitted:
state_dict.update(
{
"scaler_mean": self.scaler.mean_,
"scaler_var": self.scaler.var_,
"scaler_scale": self.scaler.scale_,
"scaler_n_samples_seen": self.scaler.n_samples_seen_,
"model_coef": self.model.coef_,
"model_intercept": self.model.intercept_,
"model_classes": self.model.classes_,
"is_fitted": self._fitted,
}
)

return state_dict

def from_state_dict(self, state_dict: dict) -> None:
"""Load the state of the classifier from a dictionary.

Parameters
----------

dict : dict
Dictionary containing the state of the classifier.

"""
self._fitted = state_dict["_fitted"]

if self.fitted:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please check if this should rather be if self._fitted:? if not, add a comment which deconfuses me :-)

self.scaler = StandardScaler()
self.scaler.mean_ = np.array(state_dict["scaler_mean"])
self.scaler.var_ = np.array(state_dict["scaler_var"])
self.scaler.scale_ = np.array(state_dict["scaler_scale"])
self.scaler.n_samples_seen_ = np.array(state_dict["scaler_n_samples_seen"])

self.model = LogisticRegression()
self.model.coef_ = np.array(state_dict["model_coef"])
self.model.intercept_ = np.array(state_dict["model_intercept"])
self.model.classes_ = np.array(state_dict["model_classes"])
Loading
Loading