From c9ed838c9d5b47a3de8501494c11705443e89a7d Mon Sep 17 00:00:00 2001 From: ph_ Date: Tue, 2 Jun 2020 15:54:22 -0400 Subject: [PATCH 1/3] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 596e8f7..d155306 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ ![A responsible machine learning workingflow](/img/rml_diagram_no_hilite.png) -A Responsible Machine Learning Workflow Diagram. **Source:** [*Information*, 11(3) (March 2020)](https://www.mdpi.com/2078-2489/11/3) +A Responsible Machine Learning Workflow Diagram. **Source:** [*Information*, 11(3) (March 2020)](https://www.mdpi.com/2078-2489/11/3). ## GWU_DNSC 6290: Course Outline From 30f56194e8e3a28e3a2fd084c566c18419df7671 Mon Sep 17 00:00:00 2001 From: patrickh Date: Wed, 3 Jun 2020 21:11:56 -0400 Subject: [PATCH 2/3] lecture 3 --- lecture_3.ipynb | 7816 +++++++++++++++++++++++++++++++++++++++++++++ rmltk/debug.py | 73 + rmltk/evaluate.py | 107 +- 3 files changed, 7995 insertions(+), 1 deletion(-) create mode 100644 lecture_3.ipynb diff --git a/lecture_3.ipynb b/lecture_3.ipynb new file mode 100644 index 0000000..0f851af --- /dev/null +++ b/lecture_3.ipynb @@ -0,0 +1,7816 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## License \n", + "\n", + "Copyright 2020 Patrick Hall (jphall@gwu.edu)\n", + "\n", + "Licensed under the Apache License, Version 2.0 (the \"License\");\n", + "you may not use this file except in compliance with the License.\n", + "You may obtain a copy of the License at\n", + "\n", + " http://www.apache.org/licenses/LICENSE-2.0\n", + "\n", + "Unless required by applicable law or agreed to in writing, software\n", + "distributed under the License is distributed on an \"AS IS\" BASIS,\n", + "WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n", + "See the License for the specific language governing permissions and\n", + "limitations under the License." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**DISCLAIMER:** This notebook is not legal compliance advice." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "***" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Testing a Constrained Model for Discrimination and Remediating Discovered Discrimination\n", + "\n", + "Fairness is a difficult and complex topic. So much so that leading scholars have yet to agree on a strict definition. However, there is a practical way to discuss and handle observational fairness, or how your model predictions affect different groups of people. This procedure is often known as disparate impact analysis (DIA). This example DIA notebook starts by loading a pre-trained constrained, monotonic gradient boosting machine (GBM) classifier. A probability cutoff for making credit decisions is selected by maximizing Youden's J statistic and confusion matrices are generated to summarize the GBM’s decisions across male and female customers. A basic DIA procedure is then conducted using the information stored in the confusion matrices and several traditional fair lending measures are also calculated.\n", + "\n", + "Because DIA only considers groups of people, it's also important to look for any local, or individual, discrimination that would not be flagged in the group fairness quantities. This notebook closes by illustrating a basic search for cases of individual discrimination." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Global hyperpameters" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "SEED = 12345 # global random seed for better reproducibility" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Python imports and inits" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Checking whether there is an H2O instance running at http://localhost:54321 ..... not found.\n", + "Attempting to start a local H2O server...\n", + " Java Version: openjdk version \"1.8.0_252\"; OpenJDK Runtime Environment (build 1.8.0_252-8u252-b09-1~18.04-b09); OpenJDK 64-Bit Server VM (build 25.252-b09, mixed mode)\n", + " Starting server from /home/patrickh/Workspace/GWU_rml/env_rml/lib/python3.6/site-packages/h2o/backend/bin/h2o.jar\n", + " Ice root: /tmp/tmpwbigsuw9\n", + " JVM stdout: /tmp/tmpwbigsuw9/h2o_patrickh_started_from_python.out\n", + " JVM stderr: /tmp/tmpwbigsuw9/h2o_patrickh_started_from_python.err\n", + " Server is running at http://127.0.0.1:54321\n", + "Connecting to H2O server at http://127.0.0.1:54321 ... successful.\n", + "Warning: Your H2O cluster version is too old (9 months and 10 days)! Please download and install the latest version from http://h2o.ai/download/\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "
H2O cluster uptime:00 secs
H2O cluster timezone:America/New_York
H2O data parsing timezone:UTC
H2O cluster version:3.26.0.3
H2O cluster version age:9 months and 10 days !!!
H2O cluster name:H2O_from_python_patrickh_8fev5r
H2O cluster total nodes:1
H2O cluster free memory:1.879 Gb
H2O cluster total cores:24
H2O cluster allowed cores:24
H2O cluster status:accepting new members, healthy
H2O connection url:http://127.0.0.1:54321
H2O connection proxy:None
H2O internal security:False
H2O API Extensions:Amazon S3, XGBoost, Algos, AutoML, Core V3, Core V4
Python version:3.6.9 final
" + ], + "text/plain": [ + "-------------------------- ---------------------------------------------------\n", + "H2O cluster uptime: 00 secs\n", + "H2O cluster timezone: America/New_York\n", + "H2O data parsing timezone: UTC\n", + "H2O cluster version: 3.26.0.3\n", + "H2O cluster version age: 9 months and 10 days !!!\n", + "H2O cluster name: H2O_from_python_patrickh_8fev5r\n", + "H2O cluster total nodes: 1\n", + "H2O cluster free memory: 1.879 Gb\n", + "H2O cluster total cores: 24\n", + "H2O cluster allowed cores: 24\n", + "H2O cluster status: accepting new members, healthy\n", + "H2O connection url: http://127.0.0.1:54321\n", + "H2O connection proxy:\n", + "H2O internal security: False\n", + "H2O API Extensions: Amazon S3, XGBoost, Algos, AutoML, Core V3, Core V4\n", + "Python version: 3.6.9 final\n", + "-------------------------- ---------------------------------------------------" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "from rmltk import debug, evaluate, model # simple module for evaluating, debugging, and training models\n", + "\n", + "# h2o Python API with specific classes\n", + "import h2o \n", + "from h2o.estimators.gbm import H2OGradientBoostingEstimator # for GBM\n", + "\n", + "import numpy as np # array, vector, matrix calculations\n", + "import pandas as pd # DataFrame handling\n", + "\n", + "import matplotlib.pyplot as plt # general plotting\n", + "pd.options.display.max_columns = 999 # enable display of all columns in notebook\n", + "\n", + "# display plots in-notebook\n", + "%matplotlib inline \n", + "\n", + "h2o.init(max_mem_size='2G') # start h2o\n", + "h2o.remove_all() # remove any existing data structures from h2o memory" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 1. Download, Explore, and Prepare UCI Credit Card Default Data\n", + "\n", + "UCI credit card default data: https://archive.ics.uci.edu/ml/datasets/default+of+credit+card+clients\n", + "\n", + "The UCI credit card default data contains demographic and payment information about credit card customers in Taiwan in the year 2005. The data set contains 23 input variables: \n", + "\n", + "* **`LIMIT_BAL`**: Amount of given credit (NT dollar)\n", + "* **`SEX`**: 1 = male; 2 = female\n", + "* **`EDUCATION`**: 1 = graduate school; 2 = university; 3 = high school; 4 = others \n", + "* **`MARRIAGE`**: 1 = married; 2 = single; 3 = others\n", + "* **`AGE`**: Age in years \n", + "* **`PAY_0`, `PAY_2` - `PAY_6`**: History of past payment; `PAY_0` = the repayment status in September, 2005; `PAY_2` = the repayment status in August, 2005; ...; `PAY_6` = the repayment status in April, 2005. The measurement scale for the repayment status is: -1 = pay duly; 1 = payment delay for one month; 2 = payment delay for two months; ...; 8 = payment delay for eight months; 9 = payment delay for nine months and above. \n", + "* **`BILL_AMT1` - `BILL_AMT6`**: Amount of bill statement (NT dollar). `BILL_AMNT1` = amount of bill statement in September, 2005; `BILL_AMT2` = amount of bill statement in August, 2005; ...; `BILL_AMT6` = amount of bill statement in April, 2005. \n", + "* **`PAY_AMT1` - `PAY_AMT6`**: Amount of previous payment (NT dollar). `PAY_AMT1` = amount paid in September, 2005; `PAY_AMT2` = amount paid in August, 2005; ...; `PAY_AMT6` = amount paid in April, 2005. \n", + "\n", + "Demographic variables will not be used as model inputs as is common in credit scoring models. However, demographic variables will be used after model training to test for disparate impact." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Import data and clean" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "# import XLS file\n", + "path = 'default_of_credit_card_clients.xls'\n", + "data = pd.read_excel(path,\n", + " skiprows=1)\n", + "\n", + "# remove spaces from target column name \n", + "data = data.rename(columns={'default payment next month': 'DEFAULT_NEXT_MONTH'}) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Assign modeling roles" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "y = DEFAULT_NEXT_MONTH\n", + "X = ['LIMIT_BAL', 'PAY_0', 'PAY_2', 'PAY_3', 'PAY_4', 'PAY_5', 'PAY_6', 'BILL_AMT1', 'BILL_AMT2', 'BILL_AMT3', 'BILL_AMT4', 'BILL_AMT5', 'BILL_AMT6', 'PAY_AMT1', 'PAY_AMT2', 'PAY_AMT3', 'PAY_AMT4', 'PAY_AMT5', 'PAY_AMT6']\n" + ] + } + ], + "source": [ + "# assign target and inputs for GBM\n", + "y_name = 'DEFAULT_NEXT_MONTH'\n", + "x_names = [name for name in data.columns if name not in [y_name, 'ID', 'AGE', 'EDUCATION', 'MARRIAGE', 'SEX']]\n", + "print('y =', y_name)\n", + "print('X =', x_names)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Helper function for recoding values in the UCI credict card default data\n", + "This simple function maps longer, more understandable character string values from the UCI credit card default data dictionary to the original integer values of the input variables found in the dataset." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Parse progress: |█████████████████████████████████████████████████████████| 100%\n" + ] + } + ], + "source": [ + "def recode_cc_data(frame):\n", + " \n", + " \"\"\" Recodes numeric categorical variables into categorical character variables\n", + " with more transparent values. \n", + " \n", + " Args:\n", + " frame: Pandas DataFrame version of UCI credit card default data.\n", + " \n", + " Returns: \n", + " H2OFrame with recoded values.\n", + " \n", + " \"\"\"\n", + " \n", + " # define recoded values\n", + " sex_dict = {1:'male', 2:'female'}\n", + " education_dict = {0:'other', 1:'graduate school', 2:'university', 3:'high school', \n", + " 4:'other', 5:'other', 6:'other'}\n", + " marriage_dict = {0:'other', 1:'married', 2:'single', 3:'divorced'}\n", + " \n", + " # recode values using apply() and lambda function\n", + " frame['SEX'] = frame['SEX'].apply(lambda i: sex_dict[i])\n", + " frame['EDUCATION'] = frame['EDUCATION'].apply(lambda i: education_dict[i]) \n", + " frame['MARRIAGE'] = frame['MARRIAGE'].apply(lambda i: marriage_dict[i]) \n", + " \n", + " return h2o.H2OFrame(frame)\n", + "\n", + "data = recode_cc_data(data)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Split data into training and validation partitions\n", + "Fairness metrics will be calculated for the validation data to give a better idea of how explanations will look on future unseen data." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Train data rows = 21060, columns = 25\n", + "Validation data rows = 8940, columns = 25\n" + ] + } + ], + "source": [ + "# split into training and validation\n", + "train, valid = data.split_frame([0.7], seed=12345)\n", + "\n", + "# summarize split\n", + "print('Train data rows = %d, columns = %d' % (train.shape[0], train.shape[1]))\n", + "print('Validation data rows = %d, columns = %d' % (valid.shape[0], valid.shape[1]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2. Load Pre-trained Monotonic GBM\n", + "Load the model known as `mgbm5` from the first lecture." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Model Details\n", + "=============\n", + "H2OGradientBoostingEstimator : Gradient Boosting Machine\n", + "Model Key: best_mgbm\n", + "\n", + "\n", + "Model Summary: " + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
number_of_treesnumber_of_internal_treesmodel_size_in_bytesmin_depthmax_depthmean_depthmin_leavesmax_leavesmean_leaves
046.046.06939.03.03.03.05.08.07.369565
\n", + "
" + ], + "text/plain": [ + " number_of_trees number_of_internal_trees model_size_in_bytes \\\n", + "0 46.0 46.0 6939.0 \n", + "\n", + " min_depth max_depth mean_depth min_leaves max_leaves mean_leaves \n", + "0 3.0 3.0 3.0 5.0 8.0 7.369565 " + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "\n", + "ModelMetricsBinomial: gbm\n", + "** Reported on train data. **\n", + "\n", + "MSE: 0.13637719864300343\n", + "RMSE: 0.3692928358945018\n", + "LogLoss: 0.4351274080189972\n", + "Mean Per-Class Error: 0.2913939696264273\n", + "AUC: 0.7716491282246187\n", + "pr_auc: 0.5471826859054356\n", + "Gini: 0.5432982564492375\n", + "\n", + "Confusion Matrix (Act/Pred) for max f1 @ threshold = 0.21968260039166268: " + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
01ErrorRate
0013482.02814.00.1727(2814.0/16296.0)
111907.02743.00.4101(1907.0/4650.0)
2Total15389.05557.00.2254(4721.0/20946.0)
\n", + "
" + ], + "text/plain": [ + " 0 1 Error Rate\n", + "0 0 13482.0 2814.0 0.1727 (2814.0/16296.0)\n", + "1 1 1907.0 2743.0 0.4101 (1907.0/4650.0)\n", + "2 Total 15389.0 5557.0 0.2254 (4721.0/20946.0)" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Maximum Metrics: Maximum metrics at their respective thresholds\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
metricthresholdvalueidx
0max f10.2196830.537474248.0
1max f20.1278590.630227329.0
2max f0point50.4466990.583033147.0
3max accuracy0.4466990.821493147.0
4max precision0.9502471.0000000.0
5max recall0.0506091.000000395.0
6max specificity0.9502471.0000000.0
7max absolute_mcc0.3251590.413494194.0
8max min_per_class_accuracy0.1775420.698495281.0
9max mean_per_class_accuracy0.2196830.708606248.0
\n", + "
" + ], + "text/plain": [ + " metric threshold value idx\n", + "0 max f1 0.219683 0.537474 248.0\n", + "1 max f2 0.127859 0.630227 329.0\n", + "2 max f0point5 0.446699 0.583033 147.0\n", + "3 max accuracy 0.446699 0.821493 147.0\n", + "4 max precision 0.950247 1.000000 0.0\n", + "5 max recall 0.050609 1.000000 395.0\n", + "6 max specificity 0.950247 1.000000 0.0\n", + "7 max absolute_mcc 0.325159 0.413494 194.0\n", + "8 max min_per_class_accuracy 0.177542 0.698495 281.0\n", + "9 max mean_per_class_accuracy 0.219683 0.708606 248.0" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Gains/Lift Table: Avg response rate: 22.20 %, avg score: 22.00 %\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
groupcumulative_data_fractionlower_thresholdliftcumulative_liftresponse_ratescorecumulative_response_ratecumulative_scorecapture_ratecumulative_capture_rategaincumulative_gain
010.0100740.8139273.6078833.6078830.8009480.8434460.8009480.8434460.0363440.036344260.788259260.788259
120.0203380.7955753.5198083.5634320.7813950.8051530.7910800.8241190.0361290.072473251.980795256.343177
230.0303160.7636793.4053283.5113940.7559810.7839700.7795280.8109050.0339780.106452240.532798251.139446
340.0400080.7151383.2618913.4509540.7241380.7398150.7661100.7936840.0316130.138065226.189099245.095388
450.0500810.6644163.1168693.3837550.6919430.6866950.7511920.7721640.0313980.169462211.686898238.375473
560.1000190.5433842.8594633.1219840.6347990.6017940.6930790.6871010.1427960.312258185.946339212.198445
670.1500050.3662372.2242932.8228490.4937920.4469510.6266710.6070760.1111830.423441122.429306182.284922
780.2056720.2927651.5955102.4906590.3542020.3127770.5529250.5274220.0888170.51225859.551043149.065864
890.3012510.1966481.1745042.0730770.2607390.2344990.4602220.4344850.1122580.62451617.450421107.307684
9100.4000290.1738170.8643271.7746040.1918800.1848440.3939610.3728420.0853760.709892-13.56728477.460410
10110.5002860.1514310.7014181.5595370.1557140.1613350.3462160.3304550.0703230.780215-29.85824955.953665
11120.6003060.1312140.6192371.4028700.1374700.1407090.3114360.2988410.0619350.842151-38.07634240.286982
12130.7006590.1147940.5593141.2820500.1241670.1228170.2846140.2736300.0561290.898280-44.06856828.204987
13140.8008210.1022260.3692931.1678870.0819830.1080620.2592700.2529210.0369890.935269-63.07069716.788724
14150.9045640.0918610.4021521.0800660.0892770.0975240.2397740.2350990.0417200.976989-59.7848088.006633
15161.0000000.0348100.2411121.0000000.0535270.0769890.2219990.2200100.0230111.000000-75.8887830.000000
\n", + "
" + ], + "text/plain": [ + " group cumulative_data_fraction lower_threshold lift \\\n", + "0 1 0.010074 0.813927 3.607883 \n", + "1 2 0.020338 0.795575 3.519808 \n", + "2 3 0.030316 0.763679 3.405328 \n", + "3 4 0.040008 0.715138 3.261891 \n", + "4 5 0.050081 0.664416 3.116869 \n", + "5 6 0.100019 0.543384 2.859463 \n", + "6 7 0.150005 0.366237 2.224293 \n", + "7 8 0.205672 0.292765 1.595510 \n", + "8 9 0.301251 0.196648 1.174504 \n", + "9 10 0.400029 0.173817 0.864327 \n", + "10 11 0.500286 0.151431 0.701418 \n", + "11 12 0.600306 0.131214 0.619237 \n", + "12 13 0.700659 0.114794 0.559314 \n", + "13 14 0.800821 0.102226 0.369293 \n", + "14 15 0.904564 0.091861 0.402152 \n", + "15 16 1.000000 0.034810 0.241112 \n", + "\n", + " cumulative_lift response_rate score cumulative_response_rate \\\n", + "0 3.607883 0.800948 0.843446 0.800948 \n", + "1 3.563432 0.781395 0.805153 0.791080 \n", + "2 3.511394 0.755981 0.783970 0.779528 \n", + "3 3.450954 0.724138 0.739815 0.766110 \n", + "4 3.383755 0.691943 0.686695 0.751192 \n", + "5 3.121984 0.634799 0.601794 0.693079 \n", + "6 2.822849 0.493792 0.446951 0.626671 \n", + "7 2.490659 0.354202 0.312777 0.552925 \n", + "8 2.073077 0.260739 0.234499 0.460222 \n", + "9 1.774604 0.191880 0.184844 0.393961 \n", + "10 1.559537 0.155714 0.161335 0.346216 \n", + "11 1.402870 0.137470 0.140709 0.311436 \n", + "12 1.282050 0.124167 0.122817 0.284614 \n", + "13 1.167887 0.081983 0.108062 0.259270 \n", + "14 1.080066 0.089277 0.097524 0.239774 \n", + "15 1.000000 0.053527 0.076989 0.221999 \n", + "\n", + " cumulative_score capture_rate cumulative_capture_rate gain \\\n", + "0 0.843446 0.036344 0.036344 260.788259 \n", + "1 0.824119 0.036129 0.072473 251.980795 \n", + "2 0.810905 0.033978 0.106452 240.532798 \n", + "3 0.793684 0.031613 0.138065 226.189099 \n", + "4 0.772164 0.031398 0.169462 211.686898 \n", + "5 0.687101 0.142796 0.312258 185.946339 \n", + "6 0.607076 0.111183 0.423441 122.429306 \n", + "7 0.527422 0.088817 0.512258 59.551043 \n", + "8 0.434485 0.112258 0.624516 17.450421 \n", + "9 0.372842 0.085376 0.709892 -13.567284 \n", + "10 0.330455 0.070323 0.780215 -29.858249 \n", + "11 0.298841 0.061935 0.842151 -38.076342 \n", + "12 0.273630 0.056129 0.898280 -44.068568 \n", + "13 0.252921 0.036989 0.935269 -63.070697 \n", + "14 0.235099 0.041720 0.976989 -59.784808 \n", + "15 0.220010 0.023011 1.000000 -75.888783 \n", + "\n", + " cumulative_gain \n", + "0 260.788259 \n", + "1 256.343177 \n", + "2 251.139446 \n", + "3 245.095388 \n", + "4 238.375473 \n", + "5 212.198445 \n", + "6 182.284922 \n", + "7 149.065864 \n", + "8 107.307684 \n", + "9 77.460410 \n", + "10 55.953665 \n", + "11 40.286982 \n", + "12 28.204987 \n", + "13 16.788724 \n", + "14 8.006633 \n", + "15 0.000000 " + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "\n", + "ModelMetricsBinomial: gbm\n", + "** Reported on validation data. **\n", + "\n", + "MSE: 0.13326994104124376\n", + "RMSE: 0.3650615578792757\n", + "LogLoss: 0.4278285715046422\n", + "Mean Per-Class Error: 0.2856607030196092\n", + "AUC: 0.7776380047998697\n", + "pr_auc: 0.5486322626112021\n", + "Gini: 0.5552760095997393\n", + "\n", + "Confusion Matrix (Act/Pred) for max f1 @ threshold = 0.27397344199105433: " + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
01ErrorRate
006093.0975.00.1379(975.0/7068.0)
11863.01123.00.4345(863.0/1986.0)
2Total6956.02098.00.203(1838.0/9054.0)
\n", + "
" + ], + "text/plain": [ + " 0 1 Error Rate\n", + "0 0 6093.0 975.0 0.1379 (975.0/7068.0)\n", + "1 1 863.0 1123.0 0.4345 (863.0/1986.0)\n", + "2 Total 6956.0 2098.0 0.203 (1838.0/9054.0)" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Maximum Metrics: Maximum metrics at their respective thresholds\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
metricthresholdvalueidx
0max f10.2739730.549951217.0
1max f20.1478350.634488307.0
2max f0point50.4366200.590736153.0
3max accuracy0.4569630.825271147.0
4max precision0.9470691.0000000.0
5max recall0.0451061.000000397.0
6max specificity0.9470691.0000000.0
7max absolute_mcc0.3472460.429999184.0
8max min_per_class_accuracy0.1815850.709970275.0
9max mean_per_class_accuracy0.2305180.714339240.0
\n", + "
" + ], + "text/plain": [ + " metric threshold value idx\n", + "0 max f1 0.273973 0.549951 217.0\n", + "1 max f2 0.147835 0.634488 307.0\n", + "2 max f0point5 0.436620 0.590736 153.0\n", + "3 max accuracy 0.456963 0.825271 147.0\n", + "4 max precision 0.947069 1.000000 0.0\n", + "5 max recall 0.045106 1.000000 397.0\n", + "6 max specificity 0.947069 1.000000 0.0\n", + "7 max absolute_mcc 0.347246 0.429999 184.0\n", + "8 max min_per_class_accuracy 0.181585 0.709970 275.0\n", + "9 max mean_per_class_accuracy 0.230518 0.714339 240.0" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Gains/Lift Table: Avg response rate: 21.94 %, avg score: 22.52 %\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
groupcumulative_data_fractionlower_thresholdliftcumulative_liftresponse_ratescorecumulative_response_ratecumulative_scorecapture_ratecumulative_capture_rategaincumulative_gain
010.0111550.8150103.2950553.2950550.7227720.8398580.7227720.8398580.0367570.036757229.505549229.505549
120.0205430.7955753.7007643.4804600.8117650.8056310.7634410.8242170.0347430.071501270.076417248.045999
230.0300420.7835503.6047213.5197490.7906980.7924410.7720590.8141700.0342400.105740260.472142251.974853
340.0400930.7431923.0058763.3909270.6593410.7613350.7438020.8009250.0302110.135952200.587630239.092657
450.0500330.6977023.4445123.4015730.7555560.7230910.7461370.7854610.0342400.170191244.451158240.157260
560.1012810.5531933.1047773.2513940.6810340.6147360.7131950.6990750.1591140.329305210.477654225.139444
670.1503200.3835642.1870462.9041710.4797300.4660670.6370320.6230610.1072510.436556118.704581190.417123
780.2000220.2969151.5804232.5752440.3466670.3278170.5648810.5496980.0785500.51510658.042296157.524427
890.3013030.2035391.1335142.0906160.2486370.2506480.4585780.4491740.1148040.62990913.351366109.061561
9100.4034680.1769700.9610681.8045950.2108110.1871900.3958390.3828360.0981870.728097-3.89319880.459549
10110.5002210.1520280.6557341.5823820.1438360.1635660.3470960.3404240.0634440.791541-34.42660358.238248
11120.5999560.1330090.5553491.4116510.1218160.1416510.3096470.3073810.0553880.846928-44.46507641.165144
12130.7022310.1150620.4923231.2777570.1079910.1235490.2802770.2806070.0503520.897281-50.76768527.775745
13140.8019660.1023800.3534041.1628020.0775190.1078340.2550610.2591210.0352470.932528-64.65959416.280206
14150.9053460.0918610.3799091.0734050.0833330.0975850.2354520.2406750.0392750.971803-62.0090637.340501
15161.0000000.0348100.2978991.0000000.0653440.0768840.2193510.2251720.0281971.000000-70.2101410.000000
\n", + "
" + ], + "text/plain": [ + " group cumulative_data_fraction lower_threshold lift \\\n", + "0 1 0.011155 0.815010 3.295055 \n", + "1 2 0.020543 0.795575 3.700764 \n", + "2 3 0.030042 0.783550 3.604721 \n", + "3 4 0.040093 0.743192 3.005876 \n", + "4 5 0.050033 0.697702 3.444512 \n", + "5 6 0.101281 0.553193 3.104777 \n", + "6 7 0.150320 0.383564 2.187046 \n", + "7 8 0.200022 0.296915 1.580423 \n", + "8 9 0.301303 0.203539 1.133514 \n", + "9 10 0.403468 0.176970 0.961068 \n", + "10 11 0.500221 0.152028 0.655734 \n", + "11 12 0.599956 0.133009 0.555349 \n", + "12 13 0.702231 0.115062 0.492323 \n", + "13 14 0.801966 0.102380 0.353404 \n", + "14 15 0.905346 0.091861 0.379909 \n", + "15 16 1.000000 0.034810 0.297899 \n", + "\n", + " cumulative_lift response_rate score cumulative_response_rate \\\n", + "0 3.295055 0.722772 0.839858 0.722772 \n", + "1 3.480460 0.811765 0.805631 0.763441 \n", + "2 3.519749 0.790698 0.792441 0.772059 \n", + "3 3.390927 0.659341 0.761335 0.743802 \n", + "4 3.401573 0.755556 0.723091 0.746137 \n", + "5 3.251394 0.681034 0.614736 0.713195 \n", + "6 2.904171 0.479730 0.466067 0.637032 \n", + "7 2.575244 0.346667 0.327817 0.564881 \n", + "8 2.090616 0.248637 0.250648 0.458578 \n", + "9 1.804595 0.210811 0.187190 0.395839 \n", + "10 1.582382 0.143836 0.163566 0.347096 \n", + "11 1.411651 0.121816 0.141651 0.309647 \n", + "12 1.277757 0.107991 0.123549 0.280277 \n", + "13 1.162802 0.077519 0.107834 0.255061 \n", + "14 1.073405 0.083333 0.097585 0.235452 \n", + "15 1.000000 0.065344 0.076884 0.219351 \n", + "\n", + " cumulative_score capture_rate cumulative_capture_rate gain \\\n", + "0 0.839858 0.036757 0.036757 229.505549 \n", + "1 0.824217 0.034743 0.071501 270.076417 \n", + "2 0.814170 0.034240 0.105740 260.472142 \n", + "3 0.800925 0.030211 0.135952 200.587630 \n", + "4 0.785461 0.034240 0.170191 244.451158 \n", + "5 0.699075 0.159114 0.329305 210.477654 \n", + "6 0.623061 0.107251 0.436556 118.704581 \n", + "7 0.549698 0.078550 0.515106 58.042296 \n", + "8 0.449174 0.114804 0.629909 13.351366 \n", + "9 0.382836 0.098187 0.728097 -3.893198 \n", + "10 0.340424 0.063444 0.791541 -34.426603 \n", + "11 0.307381 0.055388 0.846928 -44.465076 \n", + "12 0.280607 0.050352 0.897281 -50.767685 \n", + "13 0.259121 0.035247 0.932528 -64.659594 \n", + "14 0.240675 0.039275 0.971803 -62.009063 \n", + "15 0.225172 0.028197 1.000000 -70.210141 \n", + "\n", + " cumulative_gain \n", + "0 229.505549 \n", + "1 248.045999 \n", + "2 251.974853 \n", + "3 239.092657 \n", + "4 240.157260 \n", + "5 225.139444 \n", + "6 190.417123 \n", + "7 157.524427 \n", + "8 109.061561 \n", + "9 80.459549 \n", + "10 58.238248 \n", + "11 41.165144 \n", + "12 27.775745 \n", + "13 16.280206 \n", + "14 7.340501 \n", + "15 0.000000 " + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "\n", + "Scoring History: " + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
timestampdurationnumber_of_treestraining_rmsetraining_loglosstraining_auctraining_pr_auctraining_lifttraining_classification_errorvalidation_rmsevalidation_loglossvalidation_aucvalidation_pr_aucvalidation_liftvalidation_classification_error
02020-05-28 14:33:2343.415 sec0.00.4155910.5294270.5000000.0000001.0000000.7780010.4138150.5261050.5000000.0000001.0000000.780649
12020-05-28 14:33:2343.443 sec1.00.4078220.5118640.7161310.5347173.4749120.2363700.4055380.5074960.7267310.5371253.4442640.187652
22020-05-28 14:33:2343.467 sec2.00.4014830.4987460.7446460.5321723.5297060.2287310.3988080.4936980.7529090.5345883.4223070.232825
32020-05-28 14:33:2343.489 sec3.00.3964710.4890130.7481890.5356213.5297060.2286360.3933940.4832730.7564480.5356923.4223070.214491
42020-05-28 14:33:2343.515 sec4.00.3924420.4814300.7501210.5353583.5297060.2107800.3890300.4751350.7585110.5360953.4223070.217915
52020-05-28 14:33:2343.535 sec5.00.3891410.4753750.7500580.5351983.5297060.2450590.3854530.4686300.7585050.5356593.4223070.214270
62020-05-28 14:33:2343.570 sec6.00.3863990.4703320.7569860.5350243.5297060.2439610.3824470.4631570.7647220.5360393.4223070.229843
72020-05-28 14:33:2343.592 sec7.00.3841910.4663160.7570050.5354183.5297060.2439610.3800450.4588340.7646340.5364113.4223070.220013
82020-05-28 14:33:2343.614 sec8.00.3823410.4627600.7611060.5401763.5143590.2474460.3780630.4550490.7703400.5420433.4575240.204330
92020-05-28 14:33:2343.639 sec9.00.3807010.4595890.7625150.5408803.5182790.2356540.3761840.4514640.7723580.5435223.4575240.223548
102020-05-28 14:33:2343.668 sec10.00.3792020.4567050.7625220.5414243.5182790.2356060.3745830.4483800.7729820.5438933.4575240.226309
112020-05-28 14:33:2343.697 sec11.00.3780520.4544670.7616480.5415053.5213320.2310230.3733540.4459730.7729250.5445533.4608820.228960
122020-05-28 14:33:2343.729 sec12.00.3770430.4524200.7627670.5416583.5213320.2299720.3721990.4436700.7734120.5431953.4608820.224542
132020-05-28 14:33:2343.762 sec13.00.3761370.4505170.7647950.5432643.5258990.2343170.3713690.4419320.7741610.5436323.4480380.227413
142020-05-28 14:33:2343.796 sec14.00.3753570.4489630.7651450.5431133.5258990.2356540.3705490.4403350.7741760.5432023.4480380.228076
152020-05-28 14:33:2343.848 sec15.00.3746990.4475430.7661180.5440373.5284170.2332190.3699990.4391610.7745920.5437093.4480380.228297
162020-05-28 14:33:2343.903 sec16.00.3740980.4463410.7665290.5438963.5607130.2291610.3693900.4379260.7750210.5448513.4248550.226751
172020-05-28 14:33:2343.949 sec17.00.3735340.4451150.7663120.5442083.5683700.2314520.3688100.4366690.7749270.5459573.4429290.225425
182020-05-28 14:33:2344.004 sec18.00.3731210.4441710.7667850.5447203.5683700.2293520.3684960.4359090.7752560.5455863.4429290.226530
192020-05-28 14:33:2344.054 sec19.00.3727220.4433600.7671450.5450593.5683700.2264390.3680470.4350060.7754740.5459223.4429290.224652
\n", + "
" + ], + "text/plain": [ + " timestamp duration number_of_trees training_rmse \\\n", + "0 2020-05-28 14:33:23 43.415 sec 0.0 0.415591 \n", + "1 2020-05-28 14:33:23 43.443 sec 1.0 0.407822 \n", + "2 2020-05-28 14:33:23 43.467 sec 2.0 0.401483 \n", + "3 2020-05-28 14:33:23 43.489 sec 3.0 0.396471 \n", + "4 2020-05-28 14:33:23 43.515 sec 4.0 0.392442 \n", + "5 2020-05-28 14:33:23 43.535 sec 5.0 0.389141 \n", + "6 2020-05-28 14:33:23 43.570 sec 6.0 0.386399 \n", + "7 2020-05-28 14:33:23 43.592 sec 7.0 0.384191 \n", + "8 2020-05-28 14:33:23 43.614 sec 8.0 0.382341 \n", + "9 2020-05-28 14:33:23 43.639 sec 9.0 0.380701 \n", + "10 2020-05-28 14:33:23 43.668 sec 10.0 0.379202 \n", + "11 2020-05-28 14:33:23 43.697 sec 11.0 0.378052 \n", + "12 2020-05-28 14:33:23 43.729 sec 12.0 0.377043 \n", + "13 2020-05-28 14:33:23 43.762 sec 13.0 0.376137 \n", + "14 2020-05-28 14:33:23 43.796 sec 14.0 0.375357 \n", + "15 2020-05-28 14:33:23 43.848 sec 15.0 0.374699 \n", + "16 2020-05-28 14:33:23 43.903 sec 16.0 0.374098 \n", + "17 2020-05-28 14:33:23 43.949 sec 17.0 0.373534 \n", + "18 2020-05-28 14:33:23 44.004 sec 18.0 0.373121 \n", + "19 2020-05-28 14:33:23 44.054 sec 19.0 0.372722 \n", + "\n", + " training_logloss training_auc training_pr_auc training_lift \\\n", + "0 0.529427 0.500000 0.000000 1.000000 \n", + "1 0.511864 0.716131 0.534717 3.474912 \n", + "2 0.498746 0.744646 0.532172 3.529706 \n", + "3 0.489013 0.748189 0.535621 3.529706 \n", + "4 0.481430 0.750121 0.535358 3.529706 \n", + "5 0.475375 0.750058 0.535198 3.529706 \n", + "6 0.470332 0.756986 0.535024 3.529706 \n", + "7 0.466316 0.757005 0.535418 3.529706 \n", + "8 0.462760 0.761106 0.540176 3.514359 \n", + "9 0.459589 0.762515 0.540880 3.518279 \n", + "10 0.456705 0.762522 0.541424 3.518279 \n", + "11 0.454467 0.761648 0.541505 3.521332 \n", + "12 0.452420 0.762767 0.541658 3.521332 \n", + "13 0.450517 0.764795 0.543264 3.525899 \n", + "14 0.448963 0.765145 0.543113 3.525899 \n", + "15 0.447543 0.766118 0.544037 3.528417 \n", + "16 0.446341 0.766529 0.543896 3.560713 \n", + "17 0.445115 0.766312 0.544208 3.568370 \n", + "18 0.444171 0.766785 0.544720 3.568370 \n", + "19 0.443360 0.767145 0.545059 3.568370 \n", + "\n", + " training_classification_error validation_rmse validation_logloss \\\n", + "0 0.778001 0.413815 0.526105 \n", + "1 0.236370 0.405538 0.507496 \n", + "2 0.228731 0.398808 0.493698 \n", + "3 0.228636 0.393394 0.483273 \n", + "4 0.210780 0.389030 0.475135 \n", + "5 0.245059 0.385453 0.468630 \n", + "6 0.243961 0.382447 0.463157 \n", + "7 0.243961 0.380045 0.458834 \n", + "8 0.247446 0.378063 0.455049 \n", + "9 0.235654 0.376184 0.451464 \n", + "10 0.235606 0.374583 0.448380 \n", + "11 0.231023 0.373354 0.445973 \n", + "12 0.229972 0.372199 0.443670 \n", + "13 0.234317 0.371369 0.441932 \n", + "14 0.235654 0.370549 0.440335 \n", + "15 0.233219 0.369999 0.439161 \n", + "16 0.229161 0.369390 0.437926 \n", + "17 0.231452 0.368810 0.436669 \n", + "18 0.229352 0.368496 0.435909 \n", + "19 0.226439 0.368047 0.435006 \n", + "\n", + " validation_auc validation_pr_auc validation_lift \\\n", + "0 0.500000 0.000000 1.000000 \n", + "1 0.726731 0.537125 3.444264 \n", + "2 0.752909 0.534588 3.422307 \n", + "3 0.756448 0.535692 3.422307 \n", + "4 0.758511 0.536095 3.422307 \n", + "5 0.758505 0.535659 3.422307 \n", + "6 0.764722 0.536039 3.422307 \n", + "7 0.764634 0.536411 3.422307 \n", + "8 0.770340 0.542043 3.457524 \n", + "9 0.772358 0.543522 3.457524 \n", + "10 0.772982 0.543893 3.457524 \n", + "11 0.772925 0.544553 3.460882 \n", + "12 0.773412 0.543195 3.460882 \n", + "13 0.774161 0.543632 3.448038 \n", + "14 0.774176 0.543202 3.448038 \n", + "15 0.774592 0.543709 3.448038 \n", + "16 0.775021 0.544851 3.424855 \n", + "17 0.774927 0.545957 3.442929 \n", + "18 0.775256 0.545586 3.442929 \n", + "19 0.775474 0.545922 3.442929 \n", + "\n", + " validation_classification_error \n", + "0 0.780649 \n", + "1 0.187652 \n", + "2 0.232825 \n", + "3 0.214491 \n", + "4 0.217915 \n", + "5 0.214270 \n", + "6 0.229843 \n", + "7 0.220013 \n", + "8 0.204330 \n", + "9 0.223548 \n", + "10 0.226309 \n", + "11 0.228960 \n", + "12 0.224542 \n", + "13 0.227413 \n", + "14 0.228076 \n", + "15 0.228297 \n", + "16 0.226751 \n", + "17 0.225425 \n", + "18 0.226530 \n", + "19 0.224652 " + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "See the whole table with table.as_data_frame()\n", + "\n", + "Variable Importances: " + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
variablerelative_importancescaled_importancepercentage
0PAY_02794.4448241.0000000.693347
1PAY_2307.2373660.1099460.076231
2PAY_3215.1528930.0769930.053383
3PAY_4155.4344480.0556230.038566
4PAY_AMT1127.9863130.0458000.031755
5PAY_5127.5386280.0456400.031644
6PAY_6102.3516010.0366270.025395
7LIMIT_BAL82.4323500.0294990.020453
8PAY_AMT258.9341350.0210900.014623
9PAY_AMT458.8580470.0210630.014604
\n", + "
" + ], + "text/plain": [ + " variable relative_importance scaled_importance percentage\n", + "0 PAY_0 2794.444824 1.000000 0.693347\n", + "1 PAY_2 307.237366 0.109946 0.076231\n", + "2 PAY_3 215.152893 0.076993 0.053383\n", + "3 PAY_4 155.434448 0.055623 0.038566\n", + "4 PAY_AMT1 127.986313 0.045800 0.031755\n", + "5 PAY_5 127.538628 0.045640 0.031644\n", + "6 PAY_6 102.351601 0.036627 0.025395\n", + "7 LIMIT_BAL 82.432350 0.029499 0.020453\n", + "8 PAY_AMT2 58.934135 0.021090 0.014623\n", + "9 PAY_AMT4 58.858047 0.021063 0.014604" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/plain": [] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# load saved best model from lecture 1 \n", + "best_mgbm = h2o.load_model('best_mgbm')\n", + "\n", + "# display model details\n", + "best_mgbm" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3. Select a Probability Cutoff by Maximizing Youden's J Statistic" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Bind model predictions to test data for further calculations" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "gbm prediction progress: |████████████████████████████████████████████████| 100%\n" + ] + } + ], + "source": [ + "# cbind predictions to training frame\n", + "# give them a nice name\n", + "yhat_name = 'p_DEFAULT_NEXT_MONTH'\n", + "preds1 = valid['ID'].cbind(best_mgbm.predict(valid).drop(['predict', 'p0']))\n", + "preds1.columns = ['ID', yhat_name]\n", + "valid_yhat = valid.cbind(preds1[yhat_name]).as_data_frame()\n", + "valid_yhat.reset_index(drop=True, inplace=True) # necessary for later searches/joins" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Select best cutoff based on Youden's J\n", + "Maximizing Youden's J statistic corresponds to selecting the cutoff where the AUC curve is the farthest from the baseline. This is appropriate for classifiers that were trained to maximize AUC. However, cutoff selection has a strong impact on group fairness measures. Other options for the cutoff are also presented below." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0.22\n" + ] + } + ], + "source": [ + "j_frame = evaluate.get_youdens_j(valid_yhat, y_name, yhat_name)\n", + "best_cut = j_frame.loc[j_frame['J'].idxmax(), 'cutoff'] # Find cutoff w/ max F1\n", + "### !!! UNCOMMENT LINES BELOW TO REMEDIATE MINOR FAIRNESS ISSUES !!! ###\n", + "#best_cut = 0.31 # lowest cutoff that passess discrimination tests\n", + "#best_cut = 0.456963 # max accuracy\n", + "#best_cut = 0.347246 # max MCC\n", + "print('%.2f' % best_cut)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Plot ROC Curve\n", + "An receiver operating characteristic (ROC) curve for true positive rate (TPR) and false negative rate (FNR) is a typical way to visualize TPR and FNR for a binomial predictive model. " + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3XucXGWd5/HPr6ovCSQETEeBXGESwASTyPQQlFsUHAJKsrsgEmAURTO6huGlDMh6gQjjjKI4q2NcjFxUFkEuO5qVKKMCBnwBElaSIWi0iUg6QQlNSNKh09ff/nFOJ5W6nK7urlNVp+r7fr3qlarneerULyed+vVzOc8xd0dERKSQVKUDEBGR6qZEISIikZQoREQkkhKFiIhEUqIQEZFIShQiIhJJiUKkxpjZm8xsrZntNrObLHC7me0ws19XOj5JHiUKwcweCb9EmvOUfzirbKGZtWe8djPbY2adZrbVzL5qZums97zHzH4dtuswszvNbEpWmyPM7FYzeyn8gvudmX3ezA4uEHOTma0wsz+Ex33BzG4zsxmjPR9xMLOzMr68t5vZL81scZHvzfl3GMIy4BXgEHe/EjgFeBcwxd1PHH70kbF9wsz+bGa7wvPfXKDdSWb2MzN7Nfz732tmR2TUX2Vmz4bn549mdlUp45TRUaKoc+EX66mAA0V9ceUxz93HAacD7wM+lHH884HvA/8TaAHmAN3AY2Z2WNjmDcDjwFjgbe4+nuCL7VDgrwp85n1hvBcBE4B5wNPAGcMN3swahvueYR7/fOBe4HvAFOBNwLXAuTF95HTgOd9/Ne104AV331PKDzGzs4BrCM75dOBo4PMFmh8GrAJmhG13A7dnHg54f9huEbDczC4sZbwyCu6uRx0/CL6wfgV8FfhxVt0jwIezyhYC7RmvHZiZ8foeYGX43IA/AVdnHSMFPAtcH77+J+A/gVSRMZ8JdAFTI9q8AJyZ8XoF8L/D5zPCuC8DXgTWAj8BlmcdYz3w38LnxwE/A14FNgEXFBmrhZ9xVUSbfbFlxdcAfAHoB/YCncA3wjZvB54CdoZ/vj0s/w7QC/SE7f8+fG9/+PrzJfzZ+T7wzxmvzwD+XOR7TwB2R9R/Hfi3Sv//0CN4qEch7wfuDB9nmdmbRnogMzuOoHfSFhYdC0wj+G16H3cfAO4n6DVA8MX/f8LyYpwJ/Nrdt4w01tDpwJuBs4C7gKWDFWY2m+A33wfC4a+fEXwxvhG4EPhm2AYzu8jMNhT4jGOBqQQ9oGFz988AjxIksXHuvjzsgT1A8GU6kSDJP2BmE939UoJ/yxvD9t8CPgo8Hr6+LvszzOwUM3st4nFKgfDmECTTQeuBN5nZxCL+aqcBG/NVmJkR/BzlrZfyU6KoY+EXwHTgHnd/GnieYChnuP6fme0BfkvQC/lmWN4S/vlSnve8lFE/sUCbQobbvpAV7r7H3buAfwfmm9n0sO5iguTVDbyHYOjmdnfvc/ffECS69wK4+/fdfW5ErJQo3kHvBv7g7neE8dwF/I4RDmW5+2PufmjE47ECbx1H0KMZNPh8fNTnmdlcgp5soXmIFQTfTbcXqJcyU6Kobx8A/sPdXwlffz8sG9QHNGa9p5FgaCPTCQRfGu8DFgCDE9CDxz2CXEdk1HcUaFPIcNsXsq9H4u67CX5LHxwXX0rwmzkEyXRB5m/ZBInk8CJjpUTxDjqSYEgv05+AySX8jGJ0AodkvB58vrvQG8xsJsEw3xXu/mie+uUEvdx3h0laqoASRZ0ys7HABcDp4aqVPwOfAOaZ2byw2YsE4+WZjiL3SwoP3EMwKX1tWLwJaCf8zTvjs1PAecAvwqKfA/81LC/Gz4ETs1dOZdkDHJTxOt+XevbWyXcBS83sbcAY4OGwfAvwy6zfsse5+8eKiHVT+P7zRhFrdpzbCJJXpmnA1iLiyWFmp4ar1go9Ti3w1o0EiwgGzQP+4u4d+RqHvbWfAze4+x156j9EODnu7u3Z9VJBlZ4k0aMyD4LfmF8l+II5POOxFrgpbHMW8DJwIsGk7DEEw0sfzThO9mT2Wwi++A4PX78P2EUwpDUm/IzbCJLQxLDNGwgmn+8ApodlkwnG3ucWiH81wSTuXxNM+o4nGIv/UFh/J0EPqRFoJei9ZE9mN2QdsxnYQTAf8a8Z5eMJkuPfhcdrBP4GeHOR5/p8gmGZDxL81p0iWLK6Kqx/VxjfNIIVXD/KjA+4mwMnjScCr4XntCE8x68BLWH9d4B/ymh/KfBYDD9Di4A/A7MJVqg9BHyxQNvJBEOb/1ig/uLwWEWdUz3K+6h4AHpU6B8efjqYELLKLwj/ww5+SX2I4DfHXQST1NeQsTopO1GEZT/JPDawJPxS30OQnO4ia8USwXDKbeFn7yYYc78OOKhA/E0ESzHbwuP+CbgFmBbWHw08STA8MjjxG5kowrpbw7q/ySo/NjzOdoLhpIeA+WHdxcDGIc73IoJJ6c7wGI8QDK8M1q8Mv+zbgI9kJYq3Ab8nSGJfD8tOIVgOvDP885SMY5UlUYTH/iTwl/Dn43agOaNuI3Bx+Py68O/UmfnIaPtHgiHNzPqbK/3/RI/gYeE/koiISF6aoxARkUhKFCIiEkmJQkREIilRiIhIpFg3Q4tDS0uLz5gxo9JhVLdNm4I/jz22snGISNV4+umnX3H3SSN5b+ISxYwZM1i3bl2lw6huCxcGfz7ySCWjEJEqYmY5F8oWS0NPIiISSYlCREQiKVGIiEgkJQoREYmkRCEiIpGUKEREJJIShYiIRIotUZjZbWb2spk9W6DezOzrZtZmZhvM7IS4YhERqVcdnd18/eebaHjD5JkjPUacF9x9B/gG8L0C9WcDs8LHAuB/hX+KiEgBHZ3dtO/oYsphY5k4rjmy/rG2V7ji7mcASDWNnTDSz4wtUbj7WjObEdFkCfA9D26I8YSZHWpmR7h7KW9CLyKSOIWSwY+e2cqn7t9AYypF78AAN543l8XzJ+et7+kfoLd/oCTxVHILj8lk3Nye4N7Kk4GcRGFmy4BlANOmTStLcCIilVAoGXR0dvOp+zewt3eAvQQJ4Or7N3DyzBYmjmvOW18qiZjMdvdV7t7q7q2TJo1oTysRkYrp6Oxm/ZbX6OjsHrLd4Jf97u4+9vYOcPX9G/b1MBpTB35lN6ZStO/oAshbXyqV7FFsBaZmvJ4SlomI1IyhhosyDX7ZZ/YIBpPBlMPG0jtwYE+hd2CAKYeNBchb35g2evtHf7vrSvYoVgPvD1c/nQTs1PyEiNSSqB5CPlHJYOK4Zm48by5jGlOMb25gTGOKG8+bu28OI1/9Te+dx9OfPZNPnjmTgZ6unSP9e8TWozCzu4CFQIuZtQPXAY0A7n4zsAY4B2gDXgc+GFcsIiKVENVDyLdiafDL/uqsHshg28XzJ3PyzJaCq54K1f/Dmcdyxatb20b694hz1dPSIeod+Hhcny8iUmlDDRflM1QymDiuOW+SKbZ+JBIxmS0ikkRDDRdFvW/e1ENL/oU/Uom7w52ISJIM1UNIAiUKEZGYxTEcVE4aehKRRCr22gQZPfUoRCRxhnNtgoyeehQikijDvTZBRk+JQkQSZaitLKT0lChEJFFGcm2CjI4ShYgkykivTZCR02S2iIzYUDfRiUstXJuQJEoUIjIilV55lPRrE5JEQ08iMmxaeVRflChEZNi08qi+KFGIyLBp5VF9UaIQkWHTyqP6oslsERkRrTyqH0oUIjWiEktVtfKoPihRiNSASi9VldqmOQqRhNNSVYmbEoVIwmmpqsRNiUIk4bRUVeKmRCGScFqqKnHTZLZImcWxOklLVSVOShQiZRTn6iQtVZW4aOhJpEy0OkmSSolCpEy0OkmSSolCpEy0OkmSSolCpEy0OkmSSpPZImWk1UmSREoUIkMo9XJWrU6SpIk1UZjZIuBrQBq4xd2/mFU/DfgucGjY5hp3XxNnTCLDoc32RGKcozCzNLASOBuYDSw1s9lZzT4L3OPubwUuBL4ZVzwiw6XlrCKBOCezTwTa3H2zu/cAdwNLsto4cEj4fAKwLcZ4RIZFy1lFAnEmisnAlozX7WFZphXAJWbWDqwBLs93IDNbZmbrzGzd9u3b44hVJIeWs4oEKr08dinwHXefApwD3GFmOTG5+yp3b3X31kmTJpU9SEmujs5u1m95bUTDRVrOKhKIczJ7KzA14/WUsCzTZcAiAHd/3MzGAC3AyzHGJXWiFBPRWs4qEm+P4ilglpkdZWZNBJPVq7PavAicAWBmbwbGABpbklEr5UT0xHHNzJt6qJKE1K3YEoW79wHLgQeB3xKsbtpoZteb2eKw2ZXAR8xsPXAXcKm7e1wxSf3QRLRI6cR6HUV4TcSarLJrM54/B5wcZwxSnzQRLVI6lZ7MFok00sloTUSLlI628JCqNdrJaE1Ei5SGEoVUpczJ6L0EQ0hX37+Bk2e2DOsLX/sqiYyehp6kKmkyWqR6KFFIVdJktEj1UKKQqqTJaJHqoTkKidVo7uWgyWiR6qBEIbEpxRYamowWqTwNPUksdC8HkdqhRCGx0KolkdqhRCGx0KolkdqhRCGx0KolkdqhyWyJjVYtidQGJQoZ0miWuGrVkkjyKVFIpFIscRWRZNMchRSkJa4iAkoUEkFLXEUElCgkgpa4iggoUUgELXEVEdBktgxBS1xFRIlChqQlriL1TUNPIiISSYlCREQiKVHUsI7ObtZveU3XPYjIqGiOoka90tnNKV96SFdUi8ioqUdRg3r7B9i8fY+uqBaRklCiqEHdfQOYHVimK6pFZKSUKBIqav6huSGF+4FluqJaREZKcxQJNNSOro3pFEdPOpgxjakD2uhaCBEZCSWKBMi8HwSwb0fXvQT7MF19/wZOntlyQCJoGdfMrz71Tl1RLSKjFmuiMLNFwNeANHCLu38xT5sLgBWAA+vd/aI4Y0qa7N7DxxfOpDGV2pckYP/8Q3Yy0BXVIlIKsSUKM0sDK4F3Ae3AU2a22t2fy2gzC/gfwMnuvsPM3hhXPEmUeT+IwcTwjYfbCHLqfpp/EJE4xTmZfSLQ5u6b3b0HuBtYktXmI8BKd98B4O4vxxhPYgxOVG/ctivnfhBN6RTL3zFLO7qKSNnEOfQ0GdiS8bodWJDV5hgAM/sVwfDUCnf/afaBzGwZsAxg2rRpsQRbLTKHmnr6B+jPcz+IixZM46IF0zT/ICJlUenJ7AZgFrAQmAKsNbO3uPtrmY3cfRWwCqC1tdWzD1Ir8g01NaaN5gZoSqdzVi8pQYhIOcSZKLYCUzNeTwnLMrUDT7p7L/BHM/s9QeJ4Ksa4qs7gqqadXb05E9VjGtKsvPitTBjbpN6DiFREnIniKWCWmR1FkCAuBLJXNP0QWArcbmYtBENRm2OMqeoUM9Q058gJShAiUjGxJQp37zOz5cCDBPMPt7n7RjO7Hljn7qvDur81s+eAfuAqd++IK6ZqMdiDOLgpPayhJhGRSoh1jsLd1wBrssquzXjuwCfDR13I7EF09w9gWXttaKhJRKpNpSez60pHZzdX37ee7j4/YB4ik4aaRKTaKFGU0Z1Pvkh3X1YPojHFwIDT3KChJhGpTkoUZdDR2c3GbTv5xkN/yKlzd9b8w6ns6enXUJOIVCUlihh1dHZz55MvsvLhP5BOpejpz70EZPk7ZjHzTeMrEJ2ISHGUKGLyo2e2cvV9G+juG5yL6M9p09yQ4qIFtX2luYgk37D3ejKzlJldHEcwtWLwCuv9SeJABzWmGdOY4svnaz5CRKpfwR6FmR0CfJxgz6bVwM+A5cCVwHrgznIEmEQbt+0iheWta25IcfPf/TVzjjxESUJEEiFq6OkOYAfwOPBh4NOAAf/F3Z8pQ2yJlDvktF9zg/Hl8+dy2jGTKhCZiMjIRCWKo939LQBmdgvwEjDN3feWJbIEavvLbq66d33OpHVT2rj8nbO4aME09SJEJHGiEkXv4BN37zezdiWJwn70zFauum9DTpI4qCnNzZecwGnH6J5MIpJMUYlinpntgn2D7WMzXru7HxJ7dAkxOHndk2e4acCdOUdOqEBUIiKlUTBRuHu6nIEkWfuOrpztwSEYctKV1iKSdFGrnsYAHwVmAhsIdn/tK1dgSTLlsLH0Zm0P3tSQYs3lp+hiOhFJvKjrKL4LtAL/CZwD3FSWiBJo4rhmbjxv7gH3sf7K+XOVJESkJkTNUczOWPV0K/Dr8oRU/QbvJ5G5N9Pi+ZM5eWaL7mMtIjWn2FVPfWb5LyCrN5n3kxjc7XXx/MlA0LNQghCRWhM19DTfzHaFj93A3MHn4eqnujO4umlv7wC7u/vY2zvA1fdvoKOzu9KhiYjEJqpHsd7d31q2SBIg3+qmxlSK9h1d6kmISM2K6lHk7oldxzo6u9nZ1UNP/4G7wPYODDDlsLEVikpEJH5RPYo3mlnBe1m7+1djiKcqZc5LDDg0pGBsY4PuSCcidSEqUaSBcVBgG9Q6MHhnuuz7XDc3pFh58QnaAVZE6kJUonjJ3a8vWyRVZrAXkTLLuc91UzrFhLGNShIiUheiEkXd9iTa/rKbf7z3GXpzb0oHaF5CROpLVKI4o2xRVJEfPbOVK+9ZT76b0x3UmGYA17yEiNSVqE0BXy1nINWgo7Obq+5dT99A7oKvprTpznQiUpeiehR159uPbs65n8Sgy985S3emE5G6FHUdRV1Z9cvnufmXm/PWNaWNixZMK3NEIiLVQYmCIEn8809+l7cubfCV987TcJOI1K26H3q684k/FUwSDSnjp1ecqu3CRaSuxdqjMLNFZrbJzNrM7JqIdueZmZtZa5zxZOvo7GbF/91YsP7zi+coSYhI3YstUZhZGlgJnA3MBpaa2ew87cYDVwBPxhVLId9+dDO9BSavP33OcVx80vQyRyQiUn3i7FGcCLS5+2Z37wHuBpbkaXcD8CVgb4yx5IiavP70Ocex7LS/Kmc4IiJVK85EMRnYkvG6PSzbx8xOAKa6+wNRBzKzZWa2zszWbd++fdSBRU1ef2zh0UoSIiIZKrbqycxSwFeBK4dq6+6r3L3V3VsnTRrdtQxRk9eNaePDpxw9quOLiNSaOBPFVmBqxuspYdmg8cDxwCNm9gJwErA6zgntjs5uPvfDZwvWrzh3jpbBiohkiTNRPAXMMrOjzKwJuBBYPVjp7jvdvcXdZ7j7DOAJYLG7r4sroFse3UyeLZwATV6LiBQSW6Jw9z5gOfAg8FvgHnffaGbXm9niuD63kI7Obm597I956y59+3TNS4iIFBDrBXfuvgZYk1V2bYG2C+OMpX1HF55ns790KtjHSURE8qubLTy++cgf6M1zycSnzjpO8xIiIhHqIlF865fP8+DGl3PKG1Ow4OiJFYhIRCQ5aj5RdHR28y8FlsM66E51IiJDqPlE8fjzHQXrrtawk4jIkGo+UbzSmX9nkPlTJ7DsdK10EhEZSs0niu6+/Jv+feX8eWWOREQkmWo6UXR0dnPjT3PnJy5onaztw0VEilTTieKWRzeTbxfx98ydnFsoIiJ51Wyi6OjsZtXa/NuIB+udRESkGDWbKDZu25W3N5EymHPkhPIHJCKSUDWbKB5//pW85X9/2tFaEisiMgw1mSgKbQDYkIIPn6r7TYiIDEdNJoo7n3yRnjzjTleccYx6EyIiw1RziaKjs5uVD7fllDeljYsWTKtARCIiyVZziaJ9Rxd4bm/i8nfOUm9CRGQEai5RHNyUpjvPsNPZxx9egWhERJKv5hLFtp3593YqVC4iItFqLlEUvphOF9mJiIxEzSWKLa925ZQ1pk0X2YmIjFBNJYqOzm5ueOC5nPIV587RRLaIyAjVVKJo39FF2uyAsoOb0xw/Wb0JEZGRqqlE8ezWnezp6T+grH/AdbtTEZFRqJlEUWjY6XPvnq1hJxGRUaiZRKFhJxGReNRMotCwk4hIPGoiUWjYSUQkPjWRKNp3dOEDB15Q15Q2DTuJiJRATSSKfPs79fQ7BzelKxSRiEjtqIlEsaenn+aGAyeyxzSmcuYsRERk+GoiUTy7dSfdfbl7OWkiW0Rk9GJNFGa2yMw2mVmbmV2Tp/6TZvacmW0ws1+Y2fThfkZHZzfXrX42p1wT2SIipRFbojCzNLASOBuYDSw1s9lZzX4DtLr7XOA+4Mbhfs7GbbvoG8gtn/qGg4Z7KBERySPOHsWJQJu7b3b3HuBuYElmA3d/2N1fD18+AUwZ/sdoW3ERkTjFmSgmA1syXreHZYVcBvwkX4WZLTOzdWa2bvv27QfUaVtxEZF4VcVktpldArQCX85X7+6r3L3V3VsnTZq0r7yjs5vrf7wxp722FRcRKZ2GGI+9FZia8XpKWHYAMzsT+Axwurt3D+cD7nzyxZzVTtrfSUSktOLsUTwFzDKzo8ysCbgQWJ3ZwMzeCnwLWOzuLw/n4B2d3ax8uC2nvK9/QMtiRURKKLZE4e59wHLgQeC3wD3uvtHMrjezxWGzLwPjgHvN7BkzW13gcDnad3SB505YL3/HLA07iYiUUJxDT7j7GmBNVtm1Gc/PHOmx823bAXD28YeP9JAiIpJHVUxmj8Senn4asqJvSKFtO0RESiyxieLgpnTOhXZ9A2gjQBGREktsoti2c++wykVEZGQSmyi+vfb5vOW7unrKHImISG1LZKJo+8tuHm3ryFt3yNjGMkcjIlLbEpkoHmt7JW+5gbbuEBEpsUQmipYC10l84G3TdQ2FiEiJJTJRHHf4+Lzll5w07NtZiIjIEBKZKHQNhYhI+SQyUegaChGR8klkoljz7J9zysY0ptSjEBGJQeISRd+A87Wf/z6n3N21a6yISAwSlyj29vSTZy9APnTyUVrxJCISg8Qliv48W4sDzDnykDJHIiJSHxKXKEREpLwSlyjSZnnLtXWHiEg8EpcoxjSlaUwfmCwa06atO0REYpK4RNGQMlacO4fGtDG2MUVzQ4qb3jtPE9kiIjFJXKJ47fVebnjgOZobUvQ7XHvubBbPn1zpsEREalbiEkX7a6+zt3eAzu5+evoGuOHHz9HR2V3psEREalbiEoWRNT+RStG+o6tC0YiI1L7EJQrnwOsoegcGdEW2iEiMGiodwHAdOWEsDWmjMZ2i350bz5uriWwRkRglrkexbWcXTQ0pegecz71HE9kiInFLXKJwRxPZIiJllLhEkUkT2SIi8Ut0oujq7dNEtohIzBKdKKzAvk8iIlI6iU4UYxrSGnoSEYlZohNFT3+/hp5ERGIWa6Iws0VmtsnM2szsmjz1zWb2g7D+STObMZzjnz3ncF1DISISs9gShZmlgZXA2cBsYKmZzc5qdhmww91nAv8KfGk4n/HSLg07iYjELc4exYlAm7tvdvce4G5gSVabJcB3w+f3AWfYMGao//TK6yUJVERECoszUUwGtmS8bg/L8rZx9z5gJzAx+0BmtszM1pnZuv7Xd+4r7+kbKHXMIiKSJRGT2e6+yt1b3b01fdD+O9mdd4K27xARiVuciWIrMDXj9ZSwLG8bM2sAJgAdxRw8BXzm3ONHH6WIiESKM1E8Bcwys6PMrAm4EFid1WY18IHw+fnAQ+7uRGhMGx85eTqbv/jukgcsIiK5Yttm3N37zGw58CCQBm5z941mdj2wzt1XA7cCd5hZG/AqQTKJdNzhh6gnISJSRrHej8Ld1wBrssquzXi+F3hvnDGIiMjoJGIyW0REKkeJQkREIilRiIhIJCUKERGJpEQhIiKRlChERCSSDXF9W9Uxs93ApkrHUSVagFcqHUSV0LnYT+diP52L/Y519/EjeWOs11HEZJO7t1Y6iGpgZut0LgI6F/vpXOync7Gfma0b6Xs19CQiIpGUKEREJFISE8WqSgdQRXQu9tO52E/nYj+di/1GfC4SN5ktIiLllcQehYiIlJEShYiIRKraRGFmi8xsk5m1mdk1eeqbzewHYf2TZjaj/FGWRxHn4pNm9pyZbTCzX5jZ9ErEWQ5DnYuMdueZmZtZzS6NLOZcmNkF4c/GRjP7frljLJci/o9MM7OHzew34f+TcyoRZ9zM7DYze9nMni1Qb2b29fA8bTCzE4o6sLtX3YPgRkfPA0cDTcB6YHZWm/8O3Bw+vxD4QaXjruC5eAdwUPj8Y/V8LsJ244G1wBNAa6XjruDPxSzgN8Bh4es3VjruCp6LVcDHwuezgRcqHXdM5+I04ATg2QL15wA/AQw4CXiymONWa4/iRKDN3Te7ew9wN7Akq80S4Lvh8/uAM8zMyhhjuQx5Ltz9YXd/PXz5BMH9yWtRMT8XADcAXwL2ljO4MivmXHwEWOnuOwDc/eUyx1guxZwLBw4Jn08AtpUxvrJx97UEdwstZAnwPQ88ARxqZkcMddxqTRSTgS0Zr9vDsrxt3L0P2AlMLEt05VXMuch0GcFvDLVoyHMRdqWnuvsD5QysAor5uTgGOMbMfmVmT5jZorJFV17FnIsVwCVm1k5w183LyxNa1Rnu9wmQzC08pAAzuwRoBU6vdCyVYGYp4KvApRUOpVo0EAw/LSToZa41s7e4+2sVjaoylgLfcfebzOxtwB1mdry7D1Q6sCSo1h7FVmBqxuspYVneNmbWQNCd7ChLdOVVzLnAzM4EPgMsdvfuMsVWbkOdi/HA8cAjZvYCwRjs6hqd0C7m56IdWO3uve7+R+D3BImj1hRzLi4D7gFw98eBMQQbBtabor5PslVrongKmGVmR5lZE8Fk9eqsNquBD4TPzwce8nC2psYMeS7M7K3AtwiSRK2OQ8MQ58Ldd7p7i7vPcPcZBPM1i919xJuhVbFi/o/8kKA3gZm1EAxFbS5nkGVSzLl4ETgDwMzeTJAotpc1yuqwGnh/uPrpJGCnu7801JuqcujJ3fvMbDnwIMGKhtvcfaOZXQ+sc/fVwK0E3cc2gsmbCysXcXyKPBdfBsYB94bz+S+6++KKBR2TIs9FXSjyXDwI/K2ZPQf0A1e5e831uos8F1cC3zazTxBMbF9ai79YmtldBL8ctITzMdcBjQDufjPB/Mw5QBvwOvDBoo5bg+dKRERKqFqHnkREpEooUYiISCQlChERiaREISIikZQoREQkkhKFSBHMrN/Mnsl4zDCzheEOtedmtPuxmS0Mnz8S7mi63syeMrP5FfvkUI/tAAAA6UlEQVQLiIyCEoVIcbrcfX7G44WwvJ3givhCLnb3ecA3Ca53EUkcJQqR0VkP7DSzdw3R7nGK2HxNpBopUYgUZ2zGsNO/Z9V9AfjsEO9fRLClhkjiVOUWHiJVqMvd884xuPtaM8PMTslTfWe4/9A4QHMUkkjqUYiURqFexcUEd177LvBvZY1IpESUKERKwN3/AzgMmJunzoHPASeZ2XHljk1ktJQoRErnCxy41/8+7t4F3ARcVdaIREpAu8eKiEgk9ShERCSSEoWIiERSohARkUhKFCIiEkmJQkREIilRiIhIJCUKERGJ9P8BPAMYcFvcejAAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "# Plot AUROC w/ best cutoff\n", + "title_ = 'AUROC Curve: Cutoff = ' + str(best_cut)\n", + "ax = j_frame.plot(x='FNR', y='TPR', kind='scatter', title=title_, xlim=[0,1])\n", + "_ = ax.axvline(best_cut, color='r')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 4. Report Raw Confusion Matrices\n", + "\n", + "The basic DIA procedure in this notebook is based on measurements found commonly in confusion matrices, so confusion matrices are calculated as a precursor to DIA and to provide a basic summary of the GBM's behavior in general and across men and women." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Overall confusion matrix" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
actual: 1actual: 0
predicted: 111791192
predicted: 08245745
\n", + "
" + ], + "text/plain": [ + " actual: 1 actual: 0\n", + "predicted: 1 1179 1192\n", + "predicted: 0 824 5745" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "evaluate.get_confusion_matrix(valid_yhat, y_name, yhat_name, cutoff=best_cut)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The general confusion matrix shows that the GBM is more accurate than not because the true positive and true negative cells contain the largest values by far. But the GBM seems to make a larger number of type II errors or false negative predictions. False negatives can be a disparity issue, because for complex reasons, many credit scoring and other models tend to over-estimate the likelihood of non-reference groups - typically people other than white males - to default. This is both a sociological discrimination problem and a financial problem if an unpriviledged group is not recieving the credit they deserve, in favor of undeserving white males. Deserving people miss out on potentially life-changing credit and lenders incur large write-off costs." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Report confusion matrices by `SEX`\n", + "\n", + "The only values for `SEX` in the dataset are `female` and `male`. " + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['female', 'male']" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "sex_levels = list(valid_yhat['SEX'].unique())\n", + "sex_levels" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Confusion matrix for `SEX = male`" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
actual: 1actual: 0
predicted: 1511482
predicted: 03602124
\n", + "
" + ], + "text/plain": [ + " actual: 1 actual: 0\n", + "predicted: 1 511 482\n", + "predicted: 0 360 2124" + ] + }, + "execution_count": 13, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "male_cm = evaluate.get_confusion_matrix(valid_yhat, 'DEFAULT_NEXT_MONTH', 'p_DEFAULT_NEXT_MONTH', by='SEX', level='male', cutoff=best_cut)\n", + "male_cm" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Confusion matrix for `SEX = female`" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
actual: 1actual: 0
predicted: 1668710
predicted: 04643621
\n", + "
" + ], + "text/plain": [ + " actual: 1 actual: 0\n", + "predicted: 1 668 710\n", + "predicted: 0 464 3621" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "female_cm = evaluate.get_confusion_matrix(valid_yhat, 'DEFAULT_NEXT_MONTH', 'p_DEFAULT_NEXT_MONTH', by='SEX', level='female', cutoff=best_cut)\n", + "female_cm" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Both confusion matrices reflect the global confusion matrix: more accurate than not with a larger number of false negative predictions (type II errors) than false positive predictions (type I errors)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 5. Disparate Accuracies and Errors" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To perform the following basic DIA many different values from the confusion matrices reflecting different prediction behavior are calculated. These metrics essentially help us understand the GBM's overall performance and how it behaves when predicting:\n", + "\n", + "* Default correctly\n", + "* Non-default correctly\n", + "* Default incorrectly (type I errors)\n", + "* Non-default incorrectly (type II errors)\n", + "\n", + "In a real-life lending scenario, type I errors essentially amount to false accusations of financial impropriety and type II errors result in awarding loans to undeserving customers. Both types of errors can be costly to the lender too. Type I errors likely result in lost interest and fees. Type II errors often result in write-offs." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Calculate and report group fairness metrics" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
PrevalenceAccuracyTrue Positive RatePrecisionSpecificityNegative Predicted ValueFalse Positive RateFalse Discovery RateFalse Negative RateFalse Omissions Rate
female0.2072120.7851000.5901060.4847610.8360660.8864140.1639340.5152390.4098940.113586
male0.2505030.7578370.5866820.5146020.8150420.8550720.1849580.4853980.4133180.144928
\n", + "
" + ], + "text/plain": [ + " Prevalence Accuracy True Positive Rate Precision Specificity \\\n", + "female 0.207212 0.785100 0.590106 0.484761 0.836066 \n", + "male 0.250503 0.757837 0.586682 0.514602 0.815042 \n", + "\n", + " Negative Predicted Value False Positive Rate False Discovery Rate \\\n", + "female 0.886414 0.163934 0.515239 \n", + "male 0.855072 0.184958 0.485398 \n", + "\n", + " False Negative Rate False Omissions Rate \n", + "female 0.409894 0.113586 \n", + "male 0.413318 0.144928 " + ] + }, + "execution_count": 15, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "cm_dict = {'male': male_cm, 'female': female_cm} # group fairness metrics are based on confusion matrices\n", + "metrics_frame, disp_frame = debug.get_metrics_ratios(cm_dict, 'male') # calculate metrics and ratios\n", + "metrics_frame # display results" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "From eyeballing the raw metrics it appears that the model is treating men and women roughly similarly as groups. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Plot false omissions rate by `SEX`" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Because the confusion matrices indicated there might be a problem with non-default predictions, false omissions rate will be examined closely. False omissions measures how many customers the model predicted *incorrectly* would not default, out of the customers in the group the model *predicted* would not default. " + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAEhCAYAAACTNXDdAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAGAVJREFUeJzt3Xu0ZGV95vHvY7egCIJCGxVoGwImadRlpGw18ZIRL40jtLOCCZAsMUMWY5RlRo0Rx5gQTOJlEplZIxNlBpXgEEQmOj2KkgxMYsYLcpoQzIH0eESE5jI2dxG5NPzmj9qtRVndp87p012nz/v9rHUWtd/97r1/Re1+9q53165KVSFJasNjJl2AJGnXMfQlqSGGviQ1xNCXpIYY+pLUEENfkhpi6GuXSPKpJKdPuo7ZJHlvko/uwPInJfniQtYkLSRDX3OS5PokP0xy78Df0ydYz4uT/G1Xx91J/keSn53v+qrqfVX1ph1Y/tyqOnq+y89Hkk0Dr8mtST6e5AljLntYEm/WaYihr/k4pqr2Hvi7eRJFJHkJ8CXgIuCpwKHANcBXkqyaRE0TdHRV7Q08D1gD/O6E69EiZehrQSR5TJKLujPNu7qz75/bRt+nJLm463dHki8PzDsoyWeTbE7ynSRv2c5mPwR8vKo+UlX3VtXtVfVu4Erg97v1vaJ7d/Lubp03JzkmyWuTfKvb/o8CMskfJflk93ivJOcnub2r9RtJDujmndyt9/tJrktyfNf+m0n+dmB9L04y1b0L+UaSFwzM+z9J/jDJV7v1fCnJk2fb9vZ0B+C/Bp47sJ1jk1yV5J4kNyR578AiX+76bH3X9vyB5/HPSe5M8sUkB8+2be0eDH0tpM8Dh9M/6/4n4Lxt9HsncB2wouv7e9A/cHTruAI4EHgl8M4kRw2vIMk+wAuAz4xY/4XdslsdRH9ffzrwPuAc4Hjg54FfAs5IsnLEen4D2Ktbfn/gzcD9SZ4IfBh4ZVXtA/wicPWIGg8AvgD8Wbf8fwIuTvKkgW4nAicBPwU8AXj79rY9osbhbR4MrAVmBprvBX4N2A84BvjtJK/t5r0UYOBd2xVJfpn+a7SO/mt0OXD+bNvW7sHQ13x8rjv7vCvJ5wCq6pGq+mRVfb+q7gdOB47cxtjyQ/QDeGVVPVhVW8/0XwQ8sar+pGuf4ccBPWx/IMAtI+bdAgyeFd8PfKCqHgIuoB9kZ3bvDq4GNgLP2UadBwCHVdXDVTVVVfd28wp4VpLHVdUtVXXNiOWPAaar6i+raktVnUf/YPcvB/qcU1Xfqqr76B/Atp6hb2/bo3w+yfeBG4BNwBlbZ1TVZVU13b1G/9j9P3jZdtb1JuBPqmpjVW0B/ghYk+TA7Syj3YShr/l4XVXt1/29DiDJsiQf6oY67uHHZ5qjhiQ+AHwXuDTJt5O8s2t/BrBy4IByF/2x6aeOWMcd9IP3aSPmPQ24bWD6tqp6uHv8w+6//29g/g+BvUes55PA/wIuTHJTkg8kWV5V9wAnAG8Bbk3y+STPHLH807vnOei79N/FbHXrwOP7BuoYue0R29jqtd27jqOA1cCTt85I8qJuuG1zkruB32T067LVM4CzBl6D24BH6L/r0G7O0NdCeQPwGuDlwL7AYV17hjtW1T1V9baqWgW8DnhXkpcBNwLfGjig7FdV+1TVMaPWAXwDeP2IWn4FuHRHn1D3buP0qvo54MXAv6I/TEJVfbGqXkH/ADMDfGzEKm6mH6CDVgI37ci2Z1nuMuC/Af9+oPkC4L8DB1fVvsB/5cevy6hP7twInDz0Ojy+qi6fbfta/Ax9LZR9gAeA2+mPRf/xtjp2F1J/OkmAu4GH6Z9Jfg14MMk7kjyue/fw7CRHbmNV7wJOTvKWJHsneXKS9wM9BoY35ivJy5M8q7vWcA/9IZdHkjytew57AQ8CP+jqH/Z54Igkv5pkeZIT6R8MvzDfbY9Z+pnAa5I8q5veB7ijqu5P8kIePVz2PaCSHDrQ9lHgPekuxCfZL8lxY25bi5yhr4XyCfpntjcD08BXt9P3Z4DL6F9g/ArwH6vq77vx49fQ/8jh9fSHFT4GPHHUSqrq74Cj6Z/Z39ot8yzgF6vquh1+Rv3hmb+iH7rT9IdbzgeW0b/QeQv9g9wv0B/qGa5vM3As/YPT7cDb6A/D3LkD255VVd1K/2x/66d0fgt4fzfm/+/oX+je2vf7wPuBy7vhnF5VfYb+herPdEN1VwOvHmfbWvzij6hIUjs805ekhhj6ktQQQ1+SGmLoS1JDDH1Jasj27vCbiAMOOKBWrVo16TIkabeyYcOG26pqxWz9Fl3or1q1iqmpqUmXIUm7lSTDX/kxksM7ktQQQ1+SGmLoS1JDDH1JaoihL0kNMfQlqSGGviQ1xNCXpIYsupuzJO24/MSPVGq+ltpPjnimL0kNMfQlqSGGviQ1ZKzQT7I2ycYkM0lOGzH/pUmuTLIlyXEj5j8xyaYkH1mIoiVJ8zNr6CdZBpwFHA2sBk5Isnqo2w3AG4Hzt7Ga9wFfnn+ZkqSFMM6Z/hpgpqquq6oHgQuAdYMdqur6qroaeGR44SRHAj8F/PUC1CtJ2gHjhP6BwI0D05u6tlkleQzwZ8DvzNLvlCRTSaY2b948zqolSfOwsy/kvhm4uKo2ba9TVZ1dVb2q6q1YMesPv0iS5mmcm7NuAg4emD6oaxvHi4CXJHkzsDewR5J7q+onLgZLkna+cUL/CuDwJIfQD/vjgRPHWXlV/drWx0neCPQMfEmanFmHd6pqC3AqcAlwLXBhVU0nOSPJsQBJnp9kE/B64GNJpndm0ZKk+Uktsi+W6PV65Q+jSzvG795ZOIssIrcpyYaq6s3WzztyJakhhr4kNcTQl6SGGPqS1BBDX5IaYuhLUkMMfUlqiKEvSQ0x9CWpIYa+JDXE0Jekhhj6ktQQQ1+SGmLoS1JDDH1JaoihL0kNMfQlqSGGviQ1xNCXpIaMFfpJ1ibZmGQmyWkj5r80yZVJtiQ5bqD9uUm+lmQ6ydVJfnUhi5ckzc2soZ9kGXAWcDSwGjghyeqhbjcAbwTOH2q/D3hDVR0BrAX+Q5L9drRoSdL8LB+jzxpgpqquA0hyAbAOuGZrh6q6vpv3yOCCVfV/Bx7fnOR7wArgrh2uXJI0Z+MM7xwI3Dgwvalrm5Mka4A9gG/PdVlJ0sLYJRdykzwNOA/4jap6ZMT8U5JMJZnavHnzrihJkpo0TujfBBw8MH1Q1zaWJE8EvgC8p6q+PqpPVZ1dVb2q6q1YsWLcVUuS5mic0L8CODzJIUn2AI4H1o+z8q7/Z4G/qKqL5l+mJGkhzBr6VbUFOBW4BLgWuLCqppOckeRYgCTPT7IJeD3wsSTT3eK/ArwUeGOSq7q/5+6UZyJJmlWqatI1PEqv16upqalJlyHt1pJJV7B0LLKI3KYkG6qqN1s/78iVpIYY+pLUEENfkhpi6EtSQ8b5GgaN4IWyhbW7XCyTdnee6UtSQwx9SWqIoS9JDTH0Jakhhr4kNcTQl6SGGPqS1BBDX5IaYuhLUkMMfUlqiKEvSQ0x9CWpIYa+JDXE0Jekhhj6ktSQsUI/ydokG5PMJDltxPyXJrkyyZYkxw3NOynJt7q/kxaqcEnS3M0a+kmWAWcBRwOrgROSrB7qdgPwRuD8oWWfDPwB8AJgDfAHSZ6042VLkuZjnDP9NcBMVV1XVQ8CFwDrBjtU1fVVdTXwyNCyrwb+pqruqKo7gb8B1i5A3ZKkeRgn9A8EbhyY3tS1jWOsZZOckmQqydTmzZvHXLUkaa4WxYXcqjq7qnpV1VuxYsWky5GkJWuc0L8JOHhg+qCubRw7sqwkaYGNE/pXAIcnOSTJHsDxwPox138J8KokT+ou4L6qa5MkTcCsoV9VW4BT6Yf1tcCFVTWd5IwkxwIkeX6STcDrgY8lme6WvQN4H/0DxxXAGV2bJGkCUlWTruFRer1eTU1NTbqMWSWTrmBpWWS74W7P/XPh7C77ZpINVdWbrd+iuJArSdo1DH1JaoihL0kNMfQlqSGGviQ1xNCXpIYY+pLUEENfkhpi6EtSQwx9SWqIoS9JDTH0Jakhhr4kNcTQl6SGGPqS1BBDX5IaYuhLUkMMfUlqiKEvSQ0ZK/STrE2yMclMktNGzN8zyae7+ZcnWdW1PzbJuUm+meTaJO9e2PIlSXMxa+gnWQacBRwNrAZOSLJ6qNvJwJ1VdRhwJvDBrv31wJ5V9WzgSODfbD0gSJJ2vXHO9NcAM1V1XVU9CFwArBvqsw44t3t8EXBUkgAFPCHJcuDxwIPAPQtSuSRpzsYJ/QOBGwemN3VtI/tU1RbgbmB/+geAHwC3ADcAf1pVd+xgzZKkedrZF3LXAA8DTwcOAd6R5NDhTklOSTKVZGrz5s07uSRJatc4oX8TcPDA9EFd28g+3VDOvsDtwInAl6rqoar6HvAVoDe8gao6u6p6VdVbsWLF3J+FJGks44T+FcDhSQ5JsgdwPLB+qM964KTu8XHAZVVV9Id0Xg6Q5AnAC4F/XojCJUlzN2vod2P0pwKXANcCF1bVdJIzkhzbdTsH2D/JDPB2YOvHOs8C9k4yTf/g8Ymqunqhn4QkaTzpn5AvHr1er6ampiZdxqySSVewtCyy3XC35/65cHaXfTPJhqr6ieHzYd6RK0kNMfQlqSGGviQ1xNCXpIYY+pLUEENfkhpi6EtSQwx9SWqIoS9JDTH0Jakhhr4kNcTQl6SGGPqS1BBDX5IaYuhLUkMMfUlqiKEvSQ0x9CWpIYa+JDXE0JekhowV+knWJtmYZCbJaSPm75nk0938y5OsGpj3nCRfSzKd5JtJHrdw5UuS5mLW0E+yDDgLOBpYDZyQZPVQt5OBO6vqMOBM4IPdssuBTwFvqqojgF8CHlqw6iVJczLOmf4aYKaqrquqB4ELgHVDfdYB53aPLwKOShLgVcDVVfWPAFV1e1U9vDClS5LmapzQPxC4cWB6U9c2sk9VbQHuBvYHnglUkkuSXJnkd3e8ZEnSfC3fBet/MfB84D7g0iQbqurSwU5JTgFOAVi5cuVOLkmS2jXOmf5NwMED0wd1bSP7dOP4+wK3039X8OWquq2q7gMuBp43vIGqOruqelXVW7FixdyfhSRpLOOE/hXA4UkOSbIHcDywfqjPeuCk7vFxwGVVVcAlwLOT7NUdDF4GXLMwpUuS5mrW4Z2q2pLkVPoBvgz4eFVNJzkDmKqq9cA5wHlJZoA76B8YqKo7k3yY/oGjgIur6gs76blIkmaR/gn54tHr9WpqamrSZcwqmXQFS8si2w13e+6fC2d32Te766W92fp5R64kNcTQl6SGGPqS1BBDX5IaYuhLUkMMfUlqiKEvSQ0x9CWpIYa+JDXE0Jekhhj6ktQQQ1+SGmLoS1JDDH1JaoihL0kNMfQlqSGGviQ1xNCXpIYY+pLUkLFCP8naJBuTzCQ5bcT8PZN8upt/eZJVQ/NXJrk3ye8sTNmSpPmYNfSTLAPOAo4GVgMnJFk91O1k4M6qOgw4E/jg0PwPA1/c8XIlSTtinDP9NcBMVV1XVQ8CFwDrhvqsA87tHl8EHJUkAEleB3wHmF6YkiVJ8zVO6B8I3DgwvalrG9mnqrYAdwP7J9kbeBfwhzteqiRpR+3sC7mnA2dW1b3b65TklCRTSaY2b968k0uSpHYtH6PPTcDBA9MHdW2j+mxKshzYF7gdeAFwXJIPAfsBjyS5v6o+MrhwVZ0NnA3Q6/VqPk9EkjS7cUL/CuDwJIfQD/fjgROH+qwHTgK+BhwHXFZVBbxka4ckpwP3Dge+JGnXmTX0q2pLklOBS4BlwMerajrJGcBUVa0HzgHOSzID3EH/wCBJWmTSPyFfPHq9Xk1NTU26jFn1P5ukhbLIdsPdnvvnwtld9s0kG6qqN1s/78iVpIYY+pLUEENfkhpi6EtSQwx9SWqIoS9JDTH0Jakhhr4kNcTQl6SGGPqS1BBDX5IaYuhLUkMMfUlqiKEvSQ0x9CWpIYa+JDXE0Jekhhj6ktQQQ1+SGmLoS1JDxgr9JGuTbEwyk+S0EfP3TPLpbv7lSVZ17a9MsiHJN7v/vnxhy5ckzcWsoZ9kGXAWcDSwGjghyeqhbicDd1bVYcCZwAe79tuAY6rq2cBJwHkLVbgkae7GOdNfA8xU1XVV9SBwAbBuqM864Nzu8UXAUUlSVf9QVTd37dPA45PsuRCFS5LmbpzQPxC4cWB6U9c2sk9VbQHuBvYf6vPLwJVV9cD8SpUk7ajlu2IjSY6gP+Tzqm3MPwU4BWDlypW7oiRJatI4Z/o3AQcPTB/UtY3sk2Q5sC9wezd9EPBZ4A1V9e1RG6iqs6uqV1W9FStWzO0ZSJLGNk7oXwEcnuSQJHsAxwPrh/qsp3+hFuA44LKqqiT7AV8ATquqryxU0ZKk+Zk19Lsx+lOBS4BrgQurajrJGUmO7bqdA+yfZAZ4O7D1Y52nAocBv5/kqu7vKQv+LCRJY0lVTbqGR+n1ejU1NTXpMmaVTLqCpWWR7Ya7PffPhbO77JtJNlRVb7Z+3pErSQ0x9CWpIYa+JDXE0Jekhhj6ktQQQ1+SGmLoS1JDDH1JaoihL0kNMfQlqSGGviQ1xNCXpIYY+pLUEENfkhpi6EtSQwx9SWqIoS9JDTH0Jakhhr4kNcTQl6SGjBX6SdYm2ZhkJslpI+bvmeTT3fzLk6wamPfurn1jklcvXOmSpLmaNfSTLAPOAo4GVgMnJFk91O1k4M6qOgw4E/hgt+xq4HjgCGAt8J+79UmSJmCcM/01wExVXVdVDwIXAOuG+qwDzu0eXwQclSRd+wVV9UBVfQeY6dYnSZqA5WP0ORC4cWB6E/CCbfWpqi1J7gb279q/PrTsgcMbSHIKcEo3eW+SjWNVr3EcANw26SJmk0y6Ak3Iot8/d6N98xnjdBon9He6qjobOHvSdSxFSaaqqjfpOqRR3D93vXGGd24CDh6YPqhrG9knyXJgX+D2MZeVJO0i44T+FcDhSQ5Jsgf9C7Prh/qsB07qHh8HXFZV1bUf33265xDgcOAbC1O6JGmuZh3e6cboTwUuAZYBH6+q6SRnAFNVtR44BzgvyQxwB/0DA12/C4FrgC3AW6rq4Z30XDSaw2ZazNw/d7H0T8glSS3wjlxJaoihL0kNMfQlqSGGvqRdKsnjk/zMpOtolaG/xCR5ZpJLk/xTN/2cJL836bokgCTHAFcBX+qmn5tk+CPg2okM/aXnvwDvBh4CqKqr6T5CKy0Cp9P//q27AKrqKuCQSRbUGkN/6dmrqoZvgNsykUqkn/RQVd091ObnxnehRfHdO1pQtyX5abp/SEmOA26ZbEnSj0wnORFYluRw4K3AVydcU1O8OWuJSXIo/bscfwG4E/gO8OtVdf0k65IAkuwFvAd4FRD6d/q/r6run2hhDTH0l6gkTwAeU1Xfn3QtkhYPQ3+JSPL27c2vqg/vqlqkYUn+J9sZu6+qY3dhOU1zTH/p2GfSBUjb8aeTLkB9nulLUkM8019ikjyO/g/VHwE8bmt7Vf3riRUldbpP7LwfWM2j989DJ1ZUY/yc/tJzHvBU4NXA39H/tTIv5mqx+ATw5/TvHfkXwF8An5poRY1xeGeJSfIPVfXzSa6uquckeSzw91X1wknXJiXZUFVHJvlmVT17sG3StbXC4Z2l56Huv3cleRZwK/CUCdYjDXogyWOAb3W/yHcTsPeEa2qKwztLz9lJngS8l/5vFF8DfGiyJUk/8tvAXvTvxD0S+HXgDROtqDEO70jaZZL06N+R+wzgsV1zVdVzJldVWwz9JSbJfvTPnFYxMHxXVW+dVE3SVkk2Au8Evgk8srW9qr47saIa45j+0nMx8HWG/lFJi8TmqvL78yfIM/0lJsmVVfW8SdchjZLkKOAE4FLgga3tVfVXEyuqMYb+EpPkbcC9wOd59D+qOyZWlNRJ8ingZ4FpfvxOtLx5cNcx9JeYJG8B/pj+LxNtfXHLOx61GCTZWFX+Pu4EOaa/9LwDOKyqbpt0IdIIX02yuqqumXQhrTL0l54Z4L5JFyFtwwuBq5J8h/7wY/Ajm7uUob/0/ID+P6r/zaPH9P3IphaDtZMuoHWG/tLzue5PWnT8PP7keSF3CUryeGBlVW2cdC2SFhe/e2eJSXIMcBXwpW76uUm8GUYSYOgvRacDa+h/ZJOqugrw45qSAEN/KXqoqu4eavPrGCQBXshdiqaTnAgs636a7q3AVydck6RFwjP9JSLJed3Db9P/fdwHgL8E7gH+7aTqkrS4+OmdJSLJNcArgC/S/+3RR/G7dySBwztLyUfpf3PhocDUQHvofwePF3Mleaa/1CT586r6rUnXIWlxMvQlqSFeyJWkhhj6ktQQQ1+SGmLoS1JDDH1Jasj/B0V3Cso/SUYUAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "_ = metrics_frame['False Omissions Rate'].plot(kind='bar', color='b', title='False Omissions Rate')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Calculate and report disparity\n", + "To calculate disparity we compare the metrics to a user-defined reference level and to user-defined thresholds. In this case, we take the class of people who seem most priviledged as the reference level, i.e. `SEX = male`. (Usually the reference level would be `race = white` or `sex = male`.) According to the four-fifths rule (https://en.wikipedia.org/wiki/Disparate_impact#The_80%_rule) thresholds are set such that metrics 20% lower or higher than the reference level metric will be flagged as disparate. **Technically, the four-fifths rule only applies to the adverse impact ratio, discussed further below, but it will be applied to all other displayed metrics here as a rule of thumb.**" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
Prevalence RatioAccuracy RatioTrue Positive Rate RatioPrecision RatioSpecificity RatioNegative Predicted Value RatioFalse Positive Rate RatioFalse Discovery Rate RatioFalse Negative Rate RatioFalse Omissions Rate Ratio
female0.8271831.035971.005840.942011.025791.036650.8863341.061480.9917160.783745
male1111111111
" + ], + "text/plain": [ + "" + ] + }, + "execution_count": 17, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "parity_threshold_low = 0.8 # user-defined low threshold value\n", + "parity_threshold_hi = 1.25 # user-defined high threshold value\n", + "\n", + "\n", + "# small utility function to format pandas table output\n", + "def disparate_red(val):\n", + " \n", + " color = 'blue' if (parity_threshold_low < val < parity_threshold_hi) else 'red'\n", + " return 'color: %s' % color \n", + "\n", + "# display results\n", + "disp_frame.style.applymap(disparate_red)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For the selected thresholds, the GBM appears to have only one value for metrics that is low out-of-range, false omissions rate. The flagged false omissions rate disparity indicates males may be receiving too many loans they cannot pay back, potentially preventing females from recieving these loans." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Plot false omissions rate disparity" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAEhCAYAAACEF+AUAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAGMtJREFUeJzt3Xu8XWV95/HPV9AKgqIkViWJwQooRSbqGRRsRyxYCcPFeckgKCgdNIg6iqKjULAUEKz3aQtIRm1aGEVE6wAG6YBSrXgh1IgGDaZ4SRAkRK4FBfTXP9YK7nNJzk6yk52z8nm/XueVtdfz7PX8ss863/3sZ99SVUiSuuVRwy5AkjR4hrskdZDhLkkdZLhLUgcZ7pLUQYa7JHWQ4a5HJLkwyWnDrmMySU5N8rENuP5rk1wxyJq2NEk+nuTkYdehNTPcOyjJT5I8kOS+np+nDbGeP0pyTVvH3Un+X5Jnre/xquqMqnrDBlz/76tq7vpef30kWdHzO7ktySeTPK7P6z4zyXq/IWX19dux703y4yTvXIfrvy7JNb37qup1VXXW+takjc9w766Dq2q7np+fD6OIJH8MfAm4BHgK8AzgRuDrSWYPo6YhmltV2wHPA/YC/temHLw9D7YHjgD+MslLNuX42rQM9y1IkkcluaSdOd7VzqafvYa+T06ysO33yyRf7WmbkeQfk6xsZ4FvWsuw7wc+WVV/W1X3VdWqqjoJ+FfgPe3x9m8fbZzUHvPnSQ5OclCSH7XjPxKESc5MsqDd3jbJp5Ksamv9dpJpbdux7XHvTXJzkiPa/aNmou0ji0Xto4pvJ3lBT9u/JPnLJNe2x/lSkidNNvbatHe0/wTM6RnnkCSLk9yT5GdJTu25ylfbPqsfhf3nnv/HD5PcmeSKJDMnG7sd/1vAD8eMf0p7G92bZEmSQ9r9zwH+Fvjjduw72v2jlvCSvCHJsva2+EKSp/ZTizYew33LczmwC80s+vvABWvo907gZmB62/cUaO4g2mNcB+wEvBR4Z5L9xh4gyfbAC4DPTnD8i9vrrjaD5nx8GnAG8AmaGeZzgX2B05PMmuA4fwZs215/R+CNwK+SPB74MPDSdrb6IuCGCWqcBnwR+FB7/b8BFiZ5Yk+3VwGvBX4feBzw9rWNPUGNY8ecCRwALOvZfR/wamAH4GDgrUkOatv+Czwy896uqq5L8gqa39GhNL+jbwGf6mPsJHkR8Owx499Ecxs9AXgv8Kkkv19V3wPeDHytHXvcnVeSPwVOBw6jOSd+DvzfyWrRxmW4d9cX2tnkXUm+AFBVv62qBVV1b1X9CjgNeP4a1n4fognaWVX1YFWtnrnvDTy+qs5q9y/jd0E81o5AgFsnaLsV6A2KXwHvq6qHgItoAusj7Wz/BmApsOca6pwGPLOqflNVi6rqvratgD2SPLaqbq2qGye4/sHAkqr6dFU9XFUX0Nyp/deePp+oqh9V1f00d1SrZ7xrG3silye5F/gZsIImEJtCq75cVUva39F329vgxWs51huAs6pqaVU9DJwJ7JVkpzVdIcldwP3AvwB/TXMnvXr8i9vb6LdV9SngJ8DIWsbv9Wrg41W1uD2v3g28OMmMPq+vjcBw766XV9UO7c/LAZJsleT97cPve/jdzG2ipYT3AT8Frk7ybz1PwD0dmNVzx3EXzdrxUyY4xi9pAnaih+hPBe7ouXxHVf2m3X6g/fcXPe0PANtNcJwFwFXAxUluSfK+JFtX1T3AkcCbgNuSXJ5k1wmu/7T2/9nrpzQz0NVu69m+v6eOCceeYIzVDmofRewH7A48aXVDkr3bZbKVSe4GXsfEv5fVng6c0/M7uAP4Lc2jiAlV1Q5t7e+ieTT0SK1Jjkny3Z7jPWuS8XuNug3b2/5ORt+G2sQM9y3La4ADgT+hefj9zHZ/xnasqnuq6m1VNRt4OfCuJC8GlgM/6rnj2KGqtq+qgyc6BvBt4L9PUMvhwNUb+h9qHz2cVlXPBv4I+G80M0mq6oqq2p/mjmQZcP4Eh/g5TVD2mgXcsiFjT3K9L9MsW3ygZ/dFwOeAmVX1BODj/O73MtErZZYDx475PWzTrqevbezfVNX722MeB5DkGcB5wPHAju2dwA8nGb/XqNuwXY57In3chtp4DPcty/bAr4FVNGvF711Tx/YJzT9IEuBu4Dc0M8NvAA8mOTHJY9tHA89J8vw1HOpdwLFJ3pRkuyRPSnI2zUP+09dwnb4l+ZMke7TPBdxDs1Ty2yRPbf8P2wIPAv/e1j/W5cAfJnllkq2TvIrmTu+L6zt2n6V/BDgwyR7t5e2BX1bVr5K8kNHLXLcD1Ybwah8D/jztE+JJdkhyWJ9jQ/PI7F1JHkMzmy9gZXOovJ5m5r7aL4AZSR69hmN9muZ3vGeS3wPOplmjX7EO9WjADPcty9/RzLJ+DiwBrl1L392AL9M80fd14H9X1dfa9d0DaV7K9xOa5YDzgcdPdJCq+mdgLs1M/bb2OnsAL6qqmzf4f9QsCXyeJlyX0CyTfArYiuYJx1tp7sz2oVmiGVvfSuAQmjuhVcDbaJZP7tyAsSdVVbfRzN5XvyrmeODsdk3+ZJonnFf3vZcmML/VLpuMVNVnaZ4w/my7xHYD8LJ+xm5dSrPEdGz7nMbf0DzKupXmd9/7COD/Az8CfpHktrEHqqov0dxR/2N7/Vn08QhGG1f8sg5J6h5n7pLUQYa7JHWQ4S5JHWS4S1IHGe6S1EFrezfdRjVt2rSaPXv2sIaXpCnp+uuvv6Oqpk/Wb2jhPnv2bBYtWjSs4SVpSkoy9uMyJuSyjCR1kOEuSR1kuEtSBxnuktRBhrskdZDhLkkdZLhLUgcN7XXuLF0K++47et/hh8Mb3wj33w8HHjj+Oscc0/zccQccNsH3Ehx/PLzylbB8ORx99Pj2E0+Egw9uxj7uuPHtp5wC++8PixfDCSeMbz/rLNhnH7j2Wjj55PHtH/0ozJkDV10FZ545vv3882G33eCyy+BDHxrffsEFMHMmfOYzcN5549svuQSmTYMFC5qfsRYuhG23hXPPhYsvHt9+zTXNvx/8IFx++ei2bbaBK65ots84A64e8yVJO+4In/tcs33SSfCNb4xunzEDLryw2T7hhOY27LXrrjB/frM9bx7cdNPo9jlzmtsP4KijYMWY73nYe284++xm+xWvgFWrRrfvtx+c2n40+ty58MADo9sPOgje8Y5me+x5B557nnvN9lQ899bAmbskddDQvqxjZGSkfIeqJK2bJNdX1chk/Zy5S1IHGe6S1EGGuyR10KThnuSTSW5P8v01tL86yQ1Jvpfk2iT/afBlSpLWRT8z9wXAAWtp/zHw4qp6DnAGMH8AdUmSNsCkr3Ovqq8mmb2W9mt7Ln4TmLHhZUmSNsSg19yPBa5YU2OSeUkWJVm0cuXKAQ8tSVptYOGe5CU04f6uNfWpqvlVNVJVI9OnT/otUZKk9TSQjx9IsifwcWBuVa2arL8kaePa4Jl7klnA54Gjq+qmyfpLkja+SWfuST4N7AtMS7IC+Avg0QBV9THgPcCOwLlJAB7u562xkqSNp59Xyxw5SfvrgNcNrCJJ0gbzHaqS1EGGuyR1kOEuSR1kuEtSBxnuktRBhrskdZDhLkkdZLhLUgcZ7pLUQYa7JHWQ4S5JHWS4S1IHGe6S1EGGuyR1kOEuSR1kuEtSBxnuktRBhrskdZDhLkkdZLhLUgcZ7pLUQYa7JHWQ4S5JHWS4S1IHTRruST6Z5PYk319De5L8dZJlSW5I8rzBlylJWhf9zNwXAAespX0usEv7Mw84b8PLkiRtiEnDvaq+CvxyLV0OBf6hGt8Edkjy1EEVKElad4NYc98JWN5zeUW7T5I0JFtvysGSzKNZumHWrFmbcmipc5JhV9AtVcOuYLAGMXO/BZjZc3lGu2+cqppfVSNVNTJ9+vQBDC1Jmsggwv1S4DXtq2ZeCNxdVbcO4LiSpPU06bJMkk8D+wLTkqwA/gJ4NEBVfQxYCBwILAPuB/5sYxUrSerPpOFeVUdO0l7AmwZWkSRpg/kOVUnqIMNdkjrIcJekDjLcJamDDHdJ6iDDXZI6yHCXpA4y3CWpgwx3Seogw12SOshwl6QOMtwlqYMMd0nqIMNdkjrIcJekDjLcJamDDHdJ6iDDXZI6yHCXpA4y3CWpgwx3Seogw12SOshwl6QOMtwlqYP6CvckByRZmmRZkndP0D4ryVeSfCfJDUkOHHypkqR+TRruSbYCzgHmArsDRybZfUy3U4CLq+q5wBHAuYMuVJLUv35m7nsBy6rq5qp6ELgIOHRMnwIe324/Afj54EqUJK2rrfvosxOwvOfyCuAFY/qcBvxTkv8JPA7Yf9KjLl0K++47et/hh8Mb3wj33w8HTrCyc8wxzc8dd8Bhh41vP/54eOUrYflyOPro8e0nnggHH9yMfdxx49tPOQX23x8WL4YTThjfftZZsM8+cO21cPLJ49s/+lGYMweuugrOPHN8+/nnw267wWWXwYc+NL79ggtg5kz4zGfgvPPGt19yCUybBgsWND9jLVwI224L554LF188vv2aa5p/P/hBuPzy0W3bbANXXNFsn3EGXH316PYdd4TPfa7ZPukk+MY3RrfPmAEXXthsn3BCcxv22nVXmD+/2Z43D266aXT7nDnN7Qdw1FGwYsXo9r33hrPPbrZf8QpYtWp0+377wamnNttz58IDD4xuP+ggeMc7mu2x5x1MyXPvKz3NJ/BRvssc9uMqTmH8uXcc53MTu3EQl3Ei48+9o7mAFczkcD7D8Yw/9w7jElYxjdeygGNYMK79QBbyANtyPOdyOOPPvZdwTXMz8EEOYvS59wDbcCDNuXcKZ7Afo8+9VezIYTTn3lmcxN6MPvdWMIOjac69j3ACcxh97t3ErhxHc+6dzzx2ZfS5t5g5vI0pfO6twaCeUD0SWFBVM4ADgQuSjDt2knlJFiVZ9NBDDw1oaEnSWKmqtXdI9gZOq6qXtZdPAqiqs3v6LAEOqKrl7eWbgRdW1e1rOu7IyEgtWrRow/8H0hYqGXYF3TJJFG42klxfVSOT9etnWeY6YJckOwO30Dxh+qoxfX4G7AcsSPJs4LHAynUrefPkH9BgTZU/IGmqm3RZpqoeBt4MXAn8gOZVMUuSnJ7kkLbbicDrk3wX+DRwTE32kECStNH0M3OnqhYCC8fse0/P9o3AiwZbmiRpffkOVUnqIMNdkjrIcJekDjLcJamDDHdJ6iDDXZI6yHCXpA4y3CWpgwx3Seogw12SOshwl6QOMtwlqYMMd0nqIMNdkjrIcJekDjLcJamDDHdJ6iDDXZI6yHCXpA4y3CWpgwx3Seogw12SOshwl6QOMtwlqYP6CvckByRZmmRZknevoc/hSW5MsiTJpwZbpiRpXWw9WYckWwHnAC8FVgDXJbm0qm7s6bMLcBLwoqq6M8mTN1bBkqTJ9TNz3wtYVlU3V9WDwEXAoWP6vB44p6ruBKiq2wdbpiRpXfQT7jsBy3sur2j39doV2DXJ15N8M8kBgypQkrTuJl2WWYfj7ALsC8wAvprkOVV1V2+nJPOAeQCzZs0a0NCSpLH6mbnfAszsuTyj3ddrBXBpVT1UVT8GbqIJ+1Gqan5VjVTVyPTp09e3ZknSJPoJ9+uAXZLsnOQxwBHApWP6fIFm1k6SaTTLNDcPsE5J0jqYNNyr6mHgzcCVwA+Ai6tqSZLTkxzSdrsSWJXkRuArwDuratXGKlqStHapqqEMPDIyUosWLRrK2OsiGXYF3TKk062TPDcHa6qcm0mur6qRyfr5DlVJ6iDDXZI6yHCXpA4y3CWpgwx3Seogw12SOshwl6QOMtwlqYMMd0nqIMNdkjrIcJekDjLcJamDDHdJ6iDDXZI6yHCXpA4y3CWpgwx3Seogw12SOshwl6QOMtwlqYMMd0nqIMNdkjrIcJekDjLcJamD+gr3JAckWZpkWZJ3r6XfK5JUkpHBlShJWleThnuSrYBzgLnA7sCRSXafoN/2wFuBbw26SEnSuuln5r4XsKyqbq6qB4GLgEMn6HcG8FfArwZYnyRpPfQT7jsBy3sur2j3PSLJ84CZVfXFAdYmSVpPG/yEapJHAR8GTuyj77wki5IsWrly5YYOLUlag37C/RZgZs/lGe2+1bYH9gCuSfIT4IXApRM9qVpV86tqpKpGpk+fvv5VS5LWqp9wvw7YJcnOSR4DHAFcurqxqu6uqmlVNbuqZgPfBA6pqkUbpWJJ0qQmDfeqehh4M3Al8APg4qpakuT0JIds7AIlSetu6346VdVCYOGYfe9ZQ999N7wsSdKG8B2qktRBhrskdZDhLkkdZLhLUgcZ7pLUQYa7JHWQ4S5JHWS4S1IHGe6S1EGGuyR1kOEuSR1kuEtSBxnuktRBhrskdZDhLkkdZLhLUgcZ7pLUQYa7JHWQ4S5JHWS4S1IHGe6S1EGGuyR1kOEuSR1kuEtSBxnuktRBfYV7kgOSLE2yLMm7J2h/e5Ibk9yQ5OokTx98qZKkfk0a7km2As4B5gK7A0cm2X1Mt+8AI1W1J3AJ8P5BFypJ6l8/M/e9gGVVdXNVPQhcBBza26GqvlJV97cXvwnMGGyZkqR10U+47wQs77m8ot23JscCV0zUkGRekkVJFq1cubL/KiVJ62SgT6gmOQoYAT4wUXtVza+qkaoamT59+iCHliT12LqPPrcAM3suz2j3jZJkf+DPgRdX1a8HU54kaX30M3O/Dtglyc5JHgMcAVza2yHJc4HzgUOq6vbBlylJWheThntVPQy8GbgS+AFwcVUtSXJ6kkPabh8AtgM+m2RxkkvXcDhJ0ibQz7IMVbUQWDhm33t6tvcfcF2SpA3gO1QlqYMMd0nqIMNdkjrIcJekDjLcJamDDHdJ6iDDXZI6yHCXpA4y3CWpgwx3Seogw12SOshwl6QOMtwlqYMMd0nqIMNdkjrIcJekDjLcJamDDHdJ6iDDXZI6yHCXpA4y3CWpgwx3Seogw12SOshwl6QO6ivckxyQZGmSZUnePUH77yX5TNv+rSSzB12oJKl/k4Z7kq2Ac4C5wO7AkUl2H9PtWODOqnom8BHgrwZdqCSpf/3M3PcCllXVzVX1IHARcOiYPocCf99uXwLslySDK1OStC627qPPTsDynssrgBesqU9VPZzkbmBH4I7eTknmAfPai/clWbo+RWtC0xhze2+OvMvfInluDtbT++nUT7gPTFXNB+ZvyjG3FEkWVdXIsOuQxvLcHI5+lmVuAWb2XJ7R7puwT5KtgScAqwZRoCRp3fUT7tcBuyTZOcljgCOAS8f0uRR4bbt9GPDlqqrBlSlJWheTLsu0a+hvBq4EtgI+WVVLkpwOLKqqS4FPABckWQb8kuYOQJuWy13aXHluDkGcYEtS9/gOVUnqIMNdkjrIcJekDjLcJW0USbZJstuw69hSGe5TVJJdk1yd5Pvt5T2TnDLsuiSAJAcDi4EvtZfnJBn7EmptRIb71PV/gJOAhwCq6gZ8Cao2H6fRfC7VXQBVtRjYeZgFbWkM96lr26r69ph9Dw+lEmm8h6rq7jH7fN31JrRJP1tGA3VHkj+g/YNJchhw63BLkh6xJMmrgK2S7AK8Bbh2yDVtUXwT0xSV5Bk07/zbB7gT+DFwVFX9ZJh1SQBJtgX+HPhTIDTvcD+jqn411MK2IIb7FJfkccCjqureYdciafNhuE8xSd6+tvaq+vCmqkUaK8llrGVtvaoO2YTlbNFcc596th92AdJafHDYBajhzF2SOsiZ+xSV5LE0X0z+h8BjV++vqv8xtKKkVvsKmbOB3Rl9fj5jaEVtYXyd+9R1AfAU4GXAP9N8Q5ZPqmpz8XfAeTTvvXgJ8A/AhUOtaAvjsswUleQ7VfXcJDdU1Z5JHg18rapeOOzapCTXV9Xzk3yvqp7Tu2/YtW0pXJaZuh5q/70ryR7AbcCTh1iP1OvXSR4F/Kj9JrdbgO2GXNMWxWWZqWt+kicCp9J8h+2NwPuHW5L0iLcC29K8M/X5wFHAa4Za0RbGZRlJA5dkhOYdqk8HHt3urqrac3hVbVkM9ykqyQ40M6HZ9CyvVdVbhlWTtFqSpcA7ge8Bv129v6p+OrSitjCuuU9dC4FvMuaPR9pMrKwqP799iJy5T1FJ/rWqnjfsOqSJJNkPOBK4Gvj16v1V9fmhFbWFMdynqCRvA+4DLmf0H88vh1aU1EpyIfAsYAm/e2RZvslu0zHcp6gkbwLeS/NNN6t/ieU7ALU5SLK0qvz+1CFyzX3qOhF4ZlXdMexCpAlcm2T3qrpx2IVsqQz3qWsZcP+wi5DW4IXA4iQ/plk2DL4UcpMy3Keuf6f54/kKo9fcfSmkNgcHDLuALZ3hPnV9of2RNju+nn34fEJ1CkuyDTCrqpYOuxZJmxc/W2aKSnIwsBj4Unt5ThLfNCIJMNynstOAvWheCklVLQZ8GaQkwHCfyh6qqrvH7PNjCCQBPqE6lS1J8ipgq/Yrzd4CXDvkmiRtJpy5TzFJLmg3/43m+1N/DXwauAc4YVh1Sdq8+GqZKSbJjcD+wBU03005ip8tIwlclpmKPkbzSXvPABb17A/NZ8z4pKokZ+5TVZLzqur4YdchafNkuEtSB/mEqiR1kOEuSR1kuEtSBxnuktRBhrskddB/AD95qWx9a1VrAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "ax = disp_frame['False Omissions Rate Ratio'].plot(kind='bar', color='b', title='False Omissions Rate Ratio')\n", + "_ = ax.axhline(parity_threshold_low, color='r', linestyle='--')\n", + "_ = ax.axhline(parity_threshold_hi, color='r', linestyle='--')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The model is suffering from a minor disparity problem due to it's propensity to make false negative predictions for males. To address such discrimination, users could tune the GBM variables, cutoff or regularization, could try new methods for reweighing data prior to model training, try new modeling methods specifically designed for fairness, or post-process the decisions. Before attempting remediation here, more traditional fair lending measures will be assessed and, local, or individual, fairness will also be investigated. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 6. Traditional Fair Lending Measures\n", + "\n", + "Along with adverse impact ratio (AIR), several measures have long-standing legal precedence in fair lending, including marginal effect and standardized mean difference. These measures are calculated and discussed here." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Calculate adverse impact ratio (AIR)\n", + "AIR is perhaps the most well-known discrimination measure. It was first delineated by the U.S. Equal Employment Opportunity Commission (EEOC) and AIR is associated with the convenient 4/5ths, or 0.8, cutoff threshold. AIR values below 0.8 can be considered evidence of illegal discrimination in many lending or employment scenarios in the U.S." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Male proportion accepted: 0.714\n", + "Female proportion accepted: 0.748\n", + "Adverse impact ratio: 1.047\n" + ] + } + ], + "source": [ + "print('Adverse impact ratio: %.3f' % debug.air(cm_dict, 'male', 'female'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Typical desirable ranges of AIR are above the 0.8 marker set by the 4/5ths rule. Here we see an almost ideal result where the protected and reference groups have very similar acceptance rates and AIR is near 1. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Calculate marginal effect\n", + "Marginal effect describes the difference between the percent of the reference group awarded a loan and the percent of the protected group awarded a loan under a model. " + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Male accepted: 71.44%\n", + "Female accepted: 74.78%\n", + "Marginal effect: -3.33%\n" + ] + } + ], + "source": [ + "print('Marginal effect: %.2f%%' % debug.marginal_effect(cm_dict, 'male', 'female'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "About 77% of men are awarded a loan by the model. About 79.6% of women are awarded a loan. This results in a marginal effect of -3.33%. Given that the marginal effect is negative, indicating that a higher percentage of individuals in the protected group were awarded a loan than in the reference group, this value would likely not indicate a discrimination problem in most scenarios. The magnitude of the marginal effect is also relatively small, another sign that discrimination concerning SEX is low under the model. Generally, larger marginal effects may be tolerated in newer credit products, whereas smaller marginal effects are expected in established credit products." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Calculate standardized mean difference\n", + "The standardized mean difference (SMD), i.e. Cohen's D, is the mean value of the prediction for the protected group minus the mean prediction for the reference group, all divided by the standard deviation of the prediction. Like AIR, SMD has some prescribed thresholds: 0.2, 0.5, and 0.8 for small, medium, and large differences, respectively. The standardized mean difference can also be used on continuous values like credit limits or loan amounts." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Male mean yhat: 0.23\n", + "Female mean yhat: 0.21\n", + "P_Default_Next_Month std. dev.: 0.18\n", + "Standardized Mean Difference: -0.08\n" + ] + } + ], + "source": [ + "print('Standardized Mean Difference: %.2f' % debug.smd(valid_yhat, 'SEX', yhat_name, 'male', 'female'))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For this model, in the validation set, men receive a higher average probability of default than do women. This difference is evident even after standardizing with the standard deviation of the predictions. However, the difference is quite small, below the 0.2 threshold for a small difference. SMD also points to low disparity between men and women under this model." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 7. Investigate Individual Disparity " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similar people can be treated differenly by the model, so even if the model is mostly fair for most kinds of people, there could still be people the model treated unfairly. This could occur for multiple reasons, including the functional form of the learned model or because different variables are combined by the model to represent strong signals. If a variable is important in a dataset, model, or problem domain it's likely that a nonlinear model will find combinations of other variables to act as proxies for the problematic variable -- potentially even different combinations for different rows of data! So by simply testing for group fairness, you may miss instances of individual discrimination." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Augment predictions with decisions and logloss residuals for women with false positive predictions\n", + "In this notebook, residuals for false positive predictions for women will be examined in an attempt to locate any individual instances of model discrimination. These are women who the model said would default, but they did not default. So they may have experienced some discrimination under the model." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [], + "source": [ + "valid_yhat_female = valid_yhat[valid_yhat['SEX'] == 'female'].copy(deep=True)\n", + "\n", + "\n", + "valid_yhat_female['d_DEFAULT_NEXT_MONTH'] = 0\n", + "valid_yhat_female.loc[valid_yhat_female[yhat_name] > best_cut, 'd_DEFAULT_NEXT_MONTH'] = 1\n", + "\n", + "valid_yhat_female['r_DEFAULT_NEXT_MONTH'] = -valid_yhat_female[y_name]*np.log(valid_yhat_female[yhat_name]) -\\\n", + " (1 - valid_yhat_female[y_name])*np.log(1 - valid_yhat_female[yhat_name]) \n", + " \n", + "valid_yhat_female_fp = valid_yhat_female[(valid_yhat_female[y_name] == 0) &\\\n", + " (valid_yhat_female['d_DEFAULT_NEXT_MONTH'] == 1)]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Plot logloss residuals\n", + "Residuals are a common way to visualize the errors of a model, and in just two dimensions." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfwAAAHxCAYAAACIzA7NAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzs3Xt4XGd57/3vveas0VmyFduyYztxQoyDQ3COUEiaBpJAG1qgkFI2p5JNC7u0bHa74W1LgW7a0hbKIRQC5OWluw19QyFNN6EQQtJwzgkEsSFOYjuyHMWOLI0ljea87v3HjBLFWPHY1ngkze9zXbo8s9aaNbdytfz0POs5mLsjIiIiy1vQ7AJERESk8RT4IiIiLUCBLyIi0gIU+CIiIi1AgS8iItICFPgiIiItQIEvskyY2RvM7DvNruNEmNk6M5s2s8g85//czP73An2Xm9npC3EvkaVAgS/SBGa2x8x+pdl1HC8zW18LzOnazx4z+58nel93H3b3dnevLESdIvKUaLMLEJElrdvdy2a2DfhPM7vP3W9rdlEi8ovUwhdZZMzsLWb2sJmNm9ktZrZ6zrkXm9mDZnbIzD5pZv9pZr8zz30uNrN7atfeY2YXzzn3BjPbZWZTZrbbzF5bO3567Z6HzGzMzP6lnprd/V5gO3DOnO9YbWb/amZP1L7j9+ecO9/M7jWzSTPbb2Yfrh2f7TmI1t5vqNUzZWa3Af1z7nGJmY0c9js/2XNS+47vm1nGzEbN7BNmFp/nv9VVZraj9j37zOxd9fzeIkuJAl9kETGzXwb+EvhNYBXwKPDF2rl+4EvAu4E+4EHg4nnu0wt8FfhY7doPA181sz4zS9eOX+nuHbV7/Lj20Q8A3wB6gEHg43XWfSGwBXi49j4A/h0YAtYAlwF/YGYvqX3ko8BH3b0TOA34/+e59T8D91EN+g8Ar6+nnpoK8Ie1z15Uq+H35rn2c8B/rf332AJ86xi+R2RJUOCLLC6vBW5w9/vdvUA13C8ys/XAVcB2d/+yu5ephvbj89znpcBD7v6P7l529xuBnwO/WjsfAlvMLOXuo+6+vXa8BJwKrHb3vLsfbRDgmJnlgO8DnwRurh0/D1jh7u9396K77wI+A7xmzvecbmb97j7t7j84/MZmtq52nz9194K730X1j4i6uPt97v6D2u+/B/g08KJ5Li8Bm82s090n3P3+er9HZKlQ4IssLquptuoBcPdp4CDVVvJqYO+ccw6MHH6DI92n5lFgjbtngVcDbwVGzeyrZvas2jV/BBhwt5ltN7M3HaXefqAd+O/AJUCsdvxUYHWtOz1jZhngPcBA7fybgTOAn9ceN7xsnt9holbv3N+hLmZ2hpn9HzN73MwmgQ8y55HAYV5B9Q+qR2uPEC6q93tElgoFvsji8hjVsASg1v3eB+wDRql2s8+es7nvn+k+Netq98Hdv+7ul1N9bPBzqq1v3P1xd3+Lu68G/ivwyaNNXXP3irt/GMjzVJf5XmC3u3fP+elw96tqn3nI3a8BVgJ/DXyp9rvONQr0HHZ83ZzXWaBtzn+PCLBizvl/qP1um2qPDt5D9Y+ZI/0O97j71bV6bmb+RwwiS5YCX6R5YmaWnPMTBW4E3mhm55hZgmqr9Ie1LumvAmeb2ctr174NOGWee98KnGFmv2VmUTN7NbAZ+D9mNmBmV9eCtABMU+3ix8xeZWazf0RMAD57rg5/BfyRmSWBu4EpM/tjM0uZWcTMtpjZebXv+W0zW+HuIZCpff5p3+PujwL3Au8zs7iZvYCnHkkA7ASSZvZSM4sBfwIk5pzvACaB6VoPxu8eqejavV9rZl3uXqp9pt7fWWTJUOCLNM+tQG7Oz5+7+zeBPwX+lWoL9zRqz73dfQx4FfAhqt38m6kGYuHwG7v7QeBlVLvaD1Ltqn9Z7R4B8E6qvQDjVJ9rz4bhecAPzWwauAV4R+35ez2+SvWPhLfU5tG/jOqo/d3AGPBZoKt27RXA9tr3fBR4jbvnjnDP3wIuqNX5XuALc37HQ1R7FD5Lteciy9Mfcbyr9vkpqj0YzzTj4HXAnlrX/1upjqUQWVas+hhQRJaa2kj4EeC17n5Hs+sRkcVNLXyRJcTMXmJm3bXu/tln0r8wwl1E5HAKfJGl5SLgEapd5L8KvHyernARkadRl76IiEgLUAtfRESkBSjwRUREWsCy2i2vv7/f169f3+wyRERETpr77rtvzN1XHO26ZRX469ev59577212GSIiIieNmdW15LS69EVERFqAAl9ERKQFKPBFRERagAJfRESkBSjwRUREWoACX0REpAUo8EVERFqAAl9ERKQFKPBFRERagAJfRESkBSjwRUREWoACX0REpAUo8EVERFqAAl9ERKQFKPBFRERaQLTZBYiIiLSC0UyOoZEM49kivek4Wwe7WdWdOmnfrxa+iIhIg41mcty2Yz+5YoX+9gS5YoXbduxnNJM7aTWohS8iItJgdz64n91j05RDpyMRY31/Gx3JKEMjmZPWylcLX0REpIFGMzm+v2scw+hMxiiUQ3689xCFcoXxbPGk1aHAFxERaaChkQx96TgWgJmRikdIxSI8+PgUven4SatDgS8iItJA49kiZ57SSa4YkiuVcXeckLHpIlsHu09aHQp8ERGRBupNx0lEA85Z200iGjCZL+EOF5/Wp1H6IiIiy8XWwW6m8mViEeOctT2cu66HDf3tXHLmypNahwJfRESkgVZ1p7h88wCpeISx6QKpeITLNw+c1NY9aFqeiIhIw63qTp30gD+cWvgiIiItQIEvIiLSAhT4IiIiLUCBLyIi0gI0aE9ERGQBNXtXvPmohS8iIrJAFsOuePNR4IuIiCyQoZEMHckoHckYgRkdydiTu+I1mwJfRERkgYxni6QTT39ank5ET+quePNR4IuIiCyQ3nScbKH8tGPZQvmk7oo3HwW+iIjIApldN38qXyJ0ZypfYipfPqm74s1HgS8iIrJAFsu6+UeiaXkiIiILaDGsm38kauGLiIi0AAW+iIhIC1Dgi4iItAAFvoiISAtoaOCb2Vozu8PMdpjZdjN7xxGuMTP7mJk9bGY/MbNz55x7vZk9VPt5fSNrFRERWc4aPUq/DPx3d7/fzDqA+8zsNnffMeeaK4FNtZ8LgH8ALjCzXuC9wDbAa5+9xd0nGlyziIjIstPQFr67j7r7/bXXU8DPgDWHXXY18AWv+gHQbWargJcAt7n7eC3kbwOuaGS9IiIiy9VJe4ZvZuuB5wI/POzUGmDvnPcjtWPzHRcREZFjdFIC38zagX8F/sDdJxf43tea2b1mdu8TTzyxkLcWERFZNhoe+GYWoxr2/+TuXz7CJfuAtXPeD9aOzXf8adz9enff5u7bVqxYsXCFi4iILCONHqVvwOeAn7n7h+e57Bbgv9RG618IHHL3UeDrwIvNrMfMeoAX146JiIjIMWr0KP3nA68DfmpmP64dew+wDsDdPwXcClwFPAzMAG+snRs3sw8A99Q+9353H29wvSIiIstSQwPf3b8D2FGuceBt85y7AbihAaWJiIi0FK20JyIi0gIU+CIiIi1AgS8iItICFPgiIiItQIEvIiLSAhT4IiIiLUCBLyIi0gIU+CIiIi1AgS8iItICFPgiIiItQIEvIiLSAhT4IiIiLUCBLyIi0gIU+CIiIi1AgS8iItICFPgiIiItQIEvIiLSAhT4IiIiLUCBLyIi0gIU+CIiIi1AgS8iItICFPgiIiItINrsAkRERBaL0UyOoZEM49kivek4Wwe7WdWdanZZC0ItfBEREaphf9uO/eSKFfrbE+SKFW7bsZ/RTK7ZpS0IBb6IiAgwNJKhIxmlIxkjMKMjGaMjGWVoJNPs0haEAl9ERAQYzxZJJ57+pDudiDKeLTapooWlwBcREQF603GyhfLTjmULZXrT8SZVtLAU+CIiIsDWwW6m8mWm8iVCd6byJabyZbYOdje7tAWhwBcREQFWdae4fPMAqXiEsekCqXiEyzcPLJtR+pqWJyIiLW05T8WbSy18ERFpWct9Kt5cCnwREWlZy30q3lwKfBERaVnLfSreXAp8ERFpWct9Kt5cCnwREWlZy30q3lwKfBERaVnLfSreXJqWJyIiLW1Vd2pZBvzh1MIXERFpAQp8ERGRFqDAFxERaQEKfBERkRagwBcREWkBCnwREZEWoMAXERFpAQ2dh29mNwAvAw64+5YjnP8fwGvn1HIWsMLdx81sDzAFVICyu29rZK0iIiLLWaMX3vk88AngC0c66e5/A/wNgJn9KvCH7j4+55JL3X2swTWKiMgy1ir73R9NQ7v03f0uYPyoF1ZdA9zYwHJERKTFtNJ+90ezKJ7hm1kbcAXwr3MOO/ANM7vPzK59hs9ea2b3mtm9TzzxRKNLFRGRJaSV9rs/msWylv6vAt89rDv/Be6+z8xWAreZ2c9rPQZP4+7XA9cDbNu2zU9OuSIistgNDU/wTz98lEKpQk86wbZTe9jQ3046EWVsutDs8k66RdHCB17DYd357r6v9u8B4CvA+U2oS0RElqCh4Qmuv2s3BnQmYuRLFb7+wH52j00v2/3uj6bpgW9mXcCLgH+bcyxtZh2zr4EXAw80p0IREVlqbrpvL5P5IsVyhccm84QVaEsEfO+Rg8t2v/ujafS0vBuBS4B+MxsB3gvEANz9U7XLfh34hrtn53x0APiKmc3W+M/u/h+NrFVERJaH6qj8Q6zsiNORTBINioxlC3SmorhXlu1+90fT0MB392vquObzVKfvzT22C9jamKpERGQ5GxrJ0J+OU64YyZjR1ZYgFY+SL1XYvLqzJcMeFkGXvoiIyEIazxa5YGMv2UKJ6XyZMAwplEImciWu2rKq2eU1jQJfRESWld50nIHOFC9+9gDJWMD4TJFIAC/dMsDWdT3NLq9pFsu0PBERkQWxdbCb23bsZ0VHklc8b5BsocxUvszlmweaXVpTqYUvIiLLyqruFJdvHiAVjzA2XSAVj7TsQL251MIXEZElb2h4glsfGGX/ZJ6BziRXbVnFFS38vP5IFPgiIrKkDQ1P8L5bHmD/VIFi2YlHjXt2HeS9v7alpZ/ZH05d+iIisqRdd8dOdh2cAYz2ZAQwdh2c4bo7dja7tEVFLXwREVnS7t49TqkCmUqJaGCk4wHpWIShfZPNLm1RUQtfRESWrKHhCbLFEMeJBuDuHMqVKZTKhJVmV7e4KPBFRGTJuvWBUXpSMUI3QowgMDCYKlbY2N/W7PIWFQW+iIgsWfsn82w7tYdkLCAMQ0qVEA+daCTCtS/c2OzyFhU9wxcRkSVnNJPjzgcP8ODjUxQrIYPdSSbzFfLlCrEg4OzVnVz2bE3Lm0uBLyIiS8poJseX7h9h9xNZTluRZvtjU8wUQwY6Ewz2tFEsO9e+cEOzy1x0FPgiIrKkDI1kGJ8u0JOOkYqlSMYi7Hx8mrFskZ62OG+79HTNvz8CBb6IiCwp49kipUpIVzwOwCldbQx0pjiUK7F5dafCfh4atCciIktKbzpOLBKQLz817y5fColFjN50vImVLW4KfBERWVK2DnbT255gIltiplhiplAmkyvS2xZn62B3s8tbtBT4IiKypKzqTvHKcwc5b30P+VJIoVJh26ndvHLb2pbfEe+Z6Bm+iIgsCaOZXHXAXrZIbzrOJWeu5JoLTm12WUuGAl9ERBa927ePcsP39pArhaxoj3PWqk4OTBa0z/0xUJe+iIgsakPDE3zk9ocZmy6QL5bZOz7D7T8/wHi2wNBIptnlLRkKfBERWdT+3+/uYv+hHNOFMvlyiGPMFCvcPzzOeLbY7PKWDHXpi4jIojU0PMF3Hj5IqRJiZgQ45UqZtnjA3omcpuEdA7XwRURkURrN5Ljxnr1EIgGpeISyQ7ZUJnRnMlcmwDQN7xgo8EVEZFEaGslQCZ0NfSlCdxJRIwJMF0oUKiEv0YC9Y6IufRERWZTGs0X60nEC2imUnPFsASIBQSTg7DWdvO5ibZBzLBT4IiKy6Nx09x4+fecjjM8UiUUinNbfxlmruxifKRALAv7bpZvUuj9GCnwREVlUbrp7Dx/+5iPEo0ZXMkauVGH76BSbHU4/pZNrzlurDXKOgwJfREQWjdFMjk/c8QjlsEIqiJOKBcRjUYJckSemC3z8MrXsj5cG7YmIyKIwmslx2479HMqXaY9HcIeZYkh7IsJgT4qKu8L+BCjwRURkUbjzwf3sHpsmCGB8pkQYVogEAdOFClPFCn3pRLNLXNIU+CIi0nSjmRzf3zWOYZyzpotS2ZnIFSmVS0zli2TzFX77grXNLnNJU+CLiEjT3fngAabyZR7cP0UyFuOcdV3EIhHGZ8q0xSK881dO41Xnr292mUuaBu2JiEhTjWZyfPNn+ymWKxyYLDAxU6Q9EeGys1aSK4a856qz9Ox+AaiFLyIiTXXng/vJFspEgwhre9qIBcYTU0X2Hpzh4tP6FPYLRIEvIiJN9cBjk6zra8NxIpGANT1trO1poxw6l5y5stnlLRsKfBERaSp3aE9E2difJhYYM6Uy0Qis6Umpdb+AFPgiItJUZ6/pIjNTJhIxNqxIc9qKdrrbElywoa/ZpS0rCnwREWmqS85cyYYVacLQOZQrEobOhhVpdecvMI3SFxGRplrVneKV5w4yNJJhPFukNx1n62C3uvMXmAJfRESablW3ntc3mrr0RUREWoACX0REpAU0NPDN7AYzO2BmD8xz/hIzO2RmP679/Nmcc1eY2YNm9rCZ/c9G1ikiIrLcNbqF/3ngiqNc8213P6f2834AM4sA1wFXApuBa8xsc0MrFRERWcYaGvjufhcwfhwfPR942N13uXsR+CJw9YIWJyIi0kIWwzP8i8xsyMy+ZmbPrh1bA+ydc81I7ZiIiIgch2ZPy7sfONXdp83sKuBmYNOx3MDMrgWuBVi3bt3CVygiIidkNJPTHPtFoKmB7+6Tc17famafNLN+YB+wds6lg7VjR7rH9cD1ANu2bfMGlisiIsdoaHiCG+/ZSyV0+tJx8qXqFriXbx5Q6J9kTe3SN7NTzMxqr8+v1XMQuAfYZGYbzCwOvAa4pXmViojIsRrN5LjxnmEiAazsSFCqOA8fyFIOQ4ZGMs0ur+U0tIVvZjcClwD9ZjYCvBeIAbj7p4BXAr9rZmUgB7zG3R0om9nbga8DEeAGd9/eyFpFRGRhDY1kai37BGZGKh4B4MBkgWQs0uTqWs9RA9/MvgLM21Xu7r/xDOeueaZ7u/sngE/Mc+5W4Naj1SciIovPaCbHXTuf4PFDecazRdb2tNGejJGMBRyYKnDOuu5ml9hy6mnhzwayAf8AvLVx5YiIyFI3mslx2479JKIBq7pT7B2f4cH9U2waaKcSOpHA2DqowD/Zjhr47n777Gszm577XkRE5HBDIxk6klHOWtXJj/ce4tTeNAemcjx8YJp1vW1cc946DdhrgmN9hq9R8CIi8ozGs0X62xMEZpyztos9YzNEI4bjvP3STQr7JqnnGX7nnLcRM+ug2r0PPH1qnYiItLbRTI5HD2b50XCGFe0J1ve3ce6pPUzlS6TiEYV9E9XTwt9OtWU/G/I75rx3QKvdiIhIbc79MNP5MplciVKlwsRMkU0D7USDgAs39jW7xJZWT+Bf5O4jDa9ERESWrOqc+71EAuPUvjTt2SL7MjN0JkP2T+Z5w8Ub1LpvsnoC/xbg3EYXIiIiS9fhc+772xOkExHiEWNtb1phvwjUs9KeHf0SERFpZePZYm3p3PDJY8lohLHp6vr50nz1tPDXmNmH5zvp7u9cwHpERGQJ6q2tk//wgSwAyVhAJlfUnPtFpJ7Az1EduCciIvKkoeEJbn1glP2TedoTUVLRCKevTHNgssCBqQKRwDTnfhGpJ/APuvvnGl6JiIgsGUPDE1x/126601FWd6U4lC+x6+AMve0x1vW1cc66bm2Du8jUE/iVhlchIiJLyk337WUyXyRbKjERK7KyI8UpXQkyuTK/e+kZzS5PjqCewL/azFbPd9LdH1vAekREZJEbzeQYGjnEyo44iWiUUiVk19g0p/a1sX8y3+zyZB71BP7tPH3hHWrve4EVVLevFRGRFjE0kqE/HadcMZIxIx6txsCesSybV3ce5dPSLEedlufuZ7n75tq/ZwEvpvpHwDTwrkYXKCIii8t4tsgFG3vJFkpM58uEYUihFDKRK3HVllXNLk/mUc88fADMbKOZfRb4JtVR+5vd/SMNq0xERBal3nScgc4UL372AMlYwPhMkUgAL90ywNZ1Pc0uT+ZRz+Y5ZwHvobra3t8Ab3X3cqMLExGRxWnrYDe37djPio4kr3jeINlCmal8mcs3DzS7NHkG9TzDfwDYS3WJ3ecAHzJ76nG+Ft4REWktq7pTXL55gKGRDGPTBXrTcS7c2KcpeItcPYF/bcOrEBGRJWVVd0oBv8QcNfDnLrpjZsnaMc27EBERWULqGrRnZm8xs13A48DjZvaImanlLyIiskTUM2jv3cAlwBXuvrN27Azgo2bW5+5/2dgSRUTkZJu7Tv5AZ5KrtqzSCPwlrp4W/huAl8+GPUDt9SuANzaoLhERaZLZdfKnC2VWd6WYLpS5/q7dDA1PNLs0OQH1BL67e+4IB2eA8AjXi4jIEnbrA6N0p6P0tCUIgoCetgTd6Si3PjDa7NLkBNQT+KNmdsnhB83sRVSf6YuIyDKyfzJPVzL2tGNdyZjWyV/i6pmW9/vAzWZ2B3Bf7dg2qs/1X96gukREpEkGOpMcypfoaUs8eexQvsRAZ7KJVcmJqmct/Z8CW4C7gWfVfu4Gzq6dExGRZeSqLavIZMtMzBQIw5CJmQKZbFnr5C9x9bTwqT3Dv77BtYiIyCKwdV0P176w+iz/sUM5BjqT/Obz1mqU/hJXz7S8Carb4f7CKaoD+noXvCoREWmqret6FPDLTD0t/O8CK4EvAV8E9jW0IhEREVlw9TzDfxnwEmACuAG4DXgz0OHulcaWJyIiIguh3mf4E8BnzOyzwGuAjwMdwN81sDYREWmg0UyOoZEM49kivek4Wwe7tSHOMlbvWvrnm9lHgB8BLwJeBXy4kYWJiEjjjGZy3LZjP7lihf72BLlihdt27Gc08wvrrMkyUc+gvUeAKarP798MlGqnzjYz3P0nDaxPREQaYGgkQ0cySkdtgZ3Zf4dGMmrlL1P1dOmPUh2l/1LgKqqj82c58MIG1CUiIg00ni3S35542rF0IsrYdKFJFUmjHTXw3f0F9dzIzH7Z3b914iWJiEij9abjZAvlJ1v2ANlCmd50vIlVSSPVNWivTn8LnLuA9xMRkQVy+AC9VZ1JhkYOAdWWfbZQZipf5sKNfU2uVBqlrkF7dbKjXyIiIifbkQboDY0cYutgF6l4hLHpAql4hMs3D+j5/TK2kC38I63GJyIiTTbfAL3RyTxXaH38lrGQLXwREVmExrNF0omnt+/SiSjj2WKTKpJmOGrgm9l5dd5r7wnWIiIiDTA7QG8uDdBrPfW08D9dz43c/eoTrEVERBpg62A3U/kyU/kSoTtT+RJT+TJbB7ubXZqcROrSFxFZ5lZ1p7h884AG6LW4egbtbTSzL8930t1/YwHrERGRBljVnVLAt7h6Av8J4LrjubmZ3QC8DDjg7luOcP61wB9TndI3Bfyuuw/Vzu2pHasAZXffdjw1iIiISH2BP+3utx/n/T8PfAL4wjzndwMvcvcJM7sSuB64YM75S9197Di/W0RERGrqCfzh4725u99lZuuf4fz35rz9ATB4vN8lIiIi86tn0N63zOwXVtEzsx4zq2sEf53eDHxtznsHvmFm95nZtQv4PSIiIi2nnsDfCtxnZk92tdcC+EfAQwtRhJldSjXw/3jO4Re4+7nAlcDbzOyIu/KZ2bVmdq+Z3fvEE08sRDkiIiLLTj275b3JzH4JuN7M7gfOotrN/wJ3HznRAszsOcBngSvd/eCc791X+/eAmX0FOB+46wj1XU/12T/btm3T8r4i0lIO3xRn62C3RuPLEdW7lv6PgO8Cv1p7/84FCvt1wJeB17n7zjnH00Dg7lO11y8G3n+i3ycispwMDU9w4z17qYROXzpOvlThwGRBc+zliOpZWvcaYAjYB5wGvAL4qJndYGb9R/nsjcD3gTPNbMTM3mxmbzWzt9Yu+TOgD/ikmf3YzO6tHR8AvmNmQ8DdwFfd/T+O5xcUEVmORjM5brxnmEgAKzsSlCrOwweylMOQoZFMs8uTRaieFv5vA7/i7rtr7++uPc9/G9Uw3jjfB939mme6sbv/DvA7Rzi+i+rYAREROcxoJsfnv7ebnfunWdmRIBoET+6Ad2CyQDIWaXKFshgdtYXv7i+dE/azx0J3/zhwUcMqExGRXzC7t/14tsjK9ji5YoXdY1mm8iWSsYCDtWf5Ioerp0v/7+a8fvthpz+44BWJiMi8Zve2X9GRoDudwKkuVbp/Mk8mVyQSmDbFkSOqZ1repXNev+mwc89dwFpEROQoZve2X9/XTmDG6u4UqVjAgakCldC55ry1GrAnR1TPM3yb57WIiJxks3vb96bjnLO2mz0Hp6mEIaf2p3nDxRsU9jKvegI/MLMOqr0Bs69ng18jQ0REGmzuXHvDGcuWGOyG7rYYmyIdnNKZ0lQ8Oap6Ar8P2M5TIb9jzjktdCMi0kCzg/Q6klH62xNkC2UIi+RLFXKlCr3pOBdu7FPYy1HVs9KeNrQREWmS2UF6s9PuOpIxBnvbSMUjXLFlVZOrk6XkqIFvZquf6by7P7Zw5YiIyFzj2SL97YmnHUsnooxNF5pUkSxV9XTp3w5PzvyY5UAvsAI9xxcRaZjZQXqzLXzgyUF7IseinoV3znL3zbV/z6K6rv3twDTwrkYXKCLSyrYOdjOVLzOVLxG6M5UvMZUva669HLN65uEDYGYbzeyzwDepDuLb7O4faVhlIiLCqu7qCPxUPMLYdIFUPKIR+XJc6nmGfxbwHuBc4G+At7p7udGFiYi0kmfa5nZVd0oBLyesnmf4DwB7gVuA5wAfMnvqcb67v7MxpYmItIbqNrfDVEKnvz1OoRRyYHK/WvKyoOoJ/GsbXoWISIuqbnO7l0hg9KUT5MsVHjowzaaV7QyNZBT4smDqmYdZ9oQtAAAgAElEQVT/uXpuZGZ/7+5/cOIliYi0jqGRDJXQ6UsnMDNSsShQ5sBUjkSs7mFWIke1kP/X9MIFvJeISEsYzxbpS8fJl8InjyWjEcamtc2tLCz9+Sgi0kS96TgrOxPkShVyxQrurm1upSEU+CIiTbR1sJtoEHD6yjSxiNW2uYVrzlun5/eyoOoZtFcvbZ0rInIUR5p+d/nmAYZGMiRjEc5Z1/20KXkiC6WeefgfcPc/reNen1iAekRElq0j7Xx3247q9DtthCONVk+X/kvruVG9o/lFRFrRaCbH57+3m5/uO8RD+6fJzBTpSMboSEYZGsk0uzxpAfV06UfMrIN5uuzdfXJhSxIRWV5mW/bj2SIr2hMUyiE/3nuIc9Z20d0W1853clLUE/jPorp2/uG75Vnt33UNqEtEZNmY3dN+RUc17FPx6v/07hmbYdNAoOl3clLUE/g73P25Da9ERGSZmt3Tfn1fOz/emwHKJKIBT0wXOKUryYUb+5pdorQATcsTEWmw2T3te9Nxzlnb/WTY96ZjWi9fTpp6Wvjzjr43s79193ctYD0iIkve4VPvVnUmGRo5BEB3W4xNkQ5O6Uwp7OWkOmoL/yij739zAWsREVnyZgfo5YoV+tsT5IoVhkYOsXWwS3vaS1Od6MI7WmxHRKRmdurdeLbIio7qM/vZAXmjk3nNtZemqmfhnd75TqHAFxEBZve038vO/VOsaE8QGGRmypyztpvutpim3knT1dPCv4+npuEdrriw5YiILD1DwxN86Os/pxSGeOhkZorkShVWdSXYc3CaTZEOTb2Tpjtq4Lv7hpNRiIjIUnT79lHe9+872D+VJ2IBbYkIsaixridNZqZM6HBKZ0pT76Tp6unSP/ewQw6MufvexpQkIrI0DA1P8L9u/RmPT+YBKBMynXfiMWN8Kk8pdNb3t2mAniwK9XTp/90RjvWaWRy4xt1/vMA1iYgsCTfdt5ex6QKxaEBgRqnihB5SrsBUKeR5Ax284eINCntZFOrp0r/0SMfNbBvwMeCFC12UiMhS8OD+KTAjGYFixYhFjHLFKZSdciXkmvPWKuxl0TjuaXnufq+ZtS9kMSIii93cRXWmc2XikYCyQyIK5dAph0Ys4rzg9H62rutpdrkiTzrupXXNbIDq83wRkZZw+KI6G/vbAPDQca/+BAarupK88fka7yyLSz2D9j7OLwZ7L3Ax8I5GFCUistiMZnL89dd2sGN0CgdWdSZ5zmAn2WKFRw9mmSk7EXfWdST5g8tOV+teFp16uvTvPey9AweBd7r7gYUvSURkcRnN5Pj8d3dzz6MZupJRopGAxw7lmcgW+aUz+lnT28azV3fRm46zdbBbz+1lUapn0N7/d6TjZrbWzP6Hu//NwpclIrI4jGZyfOJbO7njwSeYKVYwoKctTjoRpVR2fjY6xavPX6dlc2XRO6ZBe2a2AngVcA2wGvhKI4oSEVkMbt8+yif/cxePPJGlUCqTjkeYypcoh05fOkbozli2yNbB7maXKnJU9TzD7wB+A/gt4Azgy8AGdx9scG0iIk1z+/ZR/uJrDzJTKBELoGBGthTSkYhQqTgTMyW6kjG2DnapC1+WhHpa+AeAu4E/Ab7j7m5mv97YskREmmdoeIIPfeNBJnNFDIgGRtSMQqXCTBE6U1FikYBVXSle9by1zS5XpC71TMt7N5AAPgm828xOa2xJIiLNMzQ8wQe+uoORiTyFYplcsUKuXKE9GaUtHqEcQrninNqb4h2/skmj8WXJOGrgu/vfu/uFwNW1QzcDq83sj83sjGf6rJndYGYHzOyBec6bmX3MzB42s5/MXbffzF5vZg/Vfl5/DL+TiMhxGRqe4C++uoNdT2QJzDEzKh5SKIUUyhXikYC1vSled/EG/vqV5yjsZUmpe+Edd9/l7h9097OBbUAncOtRPvZ54IpnOH8lsKn2cy3wDwBm1gu8F7gAOB94r5np/7NEpGFGMzluvGeYiVyJjkSEVDRKySEWRDB3soUyFTde/KyVvPLcQT23lyWnnkF7F7r7D+Yec/cHgP+n9jMvd7/LzNY/wyVXA19wdwd+YGbdZrYKuAS4zd3HazXcRvUPhxuPVq+IyLEaGp7gujse5mePT5ItlOlMxkjGIwDky2WCSITORIQPvnwLlz1b0+9kaaqnhf/J2Rdm9v0F/v41wNxtdkdqx+Y7LiKyoGb3s995YIrAjMCMsekikQDSySiJWJQ1PUneePF6hb0safWM0rc5r5ONKuR4mdm1VB8HsG7duiZXIyJLxdDwBJ//3m6++bMDBAZt8QjJaBQMohFjKl+mOx2npy3Gi85YydXP1UxkWdrqaeEHZtZjZn1zXvfO/pzg9+8D5s5pGawdm+/4L3D36919m7tvW7FixQmWIyKtYGh4go9+8yF+MjIJONHAmMqXmCyU6IjHSNe683tSMX79uat50/O1p70sffW08LuA+3iqpX//nHMObDyB778FeLuZfZHqAL1D7j5qZl8HPjhnoN6LqU4PFBE5Ybc+MErZQ2JRaItHqTgkYwGVSgXMiUUDnr26i/ddvUVBL8tGPWvprz/em5vZjVQH4PWb2QjVkfex2n0/RXWU/1XAw8AM8MbauXEz+wBwT+1W758dwCcicrxm97L/wa6DlMoh0SCgMxVjbKpAJDAcMDN6UnHedunpCntZVuoZpf/b7v6/a6+f7+7fnXPu7e7+ifk+6+7XPNO9a6Pz3zbPuRuAG45Wn4hIPWb3su9IRjmlI8Xu8WkqxZBIENDfkeBgtkgIxCIBb7tko+bYy7JTzzP8d855/fHDzr1pAWsREWmYOx88wO6xae4fniCdDAiA0MGs+gw/FYtwwYY+PvSK52g0vixLxzpK357hnIjIojOayXHngwf457uHWdEeZ7CnjZ50grW9bRzKldg/mWddX5qrt67m6udqQR1ZvuoJfJ/n9ZHei4gsGrPd+LvHplnRHqcSGnsOzrChP826vjRh6LzpBRu1l720hHoC/1lm9hOqrfnTaq+pvT+REfoiIg0xmslx849GuOXHj1GohEQC4/QVabLFEMPYP5lndVeKg9rLXlpIPYF/VsOrEBFZIKOZHDd8dzd37x6n5E46HuGJ6QI/3TfJ1jVdhA7jM0VWdSW5aGOvuvClZdQzLe9RADPrprrJDcBOdz/UyMJERI7Hdd/ayc1Dj1EsO5EAetvi9KYTHJwuMDyR47z1vZzSnWRDfzuXnDnQ7HJFTpp6puUlgE8DLwd2U+3KP9XMvgK81d2LjS1RRKQ+n7lzJ1/60WOUyyHRCHgFHp8sAE5nKsp0scxYtsBFG3u55MwBte6lpdTTpf8nVBfLWevuUwBm1gFcB/xp7UdEpGmqz+z38rFvPUK5HOKAOwSRgFgl5GC2RDIeY9OKNO++8iwFvbSkegL/14Hz3X1m9oC7T5nZ7wE/QIEvIk10+/ZRPnXXIzzyxAz5Ukg8AhWHUghRQgwIQ2d9b5tWz5OWVk/gh3PDfpa7T5uZpuWJSNN85s6dfPzOXRRKlSePlUOIR4yKg4dOBehMRvn9yzZp9TxpaXXNw69tYnOkRXbCBa5HRKQut28f5VPf3kPoTiwSUKpU/+eo4lCqOMlYAFEjahHec+UZCntpecezW95cauGLyEk1uwHOR7+5k0K5QiIaUK4ABiFOWHHcoVgJScej/N6L1vOq89c3u2yRpluw3fLM7Nnuvv2EKxIRmcfQ8AQ33jNMJXTGpgpEzShVZtsdAfGgQiGEnrY4F53ex5ufv0Ete5Gaelr49fpH4NwFvJ+IyJNGMzluvGcvkcDoSydIxSOU8iEeQiQw4hGYLjjRKLzyeWv4Lxdv0AA9kTkWMvC1kY6INMTQ8ATX3fEwP3t8kr72BJtWtnP2mi6+v2ucSMRrzxaNdCLGW39pPW+55IwmVyyy+Cxk4Ot5vogsqNFMjj//tyH+86EJSpWQqEGpHDKdK/PcU7u5aEMv949kSEQCzhjo4NXbBrW1rcg86gp8MzNg0N33NrgeERGg2qp//7//lB/tnSJiEA2gXIED09XFPR86MM2zV3fyojNW8I7LzlD3vchRBPVc5O4O3HqUy7TErogsiNFMjs99Zxc/fzz75DF3iEYCAmBipsjB6QKVEK45b53CXqQOx9Klf7+Znefu9xzppLtfuEA1iUgLGxqe4H99dQfbRyfJlqpz66stE8MIiQbVufZnndLJOy7bpLAXqdOxBP4FwGvN7FEgS3WQnrv7cxpSmYi0nM/cuZNPf3sP4zMl5q7jWXaI4IQOsYiRigVaJlfkGB1L4L+kYVWISMu76e49fPRbj1AOHQMCq7bkZ4Wzr814/QVrNb9e5BjVHfju/mgjCxGR1jU0PMFHbn+YYiUkMIhGjEroxKy6CQ6AGaxoj/Ouyzdp5TyR47CQ0/JERI7JaCbHnQ8e4F/vH+FQrkQsMMq1lnxAtVUfNRjoTPC89b3a2lbkBCjwRaQpbt8+yg3ff5TRTI5cqUzUqs/qrfas3gxwiATwrFWdvPn5WjlP5EQo8EXkpLt9+yh/8bWfY1QX0gFwjIo70SAAd3JlJxLAVVsG+KMrn62wFzlBCnwROWlGMzlu/tEIn//eHrLFCj1tseoOd6GRiEWIBoAFgNPZFuEPLztdz+tFFogCX0ROitFMji/du5fvPjJGGDrJaMBUvkwiWp1lH48YZSJsXJnm1L4015y3TiPxRRaQAl9EToqhkQzjM0WCwGhPxcgXKzghldBJxSMUSiGdqRi/tnU1l5w5oC58kQWmwBeRhqnucreToX2TTOcrpGMBa3qS9KfjjBRzxIOAQqVCrlihIxXj3Vecqc1vRBqkrrX0RUSO1dDwBO+75QHu3pMhSnW/+ky+xMMHsmSLZQZ7UsSiAREz+toT/MmVz1LYizSQWvgisuBGMzmuu+NhHhmbIRIYsWhAVzROxYsUyyGZmRKndCYph87a3m7e/PwNel4v0mAKfBFZMDfdvYdP3bWL0UN5ihUHh45khOlChfZEhJ5UnIPTefKlCqd0pbh88ylccuZKPa8XOQkU+CKyID5z507+/vaHKVScAKiE4MBUvkJXW0CuFJKIBXSk4py9posP/ob23RI5mRT4InLCPnPnTv7yPx6ituw9ARANquvglx3yxTLRSECpEtCVivPqbYPNLFekJSnwReSEfObOnXzk9keeDHuAELAQ4gEUQ6i4E3HYtCLNtS/cqMF5Ik2gwBeR43LT3Xu44bt7eHB/9mlhP6sCRAzaosavPXeQd1y2Sc/qRZpIgS8ix+ymu/fw4W8+ghPiQG2fm19QrsC6vhTXnLdWYS/SZAp8ETkmt28f5f1f/RmFckjEDKMa+CG/GPpnr27n/S9/jqbciSwCCnwRqctoJscXvreLm4ceZ6YYkowaZa+27StUQx8gGanueveys0/hj67crJa9yCKhwBeRoxrN5Lhtx36+/dBBOpMRJrJGseIkYoa7UQmrc+5DoLc9zhsvWsdbLjmj2WWLyBwKfBF5RrdvH+Vj33qY8ZkiY1NFTumKs7I9wWOTeUplJx5ALoR0MsrVzzmF3/vlM9SqF1mEFPgiMq/P3LmTT317D/limVQ8ghOybyLPut40qzuTHJguUKw4bfGAP73qTO1dL7KIKfBF5IiGhif4zHceJRoY7ckYpbDaZ+/A6KEZNq5Ik0xEiAUR3vkrp2tuvcgip8AXkacZzeQYGsnwTz98lGyxQm9blCAIqBQqtKViFIolyu4cnClx+op2Xn/hqQp7kSWg4YFvZlcAHwUiwGfd/a8OO/8R4NLa2zZgpbt3185VgJ/Wzg27+681ul6RVjY7OK8jGaVQqpCKBRzKl+lti5NORMiXKhQwnjXQwadet03P6kWWkIYGvplFgOuAy4ER4B4zu8Xdd8xe4+5/OOf6/wY8d84tcu5+TiNrFJGnWvXffugJ4pGAs1Z10ZNOsDJfZHiiwGS+RFcqBkAyHuX3f/l0hb3IEhM0+P7nAw+7+y53LwJfBK5+huuvAW5scE0iMsdsqz5XrGAYZvDjvRnOWJGmLZ5gbXcC3DiUL+EOb/2l9erCF1mCGt2lvwbYO+f9CHDBkS40s1OBDcC35hxOmtm9QBn4K3e/uVGFirSip6bclehNx1nVGWdNTzupeHUxnZdsGeB7jxykI1Xiuet6uGrLKq2aJ7JELaZBe68BvuTulTnHTnX3fWa2EfiWmf3U3R+Z+yEzuxa4FmDdunUnr1qRJWq2+/47Ow9w584xAoP+dJyZUoWf7JtiulDmjIEuJnMlNq3s4NIzV3L55gF14YsscY3u0t8HrJ3zfrB27Ehew2Hd+e6+r/bvLuBOnv58f/aa6919m7tvW7FixULULLJsze2+/+m+KZLxCI5RqDgdyRgdySjjM2UcJ3RIxSMKe5FlotEt/HuATWa2gWrQvwb4rcMvMrNnAT3A9+cc6wFm3L1gZv3A84EPNbhekWVraHiC6+54mLFskRUdCfZP5VnVlSQWCZnKl4lHAtpiAWPZEhv62xX0IstMQwPf3ctm9nbg61Sn5d3g7tvN7P3Ave5+S+3S1wBfdPe5m22dBXzazEKqPRF/NXd0v4jUb2h4guvv2k0mV2KgPc5MsUKuWGF8ukh/R4IwdCKBMTFTorctprAXWYYa/gzf3W8Fbj3s2J8d9v7Pj/C57wFnN7Q4kWXuyUV0fvAoAG3xCGWvrpy3pjvB3ok8sVhARzxCIhaQimnKnchytZgG7YnIAhrN5PjTm3/Cj4YPMTFTIh6F3rYYa3vb6UzFObU3Ta5YIWJGvuys6Ijy5os15U5kuVLgiyxTH/raDr7z8DjxqBGPQLkCj0+ViEdm6EsnGMsW2TTQyfuu3qIWvUgLUOCLLCOzXfjj2SLf2HGAWADpeJRoJCBbKEPFGT1U4PlnxDAzrn3hBoW9SItQ4IssA6OZHP/2oxG++fMDxCIBp/WnKVRC4oFRCZ1EJIBElGyuRCmE9kSU33zeWi2iI9JCFPgiS9zsanmPjGWJGPSlYxTKIRGDUugUyiGpWIRYEBCLBvSnYrz7qs3NLltETjIFvsgSNZrJcfOP9vKPP9jLdKFEgBOLRDiYrVCq5FnZEWd0skihVCFiTrk26fUNF2lFSpFWpMAXWYJuunsPn/3uo+zLzFAsh8SCgCAwwIhFnFypwkBHks5klP1TJcphSE8qxm9fsJa3XHJGs8sXkSZQ4IssMbdvH+Wj33qEZDxCNAgoeEi+FJKKBxTKIcmoUao4hUrItvV9fPayTRqYJyINX0tfRBbYv9w7QjwW0JWMkYhGiAVGEEC+HBIJoOJOxUO62mJcc95ahb2IAGrhiyx6o5kcdz64nwcem8Qdtj92iK5UjEoI3akouVKFsFymWAZwsIDnrOnij644S6PwReRJCnyRRWw0k+NL9+5lz/gMXckYmFNxZ2KmSHsiJJ2Ic0qHMzoZAs4ZKzu59Fn9vPy5atmLyNMp8EUWsaGRDOMzRbpTcVLxCABnr+ni3kcnSMYdcMpAVyrOO375NF51/vpmlisii5gCX2QRG88WKVWcrtRTw23OGOikWKpwYLqI45ze386rtw1qDXwReUYKfJFFrDcdJxax2ij8ags/X66wcWUH11zYxxVbFPIiUh8FvsgicPjAvLPXdHHJmSvZOtjNQ49PsWd8BvfqM/zMTJkNK9JsHexudtkisoQo8EWabGh4gs99Zxd7J/L0pGL0d8S4Z88EY9kirzx3kFduW/u0PwbOW9/DJWeu1KA8ETkmCnyRJpnd8Oam+/cxXSjTnYySjkd4fLLIqq4E49MFhkYyXLFlFddcsL7Z5YrIEqeFd0SaYDST40v3j3DnQ2MYTtSM6UKF/ZM5wtA5lCtTqoSMZ4vNLlVElgm18EVOkrnP6YfHZ6hUQkplpzMVo1wpErpRqkC2WCbEWdWVpDcdb3bZIrJMKPBFToLbt4/yybt2sf9Qnu62GAZkCxWCwOhpixGPBkwXypjBVMGIRwJ62xMamCciC0Zd+iINNjQ8wXV37mIqX6IvHceBJ6aqc+jNYLpQYaAzRWcySjkED+GSM1fwynMHNTBPRBaMWvgiDVDtvj/AD3cf5J7d4+SKFRKxCIn2gFQsSmdbyFSuTALoTMaIRYxUPMamgU5+5wUbtAa+iCw4Bb7IApsdkLdj3yEmZooUyyHukC2UCHEGOoyORIx8scLq7hQRM1Z1J7l880ouOXNArXoRaQgFvsgCGs3k+Pz3dvOTkUNM58t0pWJ0pOJkCyXyJadQDDmUK5GMBSTiES46rV9d9yJyUijwRRbI7dtHueF7jzI8PoOHIWZOxaGnLcpMvkQ8EhAxmCmWCcMov/acUxT2InLSKPBFTsDss/o7HtzPfY9m6ErG6ExGmMiGTBdColYkGUuyrr+Nxw/liUYCTl/RztsuPV3P6UXkpFLgixynoeEJPvfd3QyPz3BwKo8BhUoFLzjJeJRCJWQiVyIajdDXHmfzqi6ePdilVr2INIUCX+QYzS6g8+Xakrh97XFGD0HoTnWHemhPBKRiCfZl8kQjxkBnkgs39mkNfBFpGgW+SJ2Ghie46b693LNnnJlChXJYnUdfroQkIgYeUCiFWMxwh662ON1tCf72VVsV8iLSdAp8kToMDU9w/V27OTCVA4diJSRfqpCIBkwXIBkxzIwAJzDIlULi0ZC3vWijwl5EFgUFvshRfObOnXzsjkfIl6rz6fva47QnYgAUy04QOIUQTmmP8/h0kXQ8wrb1vbzqeWs1ME9EFg0FvsgRzD6n/5e7h9k+OkUYwv9t797Do6rOPY5/35lJCAJyF5EQLooogkaNihYURdSi5VLtEdvailqtgJ5qvXBqj1JrxVYtXpBjBXtUqsTiqdTWW9VCtVpEFKIGyk0jRCNiSIBAyGVmnT9mJw4hIZPJTCbJ/D7Pk4eZvffa+12ZhDdr7bXXygjA3iB8uasScKT5/QRDQbp1TGP7nirS0vz88NQsJh3fX616EWl1lPBF6igqLefZlVso2L6Hjdt2E/AZ1TiqnBHwQVXQsX13FYN6p+MzIyM9wDkDezD9zCFK9CLSainhi3iKSsvJKyzljfXbKCotp8/BGVQFQ2Sk+fGFHBVVIfw+8PugOgTllUGOy+zK0L561E5EWj8lfBG8WfL+9SnlldWUVwZJ8/uoDjkCfh9VwRDpAT+hkAOMoAvRIQDfOvYwju3fjeMyuynZi0irp4QvKa2otJz7Xl7LXz4qwjmjSwc/GekBKr3WfGa3DD7+qhwI4veBz2ekEeDaMYP40Zgjkx2+iEjUlPAlZS1eUcCc1zZQtLMSgHS/o7wqSHlViIPS/Hyxo4JBvTtxcEaA/KIyqoIhenUMcNmpWUr2ItLmKOFLyikqLWfh25/w9IpC9lRWAeADKoOQYeDzgcMR8BkBnzHw0K58+8QszZInIm2aEr6kjMUrCnj0zU8oLN1LZVWIgA9CgBngwIDKoKOjGXurQxx1WCd+faFmyROR9kEJX1LC4hUF3PO3DZRXVtMxYOytgopQeJ/fwBn4HAQdVIcc6QEfl586QMleRNoNJXxpt2rmvl//5S7yC3cCjrSAn/SAnzR/iKqgt9iNA5+FF70ByEjzM33MIMYe0zeJ0YuIxJcSvrQ7eZtL+NULH7Fqy05CLrxyXXlVCB8QckECPuOgdB87y4MAdAiEn6vH4KjenbjxnKFK9iLS7ijhS7uSt7mEXzz/EflFu4Bwd/2eihAhwi14v4OK6hAd0/1kBILhgXrpAUYcdjA/PHWgEr2ItFsJT/hmdh7wAOAHFjjn7q6z/zLgHuAzb9Nc59wCb98PgZ972+90zj2R6Hil7Zr13Gqeevczqrx78z6gY7qPUAhC1SGChO/PdyBERZURwsdx/Tpz24ThWuRGRNq9hCZ8M/MDDwPjgELgXTN73jm3ps6hzzjnZtQp2wO4Hcgh3Dh7zytbksiYpW15Pb+IB19fz4eflxGqsy8ElFeG6Jjuw+cL/8WZ5vcRdNAtw8+3ju3D9LOGamCeiKSERLfwTwY2Ouc+BjCzXGAiUDfh1+dc4FXn3Hav7KvAecCiBMUqbcjr+UX89tX15H9RdsDjQoRb9QAd0nxMOXkAl39jkJK8iKScRCf8fsCWiPeFwCn1HHehmZ0OrAeud85taaBsv0QFKm1DUWk5v3kpn5fXbKO8qm6bvn5V1Y70gHH2UYco2YtIymoNg/b+AixyzlWY2dXAE8BZ0RY2s6uAqwCysrISE6G0CotXFHDPK+v4cnd1k8qdNKAbV50+WAPyRCSl+RJ8/s+A/hHvM/l6cB4Azrli51yF93YBcGK0Zb3yjzrncpxzOb17945b4NK6zF+2nv/+c36Tkn3AB5ed0o9nrvmGkr2IpLxEt/DfBYaY2SDCyXoK8N3IA8ysr3OuyHs7AVjrvX4FuMvMaoZPnwP8V4LjlVakqLScZeu+ZOm6rbyxfhsVwfD0t67RktDBB5ec1I9Zk7MTHaaISJuQ0ITvnKs2sxmEk7cf+L1zLt/M7gBWOueeB64zswlANbAduMwru93Mfkn4jwaAO2oG8En7l7e5hLlLN7LxyzJKdldQFYwu0XdJ93HdWYdrNTsRkTrMuWj+G20bcnJy3MqVK5MdhsSoZirc9z8tYUvJHqqCIboflM7uiiC7K4MEvR/V+lr5fbsEGDusL985sb+eqReRlGJm7znncho7rjUM2pMUFx55v4aXPvqitiXvCCf2sopqQiFI80Owntv3PTsFmHnuUL5z8sCWDVpEpI1Rwpekyttcwm1LPiDv8/2fp3dARVV4GlyCPjL8IaqCYL7wlLnZmV352fnHqEUvIhIFJXxJirzNJfxx5WZe+/dWtu2savC46hD4fEYHIOj30bNzGmcedYi67kVEmkgJX1pc3uYSHn3jE3ZVVFJRte/d+Lr3530+cCFHmt/PCVldufbMIUr0IiIxUMKXFjF/2Xoe/9dmSrek2W4AABdoSURBVPZUEQyFOLRLOod17ww4/D4I1TNpngGd0/3079mJ84cfysTjMzVLnohIjJTwJeFmPbeaP7z7Gc6FJ8OpDMKW0kp2V+2kQ8BPRVWI6pCrHaxX49CD0/nBqQOYdHx/JXoRkWZSwpeEqJk054/vbmZV4c7a7ebAvH77neXV9Ot+UDjZuyoqvYl10gLGBcMP5afnHa1ELyISJ0r4EnfhSXM28MGWHXxZVrnPvuoQBAyCQDAEh3RJ5+COAQpLjEGd0zlhQDf+IydL9+lFROJMCV/iJm9zCbNfzGflpzuojpgkp66QC28P+MEZjBzck/HD+yrJi4gkkBK+NFve5hLmLdvAGxu+otwbdV8z2r6+eRxDhFdtOmNIL34x6Vh124uItAAlfInZfS/l8+ibBVTUM8K+sQmbzziiu5K9iEgLUsKXJlu8ooBf/jWfnZWNH1vDb+FW/UEd/MwYM1iL24iItDAlfIna/GXreXjZJkr31tOkb0THNDj60K78+IzDtTa9iEgSKOFLo4pKy/nZs6v4x8YSmpLqfYRb9hOOO5Qbzxum7nsRkSRSwpcG5W0u4b6//Zt3C0oor27aMsoBg8O6duDas47QSnYiIq2AEr7Ua/GKAu59dT3bdlU1OgCvrnOP7sWsiRqQJyLSmijhS62i0nIe/vs6Xs7/kuLd4UTvY/8FbRrSJQ1u+9YxatGLiLRCSvgCwLQnl/PimuL9toeof/KcGh38cMqgHvz0nKM0cY6ISCumhJ/CikrLuWjuMj4rO/BQvJqWfuRRGQFjyomHMWtydiJDFBGROFHCT0Gv5xcx6/l8tuyoiLpMTSs/3eDUwT244Vy16EVE2hIl/BQyf9l65i7dxI76psZrhM8HRx7SiRvHDdVz9CIibZASfgoYPPOFJj0/Hyk9AKcM6MaN5w5Ti15EpA1Twm/H7nspn4f+URBz+c5pcLtG3YuItAtK+O3Q1AVvsXRjabPOcVF2H+6dkhOniEREJNmU8NuRotJyLp73Bpt3Vsd8jmMOyeCFG8bGMSoREWkNlPDbuKLScpas2sKvX9kQ8zl8wA9O6adH7ERE2jEl/DbsxtyVPLt6a8zl+3QKMHXUYCYen6lpcEVE2jkl/DamqLSceX9fz1MrCmMeeQ9wz7c1GE9EJJUo4bcheZtLuCH3fTZt3xvzOQZ3T2fOJTl6xE5EJMUo4bdyi1cU8OuX/81Xe4Ixn8MH5AzsxtWjB2vSHBGRFKWE30rNem41T7zzWZOXpq3rz9NOU2teRESU8FubotJyrvz92+R/GXu3fY2Cu8+PQ0QiItIeKOG3EotXFHDP39bxZVnsz9B3AO7UYDwREamHEn6SvZ5fxC3Pruar8tjH3HfrYNx6/jAlehERaZASfpIsXlHAL/+6hp2Vsd+l75EB93znBA3EExGRRinht6D5y9Yz57UN7Im91x4Ir01/oea6FxGRJlDCbwF5m0u4ftFKPi6pbNZ5LjzuUG785jDNiiciIk2mhJ9Ai1cU8OuX1jbr/nzHAFx4QibTzzpSiV5ERGKmhJ8g055czotrimMunw5MOTmTa5ToRUQkDpTw4+T1/CJ+9+bHrPtiJzv2xt6iz/DDT8cN4UdjjoxjdCIikuqU8Jtp8YoC7vhrPruad3uejn64QYleREQSRAk/RotXFHDnC2vZUdGcNevg4HSYc7EerRMRkcRSwm+ixSsKuOVP+c1amhagT+c0bjznSE2WIyIiLUIJP0p5m0u49qkVbN4R+0P0Bow6vBs3njtMC9qIiEiLSnjCN7PzgAcAP7DAOXd3nf03AFcC1cA24HLn3KfeviDwoXfoZufchETHG6motJwbFq3kX5/ubNZ50n3w3ZP6MWtydpwiE5GWUFVVRWFhIXv3Nn8xK5HmysjIIDMzk7S0tJjKJzThm5kfeBgYBxQC75rZ8865NRGHrQJynHN7zOwa4DfAxd6+cudcUrLkibe/QHFF887Rp3Mapx3Ri8tOG6QWvUgbVFhYSJcuXRg4cCBmluxwJIU55yguLqawsJBBgwbFdI5Et/BPBjY65z4GMLNcYCJQm/Cdc0sjjl8OfD/BMTXqlF80L9kfepCP564bo+fnRdq4vXv3KtlLq2Bm9OzZk23btsV8Dl8c46lPP2BLxPtCb1tDrgBeinifYWYrzWy5mU1KRID12Voee9nxw3qy/LZvKtmLtBNK9tJaNPdnsdUM2jOz7wM5wBkRmwc45z4zs8HA383sQ+fcpjrlrgKuAsjKymqxeCOdOqALi645PSnXFhERiUaiW/ifAf0j3md62/ZhZmcDtwITnHO1nenOuc+8fz8GlgHH1y3rnHvUOZfjnMvp3bt3fKM/gHQfXJh9KP+aeZaSvYgA4YG+L39UxNPvfMrLHxVRVNqM7sImGDhwICNGjGDEiBEMGzaMn//857UDDQsKCujYsSPZ2dm1X08++eQ+5Wq2v/3227XnvP/++8nIyGDHjh212x5//HFmzJixz7XHjBnDypUra8/31Vdf1e5bsGBB7bnT09Nrr3XrrbfWW48FCxbg8/nIz8+v3XbUUUdRWFgIQGZm5j7xXn/99QCMHz+ep59+urbM1KlTmTNnDhMmTCA7O5sjjjiCrl271pZ755136r3+qFGj9rs/fsEFF9CtW7fa9x9++CFjxoxh6NChDBkyhLvuuiuq+HNycsjOziYrK4vevXvXxrJlyxYyMzMpLS2tLfPaa68xaVL8O7UT3cJ/FxhiZoMIJ/opwHcjDzCz44HfAec5576M2N4d2OOcqzCzXsA3CA/oS7g+HRvu1u+eYdz7neM1UY6I7KOotJxX12ylS0aAXp07sLuimlfXbGXcsD5xu8XnnMM5h8+3f1tt6dKl9OrVi7KyMq666iquvvpqnnjiCQAOP/xwVq9eXe85a8rVtWjRIk466ST+9Kc/MXXq1JjivfLKK7nyyiuBcLJ+880390me9cnMzOSuu+7iqaeeqnd/feeYO3cuZ599NhdccAF5eXmsWrWKBQsW1P5B8NprrzF37lyWLFnSaMxdunRh+fLljBw5ku3bt7N169bafXv27GHixInMnz+fsWPHsnv3biZPnkzPnj25+uqrDxh/zR9FCxYs4KOPPuL+++9vNJZ4S2gL3zlXDcwAXgHWAn90zuWb2R1mVvOI3T1AZ2Cxma02s+e97UcDK80sD1gK3F1ndH/CvHP7+fSp8/uZBvx52mmsmjVeyV5E9pNXWEqXjABdMtLwmdElI40uGQHyCksbL3wABQUFDB06lB/84AcMHz6cLVu2HPD4zp0788gjj7BkyRK2b98e0zU3bdpEWVkZd955J4sWLYrpHLGaOHEi77//Phs3boy6zODBg5k6dSq33HIL06ZNY968efj9/piuP2XKFHJzcwF49tlnueiii2r3LVy4kDFjxjB27FgAOnXqxEMPPcTdd3/9tHks8beUhN/Dd869CLxYZ9ttEa/PbqDc28CIxEbXsHduPz9ZlxaRNmj77kp6de6wz7ZOHQJ8VdbM53uBDRs28MQTTzBy5Miojj/44IMZNGgQGzZsoE+fPmzatIns7K+fcH7ooYcYPXo0AGeeeSZ+v58OHTrUdnXn5uYyZcoURo8ezbp169i6dSt9+vRpdj2i4ff7uemmm5g9ezaPPfbYfvtHjx5dm8wvv/xyrrvuOgBuvvlmDj/8cMaOHctpp50W8/XHjRvHFVdcQSgU4plnnuGxxx5j9uzZAOTn53PiiSfuc/zQoUMpLi5mz549UcXfkMh6lZWVMXz48Jjr0JBWM2hPRKQt69Epnd0V1XTJ+HpSlN0V1fTolN7scw8YMCDqZF/DOVf7uqld+osWLeK5557D5/Nx4YUXsnjxYmbMmNHgKPF4P8lw6aWXMnv2bDZv3rzfvoZuC+Tl5QGwdu1aQqFQvbc9opGWlsbIkSPJzc0lGAySmZnZ5HMcKP6GRNar5hZEvCV60J6ISEo4LrMbu/ZWs2tvFSHn2LW3il17qzku88D3rKPRqVOnJh2/a9cuCgoKOPLIpq+++eGHH7JhwwbGjRvHwIEDyc3Nre3W79mzJyUlJfscv3379nrHADRHWloa119/Pb/5TXTDtoLBINOmTSM3N5esrCzmz5/frOtPmTKFa6+9losvvnif7cOGDeO9997bZ9v69evp2bMnBx10UMzxtxQlfBGROOjbrSPjhvWhY7qfr8oq6Jjuj+uAvWiVlZUxbdo0Jk2aRPfuTZ/hc9GiRcyaNYuCggIKCgr4/PPP+fzzz/n000856aSTeOutt/jiiy+A8EC0iooK+vfv38hZm+6KK67gpZdeimocwrx58xg+fDijRo1izpw53HXXXRQXF8d87TFjxjBz5sz9Ev6ll17K0qVLWbo0PF/cnj17uO6667j55pubFX9LUcIXEYmTvt06ct7wvnz3lAGcN7xviyb7M888k+HDh3PyySeTlZXF7373u9p9Nffwa74efPDBBs+Tm5vL5MmT99k2efJkcnNz6dOnDw888ADjx48nOzubn/zkJyxatGif7vNjjz2WzMxMMjMzueGGG2KuT4cOHZg+ffp+M8uNHj26th5Tp07liy++4L777qttTffv35/p06czc+bMmK/t8/m46aab6NGjxz7bO3XqxJIlS5g1axZDhw7l2GOPZdSoUfz4xz+OOv5kssj7PG1dTk6Oq3n0QUSkudauXcvRRx+d7DBEatX3M2lm7znnchorqxa+iIhICtAofRGRNuKUU06homLfx/wWLlzIiBFJe4I5JgsWLNhvFPrpp59+wFsN8TRhwoT9RtDfe++9nH12vU+Jtxvq0hcRaYC69KW1UZe+iEiCtKdGkbRtzf1ZVMIXEWlARkYGxcXFSvqSdM45iouLycjIiPkcuocvItKAzMxMCgsLW9WjVZK6MjIyYpr5r4YSvohIA9LS0vZbLlWkrVKXvoiISApQwhcREUkBSvgiIiIpoF09h29m24BPD3BIL+CrFgqnNUnFeqdinSE16606p45UrHc0dR7gnOvd2InaVcJvjJmtjGZygvYmFeudinWG1Ky36pw6UrHe8ayzuvRFRERSgBK+iIhICki1hP9osgNIklSsdyrWGVKz3qpz6kjFesetzil1D19ERCRVpVoLX0REJCW1y4RvZueZ2Toz22hmM+vZf4OZrTGzD8zsdTMbkIw44y2Kev/YzD40s9Vm9k8zG5aMOOOpsTpHHHehmTkza/MjfKP4nC8zs23e57zazK5MRpzxFs1nbWb/4f1u55vZ0y0dY7xF8VnPific15tZaTLijLco6p1lZkvNbJX3//j4ZMQZT1HUeYCXrz4ws2Vm1vRJ9Z1z7eoL8AObgMFAOpAHDKtzzJnAQd7ra4Bnkh13C9X74IjXE4CXkx13ouvsHdcFeANYDuQkO+4W+JwvA+YmO9Yk1HsIsAro7r0/JNlxJ7rOdY6/Fvh9suNuoc/6UeAa7/UwoCDZcbdAnRcDP/RenwUsbOp12mML/2Rgo3PuY+dcJZALTIw8wDm31Dm3x3u7HIh9+aHWI5p674x42wlo6wM4Gq2z55fAr4G9LRlcgkRb5/Ymmnr/CHjYOVcC4Jz7soVjjLemftaXAItaJLLEiqbeDjjYe90V+LwF40uEaOo8DPi793ppPfsb1R4Tfj9gS8T7Qm9bQ64AXkpoRC0jqnqb2XQz2wT8BriuhWJLlEbrbGYnAP2dcy+0ZGAJFO3P94Ve19+zZta/ZUJLqGjqfSRwpJm9ZWbLzey8FosuMaL+v8y7LTmIrxNCWxZNvWcB3zezQuBFwr0bbVk0dc4Dvu29ngx0MbOeTblIe0z4UTOz7wM5wD3JjqWlOOceds4dDtwC/DzZ8SSSmfmA3wI/TXYsLewvwEDn3LHAq8ATSY6npQQId+uPIdzanW9m3ZIaUcuZAjzrnAsmO5AWcgnwuHMuExgPLPR+39uzG4EzzGwVcAbwGdCkz7s9foM+AyJbNJnetn2Y2dnArcAE51xFC8WWSFHVO0IuMCmhESVeY3XuAgwHlplZATASeL6ND9xr9HN2zhVH/EwvAE5sodgSKZqf70LgeedclXPuE2A94T8A2qqm/E5PoX1050N09b4C+COAc+5fQAbhOefbqmh+rz93zn3bOXc84dyFc65pgzSTPVghAYMfAsDHhLu3agY/HFPnmOMJD5AYkux4W7jeQyJefwtYmey4E13nOscvo+0P2ovmc+4b8XoysDzZcbdQvc8DnvBe9yLcRdoz2bEnss7ecUcBBXjzqrT1ryg/65eAy7zXRxO+h99m6x9lnXsBPu/1r4A7mnqddtfCd85VAzOAV4C1wB+dc/lmdoeZTfAOuwfoDCz2Hmd5Pknhxk2U9Z7hPa60GrgB+GGSwo2LKOvcrkRZ5+u8zzmP8DiNy5ITbfxEWe9XgGIzW0N4UNNNzrni5ETcfE34+Z4C5DovE7R1Udb7p8CPvJ/xRYSTf5utf5R1HgOsM7P1QB/CSb9JNNOeiIhICmh3LXwRERHZnxK+iIhIClDCFxERSQFK+CIiIilACV9ERCQFKOGLiIikACV8kRbmLW25zpvr/t9mNjdyClgzC0Ysebq6ZqnMiHI12y+KKDPJW/73qIhtY8zsr3Wu/XhNOe98ORH7xkecuyziWv/bQD3O9q75zYhtL5vZKO/1P+vE+4y3fZ6Z/SyizO1m9oCZPeIdt8bMyiPKTW7g+n/w4uwUsW2uF1M3731/M3vezDaY2SYz+62ZpTUWv1dmtbdU6Y6IWE7x6pUdUeYIb24LkVYtkOwARFLU95xzK80sHZgN/Jnw/NgA5c657AOVq2f7JcA/vX9vjyUg59yLhBciwcz+CcxwzjWWyLYQnuazoQWoLq7nHD8DVll4vfoA4Qmgsp23mqOZHUF4XviGvgeRPiY8a2SumfmB04EvvPMY4e/rHOfcQjMLAI8BdwD/daD4nXMTvHOcTfj7UDsNdfi0Im2PWvgicWBmA73W+lNmttbCq9Qd1Fg5F14K82Ygy8yOi/HanYFRhOcXnxLLOZrhfaDCzM6MtoALz/99G/AgMA+41e27dHNT5AIXe6/HAv/g6wVFzgFKnXMLvetWA/8JXGVmGbHGL9JWKeGLxM9QYJ5z7mhgJzAtmkIuvMJZHuE50QE61unSvzji8KcittcsjTkReNk5t57w1LItvVjOr2h45cVnIuK9u2ajl4T7AunOueYs+rIG6GdmXQn3buRG7DsGeC/yYO+Pjc+BwVHG35DaegFtfmpuSQ1K+CLxs8U595b3+g+EW93RiuwnLnfOZUd8PROx73sR22vmiY9MdLnee4CG5s2O63zazrm/E/4jZWQ9uy+OiHdmzUYLr9/eG+gfTU9II5YQ7tk4AXi7qYUbib8htfUC2uW6DdL+6B6+SPzUTaRRJVbv3vMIwotmNImZ9QDOAkaYmQP8gDOzm4BioHudIj2Ar5p6nSjcSdNayQ8Svnd+PPDffH1PPRa5wLvAAueci7jHvga4IPJAbzBfP8L3/g+L2NXU+EXaHLXwReIny8xO9V5/l/AgugPyRozPJtw78EEM17wIWOicG+CcG+ic6w98AowGNgCHmdnR3rUGAMcBcR9R7g34O5RwN/oBmdm3gK7OuaeAWcDFZja0Gdf+mHCyfqTOrr8B3c3se951A8BvgfnOub2xxi/SVinhi8TPOmC6ma0l3LL+nwMc+5SZfQB8BHQifB++Rt17+HfXfwog3H3/XJ1t/wdc4pyrAL4P/K93r/lZ4Ern3I6IY18ws0Lva3FUtWzYXUBmnW2R9/BfMbOOwH144xucc2XATOCh5lzYOfc/zrlP6mwLAZOAS8xsA+HPZxfhHoVo4xdpN7Q8rkgcmNlA4K/OueFJDkVEpF5q4YuIiKQAtfBFEsTMngMG1dl8i3PulWTEEyszG0+4uzvSRufcRfUdn4DrPwLUHUH/W+fcky1xfZH2QglfREQkBahLX0REJAUo4YuIiKQAJXwREZEUoIQvIiKSApTwRUREUsD/A5jGEBXjfKf5AAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "# initialize figure\n", + "fig, ax_ = plt.subplots(figsize=(8, 8)) \n", + "\n", + "ax_.plot(valid_yhat_female_fp[yhat_name],\n", + " valid_yhat_female_fp['r_DEFAULT_NEXT_MONTH'],\n", + " marker='o', linestyle='', alpha=0.3)\n", + "\n", + "# annotate plot\n", + "_ = plt.xlabel(yhat_name)\n", + "_ = plt.ylabel('r_DEFAULT_NEXT_MONTH')\n", + "_ = ax_.legend(loc=4)\n", + "_ = plt.title('Logloss Residuals')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Examine low logloss residual individuals" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Low residual individuals near the probability cutoff are some of the most likely to be treated differently from individuals who received the credit product. Those individuals are displayed below." + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
IDLIMIT_BALSEXEDUCATIONMARRIAGEAGEPAY_0PAY_2PAY_3PAY_4PAY_5PAY_6BILL_AMT1BILL_AMT2BILL_AMT3BILL_AMT4BILL_AMT5BILL_AMT6PAY_AMT1PAY_AMT2PAY_AMT3PAY_AMT4PAY_AMT5PAY_AMT6DEFAULT_NEXT_MONTHp_DEFAULT_NEXT_MONTHd_DEFAULT_NEXT_MONTHr_DEFAULT_NEXT_MONTH
573519327230000femaleuniversitymarried401-2-1-1-1-100350442722977190003504427229771900000.22003710.248509
787026436160000femaleuniversitysingle441-2-1-1-1-1-4-4345403312003458033120000.22003710.248509
30931038720000femalegraduate schoolsingle29-1-1-1-1-1-1342792677-12138567926770214856000.22029610.248841
2821946650000femaleuniversitymarried3400200011367109821024310826116991014630001000100010002000200000.22032710.248881
15150130000femaleuniversitymarried3800002220344217052253724161251282457620001500200015000120000.22059210.249220
653621992150000femalegraduate schoolsingle29-12-1-1002599153039013667800039013660043100.22083210.249528
82442765250000femalegraduate schoolsingle231-1-1-2-2-2-697113610000120580000000.22090910.249628
625205850000femalehigh schoolmarried51002000447664804746640405511939804000081110000000.22093910.249665
17558620000femalegraduate schoolsingle25000002146031566116394167231805617618160013006001600080000.22096910.249705
776826079210000femaleuniversitysingle361-2-2-2-1000212306613206105830212306613206212000.22102610.249778
568219140260000femaleuniversitymarried3711-2-2-105917-15910-15910-1591024090139770004000050765600.22102610.249778
429314369450000femalegraduate schoolsingle541-2-2-1-1-1-237-2400-24003990300509993006390300509993000.22102610.249778
866329086220000femalegraduate schoolmarried561-2-2-2-1-100005889300000588930016500.22102610.249778
456315316300000femalegraduate schoolsingle351-2-2-2-1-100001500000000150000026700.22102610.249778
579219497150000femalegraduate schoolmarried491-2-2-1-1-100012200169613000001220016961300049300.22102610.249778
587419793320000femalegraduate schoolmarried451-2-2-1-1-1000370930100037093010000.22102610.249778
2716911630000femaleuniversitysingle2200-1-12217358-3623114247272419225905023150200002119000.22103610.249790
31941071530000femaleuniversitymarried380000207979904610279106101033987101200142375401000300000.22181510.250791
1460490530000femaleuniversitysingle2200002025536266352738329061284922985018001500210001800100000.22181510.250791
407513647240000femaleothersingle36-2-1200-2-2351765871871-155-1552000001550000.22250510.251678
30361014820000femalehigh schoolsingle3500020010704113521329712418125811230515172852050070050000.22377210.253309
86782912520000femalehigh schoolsingle56000200114711218815074144261452615026121431000500500000.22377210.253309
18056061170000femalegraduate schoolsingle36-1-12-1-1-12488324162304323-109214160230400000.22383610.253391
65432202130000femaleuniversitysingle2500200052938107773810179119021310930000300020001500100000.22387810.253445
779026150250000femalegraduate schoolsingle39-1-1-122212368174240292396002130411851742396007118505441600.22396710.253560
408013673180000femaleuniversitysingle27002000531965315446485339653310230002881432463300.22400910.253615
384212864220000femalegraduate schoolsingle36002000130434111152251983974-38219013980200776200.22400910.253615
6972334210000femalegraduate schoolmarried38-1-122-1-123604707446509306585544650093065855984000.22444010.254171
33161112120000femaleuniversitysingle23-1-1-1-1-1-139039039039039039039039039039039039000.22525610.255223
84692839250000femalehigh schoolmarried4200200028693318543106531976293003115136440176610362700000.22528710.255263
.......................................................................................
780226196180000femalegraduate schoolsingle271-1-1-1-1-201530016100015300161000000.25647410.296352
790426540200000femalehigh schoolmarried511-1-1-2-2-201221000012210000000.25647410.296352
776726076430000femalegraduate schoolmarried381-1-1-2-2-1-469215-101-1217-23332351100000050001200000.25647410.296352
1900636940000femalegraduate schoolsingle2600022220936206472156118438206191758230004000050000800000.25699610.297054
1506505050000femalehigh schoolsingle6100022036205386094299643992431554443630005064200002000200000.25766610.297956
480116151200000femaleuniversitymarried4800-122226465204358475816210962735010008475030000750500.25795710.298348
17896011450000femalegraduate schoolsingle351-2-100-2003581365400035817300000.25825910.298755
75002520030000femaleuniversitysingle241-1-1-2-2-1-5565-25-254897668557025049226788000.25906310.299840
45401524580000femaleuniversitymarried37-10022228652298693307133531341843350320004000130015000130000.25924010.300079
124413230000femalegraduate schoolmarried321-2-1-1-1-1002809459579234040280946067923404000.25936110.300241
456115312240000femalehigh schoolsingle371-2-1-1-1-1-3-32486250058049002489250058049017700.25936110.300241
349411711140000femaleuniversitysingle38002002154381214011225668869086438262003104000143500.26042710.301682
562118948290000femaleuniversitymarried3212000014685313975613601213646613692913217915035400520050004667500000.26052210.301810
2752924590000femaleuniversitymarried321-1-1-1-1-1032302520165103230252016510000.26062010.301943
21187108100000femalegraduate schoolsingle281-1-1-2-2-22909309770001534977000000.26122010.302756
42691429930000femaleuniversitymarried3700022226168274423016529383327393314320003500040001150100000.26163610.303319
434139020000femaleuniversitysingle220020021119215677151331558216677162595000010001500070000.26257010.304585
79222657650000femalehigh schoolsingle51-1-12-1-1-1390780390390390132078003903901320000.26314610.305366
1354180000femalegraduate schoolsingle2512000041402417424275843510444204531913002010176217621790162200.26355110.305915
26384180000femaleuniversitysingle2212000079318794597978829069265512710920002201200010001000200000.26384310.306312
698923529450000femalegraduate schoolsingle361-2-1-1-1-259099646131797679106439686151834682106973045100.26450510.307211
350811755160000femaleuniversitysingle271-2-100-200620462040006204000000.26471910.307503
15105055200000femaleuniversitysingle311-2-1-1-1-100637219570596063721957059678900.26471910.307503
524717669210000femalegraduate schoolmarried411-2-1-1-2-1-28-283330003000335800300000.26471910.307503
445714901150000femalegraduate schoolsingle37-122-1-1-2180158214023244-13-1314020324400000.26578710.308956
570819245100000femalehigh schoolmarried42-122-2-2-276892052128137702056562801281378620565634601300.26584810.309039
1306434730000femalehigh schoolmarried44-1-1200010725122444040503660366044400000000.26631410.309674
1919642030000femaleuniversitysingle2700200028392314393039429994298341596035000000000.26631410.309674
43151444150000femalegraduate schoolsingle331-2-100-10012493283656300999012493160000999000.26748710.311274
673922665450000femalegraduate schoolmarried331-1200-1621113682370282322434322114200224304322000.26776910.311659
\n", + "

150 rows × 28 columns

\n", + "
" + ], + "text/plain": [ + " ID LIMIT_BAL SEX EDUCATION MARRIAGE AGE PAY_0 PAY_2 \\\n", + "5735 19327 230000 female university married 40 1 -2 \n", + "7870 26436 160000 female university single 44 1 -2 \n", + "3093 10387 20000 female graduate school single 29 -1 -1 \n", + "2821 9466 50000 female university married 34 0 0 \n", + "151 501 30000 female university married 38 0 0 \n", + "6536 21992 150000 female graduate school single 29 -1 2 \n", + "8244 27652 50000 female graduate school single 23 1 -1 \n", + "625 2058 50000 female high school married 51 0 0 \n", + "175 586 20000 female graduate school single 25 0 0 \n", + "7768 26079 210000 female university single 36 1 -2 \n", + "5682 19140 260000 female university married 37 1 1 \n", + "4293 14369 450000 female graduate school single 54 1 -2 \n", + "8663 29086 220000 female graduate school married 56 1 -2 \n", + "4563 15316 300000 female graduate school single 35 1 -2 \n", + "5792 19497 150000 female graduate school married 49 1 -2 \n", + "5874 19793 320000 female graduate school married 45 1 -2 \n", + "2716 9116 30000 female university single 22 0 0 \n", + "3194 10715 30000 female university married 38 0 0 \n", + "1460 4905 30000 female university single 22 0 0 \n", + "4075 13647 240000 female other single 36 -2 -1 \n", + "3036 10148 20000 female high school single 35 0 0 \n", + "8678 29125 20000 female high school single 56 0 0 \n", + "1805 6061 170000 female graduate school single 36 -1 -1 \n", + "6543 22021 30000 female university single 25 0 0 \n", + "7790 26150 250000 female graduate school single 39 -1 -1 \n", + "4080 13673 180000 female university single 27 0 0 \n", + "3842 12864 220000 female graduate school single 36 0 0 \n", + "697 2334 210000 female graduate school married 38 -1 -1 \n", + "3316 11121 20000 female university single 23 -1 -1 \n", + "8469 28392 50000 female high school married 42 0 0 \n", + "... ... ... ... ... ... ... ... ... \n", + "7802 26196 180000 female graduate school single 27 1 -1 \n", + "7904 26540 200000 female high school married 51 1 -1 \n", + "7767 26076 430000 female graduate school married 38 1 -1 \n", + "1900 6369 40000 female graduate school single 26 0 0 \n", + "1506 5050 50000 female high school single 61 0 0 \n", + "4801 16151 200000 female university married 48 0 0 \n", + "1789 6011 450000 female graduate school single 35 1 -2 \n", + "7500 25200 30000 female university single 24 1 -1 \n", + "4540 15245 80000 female university married 37 -1 0 \n", + "124 413 230000 female graduate school married 32 1 -2 \n", + "4561 15312 240000 female high school single 37 1 -2 \n", + "3494 11711 140000 female university single 38 0 0 \n", + "5621 18948 290000 female university married 32 1 2 \n", + "2752 9245 90000 female university married 32 1 -1 \n", + "2118 7108 100000 female graduate school single 28 1 -1 \n", + "4269 14299 30000 female university married 37 0 0 \n", + "434 1390 20000 female university single 22 0 0 \n", + "7922 26576 50000 female high school single 51 -1 -1 \n", + "13 54 180000 female graduate school single 25 1 2 \n", + "263 841 80000 female university single 22 1 2 \n", + "6989 23529 450000 female graduate school single 36 1 -2 \n", + "3508 11755 160000 female university single 27 1 -2 \n", + "1510 5055 200000 female university single 31 1 -2 \n", + "5247 17669 210000 female graduate school married 41 1 -2 \n", + "4457 14901 150000 female graduate school single 37 -1 2 \n", + "5708 19245 100000 female high school married 42 -1 2 \n", + "1306 4347 30000 female high school married 44 -1 -1 \n", + "1919 6420 30000 female university single 27 0 0 \n", + "4315 14441 50000 female graduate school single 33 1 -2 \n", + "6739 22665 450000 female graduate school married 33 1 -1 \n", + "\n", + " PAY_3 PAY_4 PAY_5 PAY_6 BILL_AMT1 BILL_AMT2 BILL_AMT3 BILL_AMT4 \\\n", + "5735 -1 -1 -1 -1 0 0 3504 4272 \n", + "7870 -1 -1 -1 -1 -4 -4 3454 0 \n", + "3093 -1 -1 -1 -1 342 792 677 -1 \n", + "2821 2 0 0 0 11367 10982 10243 10826 \n", + "151 0 0 2 2 20344 21705 22537 24161 \n", + "6536 -1 -1 0 0 2599 1530 390 1366 \n", + "8244 -1 -2 -2 -2 -697 11361 0 0 \n", + "625 2 0 0 0 44766 48047 46640 40551 \n", + "175 0 0 0 2 14603 15661 16394 16723 \n", + "7768 -2 -2 -1 0 0 0 212 3066 \n", + "5682 -2 -2 -1 0 5917 -15910 -15910 -15910 \n", + "4293 -2 -1 -1 -1 -237 -2400 -2400 3990 \n", + "8663 -2 -2 -1 -1 0 0 0 0 \n", + "4563 -2 -2 -1 -1 0 0 0 0 \n", + "5792 -2 -1 -1 -1 0 0 0 12200 \n", + "5874 -2 -1 -1 -1 0 0 0 370 \n", + "2716 -1 -1 2 2 17358 -36 23114 24727 \n", + "3194 0 0 2 0 7979 9046 10279 10610 \n", + "1460 0 0 2 0 25536 26635 27383 29061 \n", + "4075 2 0 0 -2 -235 1765 871 871 \n", + "3036 0 2 0 0 10704 11352 13297 12418 \n", + "8678 0 2 0 0 11471 12188 15074 14426 \n", + "1805 2 -1 -1 -1 248 832 416 2304 \n", + "6543 2 0 0 0 5293 8107 7738 10179 \n", + "7790 -1 2 2 2 12368 1742 40292 39600 \n", + "4080 2 0 0 0 5319 6531 5446 4853 \n", + "3842 2 0 0 0 13043 4111 1522 5198 \n", + "697 2 2 -1 -1 2360 4707 4465 0 \n", + "3316 -1 -1 -1 -1 390 390 390 390 \n", + "8469 2 0 0 0 28693 31854 31065 31976 \n", + "... ... ... ... ... ... ... ... ... \n", + "7802 -1 -1 -1 -2 0 1530 0 1610 \n", + "7904 -1 -2 -2 -2 0 1221 0 0 \n", + "7767 -1 -2 -2 -1 -469 215 -101 -1217 \n", + "1900 0 2 2 2 20936 20647 21561 18438 \n", + "1506 0 2 2 0 36205 38609 42996 43992 \n", + "4801 -1 2 2 2 26465 20435 8475 8162 \n", + "1789 -1 0 0 -2 0 0 3581 3654 \n", + "7500 -1 -2 -2 -1 -5 565 -25 -25 \n", + "4540 0 2 2 2 28652 29869 33071 33531 \n", + "124 -1 -1 -1 -1 0 0 2809 4595 \n", + "4561 -1 -1 -1 -1 -3 -3 2486 2500 \n", + "3494 2 0 0 2 15438 12140 11225 6688 \n", + "5621 0 0 0 0 146853 139756 136012 136466 \n", + "2752 -1 -1 -1 -1 0 323 0 2520 \n", + "2118 -1 -2 -2 -2 290 930 977 0 \n", + "4269 0 2 2 2 26168 27442 30165 29383 \n", + "434 2 0 0 2 11192 15677 15133 15582 \n", + "7922 2 -1 -1 -1 390 780 390 390 \n", + "13 0 0 0 0 41402 41742 42758 43510 \n", + "263 0 0 0 0 79318 79459 79788 29069 \n", + "6989 -1 -1 -1 -2 5909 964 613 1797 \n", + "3508 -1 0 0 -2 0 0 6204 6204 \n", + "1510 -1 -1 -1 -1 0 0 6372 1957 \n", + "5247 -1 -1 -2 -1 -28 -28 3330 0 \n", + "4457 2 -1 -1 -2 180 1582 1402 3244 \n", + "5708 2 -2 -2 -2 7689 2052 1281 3770 \n", + "1306 2 0 0 0 1072 5122 4440 4050 \n", + "1919 2 0 0 0 28392 31439 30394 29994 \n", + "4315 -1 0 0 -1 0 0 12493 28365 \n", + "6739 2 0 0 -1 621 11368 2370 2823 \n", + "\n", + " BILL_AMT5 BILL_AMT6 PAY_AMT1 PAY_AMT2 PAY_AMT3 PAY_AMT4 PAY_AMT5 \\\n", + "5735 2977 1900 0 3504 4272 2977 1900 \n", + "7870 3312 0 0 3458 0 3312 0 \n", + "3093 213 856 792 677 0 214 856 \n", + "2821 11699 10146 3000 1000 1000 1000 2000 \n", + "151 25128 24576 2000 1500 2000 1500 0 \n", + "6536 780 0 0 390 1366 0 0 \n", + "8244 0 0 12058 0 0 0 0 \n", + "625 19398 0 4000 0 811 1000 0 \n", + "175 18056 17618 1600 1300 600 1600 0 \n", + "7768 13206 10583 0 212 3066 13206 212 \n", + "5682 24090 13977 0 0 0 40000 507 \n", + "4293 30050 9993 0 0 6390 30050 9993 \n", + "8663 5889 300 0 0 0 5889 300 \n", + "4563 150000 0 0 0 0 150000 0 \n", + "5792 16961 3000 0 0 12200 16961 3000 \n", + "5874 9301 0 0 0 370 9301 0 \n", + "2716 24192 25905 0 23150 2000 0 2119 \n", + "3194 10339 8710 1200 1423 754 0 1000 \n", + "1460 28492 29850 1800 1500 2100 0 1800 \n", + "4075 -155 -155 2000 0 0 155 0 \n", + "3036 12581 12305 1517 2852 0 500 700 \n", + "8678 14526 15026 1214 3100 0 500 500 \n", + "1805 323 -1092 1416 0 2304 0 0 \n", + "6543 11902 13109 3000 0 3000 2000 1500 \n", + "7790 21304 1185 1742 39600 7 1185 0 \n", + "4080 3965 3310 2300 0 288 143 246 \n", + "3842 3974 -38 2190 1 3980 20 0 \n", + "697 9306 5855 4465 0 0 9306 5855 \n", + "3316 390 390 390 390 390 390 390 \n", + "8469 29300 31151 3644 0 1766 1036 2700 \n", + "... ... ... ... ... ... ... ... \n", + "7802 0 0 1530 0 1610 0 0 \n", + "7904 0 0 1221 0 0 0 0 \n", + "7767 -2333 2351 1000 0 0 0 5000 \n", + "1900 20619 17582 3000 4000 0 5000 0 \n", + "1506 43155 44436 3000 5064 2000 0 2000 \n", + "4801 10962 7350 1000 8475 0 3000 0 \n", + "1789 0 0 0 3581 73 0 0 \n", + "7500 4897 6685 570 25 0 4922 6788 \n", + "4540 34184 33503 2000 4000 1300 1500 0 \n", + "124 792 3404 0 2809 4606 792 3404 \n", + "4561 580 490 0 2489 2500 580 490 \n", + "3494 6908 6438 2620 0 310 400 0 \n", + "5621 136929 132179 1503 5400 5200 5000 4667 \n", + "2752 1651 0 323 0 2520 1651 0 \n", + "2118 0 0 1534 977 0 0 0 \n", + "4269 32739 33143 2000 3500 0 4000 1150 \n", + "434 16677 16259 5000 0 1000 1500 0 \n", + "7922 390 1320 780 0 390 390 1320 \n", + "13 44420 45319 1300 2010 1762 1762 1790 \n", + "263 26551 27109 2000 2201 2000 1000 1000 \n", + "6989 679 10643 968 615 1834 682 10697 \n", + "3508 0 0 0 6204 0 0 0 \n", + "1510 0 596 0 6372 1957 0 596 \n", + "5247 0 300 0 3358 0 0 300 \n", + "4457 -13 -13 1402 0 3244 0 0 \n", + "5708 2056 5628 0 1281 3786 2056 5634 \n", + "1306 3660 3660 4440 0 0 0 0 \n", + "1919 29834 15960 3500 0 0 0 0 \n", + "4315 6300 999 0 12493 16000 0 999 \n", + "6739 2243 4322 11420 0 2243 0 4322 \n", + "\n", + " PAY_AMT6 DEFAULT_NEXT_MONTH p_DEFAULT_NEXT_MONTH \\\n", + "5735 0 0 0.220037 \n", + "7870 0 0 0.220037 \n", + "3093 0 0 0.220296 \n", + "2821 2000 0 0.220327 \n", + "151 1200 0 0.220592 \n", + "6536 431 0 0.220832 \n", + "8244 0 0 0.220909 \n", + "625 0 0 0.220939 \n", + "175 800 0 0.220969 \n", + "7768 0 0 0.221026 \n", + "5682 656 0 0.221026 \n", + "4293 0 0 0.221026 \n", + "8663 165 0 0.221026 \n", + "4563 267 0 0.221026 \n", + "5792 493 0 0.221026 \n", + "5874 0 0 0.221026 \n", + "2716 0 0 0.221036 \n", + "3194 3000 0 0.221815 \n", + "1460 1000 0 0.221815 \n", + "4075 0 0 0.222505 \n", + "3036 500 0 0.223772 \n", + "8678 0 0 0.223772 \n", + "1805 0 0 0.223836 \n", + "6543 1000 0 0.223878 \n", + "7790 54416 0 0.223967 \n", + "4080 33 0 0.224009 \n", + "3842 7762 0 0.224009 \n", + "697 9840 0 0.224440 \n", + "3316 390 0 0.225256 \n", + "8469 0 0 0.225287 \n", + "... ... ... ... \n", + "7802 0 0 0.256474 \n", + "7904 0 0 0.256474 \n", + "7767 12000 0 0.256474 \n", + "1900 8000 0 0.256996 \n", + "1506 2000 0 0.257666 \n", + "4801 7505 0 0.257957 \n", + "1789 0 0 0.258259 \n", + "7500 0 0 0.259063 \n", + "4540 1300 0 0.259240 \n", + "124 0 0 0.259361 \n", + "4561 177 0 0.259361 \n", + "3494 1435 0 0.260427 \n", + "5621 5000 0 0.260522 \n", + "2752 0 0 0.260620 \n", + "2118 0 0 0.261220 \n", + "4269 1000 0 0.261636 \n", + "434 700 0 0.262570 \n", + "7922 0 0 0.263146 \n", + "13 1622 0 0.263551 \n", + "263 2000 0 0.263843 \n", + "6989 30451 0 0.264505 \n", + "3508 0 0 0.264719 \n", + "1510 789 0 0.264719 \n", + "5247 0 0 0.264719 \n", + "4457 0 0 0.265787 \n", + "5708 6013 0 0.265848 \n", + "1306 0 0 0.266314 \n", + "1919 0 0 0.266314 \n", + "4315 0 0 0.267487 \n", + "6739 0 0 0.267769 \n", + "\n", + " d_DEFAULT_NEXT_MONTH r_DEFAULT_NEXT_MONTH \n", + "5735 1 0.248509 \n", + "7870 1 0.248509 \n", + "3093 1 0.248841 \n", + "2821 1 0.248881 \n", + "151 1 0.249220 \n", + "6536 1 0.249528 \n", + "8244 1 0.249628 \n", + "625 1 0.249665 \n", + "175 1 0.249705 \n", + "7768 1 0.249778 \n", + "5682 1 0.249778 \n", + "4293 1 0.249778 \n", + "8663 1 0.249778 \n", + "4563 1 0.249778 \n", + "5792 1 0.249778 \n", + "5874 1 0.249778 \n", + "2716 1 0.249790 \n", + "3194 1 0.250791 \n", + "1460 1 0.250791 \n", + "4075 1 0.251678 \n", + "3036 1 0.253309 \n", + "8678 1 0.253309 \n", + "1805 1 0.253391 \n", + "6543 1 0.253445 \n", + "7790 1 0.253560 \n", + "4080 1 0.253615 \n", + "3842 1 0.253615 \n", + "697 1 0.254171 \n", + "3316 1 0.255223 \n", + "8469 1 0.255263 \n", + "... ... ... \n", + "7802 1 0.296352 \n", + "7904 1 0.296352 \n", + "7767 1 0.296352 \n", + "1900 1 0.297054 \n", + "1506 1 0.297956 \n", + "4801 1 0.298348 \n", + "1789 1 0.298755 \n", + "7500 1 0.299840 \n", + "4540 1 0.300079 \n", + "124 1 0.300241 \n", + "4561 1 0.300241 \n", + "3494 1 0.301682 \n", + "5621 1 0.301810 \n", + "2752 1 0.301943 \n", + "2118 1 0.302756 \n", + "4269 1 0.303319 \n", + "434 1 0.304585 \n", + "7922 1 0.305366 \n", + "13 1 0.305915 \n", + "263 1 0.306312 \n", + "6989 1 0.307211 \n", + "3508 1 0.307503 \n", + "1510 1 0.307503 \n", + "5247 1 0.307503 \n", + "4457 1 0.308956 \n", + "5708 1 0.309039 \n", + "1306 1 0.309674 \n", + "1919 1 0.309674 \n", + "4315 1 0.311274 \n", + "6739 1 0.311659 \n", + "\n", + "[150 rows x 28 columns]" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "valid_yhat_female_fp.sort_values(by='r_DEFAULT_NEXT_MONTH', ascending=True).head(n=150)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examining the low-residual false positives, it can be seen that the cutoff selected by Youden's J is a bit too conservative. Many women just above the cutoff have missed 0-2 payments, and only been late 1-2 months on the few payments they missed, if any. This potential discrimination problem can be remediated by increasing the cutoff in cell 9." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Simple adversarial de-biasing approach\n", + "Create a dataset with the protected variable and the predictions of the model." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Parse progress: |█████████████████████████████████████████████████████████| 100%\n" + ] + } + ], + "source": [ + "adv_valid = h2o.H2OFrame(valid_yhat[['ID', 'SEX', yhat_name]])\n", + "adv_train, adv_valid = adv_valid.split_frame([0.7])\n", + "adv_train" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Train adversarial GBM\n", + "This adversarial GBM tries to predict whether a customer is a man or a woman just from the predictions of `best_mgbm`. The predictions of this adversarial model give a row-by-row measure for whether a prediction is encoding information about `SEX`." + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "gbm Grid Build progress: |████████████████████████████████████████████████| 100%\n", + "Adversarial GBM AUC: 0.51\n" + ] + } + ], + "source": [ + "adv_gbm = model.gbm_grid(yhat_name, 'SEX', adv_train, adv_valid, SEED)\n", + "print('Adversarial GBM AUC: %.2f' % adv_gbm.auc(valid=True))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Because the adversarial GBM cannot predict the `SEX` of the customer from the predictions of `best_mgbm`, this is a good sign that `best_mgbm` is not too discriminatory towards women." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Examine a few predictions from the adversarial GBM" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Parse progress: |█████████████████████████████████████████████████████████| 100%\n", + "gbm prediction progress: |████████████████████████████████████████████████| 100%\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
IDLIMIT_BALSEXEDUCATIONMARRIAGEAGEPAY_0PAY_2PAY_3PAY_4PAY_5PAY_6BILL_AMT1BILL_AMT2BILL_AMT3BILL_AMT4BILL_AMT5BILL_AMT6PAY_AMT1PAY_AMT2PAY_AMT3PAY_AMT4PAY_AMT5PAY_AMT6DEFAULT_NEXT_MONTHp_DEFAULT_NEXT_MONTHp_FEMALE_ADVERSARY
787026436160000femaleuniversitysingle441-2-1-1-1-1-4-4345403312003458033120000.2200370.591124
573519327230000femaleuniversitymarried401-2-1-1-1-100350442722977190003504427229771900000.2200370.591124
30931038720000femalegraduate schoolsingle29-1-1-1-1-1-1342792677-12138567926770214856000.2202960.591124
2821946650000femaleuniversitymarried3400200011367109821024310826116991014630001000100010002000200000.2203270.591124
15150130000femaleuniversitymarried3800002220344217052253724161251282457620001500200015000120000.2205920.591124
855828732240000femaleuniversitymarried35-12-1-1-1-1528264264264264414026426426441426410.2208320.591124
457115332180000femalegraduate schoolmarried45-12-1-1-1-11560316316316316316031631631631631610.2208320.591124
653621992150000femalegraduate schoolsingle29-12-1-1002599153039013667800039013660043100.2208320.591124
82442765250000femalegraduate schoolsingle231-1-1-2-2-2-697113610000120580000000.2209090.591124
37381251750000femalehigh schoolsingle5000200047405501382576826142267712717541300110012001000110010.2209390.591124
625205850000femalehigh schoolmarried51002000447664804746640405511939804000081110000000.2209390.591124
17558620000femalegraduate schoolsingle25000002146031566116394167231805617618160013006001600080000.2209690.591124
456315316300000femalegraduate schoolsingle351-2-2-2-1-100001500000000150000026700.2210260.591124
587419793320000femalegraduate schoolmarried451-2-2-1-1-1000370930100037093010000.2210260.591124
568219140260000femaleuniversitymarried3711-2-2-105917-15910-15910-1591024090139770004000050765600.2210260.591124
429314369450000femalegraduate schoolsingle541-2-2-1-1-1-237-2400-24003990300509993006390300509993000.2210260.591124
579219497150000femalegraduate schoolmarried491-2-2-1-1-100012200169613000001220016961300049300.2210260.591124
866329086220000femalegraduate schoolmarried561-2-2-2-1-100005889300000588930016500.2210260.591124
4651479210000femalegraduate schoolsingle301-2-2-2-1-10000495250000495250010.2210260.591124
776826079210000femaleuniversitysingle361-2-2-2-1000212306613206105830212306613206212000.2210260.591124
2716911630000femaleuniversitysingle2200-1-12217358-3623114247272419225905023150200002119000.2210360.591124
1460490530000femaleuniversitysingle2200002025536266352738329061284922985018001500210001800100000.2218150.591124
31941071530000femaleuniversitymarried380000207979904610279106101033987101200142375401000300000.2218150.591124
52241759930000femaleuniversitymarried3000002024061251562594928478277542818618001500300001001250010.2218150.591124
407513647240000femaleothersingle36-2-1200-2-2351765871871-155-1552000001550000.2225050.591124
60302030730000femaleuniversitymarried56000200102321156414099135801365716356151030420600294260010.2237720.591124
86782912520000femalehigh schoolsingle56000200114711218815074144261452615026121431000500500000.2237720.591124
2893966820000femaleuniversitysingle4800020014218152471698616418166151694415622301061060477910.2237720.591124
30361014820000femalehigh schoolsingle3500020010704113521329712418125811230515172852050070050000.2237720.591124
486916387160000femalegraduate schoolsingle29-1-12-10-11116259913021852736354225990185203542010.2238360.591124
....................................................................................
24148115120000femaleuniversitysingle263322321203412548120561395813468614410000240010005725800.8234980.591124
559418868180000femaleuniversitysingle2887654319723119430918998118555918113718400900006000010.8245480.591124
82692773130000femaleuniversitysingle2532222290951029799881170811223120791500018920104270010.8252120.591124
2314774420000femaleuniversitymarried4232222218464194651964919978205122083116008009501000800010.8253600.591124
560518911110000femaleuniversitysingle2932244360060060060060030000000010.8257840.591124
675522725100000femaleuniversitymarried3832233375075075075075075000000150000.8257840.591124
57401934520000femalehigh schoolmarried482222221383614805158651606215509165581500159575001452010.8265920.591124
47201585460000femaleuniversitysingle2732224356670572525576464522629456170021000970000010.8268120.591124
2866959420000femalegraduate schoolsingle2622222230030030030030030000000010.8272600.591124
52211758620000femalehigh schoolmarried3923223215307147691523216677161191654801000200001000100010.8272600.591124
86742911620000femaleuniversitymarried593232248803111371067211201127211194628000100020000010.8368500.591124
67272262120000femaleuniversitymarried434432222844727721270092717026295341710092409648200010.8369220.591124
64382166820000femaleuniversitysingle2432222232232232232232232200000010.8369220.591124
1223401420000femaleuniversitysingle2243222219529189371833519530190762044400150001700010.8369220.591124
30941039020000femaleuniversitymarried2522444416501650165016501650165000000010.8386490.591124
757025434100000femalehigh schoolsingle283225541250125012501250125065000000000.8386910.591124
20365020000femaleuniversitysingle4687654321075207952020619617187371814800000000.8440510.591124
51201723310000femaleuniversitydivorced463222245997575396299328114111065204000023950010.8465970.591124
48751640110000femalehigh schoolsingle4422222210422977510964111531076210126025001000400067210.8479840.591124
573119316110000femalegraduate schoolmarried4132277715015015015015015000000000.8727020.591124
47461594830000femaleuniversitymarried3032277724002400240024002400240000000010.8740110.591124
83222791650000femalegraduate schoolsingle2732277730030030030030030000000010.8740110.591124
17605916110000femalegraduate schoolmarried4122777715015015015015015000000000.8797770.591124
359114770000femalegraduate schoolsingle3122777724002400240024002400240000000010.8810230.591124
929308730000femaleuniversitysingle2422777730030030030030030000000000.8810230.591124
44541489250000femaleuniversitymarried2922777625502550255025502550195000000010.8810230.591124
36531219210000femalehigh schoolmarried552244444204204204204204200000078010.8912810.591124
83622805520000femaleuniversitymarried3232277724002400240024002400240000000010.9189900.591124
52701775710000femalehigh schoolmarried5132277724002400240024002400240000000010.9470690.591124
1865627210000femalegraduate schoolsingle2322777624002400240024002400180000000010.9502470.591124
\n", + "

1378 rows × 27 columns

\n", + "
" + ], + "text/plain": [ + " ID LIMIT_BAL SEX EDUCATION MARRIAGE AGE PAY_0 PAY_2 \\\n", + "7870 26436 160000 female university single 44 1 -2 \n", + "5735 19327 230000 female university married 40 1 -2 \n", + "3093 10387 20000 female graduate school single 29 -1 -1 \n", + "2821 9466 50000 female university married 34 0 0 \n", + "151 501 30000 female university married 38 0 0 \n", + "8558 28732 240000 female university married 35 -1 2 \n", + "4571 15332 180000 female graduate school married 45 -1 2 \n", + "6536 21992 150000 female graduate school single 29 -1 2 \n", + "8244 27652 50000 female graduate school single 23 1 -1 \n", + "3738 12517 50000 female high school single 50 0 0 \n", + "625 2058 50000 female high school married 51 0 0 \n", + "175 586 20000 female graduate school single 25 0 0 \n", + "4563 15316 300000 female graduate school single 35 1 -2 \n", + "5874 19793 320000 female graduate school married 45 1 -2 \n", + "5682 19140 260000 female university married 37 1 1 \n", + "4293 14369 450000 female graduate school single 54 1 -2 \n", + "5792 19497 150000 female graduate school married 49 1 -2 \n", + "8663 29086 220000 female graduate school married 56 1 -2 \n", + "465 1479 210000 female graduate school single 30 1 -2 \n", + "7768 26079 210000 female university single 36 1 -2 \n", + "2716 9116 30000 female university single 22 0 0 \n", + "1460 4905 30000 female university single 22 0 0 \n", + "3194 10715 30000 female university married 38 0 0 \n", + "5224 17599 30000 female university married 30 0 0 \n", + "4075 13647 240000 female other single 36 -2 -1 \n", + "6030 20307 30000 female university married 56 0 0 \n", + "8678 29125 20000 female high school single 56 0 0 \n", + "2893 9668 20000 female university single 48 0 0 \n", + "3036 10148 20000 female high school single 35 0 0 \n", + "4869 16387 160000 female graduate school single 29 -1 -1 \n", + "... ... ... ... ... ... ... ... ... \n", + "2414 8115 120000 female university single 26 3 3 \n", + "5594 18868 180000 female university single 28 8 7 \n", + "8269 27731 30000 female university single 25 3 2 \n", + "2314 7744 20000 female university married 42 3 2 \n", + "5605 18911 110000 female university single 29 3 2 \n", + "6755 22725 100000 female university married 38 3 2 \n", + "5740 19345 20000 female high school married 48 2 2 \n", + "4720 15854 60000 female university single 27 3 2 \n", + "2866 9594 20000 female graduate school single 26 2 2 \n", + "5221 17586 20000 female high school married 39 2 3 \n", + "8674 29116 20000 female university married 59 3 2 \n", + "6727 22621 20000 female university married 43 4 4 \n", + "6438 21668 20000 female university single 24 3 2 \n", + "1223 4014 20000 female university single 22 4 3 \n", + "3094 10390 20000 female university married 25 2 2 \n", + "7570 25434 100000 female high school single 28 3 2 \n", + "203 650 20000 female university single 46 8 7 \n", + "5120 17233 10000 female university divorced 46 3 2 \n", + "4875 16401 10000 female high school single 44 2 2 \n", + "5731 19316 110000 female graduate school married 41 3 2 \n", + "4746 15948 30000 female university married 30 3 2 \n", + "8322 27916 50000 female graduate school single 27 3 2 \n", + "1760 5916 110000 female graduate school married 41 2 2 \n", + "359 1147 70000 female graduate school single 31 2 2 \n", + "929 3087 30000 female university single 24 2 2 \n", + "4454 14892 50000 female university married 29 2 2 \n", + "3653 12192 10000 female high school married 55 2 2 \n", + "8362 28055 20000 female university married 32 3 2 \n", + "5270 17757 10000 female high school married 51 3 2 \n", + "1865 6272 10000 female graduate school single 23 2 2 \n", + "\n", + " PAY_3 PAY_4 PAY_5 PAY_6 BILL_AMT1 BILL_AMT2 BILL_AMT3 BILL_AMT4 \\\n", + "7870 -1 -1 -1 -1 -4 -4 3454 0 \n", + "5735 -1 -1 -1 -1 0 0 3504 4272 \n", + "3093 -1 -1 -1 -1 342 792 677 -1 \n", + "2821 2 0 0 0 11367 10982 10243 10826 \n", + "151 0 0 2 2 20344 21705 22537 24161 \n", + "8558 -1 -1 -1 -1 528 264 264 264 \n", + "4571 -1 -1 -1 -1 1560 316 316 316 \n", + "6536 -1 -1 0 0 2599 1530 390 1366 \n", + "8244 -1 -2 -2 -2 -697 11361 0 0 \n", + "3738 2 0 0 0 47405 50138 25768 26142 \n", + "625 2 0 0 0 44766 48047 46640 40551 \n", + "175 0 0 0 2 14603 15661 16394 16723 \n", + "4563 -2 -2 -1 -1 0 0 0 0 \n", + "5874 -2 -1 -1 -1 0 0 0 370 \n", + "5682 -2 -2 -1 0 5917 -15910 -15910 -15910 \n", + "4293 -2 -1 -1 -1 -237 -2400 -2400 3990 \n", + "5792 -2 -1 -1 -1 0 0 0 12200 \n", + "8663 -2 -2 -1 -1 0 0 0 0 \n", + "465 -2 -2 -1 -1 0 0 0 0 \n", + "7768 -2 -2 -1 0 0 0 212 3066 \n", + "2716 -1 -1 2 2 17358 -36 23114 24727 \n", + "1460 0 0 2 0 25536 26635 27383 29061 \n", + "3194 0 0 2 0 7979 9046 10279 10610 \n", + "5224 0 0 2 0 24061 25156 25949 28478 \n", + "4075 2 0 0 -2 -235 1765 871 871 \n", + "6030 0 2 0 0 10232 11564 14099 13580 \n", + "8678 0 2 0 0 11471 12188 15074 14426 \n", + "2893 0 2 0 0 14218 15247 16986 16418 \n", + "3036 0 2 0 0 10704 11352 13297 12418 \n", + "4869 2 -1 0 -1 1116 2599 1302 1852 \n", + "... ... ... ... ... ... ... ... ... \n", + "2414 2 2 3 2 12034 12548 12056 13958 \n", + "5594 6 5 4 3 197231 194309 189981 185559 \n", + "8269 2 2 2 2 9095 10297 9988 11708 \n", + "2314 2 2 2 2 18464 19465 19649 19978 \n", + "5605 2 4 4 3 600 600 600 600 \n", + "6755 2 3 3 3 750 750 750 750 \n", + "5740 2 2 2 2 13836 14805 15865 16062 \n", + "4720 2 2 4 3 56670 57252 55764 64522 \n", + "2866 2 2 2 2 300 300 300 300 \n", + "5221 2 2 3 2 15307 14769 15232 16677 \n", + "8674 3 2 2 4 8803 11137 10672 11201 \n", + "6727 3 2 2 2 28447 27721 27009 27170 \n", + "6438 2 2 2 2 322 322 322 322 \n", + "1223 2 2 2 2 19529 18937 18335 19530 \n", + "3094 4 4 4 4 1650 1650 1650 1650 \n", + "7570 2 5 5 4 1250 1250 1250 1250 \n", + "203 6 5 4 3 21075 20795 20206 19617 \n", + "5120 2 2 2 4 5997 5753 9629 9328 \n", + "4875 2 2 2 2 10422 9775 10964 11153 \n", + "5731 2 7 7 7 150 150 150 150 \n", + "4746 2 7 7 7 2400 2400 2400 2400 \n", + "8322 2 7 7 7 300 300 300 300 \n", + "1760 7 7 7 7 150 150 150 150 \n", + "359 7 7 7 7 2400 2400 2400 2400 \n", + "929 7 7 7 7 300 300 300 300 \n", + "4454 7 7 7 6 2550 2550 2550 2550 \n", + "3653 4 4 4 4 420 420 420 420 \n", + "8362 2 7 7 7 2400 2400 2400 2400 \n", + "5270 2 7 7 7 2400 2400 2400 2400 \n", + "1865 7 7 7 6 2400 2400 2400 2400 \n", + "\n", + " BILL_AMT5 BILL_AMT6 PAY_AMT1 PAY_AMT2 PAY_AMT3 PAY_AMT4 PAY_AMT5 \\\n", + "7870 3312 0 0 3458 0 3312 0 \n", + "5735 2977 1900 0 3504 4272 2977 1900 \n", + "3093 213 856 792 677 0 214 856 \n", + "2821 11699 10146 3000 1000 1000 1000 2000 \n", + "151 25128 24576 2000 1500 2000 1500 0 \n", + "8558 264 414 0 264 264 264 414 \n", + "4571 316 316 0 316 316 316 316 \n", + "6536 780 0 0 390 1366 0 0 \n", + "8244 0 0 12058 0 0 0 0 \n", + "3738 26771 27175 4130 0 1100 1200 1000 \n", + "625 19398 0 4000 0 811 1000 0 \n", + "175 18056 17618 1600 1300 600 1600 0 \n", + "4563 150000 0 0 0 0 150000 0 \n", + "5874 9301 0 0 0 370 9301 0 \n", + "5682 24090 13977 0 0 0 40000 507 \n", + "4293 30050 9993 0 0 6390 30050 9993 \n", + "5792 16961 3000 0 0 12200 16961 3000 \n", + "8663 5889 300 0 0 0 5889 300 \n", + "465 49525 0 0 0 0 49525 0 \n", + "7768 13206 10583 0 212 3066 13206 212 \n", + "2716 24192 25905 0 23150 2000 0 2119 \n", + "1460 28492 29850 1800 1500 2100 0 1800 \n", + "3194 10339 8710 1200 1423 754 0 1000 \n", + "5224 27754 28186 1800 1500 3000 0 1001 \n", + "4075 -155 -155 2000 0 0 155 0 \n", + "6030 13657 16356 1510 3042 0 600 2942 \n", + "8678 14526 15026 1214 3100 0 500 500 \n", + "2893 16615 16944 1562 2301 0 610 604 \n", + "3036 12581 12305 1517 2852 0 500 700 \n", + "4869 736 3542 2599 0 1852 0 3542 \n", + "... ... ... ... ... ... ... ... \n", + "2414 13468 6144 1000 0 2400 100 0 \n", + "5594 181137 184009 0 0 0 0 6000 \n", + "8269 11223 12079 1500 0 1892 0 1042 \n", + "2314 20512 20831 1600 800 950 1000 800 \n", + "5605 600 300 0 0 0 0 0 \n", + "6755 750 750 0 0 0 0 0 \n", + "5740 15509 16558 1500 1595 750 0 1452 \n", + "4720 62945 61700 2100 0 9700 0 0 \n", + "2866 300 300 0 0 0 0 0 \n", + "5221 16119 16548 0 1000 2000 0 1000 \n", + "8674 12721 11946 2800 0 1000 2000 0 \n", + "6727 26295 34171 0 0 924 0 9648 \n", + "6438 322 322 0 0 0 0 0 \n", + "1223 19076 20444 0 0 1500 0 1700 \n", + "3094 1650 1650 0 0 0 0 0 \n", + "7570 1250 650 0 0 0 0 0 \n", + "203 18737 18148 0 0 0 0 0 \n", + "5120 11411 10652 0 4000 0 2395 0 \n", + "4875 10762 10126 0 2500 1000 400 0 \n", + "5731 150 150 0 0 0 0 0 \n", + "4746 2400 2400 0 0 0 0 0 \n", + "8322 300 300 0 0 0 0 0 \n", + "1760 150 150 0 0 0 0 0 \n", + "359 2400 2400 0 0 0 0 0 \n", + "929 300 300 0 0 0 0 0 \n", + "4454 2550 1950 0 0 0 0 0 \n", + "3653 420 420 0 0 0 0 0 \n", + "8362 2400 2400 0 0 0 0 0 \n", + "5270 2400 2400 0 0 0 0 0 \n", + "1865 2400 1800 0 0 0 0 0 \n", + "\n", + " PAY_AMT6 DEFAULT_NEXT_MONTH p_DEFAULT_NEXT_MONTH p_FEMALE_ADVERSARY \n", + "7870 0 0 0.220037 0.591124 \n", + "5735 0 0 0.220037 0.591124 \n", + "3093 0 0 0.220296 0.591124 \n", + "2821 2000 0 0.220327 0.591124 \n", + "151 1200 0 0.220592 0.591124 \n", + "8558 264 1 0.220832 0.591124 \n", + "4571 316 1 0.220832 0.591124 \n", + "6536 431 0 0.220832 0.591124 \n", + "8244 0 0 0.220909 0.591124 \n", + "3738 1100 1 0.220939 0.591124 \n", + "625 0 0 0.220939 0.591124 \n", + "175 800 0 0.220969 0.591124 \n", + "4563 267 0 0.221026 0.591124 \n", + "5874 0 0 0.221026 0.591124 \n", + "5682 656 0 0.221026 0.591124 \n", + "4293 0 0 0.221026 0.591124 \n", + "5792 493 0 0.221026 0.591124 \n", + "8663 165 0 0.221026 0.591124 \n", + "465 0 1 0.221026 0.591124 \n", + "7768 0 0 0.221026 0.591124 \n", + "2716 0 0 0.221036 0.591124 \n", + "1460 1000 0 0.221815 0.591124 \n", + "3194 3000 0 0.221815 0.591124 \n", + "5224 2500 1 0.221815 0.591124 \n", + "4075 0 0 0.222505 0.591124 \n", + "6030 600 1 0.223772 0.591124 \n", + "8678 0 0 0.223772 0.591124 \n", + "2893 779 1 0.223772 0.591124 \n", + "3036 500 0 0.223772 0.591124 \n", + "4869 0 1 0.223836 0.591124 \n", + "... ... ... ... ... \n", + "2414 57258 0 0.823498 0.591124 \n", + "5594 0 1 0.824548 0.591124 \n", + "8269 700 1 0.825212 0.591124 \n", + "2314 0 1 0.825360 0.591124 \n", + "5605 0 1 0.825784 0.591124 \n", + "6755 1500 0 0.825784 0.591124 \n", + "5740 0 1 0.826592 0.591124 \n", + "4720 0 1 0.826812 0.591124 \n", + "2866 0 1 0.827260 0.591124 \n", + "5221 1000 1 0.827260 0.591124 \n", + "8674 0 1 0.836850 0.591124 \n", + "6727 2000 1 0.836922 0.591124 \n", + "6438 0 1 0.836922 0.591124 \n", + "1223 0 1 0.836922 0.591124 \n", + "3094 0 1 0.838649 0.591124 \n", + "7570 0 0 0.838691 0.591124 \n", + "203 0 0 0.844051 0.591124 \n", + "5120 0 1 0.846597 0.591124 \n", + "4875 672 1 0.847984 0.591124 \n", + "5731 0 0 0.872702 0.591124 \n", + "4746 0 1 0.874011 0.591124 \n", + "8322 0 1 0.874011 0.591124 \n", + "1760 0 0 0.879777 0.591124 \n", + "359 0 1 0.881023 0.591124 \n", + "929 0 0 0.881023 0.591124 \n", + "4454 0 1 0.881023 0.591124 \n", + "3653 780 1 0.891281 0.591124 \n", + "8362 0 1 0.918990 0.591124 \n", + "5270 0 1 0.947069 0.591124 \n", + "1865 0 1 0.950247 0.591124 \n", + "\n", + "[1378 rows x 27 columns]" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "valid_yhat['p_FEMALE_ADVERSARY'] = adv_gbm.predict(h2o.H2OFrame(valid_yhat))['female'].as_data_frame()\n", + "valid_yhat[(valid_yhat['SEX'] == 'female') & \n", + " (valid_yhat['p_FEMALE_ADVERSARY'] > 0.58) & \n", + " (valid_yhat[yhat_name] > best_cut)]\\\n", + " .sort_values(by=yhat_name)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Some of the women in this set also appear to have missed only 0-2 payments, and been late only 1-2 months on the few payments they missed, if any." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Shutdown H2O" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Are you sure you want to shutdown the H2O instance running at http://127.0.0.1:54321 (Y/N)? n\n" + ] + } + ], + "source": [ + "# be careful, this can erase your work!\n", + "h2o.cluster().shutdown(prompt=True)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.9" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/rmltk/debug.py b/rmltk/debug.py index 55728b4..c1f194e 100644 --- a/rmltk/debug.py +++ b/rmltk/debug.py @@ -1,3 +1,5 @@ +import pandas as pd + """ Copyright 2020 - Patrick Hall (jphall@gwu.edu) @@ -31,6 +33,76 @@ # TODO: model documentation # TODO: CV metrics for all measures, not just accuracy +# represent metrics as dictionary for use later +METRIC_DICT = { + +#### overall performance +'Prevalence': '(tp + fn) / (tp + tn +fp + fn)', # how much default actually happens for this group +'Accuracy': '(tp + tn) / (tp + tn + fp + fn)', # how often the model predicts default and non-default correctly for this group + +#### predicting default will happen +# (correctly) +'True Positive Rate': 'tp / (tp + fn)', # out of the people in the group *that did* default, how many the model predicted *correctly* would default +'Precision': 'tp / (tp + fp)', # out of the people in the group the model *predicted* would default, how many the model predicted *correctly* would default + +#### predicting default won't happen +# (correctly) +'Specificity': 'tn / (tn + fp)', # out of the people in the group *that did not* default, how many the model predicted *correctly* would not default +'Negative Predicted Value': 'tn / (tn + fn)', # out of the people in the group the model *predicted* would not default, how many the model predicted *correctly* would not default + +#### analyzing errors - type I +# false accusations +'False Positive Rate': 'fp / (tn + fp)', # out of the people in the group *that did not* default, how many the model predicted *incorrectly* would default +'False Discovery Rate': 'fp / (tp + fp)', # out of the people in the group the model *predicted* would default, how many the model predicted *incorrectly* would default + +#### analyzing errors - type II +# costly ommisions +'False Negative Rate': 'fn / (tp + fn)', # out of the people in the group *that did* default, how many the model predicted *incorrectly* would not default +'False Omissions Rate':'fn / (tn + fn)' # out of the people in the group the model *predicted* would not default, how many the model predicted *incorrectly* would not default +} + + +def get_metrics_ratios(cm_dict, _control_level): + + """ Calculates confusion matrix metrics in METRIC_DICT for each level of demographic feature. + Tightly coupled to cm_dict. + + :param cm_dict: Dictionary of Pandas confusion matrices, one matrix for each level. + :param _control_level: Control level in cm_dict. + :return: Tuple, Pandas frame of metrics for each level of demographic feature, Pandas frame of ratio metrics for + each level of demographic feature. + + """ + + levels = sorted(list(cm_dict.keys())) + + eps = 1e-20 # for safe numerical operations + + # init return frames + metrics_frame = pd.DataFrame(index=levels) # frame for metrics + + # nested loop through: + # - levels + # - metrics + for level in levels: + + for metric in METRIC_DICT.keys(): + + # parse metric expressions into executable Pandas statements + expression = METRIC_DICT[metric].replace('tp', 'cm_dict[level].iat[0, 0]') \ + .replace('fp', 'cm_dict[level].iat[0, 1]') \ + .replace('fn', 'cm_dict[level].iat[1, 0]') \ + .replace('tn', 'cm_dict[level].iat[1, 1]') + + # dynamically evaluate metrics to avoid code duplication + metrics_frame.loc[level, metric] = eval(expression) + + # calculate metric ratios + ratios_frame = (metrics_frame.loc[:, :] + eps) / (metrics_frame.loc[_control_level, :] + eps) + ratios_frame.columns = [col + ' Ratio' for col in ratios_frame.columns] + + return metrics_frame, ratios_frame + def air(cm_dict, reference, protected): @@ -121,3 +193,4 @@ def smd(valid, x_name, yhat_name, reference, protected): print(yhat_name.title() + ' std. dev.: %.2f' % sigma) return (protected_yhat_mean - reference_yhat_mean) / sigma + diff --git a/rmltk/evaluate.py b/rmltk/evaluate.py index ce20910..b296a65 100644 --- a/rmltk/evaluate.py +++ b/rmltk/evaluate.py @@ -188,6 +188,111 @@ def cv_model_rank_select(valid, seed_, train_results, model_prefix, 'METRICS': best_model_frame} +def get_prauc(frame, y, yhat, pos=1, neg=0, res=0.01): + + """ Calculates precision, recall, and f1 for a pandas dataframe of y and yhat values. + + Args: + frame: Pandas dataframe of actual (y) and predicted (yhat) values. + y: Name of actual value column. + yhat: Name of predicted value column. + pos: Primary target value, default 1. + neg: Secondary target value, default 0. + res: Resolution by which to loop through cutoffs, default 0.01. + + Returns: + Pandas dataframe of precision, recall, and f1 values. + """ + + frame_ = frame.copy(deep=True) # don't destroy original data + dname = 'd_' + str(y) # column for predicted decisions + eps = 1e-20 # for safe numerical operations + + # init p-r roc frame + prauc_frame = pd.DataFrame(columns=['cutoff', 'recall', 'precision', 'f1']) + + # loop through cutoffs to create p-r roc frame + for cutoff in np.arange(0, 1 + res, res): + # binarize decision to create confusion matrix values + frame_[dname] = np.where(frame_[yhat] > cutoff, 1, 0) + + # calculate confusion matrix values + tp = frame_[(frame_[dname] == pos) & (frame_[y] == pos)].shape[0] + fp = frame_[(frame_[dname] == pos) & (frame_[y] == neg)].shape[0] + fn = frame_[(frame_[dname] == neg) & (frame_[y] == pos)].shape[0] + + # calculate precision, recall, and f1 + recall = (tp + eps) / ((tp + fn) + eps) + precision = (tp + eps) / ((tp + fp) + eps) + f1 = 2 / ((1 / (recall + eps)) + (1 / (precision + eps))) + + # add new values to frame + prauc_frame = prauc_frame.append({'cutoff': cutoff, + 'recall': recall, + 'precision': precision, + 'f1': f1}, + ignore_index=True) + + # housekeeping + del frame_ + + return prauc_frame + + +def get_youdens_j(frame, y, yhat, pos=1, neg=0, res=0.01): + + """ Calculates TPR, TNR, and Youden's J for a Pandas DataFrame of actual (_y_name) and predicted (_yhat_name) values + to select best cutoff for AUC-optimized classifier. + + :param frame: Pandas DataFrame of actual (_y_name) and predicted (_yhat_name) values. + :param y: Name of actual value column. + :param yhat: Name of predicted value column. + :param pos: Primary target value, default 1. + :param neg: Secondary target value, default 0. + :param res: Resolution by which to loop through cutoffs, default 0.01. + :return: Pandas DataFrame of sensitivity, specificity, and Youden's J values. + + """ + + frame_ = frame.copy(deep=True) # don't destroy original data + dname = 'd_' + str(y) # column for predicted decisions + eps = 1e-20 # for safe numerical operations + + # init j_frame + j_frame = pd.DataFrame(columns=['cutoff', 'TPR', 'TNR', 'J']) + + # loop through cutoffs to create j_frame + for cutoff in np.arange(0, 1 + res, res): + + # binarize decision to create confusion matrix values + frame_[dname] = np.where(frame_[yhat] > cutoff, 1, 0) + + # calculate confusion matrix values + tp = frame_[(frame_[dname] == pos) & (frame_[y] == pos)].shape[0] + fp = frame_[(frame_[dname] == pos) & (frame_[y] == neg)].shape[0] + tn = frame_[(frame_[dname] == neg) & (frame_[y] == neg)].shape[0] + fn = frame_[(frame_[dname] == neg) & (frame_[y] == pos)].shape[0] + + # calculate precision, recall, and Youden's J + tpr = (tp + eps) / ((tp + fn) + eps) + tnr = (tn + eps) / ((tn + fp) + eps) + fnr = 1 - tnr + j = tpr + tnr - 1 + + # add new values to frame + j_frame = j_frame.append({'cutoff': cutoff, + 'TPR': tpr, + 'TNR': tnr, + 'FNR': fnr, + 'J': j}, + ignore_index=True) + + # housekeeping + del frame_ + + return j_frame + + def get_confusion_matrix(valid, y_name, yhat_name, by=None, level=None, cutoff=0.5): """ Creates confusion matrix from pandas DataFrame of y and yhat values, can be sliced @@ -199,8 +304,8 @@ def get_confusion_matrix(valid, y_name, yhat_name, by=None, level=None, cutoff=0 :param by: By variable to slice frame before creating confusion matrix, default None. :param level: Value of by variable to slice frame before creating confusion matrix, default None. :param cutoff: Cutoff threshold for confusion matrix, default 0.5. - :return: Confusion matrix as pandas DataFrame. + """ # determine levels of target (y) variable From 9c82319ecb0919ea7d7a927609b6f66aacd84d2e Mon Sep 17 00:00:00 2001 From: patrickh Date: Wed, 3 Jun 2020 21:25:21 -0400 Subject: [PATCH 3/3] add more lecture 3 materials --- README.md | 31 +++++++++++++++++++++++++++++++ 1 file changed, 31 insertions(+) diff --git a/README.md b/README.md index d155306..235032d 100644 --- a/README.md +++ b/README.md @@ -15,6 +15,8 @@ Materials for a technical, nuts-and-bolts course about increasing transparency, Corrections or suggestions? Please file a [GitHub issue](https://github.com/jphall663/GWU_rml/issues/new). +*** + ## Lecture 1: Interpretable Machine Learning Models ![Histogram, partial dependence, and ICE for a monotonic GBM and a credit card customer's most recent repayment status](/img/lecture_1.png) @@ -53,6 +55,8 @@ Corrections or suggestions? Please file a [GitHub issue](https://github.com/jpha * [When a Computer Program Keeps You in Jail](https://www.nytimes.com/2017/06/13/opinion/how-computers-are-harming-criminal-justice.html) * [When an Algorithm Helps Send You to Prison](https://www.nytimes.com/2017/10/26/opinion/algorithm-compas-sentencing-bias.html) + + ## Lecture 2: Post-hoc Explanation ![A decision tree surrogate model forms a flow chart of a more complex monotonic GBM](/img/lecture_2.png) @@ -93,6 +97,33 @@ Corrections or suggestions? Please file a [GitHub issue](https://github.com/jpha * [Machine Bias](https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing) * [Gender Shades](http://gendershades.org/) * [Explainable Neural Networks based on Additive Index Models](https://arxiv.org/pdf/1806.01933.pdf) + + + +## Lecture 3: Discrimination Testing and Remediation + +### Lecture 3 Class Materials + +* [Lecture Notes]() +* [Lecture Video]() +* Software Example: [Testing a Constrained Model for Discrimination and Remediating Discovered Discrimination](https://nbviewer.jupyter.org/github/jphall663/GWU_rml/blob/master/lecture_3.ipynb) + +### Lecture 3 Suggested Software + +* Python: + * [`aequitas`](https://github.com/dssg/aequitas) + * [`AIF360`](https://github.com/IBM/AIF360) + * [`Themis`](https://github.com/LASER-UMASS/Themis) + +### Lecture 3 Suggested Reading + +* **Introduction and Background**: + +* **Post-hoc Explanation Techniques**: + +* **Links from Lecture**: + + ## Using Class Software Resources