National Action Council for Minorities in Engineering(NACME) Google Applied Machine Learning Intensive (AMLI) at the PARTICIPATING_UNIVERSITY
Developed by:
- Guysnove Lutumba -
Northern Kentucky University
- Ella Kapanga -
Univesity of Kentucky
- Ricardo Medina -
University of Texas
- Allan Joseph -
Stevens Institute of Technology
The objective of this project is to predict the level of distress for patients using sensor data. Students will find a solution to this problem using graph neural networks (GNNs) where sensors are represented as graph nodes and their proximity is represented by edges.
This project was built with the purpose of improving the Assuage app. Assuage was designed for remote symptom monitoring in rural cancer patients by asking survey based question in determining their whether or not they are distress. This project builds on top of the Assuage project. With our application health officals would use their patients biometrics collected through their Apple watches or Fitbits to determine health, instead of asking them questions.
The dataset used in this project was from the study, Wearable and Automotive Systems for Affect Recognition from Physiology. In this study researchers collect biometrics from nine individals across twenty-seven driving runs. Though only ten of the overrall driving runs were usable. The biometrics collect from this study were respiration rate (RESP), heart rate (HR), galvanic skin response (GSR), electromyogram (EMG), blood volume pulse (BVP) and electrocardiograph (EKG). But for this project we only used RESP, HR, GSR and EMG.
This project we only used RESP, HR, GSR and EMG in measuring whether or not a patient is distress. Our parameter selection was influenced by the study, Classification of Stress Recognition using Artificial Neural Network. The study was primary focused on determined whether or not someone was stress or not. Stress is different from distress but there are similarities between the two. This study was able to determine when someone is stress with a 99% accuracy. As such, the same parameters where used in creating models to determine if someone is stress. We were able to get promising results but were not able to recreate the same results from the study. The model with the highest accuacy was a logregression model, with a 86% accuracy.
It is possible to measure patients distress levelings through Artificial Neural Network (ANN). But at the moment it is not possible to implement such model and system into an app. The biometrics Apple watch hardware is able to measure is limited. Apple watches currently able measure RESP and HR. GSR and EMG measurements are not available on Apple watches and these featured signals are crucial in determining where or not someone is distress according to the studies read. That being said if in the future Apple does implement the missing featured signals then the implement of the ANN model would be fisable.
- Fork this repo
- Change directories into your project
- On the command line, type
pip3 install pyECG
- On the command line, type
pip3 install wfdb
- On the command line, type
pip install seaborn
- On the command line, type
pip install jupyter
- On the command line, type
pip install numpy
- On the command line, type
jupyter notebook
- Locate and open the file you download from this repo