Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

upload docker image and deploy kubernets #49

Closed
wants to merge 5 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Empty file modified Supporting-material/resize.sh
100644 → 100755
Empty file.
Empty file.
15 changes: 9 additions & 6 deletions project-ml-microservice-kubernetes/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,18 +1,21 @@
FROM python:3.7.3-stretch
FROM python:3.9.6

## Step 1:
# Create a working directory

WORKDIR /app
## Step 2:
# Copy source code to working directory

COPY app.py /app/
COPY model_data /app/model_data
COPY requirements.txt /app/
## Step 3:
# Install packages from requirements.txt
# hadolint ignore=DL3013

RUN pip install --upgrade pip &&\
pip install --trusted-host pypi.python.org -r requirements.txt
## Step 4:
# Expose port 80

EXPOSE 80
## Step 5:
# Run app.py at container launch

CMD ["python", "app.py"]
8 changes: 4 additions & 4 deletions project-ml-microservice-kubernetes/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
import logging

import pandas as pd
from sklearn.externals import joblib
import joblib
from sklearn.preprocessing import StandardScaler

app = Flask(__name__)
Expand All @@ -20,7 +20,7 @@ def scale(payload):

@app.route("/")
def home():
html = f"<h3>Sklearn Prediction Home</h3>"
html = "<h3>Sklearn Prediction Home</h3>"
return html.format(format)

@app.route("/predict", methods=['POST'])
Expand Down Expand Up @@ -61,11 +61,11 @@ def predict():
# scale the input
scaled_payload = scale(inference_payload)
# get an output prediction from the pretrained model, clf
# load pretrained model as clf
clf = joblib.load("./model_data/boston_housing_prediction.joblib")
prediction = list(clf.predict(scaled_payload))
# TO DO: Log the output prediction value
return jsonify({'prediction': prediction})

if __name__ == "__main__":
# load pretrained model as clf
clf = joblib.load("./model_data/boston_housing_prediction.joblib")
app.run(host='0.0.0.0', port=80, debug=True) # specify port=80
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -1 +1,22 @@
<paste log output from Docker prediction, here>
<paste log output from Docker prediction, here>
REPOSITORY TAG IMAGE ID CREATED SIZE
api latest a05ce67946c3 7 seconds ago 1.26GB
<none> <none> e15b07d66fc6 14 minutes ago 1.26GB
* Serving Flask app "app" (lazy loading)
* Environment: production
WARNING: Do not use the development server in a production environment.
Use a production WSGI server instead.
* Debug mode: on
* Running on http://0.0.0.0:80/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 194-414-639
[2023-11-19 11:38:58,322] INFO in app: JSON payload:
{'CHAS': {'0': 0}, 'RM': {'0': 6.575}, 'TAX': {'0': 296.0}, 'PTRATIO': {'0': 15.3}, 'B': {'0': 396.9}, 'LSTAT': {'0': 4.98}}
[2023-11-19 11:38:58,343] INFO in app: Inference payload DataFrame:
CHAS RM TAX PTRATIO B LSTAT
0 0 6.575 296.0 15.3 396.9 4.98
[2023-11-19 11:38:58,354] INFO in app: Scaling Payload:
CHAS RM TAX PTRATIO B LSTAT
0 0 6.575 296.0 15.3 396.9 4.98
172.17.0.1 - - [19/Nov/2023 11:38:58] "POST /predict HTTP/1.1" 200 -
Original file line number Diff line number Diff line change
@@ -1 +1,33 @@
<paste log output from Kubernetes-mediated prediction, here>
<paste log output from Kubernetes-mediated prediction, here>
pod/api-microservices created
NAMESPACE NAME READY STATUS RESTARTS AGE
default api-microservices 1/1 Running 0 12s
kube-system coredns-66bff467f8-cbzpm 1/1 Running 0 19h
kube-system coredns-66bff467f8-nppkt 1/1 Running 0 19h
kube-system etcd-khaled-virtual-machine 1/1 Running 0 12m
kube-system kube-apiserver-khaled-virtual-machine 1/1 Running 0 19h
kube-system kube-controller-manager-khaled-virtual-machine 1/1 Running 1 19h
kube-system kube-proxy-c7lr2 1/1 Running 0 19h
kube-system kube-scheduler-khaled-virtual-machine 1/1 Running 1 19h
kube-system storage-provisioner 1/1 Running 1 19h
Forwarding from 127.0.0.1:8000 -> 80
Forwarding from [::1]:8000 -> 80

* Serving Flask app "app" (lazy loading)
* Environment: production
WARNING: Do not use the development server in a production environment.
Use a production WSGI server instead.
* Debug mode: on
* Running on http://0.0.0.0:80/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 161-774-587
[2023-11-19 19:38:23,813] INFO in app: JSON payload:
{'CHAS': {'0': 0}, 'RM': {'0': 6.575}, 'TAX': {'0': 296.0}, 'PTRATIO': {'0': 15.3}, 'B': {'0': 396.9}, 'LSTAT': {'0': 4.98}}
[2023-11-19 19:38:23,864] INFO in app: Inference payload DataFrame:
CHAS RM TAX PTRATIO B LSTAT
0 0 6.575 296.0 15.3 396.9 4.98
[2023-11-19 19:38:23,872] INFO in app: Scaling Payload:
CHAS RM TAX PTRATIO B LSTAT
0 0 6.575 296.0 15.3 396.9 4.98
172.17.0.1 - - [19/Nov/2023 19:38:23] "POST /predict HTTP/1.1" 200 -
28 changes: 14 additions & 14 deletions project-ml-microservice-kubernetes/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
Click==7.0
Flask==1.0.2
itsdangerous==1.1.0
Jinja2==2.10.3
MarkupSafe==1.1.1
numpy==1.17.2
pandas==0.24.2
python-dateutil==2.8.0
pytz==2019.3
scikit-learn==0.20.3
scipy==1.3.1
six==1.12.0
Werkzeug==0.16.0
pylint==2.4.4
Click
Flask
itsdangerous
Jinja2
MarkupSafe
numpy
pandas
python-dateutil
pytz
scikit-learn
scipy
six
Werkzeug
pylint
4 changes: 4 additions & 0 deletions project-ml-microservice-kubernetes/run_docker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,13 @@

# Step 1:
# Build image and add a descriptive tag
docker build --tag=api .

# Step 2:
# List docker images
docker image ls

# Step 3:
# Run flask app
docker run -p 8000:80 api

8 changes: 6 additions & 2 deletions project-ml-microservice-kubernetes/run_kubernetes.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,18 @@
# Step 1:
# This is your Docker ID/path
# dockerpath=<>
#list nodes
dockerpath=longtony/api-microservices:v1.0.0

# Step 2
# Run the Docker Hub container with kubernetes

minikube start
kubectl create deploy api-microservice --image=$dockerpath

# Step 3:
# List kubernetes pods
kubectl get nodes

# Step 4:
# Forward the container port to a host

kubectl port-forward pod/api-microservice 8000:80
7 changes: 7 additions & 0 deletions project-ml-microservice-kubernetes/setup_env.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/usr/bin/env bash
wget https://github.com/hadolint/hadolint/releases/download/v2.12.0/hadolint-Linux-x86_64
sudo mv hadolint-Linux-x86_64 /usr/local/bin/hadolint
sudo chmod +x /usr/local/bin/hadolint

curl -LO https://storage.googleapis.com/minikube/releases/latest/minikube-linux-amd64
sudo install minikube-linux-amd64 /usr/local/bin/minikube
5 changes: 4 additions & 1 deletion project-ml-microservice-kubernetes/upload_docker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,13 @@
# Step 1:
# Create dockerpath
# dockerpath=<your docker ID/path>

dockerpath=longtony/api-microservices:v1.0.0
docker tag api longtony/api-microservices:v1.0.0
# Step 2:
# Authenticate & tag
echo "Docker ID and Image: $dockerpath"
docker login

# Step 3:
# Push image to a docker repository
docker push longtony/api-microservices:v1.0.0