Skip to content

Commit

Permalink
upload docker image and deploy kubernets
Browse files Browse the repository at this point in the history
  • Loading branch information
longtony committed Nov 19, 2023
1 parent 4f1e504 commit 4391742
Show file tree
Hide file tree
Showing 11 changed files with 87 additions and 14 deletions.
Empty file modified Supporting-material/resize.sh
100644 → 100755
Empty file.
Empty file.
13 changes: 8 additions & 5 deletions project-ml-microservice-kubernetes/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,20 @@ FROM python:3.7.3-stretch

## Step 1:
# Create a working directory

WORKDIR /app
## Step 2:
# Copy source code to working directory

COPY app.py /app/
COPY model_data /app/model_data
COPY requirements.txt /app/
## Step 3:
# Install packages from requirements.txt
# hadolint ignore=DL3013

RUN pip install --upgrade pip &&\
pip install --trusted-host pypi.python.org -r requirements.txt
## Step 4:
# Expose port 80

EXPOSE 80
## Step 5:
# Run app.py at container launch

CMD ["python", "app.py"]
6 changes: 3 additions & 3 deletions project-ml-microservice-kubernetes/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ def scale(payload):

@app.route("/")
def home():
html = f"<h3>Sklearn Prediction Home</h3>"
html = "<h3>Sklearn Prediction Home</h3>"
return html.format(format)

@app.route("/predict", methods=['POST'])
Expand Down Expand Up @@ -61,11 +61,11 @@ def predict():
# scale the input
scaled_payload = scale(inference_payload)
# get an output prediction from the pretrained model, clf
# load pretrained model as clf
clf = joblib.load("./model_data/boston_housing_prediction.joblib")
prediction = list(clf.predict(scaled_payload))
# TO DO: Log the output prediction value
return jsonify({'prediction': prediction})

if __name__ == "__main__":
# load pretrained model as clf
clf = joblib.load("./model_data/boston_housing_prediction.joblib")
app.run(host='0.0.0.0', port=80, debug=True) # specify port=80
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -1 +1,22 @@
<paste log output from Docker prediction, here>
<paste log output from Docker prediction, here>
REPOSITORY TAG IMAGE ID CREATED SIZE
api latest a05ce67946c3 7 seconds ago 1.26GB
<none> <none> e15b07d66fc6 14 minutes ago 1.26GB
* Serving Flask app "app" (lazy loading)
* Environment: production
WARNING: Do not use the development server in a production environment.
Use a production WSGI server instead.
* Debug mode: on
* Running on http://0.0.0.0:80/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 194-414-639
[2023-11-19 11:38:58,322] INFO in app: JSON payload:
{'CHAS': {'0': 0}, 'RM': {'0': 6.575}, 'TAX': {'0': 296.0}, 'PTRATIO': {'0': 15.3}, 'B': {'0': 396.9}, 'LSTAT': {'0': 4.98}}
[2023-11-19 11:38:58,343] INFO in app: Inference payload DataFrame:
CHAS RM TAX PTRATIO B LSTAT
0 0 6.575 296.0 15.3 396.9 4.98
[2023-11-19 11:38:58,354] INFO in app: Scaling Payload:
CHAS RM TAX PTRATIO B LSTAT
0 0 6.575 296.0 15.3 396.9 4.98
172.17.0.1 - - [19/Nov/2023 11:38:58] "POST /predict HTTP/1.1" 200 -
Original file line number Diff line number Diff line change
@@ -1 +1,33 @@
<paste log output from Kubernetes-mediated prediction, here>
<paste log output from Kubernetes-mediated prediction, here>
pod/api-microservices created
NAMESPACE NAME READY STATUS RESTARTS AGE
default api-microservices 1/1 Running 0 12s
kube-system coredns-66bff467f8-cbzpm 1/1 Running 0 19h
kube-system coredns-66bff467f8-nppkt 1/1 Running 0 19h
kube-system etcd-khaled-virtual-machine 1/1 Running 0 12m
kube-system kube-apiserver-khaled-virtual-machine 1/1 Running 0 19h
kube-system kube-controller-manager-khaled-virtual-machine 1/1 Running 1 19h
kube-system kube-proxy-c7lr2 1/1 Running 0 19h
kube-system kube-scheduler-khaled-virtual-machine 1/1 Running 1 19h
kube-system storage-provisioner 1/1 Running 1 19h
Forwarding from 127.0.0.1:8000 -> 80
Forwarding from [::1]:8000 -> 80

* Serving Flask app "app" (lazy loading)
* Environment: production
WARNING: Do not use the development server in a production environment.
Use a production WSGI server instead.
* Debug mode: on
* Running on http://0.0.0.0:80/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 161-774-587
[2023-11-19 19:38:23,813] INFO in app: JSON payload:
{'CHAS': {'0': 0}, 'RM': {'0': 6.575}, 'TAX': {'0': 296.0}, 'PTRATIO': {'0': 15.3}, 'B': {'0': 396.9}, 'LSTAT': {'0': 4.98}}
[2023-11-19 19:38:23,864] INFO in app: Inference payload DataFrame:
CHAS RM TAX PTRATIO B LSTAT
0 0 6.575 296.0 15.3 396.9 4.98
[2023-11-19 19:38:23,872] INFO in app: Scaling Payload:
CHAS RM TAX PTRATIO B LSTAT
0 0 6.575 296.0 15.3 396.9 4.98
172.17.0.1 - - [19/Nov/2023 19:38:23] "POST /predict HTTP/1.1" 200 -
4 changes: 3 additions & 1 deletion project-ml-microservice-kubernetes/run_docker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,11 @@

# Step 1:
# Build image and add a descriptive tag

docker build --tag=api .
# Step 2:
# List docker images
docker image ls

# Step 3:
# Run flask app
docker run -p 8000:80 api
8 changes: 6 additions & 2 deletions project-ml-microservice-kubernetes/run_kubernetes.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,18 @@
# Step 1:
# This is your Docker ID/path
# dockerpath=<>
#list nodes
dockerpath=longtony/api-microservices:v1.0.0

# Step 2
# Run the Docker Hub container with kubernetes

minikube start
kubectl create deploy api-microservice --image=$dockerpath

# Step 3:
# List kubernetes pods
kubectl get nodes

# Step 4:
# Forward the container port to a host

kubectl port-forward pod/api-microservice 8000:80
8 changes: 8 additions & 0 deletions project-ml-microservice-kubernetes/setup_env.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#!/usr/bin/env bash
wget https://github.com/hadolint/hadolint/releases/download/v2.12.0/hadolint-Linux-x86_64
sudo mv hadolint-Linux-x86_64 /usr/local/bin/hadolint
sudo chmod +x /usr/local/bin/hadolint


curl -LO https://storage.googleapis.com/minikube/releases/latest/minikube-linux-amd64
sudo install minikube-linux-amd64 /usr/local/bin/minikube
5 changes: 4 additions & 1 deletion project-ml-microservice-kubernetes/upload_docker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,13 @@
# Step 1:
# Create dockerpath
# dockerpath=<your docker ID/path>

dockerpath=longtony/api-microservices:v1.0.0
docker tag api longtony/api-microservices:v1.0.0
# Step 2:
# Authenticate & tag
echo "Docker ID and Image: $dockerpath"
docker login

# Step 3:
# Push image to a docker repository
docker push longtony/api-microservices:v1.0.0

0 comments on commit 4391742

Please sign in to comment.