Coding / Programming Videos

Post your favorite coding videos and share them with others!

Machine Learning Model Deployment – Jose Garcia – Medium

Source link

There are many machine learning online courses but very few bridge the gap between analysis and deployment. A perfect machine learning model that is 100% accurate is worthless if it cannot be deployed. Often times when trying to deploy a model, you run into the issue of installing dependencies to a user’s computer. After you get through management and IT security, it could be months before you are able to deploy a model into production. This is where Docker containers come in.

Docker containers are lightweight self-contained virtual environments that can be deployed across multiple environments in seconds.

Why containerization?

  • Improves scalability
  • Apps deploy fast, reliably and consistently across deployment environments
  • Increased developer productivity

I will walk through a brief example of how docker containers can be used to deploy a machine learning model to AWS Elastic Beanstalk in just a few minutes. All of the source code for this project will be located on my Github:

Iris Flower Model Example:

The Iris Flower Kaggle Dataset is a popular machine learning dataset for classifying the iris flower species using some flower properties. After training and saving the model to a pickle file, it can be containerized and deployed to a website in minutes. I used Flask and Bootstrap to make the application. After designing the web app, the required files to containerize the app will be a Dockerfile and requirements.txt.

Dockerfile setup:

FROM continuumio/anaconda3:4.4.0
COPY ./flask_demo /var/www/python/
WORKDIR /var/www/python/
RUN pip install -r requirements.txt
CMD python
  • FROM — the base image for the container
  • COPY — Copies files from host computer to the container
  • EXPOSE — Exposes a port from the container
  • WORKDIR — sets the working directory for the container
  • RUN — Runs pip install and installs all modules in requirements.txt
  • CMD — Runs flask application

After creating the Dockerfile, make sure to remove the file extension, otherwise it will not work.

The requirements.txt file contains all dependencies required by the application. For this app, the dependencies will be the following:


After creating the Dockerfile and requirements.txt, the file structure should look like the directory below. All files are located on my Github.

--flask_demo #flask app directory
--static #static file directory
--templates #templates directory
--pickel_model.pkl #KNN pickled model
--requirements.txt #requirements file #flask application

Testing Container on Local Machine

Before continuing, make sure you have Docker installed and running.

From your terminal window, navigate to the directory where the Dockerfile exists. You will first build the Docker container by executing the docker command below. This will take a few minutes.

docker build -t iris_flower_app .

After you have a successful build, you will execute the docker run command and specify the port that you will be using.

docker run -p 8000:8000 iris_flower_app

After this has successfully run, you should be able to navigate to your localhost port 8000 and see the web app. For my application, I specified the local host to run at

AWS Elastic Beanstalk Deployment

To get started, you will first need to zip the Dockerfile and the flask_demo folder. If you zip the top level directory, your model will fail.

Step 1: Navigate to the AWS Elastic Beanstalk page

Step 2: Create a new Application

Step 3: Select Environment Tier

Step 4: Choose Docker as the preconfigured platform, upload the zip file and create the environment.

Step 5: The deployment process will take a few minutes and you will see the following window. You can then navigate to your web application at the provided URL.

Source link


Leave a Reply

Please Login to comment
Notify of
Translate »