How to setup a cloud ready machine learning model and Web API using Flask

How to setup a cloud ready machine learning model and Web API using Flask

·

5 min read

A well trained machine learnig model is not useful until we are able to use it for prediction on the future data. These kind of predictive models are used to solve several problems. So, we are going to learn how to use a trained machine learning model to predict on future data which will be feed into the model through an API.

In order to use a maching learning model through a callable API, we need to have a machine learning model in the first place. In this post we are not concentrating to create a model from the scratch (I will write another post on how to create a basic artificial neural network model using tensorflow) instead, we are going to use a model trained using tensorflow and keras libraries on the 'Iris' (flower) dataset. It is very famous dataset and we know that it has 3 classes (setosa, versicolor, virginica) present on the target variable.

Another thing we need is the Flask library to create the API. So. let's install flask usnig pip. pip install flask

After installing flask, let's open sublime text3/ any text editor and create a python file. I am going to name this file as my_app.py. Then I am going to test if the environment is properly setup by creating a basic flask application with only one '/' route.

from flask import Flask

app = Flask(__name__)

@app.route("/")
def index():
    return '<h1>FLASK APP IS RUNNING!</h1>'

if __name__ == '__main__':
    app.run()

Now, let's open the command prompt in the current working directory where the my_app.py file is located and run the program using python my_app.py command. We should be able to see in the command prompt like below.

Alt Text

If we open http://127.0.0.1:5000 in the browser, then we should be able to see the app is running. If you see like below then we are set to proceed.

Alt Text

Now, we are going to follow below steps to create an API using Flask and call it through python script.

  • First, we will load the model using tensorflow.keras load_model() method.
  • Then, we will load the fitted Scaler object (I saved as a pickle file) using joblib load() method to transform the new incoming data.
  • Create a function to be able to take a model object, a scaler object and the new data to predict.
  • Create an API endpoint using Flask .
  • Test the API using postman.
  • Call the API using Python script and get the prediction.

Load the model

We are going to import load_model from tensorflow.keras.models as the model is built using tensorflow. The model can be saved by any method.

from tensorflow.keras.models import load_model

Then we will use below code to load the model in my variable called flower_model. Here, I saved my model as my_model.h5 in the current working directory.

flower_model = load_model("my_model.h5")

Load the scaler object

I saved my fitted MinMaxScaler object in a pickle file in this current working directory as my_scaler.pkl. So, I am going to use joblib to load this scaler object in my variable called flower_scaler. Before that, we need to import the joblib module.

import joblib
flower_scaler = joblib.load("my_scaler.pkl")

Creating a function that returns prediction

We are going to create a function that can take a model, a scaler and a data. The data is going to be in JSON format. So, we are going to read each field and will save into variables. This way we can create a list with similar shape of the training features. We have 4 features in this case. These are sepal_length, sepal_width,petal_length,petal_width. Then we will transform the data using scaler object and will use the model to predict the classes. The function code is given below.

def return_prediction(model, scaler, data):
    s_len = data["sepal_length"]    
    s_wid = data["sepal_width"]    
    p_len = data["petal_length"]    
    p_wid = data["petal_width"]

    classes = np.array(['setosa', 'versicolor', 'virginica'])

    flower = [[s_len, s_wid, p_len, p_wid]]

    flower = scaler.transform(flower)

    class_ind = model.predict_classes(flower)[0]

    return classes[class_ind]

We need to import numpy as np in the top in order to convert the list labels (['setosa', 'versicolor', 'virginica']) into numpy array

Creating API endpoint using Flask

I am going to create a route like /api/flower which can respond to a HTTP Post method. I am going to use request module to extract the JSON data posted by a client. Then I am going to use return_prediction() function to predict on this new data and return a JSON object using jsonify.

@app.route("/api/flower", methods=['POST'])
def flower_prediction():
    content = request.json
    result = return_prediction(flower_model, flower_scaler, content)
    return jsonify(result)

We need to import request and jsonify modules from flask library

from flask import request, jsonify

Testing the API using Postman

Now, I am going to run the program using below command in command prompt from the working directory. python my_app.py. This will launch the server and we are ready to test the API.

Now, let's open Postman and issue a post request to the endpoint http://127.0.0.1:5000/api/flower. We will create a body of the request in JSON format and will send the request.

{
    "sepal_length":5.1,
    "sepal_width":3.5,
    "petal_length":1.4,
    "petal_width":0.2
}

Alt Text

We can see the status code is 200 OK and the API has returned its prediction as 'setosa' based on the input data.

Calling the API using Python script

Next, we are going to write a python script to call the API. We need to import request library in order to do that.

import requets

Then we will create a dictionary object to pass through the API.

flower_example = {
    "sepal_length":5.1,
    "sepal_width":3.5,
    "petal_length":1.4,
    "petal_width":0.2
}

Finally, we will call the API using post method and print the result.

result = requests.post("http://127.0.0.1:5000/api/flower", json=flower_example)
print(result)

Alt Text

That's all for today. We will talk about how to deploy this model into cloud in my next post. Follow me on My Twitter for software development related content. Have a great one!