azure

Author: Viji Ekambaram

Introduction

As Data Scientists, one of the most pressing challenges we have is on how to operationalize machine learning models so that they are robust, cost-effective, and scalable enough to handle the traffic demand. With advanced cloud technologies and serverless computing, there are now cost-effective (pay based on usage), and auto-scalable platforms (with scale-in/scale-out architecture depending on the traffic) available. Data scientists can use these to accelerate the machine learning model deployment without having to worry about the infrastructure.

This blog discusses one such methodology of implementing the machine learning code and model developed locally using Jupyter notebook on Azure environment for real-time predictions.

ML Implementation Architecture

ML Implementation Architecture on Azure

ML Implementation Architecture

We have used Azure Functions to deploy the Model Scoring and Feature Store Creation codes into production. Azure Functions is a FaaS offering (Function As A Service or FaaS provides event-based, serverless computing to accelerate development without having to worry about the infrastructure). Azure Functions comes with some interesting functionalities like-

1. Choice of Programming Languages
You can work with any language of your choice- C#, Node.js, Java, Python

2. Event-driven and Scalable
You can use built-in triggers and bindings such as http trigger, event trigger, timer trigger, and queue trigger to define when a function is invoked. The architecture is scalable, depending on the workload.

ML Implementation process

Once the code is developed, the following are the best practices to make the machine learning code production-ready. Below are the steps to deploy Azure Function.

ML Implementation Process

ML Implementation Process

Azure Function Deployment Steps Walkthrough

Visual Studio Code editor with Azure Function extension is used to create a serverless HTTP endpoint with Python.

1. Sign in to Azure

sign into azure

2. Create a New Project. In the prompt that shows up, select the Language as Python, Trigger as http trigger (based on the requirement)

create new project

3. Azure Function is created, and the folder structure is as follows. Write your logic or copy the code if already developed into __init__.py

azure function folder structure

4. function.json, triggered by http trigger, defines the bindings in this case

function.json

5. local settings.json contains all the environmental variables used in the code as a key-value pair

settings.json

6. requirements.txt contains all libraries that need to be pip installed

requirements

7. As the model is stored in Blob, add the following line of code to read from Blob

blob

8. Read the Feature Store data from Azure SQL DB

feature store data

9. Test locally. Choose Debug -> Start Debugging; it will run locally and give a local API endpoint

debug

10. Publish to Azure Account using the following
func azure functionapp publish functionappname functionname — build remote — additional-packages “python3-dev libevent-dev unixodbc-dev build-essential libssl-dev libffi-dev”

publish

11. Login to Azure Portal and go to Azure Functions resource to get the API endpoint for Model Scoring

azure portal

Conclusion

This API can also be integrated with front-end applications for real-time predictions.
Happy Learning!

Tags:
0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*

©2023 Tiger Analytics. All rights reserved.

Log in with your credentials

Forgot your details?