Showing posts with label API. Show all posts
Showing posts with label API. Show all posts

Tuesday, December 24, 2019

Publishing Keras Model API with TensorFlow Serving

Building a ML model is a crucial task. Running ML model in production is not a less complex and important task. I had a post in the past about serving ML model through Flask REST API — Publishing Machine Learning API with Python Flask. While this approach works, it certainly lacks some important points:

  • Model versioning 
  • Request batching 
  • Multithreading 

TensorFlow comes with a set of tools to help you run ML model in production. One of these tools — TensorFlow Serving. There is an excellent tutorial that describes how to configure and run it — TensorFlow Serving with Docker. I will follow the same steps in my example.

Read more in my Towards Data Science post.

Monday, April 1, 2019

Publishing Machine Learning API with Python Flask

Flask is fun and easy to setup, as it says on Flask website. And that's true. This microframework for Python offers a powerful way of annotating Python function with REST endpoint. I’m using Flask to publish ML model API to be accessible by the 3rd party business applications.

This example is based on XGBoost.

For better code maintenance, I would recommend using a separate Jupyter notebook where ML model API will be published. Import Flask module along with Flask CORS:


Model is trained on Pima Indians Diabetes Database. CSV data can be downloaded from here. To construct Pandas data frame variable as input for model predict function, we need to define an array of dataset columns:


Previously trained and saved model is loaded using Pickle:


It is always a good practice to do a test run and check if the model performs well. Construct data frame with an array of column names and an array of data (using new data, the one which is not present in train or test datasets). Calling two functions — model.predict and model.predict_proba. Often I prefer model.predict_proba, it returns probability which describes how likely will be 0/1, this helps to interpret the result based on a certain range (0.25 to 0.75 for example). Pandas data frame is constructed with sample payload and then the model prediction is executed:


Flask API. Make sure you enable CORS, otherwise API call will not work from another host. Write annotation before the function you want to expose through REST API. Provide an endpoint name and supported REST methods (POST in this example). Payload data is retrieved from the request, Pandas data frame is constructed and model predict_proba function is executed:


Response JSON string is constructed and returned as a function result. I’m running Flask in Docker container, that's why using 0.0.0.0 as the host on which it runs. Port 5000 is mapped as external port and this allows calls from the outside.

While it works to start Flask interface directly in Jupyter notebook, I would recommend to convert it to Python script and run from command line as a service. Use Jupyter nbconvert command to convert to Python script:

jupyter nbconvert — to python diabetes_redsamurai_endpoint_db.ipynb

Python script with Flask endpoint can be started as the background process with PM2 process manager. This allows to run endpoint as a service and start other processes on different ports. PM2 start command:

pm2 start diabetes_redsamurai_endpoint_db.py


pm2 monit helps to display info about running processes:


ML model classification REST API call from Postman through endpoint served by Flask:


More info:

- GitHub repo with source code
- Previous post about XGBoost model training

Tuesday, November 13, 2018

Amazon SageMaker Model Endpoint Access from Oracle JET

If you are implementing machine learning model with Amazon SageMaker, obviously you would want to know how to access trained model from the outside. There is good article posted on AWS Machine Learning Blog related to this topic - Call an Amazon SageMaker model endpoint using Amazon API Gateway and AWS Lambda. I went through described steps and implemented REST API for my own module. I went one step further and tested API call from JavaScript application implemented with Oracle JET JavaScript free and open source toolkit.

I will not go deep into machine learning part in this post. I will focus exclusively on AWS SageMaker endpoint. I'm using Jupyter notebook from Chapter 2 of this book - Machine Learning for Business. At the end of the notebook, when machine learning model is created, we initialize AWS endpoint (name: order-approval). Think about it as about some sort of access point. Through this endpoint we can call prediction function:


Wait around 5 minutes until endpoint starts. Then you should see endpoint entry in SageMaker:


How to expose endpoint to be accessible outside? Through AWS Lambda and AWS API Gateway.

AWS Lambda

Go to AWS Lambda service and create new function. I already have function, with Python 3.6 set for runtime. AWS Lambda acts as proxy function between endpoint and API. This is the place where we can prepare input data and parse response, before returning it to API:


Function must be granted role to access SageMaker resources:


This is function implementation. Endpoint name is moved out into environment variable. Function gets input, calls SageMaker endpoint and does some minimal processing for the response:


We can test lambda function and provide test payload. This is test payload I'm using. This is encoded list of parameters for machine learning model. Parameters describe purchase order. Model decides if manual approval is required or not. Decision rule - if PO was raised by someone not from IT, but they order IT product - manual approval is required. Read more about it in the book mentioned above. Test payload data:


Run test execution, model responds - manual approval for PO is required:


AWS API Gateway

Final step is to define API Gateway. Client will be calling Lambda function through API:


I have defined REST resource and POST method for API gateway. Client request will go through API call and then will be directed to Lambda function, which will make call for SageMaker prediction based on client input data:


POST method is set to call Lambda function (function with this name was created above):


Once API is deployed, we get URL. Make sure to add REST resource name at the end. From Oracle JET we can use simple JQuery call to execute POST method. Once asynchronous response is received, we display notification message:


Oracle JET displays prediction received from SageMaker - manual review is required for current PO:


Download Oracle JET sample application with AWS SageMaker API call from my GitHub repo.

Tuesday, September 27, 2016

BPM Worklist API 12.2.1.1 and Custom ADF 12.2.1.1 Application

I have updated my sample app with BPM API usage in ADF application to 12.2.1.1. Originally this was developed with ADF/BPM 11.1.1.7 - Dynamic ADF Buttons Solution for Oracle BPM Outcomes. There are several changes in BPM libraries usage. I will describe it all step by step.

Download sample application - adfbpm12211.zip. This archive contains both BPM and ADF app. BPM process implements two roles - request holiday and approve holiday:


Main goal of such use case - we don't want to use out of the box Oracle Worklist app, but prefer to develop our own business logic and manage BPM process from custom ADF app through BPM API. It is important to initialize Workflow context one time during login, this can be heavy operation and we should not call it each time when BPM API is invoked:


I'm using authenticateOnBehalfOf method. This allows to use admin user as a proxy for business user connection. Once Workflow context is established, we can get BPM context to use it for BPM API calls, all this is done during login into ADF app:


Assigned tasks are fetched through Workflow context:


We can initiate new task through BPM API in our custom ADF app:


There is a way to generate buttons to control task actions dynamically. Buttons can be generated on top of task outcomes list obtained from BPM API:


Task action can be executed with payload info, this allows to pass correct info to the next step in the process:


Let's see how it works. User can start new BPM task from ADF:


When task is submitted to approval, manager is assigned task to approve or reject holiday request. Buttons are generated dynamically based on task outcomes:


To double check executed flow from BPM API in ADF we can review it in EM control:


ADF application must import BPM API JARs to be able to compile Java code. In ADF 12.2.1.1 it is enough to import BPM Services JARs:


There is no need to package these JARs into application archive, these should be referenced for compile time only:

Sunday, July 22, 2012

Lightweight ADF Task Flow for BPM Human Tasks Overview

We can customize and include available Oracle BPM Workspace task flows into our own ADF 11g application. Additionally to out-of-the-box task flows in Oracle BPM 11g, we can use Oracle BPM 11g Worklist API and build custom ADF task flows. While there can be different requirements and scenarios, in typical case I would suggest to use customized out of the box Oracle BPM 11g provided task flows for core requirements and additionally implement your own custom task flows based on available Worklist API. Most likely custom task flows based on Worklist API would cover less functional, but more lightweight requirements.

I have implemented such sample lightweight task flow - it shows a list with assigned tasks. Download sample application - IntegratedBPMWorklistApp_v4.zip. Here is how this task flow looks - simple table with assigned tasks:


Same application includes BPM Workspace task flow with complete functionality for Human tasks:


In order to implement your own ADF task flow with access to BPM context, you should follow instructions from Oracle documentation - 33 Building a Custom Worklist Client. I'm using sample application from previous post - Tips & Tricks How to Run Oracle BPM 11g PS5 Workspace from Custom ADF 11g Application. This application is extended with additional libraries required to access BPM API:


You can access BPM functionality through BPM context. It takes time to initialize BPM context, its why Oracle says to create it ones and keep in session scope. Ideal place to create BPM context is Login method. Check - initBPMContext(username, password) method, it authenticated to BPM environment and when context is created - stores it into session scope:


ADF task flow contains one fragment with read-only table to list assigned tasks:


Table is created from data control generated on top of Java class, where we are accessing BPM context and retrieving assigned tasks:


Data control class is responsible to construct a list of tasks, with task state - assigned. I'm reusing BPM context from session scope:


Tasks query is created, iterating over results and populating list of tasks:


Task list construction method is executed automatically, when table is rendered on the UI. For refresh functionality to work, we need to re-execute retrieveAssignedUserTasks() method and refresh iterator. Here is page definition for the fragment:


Refresh invokes task list construction and at the end calls iterator refresh: