Tuesday, July 16, 2019

Fix for Oracle VBCS "Error 404--Not Found"

We are using Pay As You Go Oracle VBCS instance and had an issue with accessing VBCS home page after starting the service. The service was started successfully, without errors. But when accessing VBCS home page URL - "Error 404--Not Found" was returned.

I raised a support ticket and must say - received the response and help promptly. If you would encounter similar issue yourself, hopefully, this post will share some light.

Apparently "Error 404--Not Found" was returned, because VBCS instance wasn't initialized during instance start. It wasn't initialized, because of expired password for VBCS DB schemas. Yes, you read it right - internal system passwords expire in Cloud too.

Based on the instructions given by Oracle Support, I was able to extract logs from VBCS WebLogic instance (by connecting through SSH to VBCS cloud machine) and provide it to support team (yes, VBCS runs on WebLogic). They found password expire errors in the log, similar to this:

weblogic.application.ModuleException: java.sql.SQLException: ORA-28001: the password has expired

Based on provided instructions, I extracted VBCS DB schema name and connected through SQL developer. Then I executed SQL statement given by support team to reset all VBCS DB passwords in bulk. Next password expiry is set for 2019/January. Should it expire at all?

Summary: If you would encounter "Error 404--Not Found" after starting VBCS instance and trying to access VBCS home page, then most likely (but not always) it will be related to VBCS DB schema password expiry issue.

Monday, July 15, 2019

Forecast Model Tuning with Additional Regressors in Prophet

I’m going to share my experiment results with Prophet additional regressors. My goal was to check how extra regressor would weight on forecast calculated by Prophet.

Using dataset from Kaggle — Bike Sharing in Washington D.C. Dataset. Data comes with a number for bike rentals per day and weather conditions. I have created and compared three models:

1. Time series Prophet model with date and number of bike rentals
2. A model with additional regressor —weather temperature
3. A model with additional regressor s— weather temperature and state (raining, sunny, etc.)

We should see the effect of regressor and compare these three models.

Read more in my Towards Data Science post.

Wednesday, July 3, 2019

Oracle ADF A Status Update

Oracle posted information update for Oracle ADF - "With the continuous investment and usage of Oracle ADF inside Oracle we expect external customers will also continue to enjoy the benefits of Oracle ADF for many more years."

Read the complete post here: https://blogs.oracle.com/jdeveloperpm/oracle-adf-a-status-update

Happy to read the update, sounds positive. Thanks to Oracle for taking time and publishing this information. #adf #middleware #javascript #oracle #cloud #oraclefusion

Serving Prophet Model with Flask — Predicting Future

The solution to demonstrate how to serve Prophet model API on the Web with Flask. Prophet — Open-Source Python library developed by Facebook to predict time series data.

An accurate forecast and future prediction are crucial almost for any business. This is an obvious thing and it doesn’t need explanation. There is a concept of time series data, this data is ordered by date and typically each date is assigned with one or more values specific to that date. Machine Learning powered models could generate forecasts based on time series data. Such forecasts could be an important source of information for business decisions.

Read more in my Towards Data Science post.

Wednesday, June 26, 2019

Oracle Developer Tools - Do They Still Exist?

People are frustrated about @OracleADF @JDeveloper on social media - "ADF boat has no captain", etc. I agree @Oracle is to blame big time for such lame handling of its own Developer Tools stack. @Oracle please wake up and spend some budget on @OracleADF. Read more:

Oracle VBCS - right now this tool gets the most of Oracle focus. Supposed to offer declarative #JavaScript development experience in the Cloud. Not well received by the community. Are there any VBCS customers, please respond if yes?

Oracle APEX - comes with a very strong community (mostly backed by DB folks). But is not strategic for Oracle. More likely will be used by PL/SQL guys then by Java or Web developers. 

Oracle JET - highly promoted by Oracle. Set of opensource #JavaScript libs, glued by Oracle layer. Nice, but can't be used as a direct replacement for @OracleADF, JET is UI layer only. Oracle folks often confuse community by saying - Oracle JET is a great option to replace ADF

Oracle Forms - still alive, but obviously can't be strategic Oracle platform. A few years ago, Oracle was promoting Forms modernization to @OracleADF

Summary - Oracle Developer tools offering is weak. Lack of Oracle investment into development tools - makes Oracle developers community shrink.

Saturday, June 15, 2019

Running Oracle JET as Progressive Web App

Progressive Web Apps (PWA) topic is a hot thing in web development these days. Read more about it - Progressive Web Apps. The beauty and power behind PWA - user can install a web app to his mobile device, without going through the app store. This simplifies update process too, when a new version of the app is available, the user will get it straight away, because it is essentially a Web page, wrapped to look like an installed app.

Inspired by this post - A Simple Progressive Web App Tutorial, I decided to add PWA config into Oracle JET app and test how it works (on Android, didn't test on iOS, but there is nothing JET specific, if PWA is supported on iOS, it should work).

Oracle JET PWA sample app is deployed on Heroku (PWA will work only if the app is coming through HTTPS) and available under this URL. The sample app is available on GitHub repo. Node.js wrapper for this sample is available in another GitHub repo, you can use it to deploy on Heroku or another service.

Access JET app URL, the app will be loaded and you should see Android notification in the bottom. Google Chrome mobile browser automatically is recognizing PWA app by manifest and offers to "install" it by adding to the home screen:


Select notification and you will get a confirmation message:


Select "Add" and Web app will be added to the home screen. It will look like a real mobile app for the user. For example, the user could get runtime stats for the app, check storage and data usage metrics:


The app is added to the home screen (look for Oracle  JET icon):


Select the app icon and app will be opened. There is no URL address bar in the header and indeed it looks like a mobile app, not a Web page:


The app will be recognized as PWA, if certain config steps were implemented. One of them - you need to add manifest file (add it in Oracle JET in the same folder as index.html) and provide app icons, name, etc.:


The manifest file must be included through a reference in the app entry point - index page (Oracle JET index.html page for example):


In addition to manifest, the app must define a service worker (same as manifest file, you can create this file in the same directory as Oracle JET index.html). PWA doesn't only bring the visual experience of the real app to the Web application. You can define a cache store for the app files, this means next time when offline - app files will load from local cache storage, there will be no need to download them from the internet:


Service worker can be registered from main.js file where Oracle JET context is initialized on the application initial load. Add service worker registration at the bottom of main.js:


The idea of this post was to share a simple example of PWA for Oracle JET. This should help you to get started quickly with PWA support config for Oracle JET app.

Friday, June 7, 2019

Running Oracle JET on Heroku with Node.js (JET Showcase)

I have implemented JET (more about Oracle JET) showcase app with data visualization components usage. This app shows historical weather data in Boston city, the dataset is taken from Kaggle. Switching years makes data visualization to change and show new data - I love how polar chat is updated. Calendar displays temperature for each day during the year using JET picto chart component:


App is deployed on Heroku and available by this URL. Heroku provides $7 per month account with analytics and better resources, but there is a free option too (it comes with sleep after 30 minutes of inactivity) - free option is good for experimentation, as for this case.

Heroku dashboard for the deployed JET app:


Free deployment comes without analytics option:


App comes with two options - Dashboard and Histogram. The dashboard allows switching between years and shows a polar chart along with daily temperature calendar:


The histogram displays the same data in a different view:


This app comes with Web Component implementation, yes Web Components are a standard feature in JET. Toolbar, where you can switch years, is implemented as Web Component:


Web Component is being used in both UIs - dashboard and histogram:


Visualization components are getting data through Knockout.JS observable variables:


Variables are initialized in JS functions:


Resources:

1. Heroku deployment guide for Node.js
2. Node.js app which is deployed on Heroku - GitHub. JET content is inside the public folder. JET content is copied from JET app web folder, after running ojet build --release
3. Oracle JET app - GitHub

Tuesday, June 4, 2019

ADF Faces and Client Side Value with innerHTML

In ADF Faces you can leverage the full power of JavaScript. I will explain how you could assign a value from ADF Faces component to the plain HTML div.

The sample app is available on GitHub repo. It doesn't require DB connection, you can run it straight away in Oracle JDeveloper.

Look into JSF page. I have implemented ADF Faces input component with value change listener. Below this component, there is HTML div with ID ot1. We will assign a text value to this div programmatically from JS function passClientSideValue:


JavaScript function reads the value by client ID from ADF Faces component and assigns it to the innerHTML property of HTML div:


When ADF Faces value is changed, value change listener is invoked through ADF auto-submit event. In value change listener, we extract client ID of the input component and pass it to JS function through JavaScript call from Java:


This is how the end result looks like:


In particular, this approach can be useful, when you want to bypass ADF Faces validation lifecycle and display updated value despite current validation errors in the form.

Sunday, May 5, 2019

Cat or Dog — Image Classification with Convolutional Neural Network

The goal of this post is to show how convnet (CNN — Convolutional Neural Network) works. I will be using classical cat/dog classification example described in Fran├žois Chollet book — Deep Learning with Python. Source code for this example is available on Fran├žois Chollet GitHub. I’m using this source code to run my experiment.

Convnet works by abstracting image features from the detail to higher level elements. An analogy can be described with the way how humans think. Each of us knows how airplane looks, but most likely when thinking about airplane we are not thinking about every little bit of airplane structure. In a similar way, convnet learns to recognize higher level elements in the image and this helps to classify new images when they look similar to the ones used for the training.

Image classification model should be trained using this notebook (you will find a description there from where to download image dataset with cats and dogs images). Model is being used and classification prediction is invoked in this notebook. For the convenience, I uploaded my own notebooks (based on the code from Deep Learning with Python book) to GitHub.

Read more in my Towards Data Science article.

Tuesday, April 30, 2019

Run Oracle VBCS Application on Your Own Server

Latest VBCS release brings an option to export VBCS application and run on your own server (or different cloud provider). This is a truly strong step forward for VCBS. Read more about it in Shay Shmeltzer blog post. If you decide to keep running VBCS app within VBCS itself, then you get additional functionality of VBCS Business Services, Oracle Cloud security, etc. out of the box. If you export VBCS application and run on your own environment, these features are not included, but then you don't need to pay for VBCS Cloud runtime when hosting the app. It is great to have alternatives and depending on the customer either one or another of the use cases would work.

One of the use cases - customer even don't need to have its own VBCS instance. We could develop Oracle JET app in our VBCS instance, export and deploy it in the customer environment. Later we could provide support for version upgrade.

I have exported sample VBCS app with the external REST service call (REST service). Deployed app on our own server. You can try it yourself - http://138.68.79.219:7001/vbcsapp/webApps/countries/:


I must say it is simple to export VBCS app, no hassle at all. Make sure VBCS app you are exporting is set with anonymous access (this will disable Oracle Cloud security model). You will need to implement security and backend secure calls yourself:


Next go to REST service control and specify Bypass Proxy option (this will enable direct REST service call from VBCS app, bypassing Oracle Cloud proxy service). Important: to work with Bypass Proxy option, REST service must be invoked through HTTPS:


Nothing else on VBCS side. Next need to push application code to Oracle Developer Cloud Service Git repository and build artifact which can be exported. I suggest reading Shay Shmeltzer blog post about how to proceed with VBCS and Oracle Developer Cloud Service setup.

In VBCS do push to Git for the selected app:


If it is the first time with Oracle Developer Cloud Service, you will need to set up (refer to Shay post mentioned above) a build job. Create build job configuration, point to Git repo:


Provide a set of parameters for the build job:


Add Unix Shell script to the build job. This script will execute Node.js NPM command to run vb-build job to construct artifact which can be exported and deployed in your own environment. It is important to make sure that property values used in the script match property values defined in the build job earlier. To execute npm command, make sure to use Oracle Developer Cloud Service machine with Node.js support:


Run the job, once it completes and if there are no errors, go to job artifacts and download optimized.zip - this is the archive with VBCS application you can deploy:


Important: when exported VCBS application is accessed, it loads a bunch of scripts and executes HTTPS requests. There is one request which slows down VBCS application initial loading - call to _currentuser. It is trying to execute the _currentuser request on VBCS instance, but if the instance is down - it will wait until a timeout and only then will proceed with application loading. To fix that, search for _currentuser URL in the exported code and change URL to some dummy value, so that this request will fail immediately and will not keep VBCS application from continue loading:

Wednesday, April 24, 2019

Build it Yourself — Chatbot API with Keras/TensorFlow Model

Is not that complex to build your own chatbot (or assistant, this word is a new trendy term for chatbot) as you may think. Various chatbot platforms are using classification models to recognize user intent. While obviously, you get a strong heads-up when building a chatbot on top of the existing platform, it never hurts to study the background concepts and try to build it yourself. Why not use a similar model yourself. Chatbot implementation main challenges are:
  1. Classify user input to recognize intent (this can be solved with Machine Learning, I’m using Keras with TensorFlow backend)
  2. Keep context. This part is programming and there is nothing much ML related here. I’m using Node.js backend logic to track conversation context (while in context, typically we don’t require a classification for user intents — user input is treated as answers to chatbot questions)
Complete source code for this article with readme instructions is available on my GitHub repo (open source).

This is the list of Python libraries which are used in the implementation. Keras deep learning library is used to build a classification model. Keras runs training on top of TensorFlow backend. Lancaster stemming library is used to collapse distinct word forms:


Chatbot intents and patterns to learn are defined in a plain JSON file. There is no need to have a huge vocabulary. Our goal is to build a chatbot for a specific domain. Classification model can be created for small vocabulary too, it will be able to recognize a set of patterns provided for the training:


Before we could start with classification model training, we need to build vocabulary first. Patterns are processed to build a vocabulary. Each word is stemmed to produce generic root, this would help to cover more combinations of user input:


This is the output of vocabulary creation. There are 9 intents (classes) and 82 vocabulary words:


Training would not run based on the vocabulary of words, words are meaningless for the machine. We need to translate words into bags of words with arrays containing 0/1. Array length will be equal to vocabulary size and 1 will be set when a word from the current pattern is located in the given position:


Training data — X (pattern converted into array [0,1,0,1…, 0]), Y (intents converted into array [1, 0, 0, 0,…,0], there will be single 1 for intents array). Model is built with Keras, based on three layers. According to my experiments, three layers provide good results (but it all depends on training data). Classification output will be multiclass array, which would help to identify encoded intent. Using softmax activation to produce multiclass classification output (result returns an array of 0/1: [1,0,0,…,0] — this set identifies encoded intent):


Compile Keras model with SGD optimizer:


Fit the model — execute training and construct classification model. I’m executing training in 200 iterations, with batch size = 5:


Model is built. Now we can define two helper functions. Function bow helps to translate user sentence into a bag of words with array 0/1:


Check this example — translating the sentence into a bag of words:


When the function finds a word from the sentence in chatbot vocabulary, it sets 1 into the corresponding position in the array. This array will be sent to be classified by the model to identify to what intent it belongs:


It is a good practice to save the trained model into a pickle file to be able to reuse it to publish through Flask REST API:


Before publishing model through Flask REST API, is always good to run an extra test. Use model.predict function to classify user input and based on calculated probability return intent (multiple intents can be returned):


Example to classify sentence:


The intent is calculated correctly:


To publish the same function through REST endpoint, we can wrap it into Flask API:


I have explained how to implement the classification part. In the GitHub repo referenced at the beginning of the post, you will find a complete example of how to maintain the context. Context is maintained by logic written in JavaScript and running on Node.js backend. Context flow must be defined in the list of intents, as soon as the intent is classified and backend logic finds a start of the context — we enter into the loop and ask related questions. How advanced is context handling all depends on the backend implementation (this is beyond Machine Learning scope at this stage).

Chatbot UI:

Monday, April 1, 2019

Publishing Machine Learning API with Python Flask

Flask is fun and easy to setup, as it says on Flask website. And that's true. This microframework for Python offers a powerful way of annotating Python function with REST endpoint. I’m using Flask to publish ML model API to be accessible by the 3rd party business applications.

This example is based on XGBoost.

For better code maintenance, I would recommend using a separate Jupyter notebook where ML model API will be published. Import Flask module along with Flask CORS:


Model is trained on Pima Indians Diabetes Database. CSV data can be downloaded from here. To construct Pandas data frame variable as input for model predict function, we need to define an array of dataset columns:


Previously trained and saved model is loaded using Pickle:


It is always a good practice to do a test run and check if the model performs well. Construct data frame with an array of column names and an array of data (using new data, the one which is not present in train or test datasets). Calling two functions — model.predict and model.predict_proba. Often I prefer model.predict_proba, it returns probability which describes how likely will be 0/1, this helps to interpret the result based on a certain range (0.25 to 0.75 for example). Pandas data frame is constructed with sample payload and then the model prediction is executed:


Flask API. Make sure you enable CORS, otherwise API call will not work from another host. Write annotation before the function you want to expose through REST API. Provide an endpoint name and supported REST methods (POST in this example). Payload data is retrieved from the request, Pandas data frame is constructed and model predict_proba function is executed:


Response JSON string is constructed and returned as a function result. I’m running Flask in Docker container, that's why using 0.0.0.0 as the host on which it runs. Port 5000 is mapped as external port and this allows calls from the outside.

While it works to start Flask interface directly in Jupyter notebook, I would recommend to convert it to Python script and run from command line as a service. Use Jupyter nbconvert command to convert to Python script:

jupyter nbconvert — to python diabetes_redsamurai_endpoint_db.ipynb

Python script with Flask endpoint can be started as the background process with PM2 process manager. This allows to run endpoint as a service and start other processes on different ports. PM2 start command:

pm2 start diabetes_redsamurai_endpoint_db.py


pm2 monit helps to display info about running processes:


ML model classification REST API call from Postman through endpoint served by Flask:


More info:

- GitHub repo with source code
- Previous post about XGBoost model training