Monday, October 25, 2021

MLOps: Extend Skipper ML Services

The goal of this video is to explain Skipper from MLOps user perspective, different blocks of Skipper and how they fit together. I show how a sample set of ML services works and how you could replace it or add your own service. Skipper engine is implemented with Python, but you could add service container implemented in any language. All runs on Kubernetes.

 

Monday, October 18, 2021

Running Kubernetes on Oracle Cloud OCI

Oracle Cloud OCI provides a good environment to run your Kubernetes workloads. In this video, I show how to access Kubernetes cluster in OCI, explain artifacts related to the cluster. I show how Skipper API runs on Kubernetes deployed on OCI. Cluster runtime is accessed through cloud shell.

 

Monday, October 11, 2021

MLOps: Scaling TensorFlow Model on Kubernetes

ML model serving/prediction API can be scaled on Kubernetes by adding or removing Pod instances. I show you a live demo and explain how scaling can be done for TensorFlow model running on Kubernetes.
  

Sunday, October 3, 2021

MLOps: Sharing Model Across Services

Typically you would want to scale ML model training and ML inference/prediction services separately. This means services should not share the file storage, at least this is true for the Kubernetes cluster environment. I explain how you could transfer files across services using RabbitMQ.