Empowering ML Work-Flow with DevOps within Micro Service Architecture and Deploying A Hybrid-Multi Cloud, Maintaining CI/CD Pipeline: An Open Shift Orchestration of ML-OPS

Main Article Content

Vedant Bhatt, Harvinder Singh Diwan, Satish Kumar Alaria, Yashika Saini


Machine Learning happens to be one of most used cutting edge technology in present era. The heart of any Machine Learning application is actually the ML model. A ML model is basically a file, which is an outcome of training over a datasets, done so as to recognize various sorts of patterns. Training the model involves different steps which is known as ML Workflow. This workflow involves manual interaction for various purposes and one such purpose is hyper-parameter tuning. This manual interaction bring a lot of delay in production of ML based application in industries which is absolutely not tolerable. This issue is similar to the software development issue which was resolved by adopting DevOps workflow. DevOps works on automation of processes as much as possible by using automating tools and reduces the human interaction at every possible point. By integrating DevOps with ML workflow, we can achieve automation in ML pipeline for creating models and further it can also be extended till deployment and monitoring. In our research, we did an in-depth anatomy of ML workflow and tried to inspect every single point where we can bring automation and successfully minimized human interaction. Ultimately, we ended up modelling a ML-Ops based application encrypting it end-to-end with DevOps. For instance, in our project, we made a heart disease predictor model and deployed it in a hybrid multi cloud and orchestrated it using RedHatOpenShift.

Article Details