While data may be at the heart of every Machine Learning (ML) application, the variety of ML tools, frameworks, and libraries each necessitates different configurations, unique dependencies, and skillsets. Managing this variety of components in the ML pipeline is as time-consuming and complex as data collection and preparation itself, and is a challenge for small-to-medium enterprises that are trying to build new ML applications. Containerisation has been a popular way of packaging an ML application, with all its necessary libraries, frameworks, and dependencies into a reuseable container - minimising the pain points associated with setting up new computing instances each time an ML application needs to be executed. In this manner, multiple microservice containers can be orchestrated to execute on-demand, each providing a small slice of functionality that is linked to the next.
This technology offer is a unified container orchestration platform that orchestrates the end-to-end Machine Learning pipeline; from data preparation to model deployment. It aligns with the iterative nature of developing AI applications and simplifies the process of provisioning the computing infrastructure needed to support model development and deployment. This technology is purpose-built for small-to-medium businesses with limited tech manpower and resources as it allows such organisations to collaborate internally (through re-useable/re-playable development workspaces) via on-premise computing infrastructure, or on public cloud platforms.
The platform comprises a suite of micro-services using container-based computing orchestrated through Kubernetes. The micro-services abstract the complexity of ML pipeline setup and automate the provisioning of infrastructure, computing resources, machine learning development tools, and other underlying dependencies required by different stages of AI development pipeline. Key features include:
This technology serves as a development and deployment platform to enable SMEs to rapidly develop and deploy AI applications. Alternatively, it can be used as a training tool to educate software engineers to transition to the development of AI applications as AI engineers.
As an integrated AI development and deployment platform, it offers the following benefits: