Kubeflow Pipelines on Tekton reaches 1.0, Watson Studio Pipelines now accessible in open beta – IBM Developer


Our final weblog put up asserting Kubeflow Pipelines on Tekton mentioned how Kubeflow Pipelines turned a main car to deal with the wants of each DevOps engineers and knowledge scientists. As a reminder, Kubeflow Pipelines on Tekton is a mission within the MLOps ecosystem, and affords the next advantages:

  • For DevOps people, Kubeflow Pipelines faucets into the Kubernetes ecosystem, leveraging its scalability and containerization rules.
  • For Knowledge scientists and MLOps practitioners, Kubeflow Pipelines affords a Python interface to outline and deploy Pipelines, enabling metadata assortment and lineage monitoring.
  • For DataOps people, Kubeflow Pipelines brings in ETL bindings to take part extra totally in collaboration with friends by offering help for a number of ETL elements and use circumstances.

The pipelines group has been busy the previous few months creating enhancements for Kubeflow Pipelines on Tekton to deal with extra MLOps and DataOps wants, and making a steady, production-ready deliverable. As a part of this, we’re excited to announce that the mission has reached 1.0 milestone. Moreover, IBM’s providing constructed on high of this open supply mission, Watson Studio Pipelines, is now accessible in open beta!

Kubeflow Pipelines on Tekton 1.0 launch

We’re excited to announce the 1.0 launch for Kubeflow Pipelines on Tekton (KFP-Tekton) mission. Many options akin to graph recursion, conditional loops, caching, any sequencer, dynamic parameters help, and the like had been added to the mission within the strategy of reaching this milestone. These new options weren’t supported within the Tekton mission natively, however they’re essential for working real-world machine studying workflows utilizing Kubeflow Pipelines.

This weblog highlights a few of these new functionalities we launched on this model, particularly that deal with knowledge flows.

These enhancements embody:

Pipeline loops

The present Tekton design doesn’t permit any loop or sub-pipeline contained in the pipeline definition. Not too long ago, Tekton launched the idea of Tekton customized duties to permit customers to outline their very own workload definition by constructing their very own controller reconcile strategies. This opened the door for us to help Kubeflow Pipeline loops and recursions that weren’t potential earlier than on Tekton. We’re bringing again these enhancements to the Tekton neighborhood.

The ParallelFor loop in Kubeflow Pipeline is a loop that runs duties on a set of parameters in parallel. For Tekton, the kfp-tekton group constructed a Tekton customized process controller that reconciles a number of Tekton sub-pipelines in parallel over a set of parameters (each static and dynamic), and helps parallelism to manage the variety of parallel working sub-pipelines.

This can be a enormous step ahead for what we will obtain on Tekton, and it permits Tekton to deal with pipelines which can be way more advanced.

The diagram beneath describes the flows for 3 several types of parallel loops.

  • Typical loops are loops that traverse a listing of duties over one argument.
  • Multi-args loops are just like typical loops however with a number of arguments.
  • Situation loops are loops that may break or proceed primarily based on a sure situation.

pipeline loop image


Recursion permits the identical code block to execute and exit primarily based on dynamic situations. Present Tekton options don’t permit for recursion.

Nonetheless, with the brand new Tekton customized process controller that the KFP-Tekton constructed for loops and sub-pipelines, we will now run sub-pipelines with situations that may refer again to itself to create recursions, and it may be prolonged to cowl nested parallel loops inside recursions. This demonstrates how the KFP-Tekton group is main among the innovative options for Tekton and bringing again to the Tekton neighborhood.

The next diagram reveals that the recursive operate is outlined as a sub-pipeline and might refer again to itself to create recursions.

Recursion flow

Pluggable Tekton customized process

The KFP-Tekton group additionally labored on a brand new approach to allow customers to plug their very own Tekton customized process right into a Kubeflow Pipeline. For instance, a consumer may need to calculate an expression with out creating a brand new employee pod. On this case, the consumer can plug within the Widespread Expression Language (CEL) customized process from Tekton to calculate the expression inside a shared controller with out creating a brand new employee pod.

The pluggable Tekton customized process in Kubeflow Pipeline offers extra flexibility to customers that need to optimize their pipelines additional and compose duties which can be presently not potential with the default Tekton process API. The KFP-Tekton group additionally contributes to Tekton to make the customized process API extra function full akin to supporting timeout, retry, and inlined customized process spec.

The picture beneath reveals how the common duties A and D are working inside a brand new devoted pod, whereas the customized duties B and C are working inside a shared controller to avoid wasting pod provision time and cluster assets.

image showing Tekton tasks completion


AnySequencer is a dependent process that begins when any one of many process or situation dependencies full efficiently. The good thing about AnySequencer over the logical OR situation is that with AnySequencer, the order of execution of the dependencies doesn’t matter. The pipeline doesn’t await all the duty dependencies to finish earlier than shifting to the following step. You possibly can apply situations to implement the duty dependencies completes as anticipated.

The next picture reveals how the AnySequencer process can begin a brand new process whereas an authentic process is ready for a dependency.

AnySequencer image


Kubeflow Pipelines caching supplies task-level output caching. Not like Argo, by design, Tekton doesn’t generate the duty template within the annotations to carry out caching. To help caching on Tekton, we enhanced the KubeFlow Pipeline cache server to auto-generate the duty template for Tekton because the hash code which caches all of the an identical workloads with the identical inputs.

By default, compiling a pipeline provides metadata annotations and labels in order that outcomes from duties inside a pipeline run could be reused if that process is reused in a brand new pipeline run. This protects the pipeline run from re-executing the duty when the outcomes are already identified.

The next diagram reveals the caching mechanism for Kubeflow Pipeline on Tekton (KFP-Tekton). All process executions and outcomes are saved as hash code within the database to find out cached duties.

caching flow

Watson Studio Pipelines now accessible in Open Beta!

We’re excited to announce that Watson Studio Pipelines is now accessible in Open Beta! This new Watson Studio providing permits customers to create repeatable and scheduled flows that automate pocket book, knowledge refinery, and machine studying pipelines: from knowledge ingestion to mannequin coaching, testing, and deployment. With an intuitive consumer interface, Watson Studio Pipelines exposes all the state-of-the-art knowledge science instruments accessible in Watson Studio and permits customers to mix them into automation flows, creating steady integration / steady growth pipelines for AI.

Watson Studio Pipelines is constructed off of Kubeflow Pipelines on the Tekton runtime and is totally built-in into the Watson Studio platform, permitting customers to mix instruments together with:

  • Notebooks
  • Knowledge refinery flows
  • AutoAI experiments
  • Internet service / on-line deployments
  • Batch deployments
  • Import and export of mission and house belongings

The brand new options, pushed by DataOps state of affairs and leveraging the brand new Tekton extensions, are coming quickly:

The next instance showcases the way to import datasets into Watson Studio utilizing DataStage circulation, create and run AutoAI Experiments with hyperparameter optimization, and serve the very best tuned mannequin as an online service. It sends notification in case of a failure and eventually executes a customized consumer script.


To expertise this AI lifecycle automation for your self, please go the Watson Studio Pipelines beta web page

Be part of us to construct cloud-native Knowledge and AI Pipelines with Kubeflow Pipelines and Tekton

Please be a part of us on the Kubeflow Pipelines with Tekton GitHub repository, strive it out, give suggestions, and lift points. Moreover you may join with us by way of the next:

  • To contribute and construct an enterprise-grade, end-to-end machine studying platform on OpenShift and Kubernetes, please be a part of the Kubeflow neighborhood and attain out with any questions, feedback, and suggestions!
  • To get entry to Watson AI Pipelines, enroll for beta entry record.
  • If you would like assist deploying and managing Kubeflow in your on-premises Kubernetes platform, OpenShift, or on IBM Cloud, please join with us.
  • To run Pocket book-based pipelines utilizing a drag-and-drop canvas, please try the Elyra mission locally, which supplies AI-centric extensions to JupyterLab.
  • Take a look at the OpenDataHub in case you are fascinated with open supply tasks within the Knowledge and AI portfolio, specifically Kubeflow, Kafka, Hive, Hue, and Spark, and the way to convey them collectively in a cloud-native approach.


This weblog put up launched you to among the new enhancements that we’ve been engaged on to make Kubeflow Pipelines on Tekton extra extensible for customers. Our hope is that you simply’ll discover the brand new performance that will help you remedy your DataOps wants.

Because of our contributors

Because of many contributors of Kubeflow Pipelines with Tekton for contributing to the assorted facets of the mission, each internally and externally. A number of I need to particularly name out embody:

  • Adam Massachi
  • Christian Kadner
  • Jun Feng Liu
  • Yi-Hong Wang
  • Prashant Sharma
  • Feng Li
  • Andrew Butler
  • Jin Chi He
  • Michalina Kotwica
  • Andrea Fritolli
  • Priti Desai
  • Gang Pu
  • Peng Li
  • Błażej Rutkowski

Moreover, due to to OpenShift Pipelines and Tekton groups from Crimson Hat, and the Elyra group for suggestions. Final however not the least, due to the Kubeflow Pipelines group from Google for serving to and offering help.


Please enter your comment!
Please enter your name here