Table of Contents
Concepts for quick start Link to heading
pipeline Link to heading
A pipeline is a definition of a workflow that composes one or more components together to form a computational directed acyclic graph (DAG). here
Component Link to heading
Component is a core building block for kubeflow pipelines. It consists:
- code.
- dependency support.
- base image.
- input and output.
Input, output, artifact, etc Link to heading
Input and output can be defined in the signature of component function. Other type of input and output can be also defined, like Model, Metrics, etc. A list of Artifact type can be found here.
Example Link to heading
Here is a example you can check.
Basically, component will read from bucket and write to bucket. Bucket is a very important component for kubeflow pipelines.
Misc Link to heading
How to compile a file for pipeline UI uploading?
the main snippet doing the job is
from kfp import dsl, compiler ... compiler.Compiler().compile( pipeline_func=classification_pipeline, package_path="iris_pipeline.yaml" )Ways of triggering pipeline.
A lot of creative ways including but not limited to:
a. one off trigger in UI or using kfp client (python SDK) b. recurring trigger in UI c. if thinking outside box, using “trigger/hook” outside kubeflow also will provide event-driven triggering; e.g. github action can trigger in workflow using kfp client. etc.
References Link to heading
medium post