mirror of
https://github.com/ghndrx/kubeflow-pipelines.git
synced 2026-02-10 06:45:13 +00:00
59c808cb3a45bba225ff0e6863d941a84bb0a5e0
- Add upload_to_s3 function to handler - Save trained BERT models to S3 when credentials provided - Save LoRA adapters to S3 when credentials provided - Input params: s3_bucket, s3_prefix, aws_access_key_id, aws_secret_access_key, aws_region
Kubeflow Pipelines - GitOps Repository
This repository contains ML pipeline definitions managed via ArgoCD.
Structure
.
├── pipelines/ # Pipeline Python definitions
│ └── examples/ # Example pipelines
├── components/ # Reusable pipeline components
├── experiments/ # Experiment configurations
├── runs/ # Scheduled/triggered runs
└── manifests/ # K8s manifests for ArgoCD
Usage
- Add a pipeline: Create a Python file in
pipelines/ - Push to main: ArgoCD auto-deploys
- Monitor: Check Kubeflow UI at <KUBEFLOW_URL>
Quick Start
from kfp import dsl
@dsl.component
def hello_world() -> str:
return "Hello from Kubeflow!"
@dsl.pipeline(name="hello-pipeline")
def hello_pipeline():
hello_world()
Environment
- Kubeflow: <KUBEFLOW_URL>
- MinIO: <MINIO_URL>
- ArgoCD: <ARGOCD_URL>
Languages
Python
99%
Dockerfile
1%