Greg Hendrickson 59c808cb3a feat: add S3 model upload support
- Add upload_to_s3 function to handler
- Save trained BERT models to S3 when credentials provided
- Save LoRA adapters to S3 when credentials provided
- Input params: s3_bucket, s3_prefix, aws_access_key_id, aws_secret_access_key, aws_region
2026-02-03 15:13:21 +00:00
2026-02-03 00:45:27 +00:00

Kubeflow Pipelines - GitOps Repository

This repository contains ML pipeline definitions managed via ArgoCD.

Structure

.
├── pipelines/           # Pipeline Python definitions
│   └── examples/        # Example pipelines
├── components/          # Reusable pipeline components
├── experiments/         # Experiment configurations
├── runs/               # Scheduled/triggered runs
└── manifests/          # K8s manifests for ArgoCD

Usage

  1. Add a pipeline: Create a Python file in pipelines/
  2. Push to main: ArgoCD auto-deploys
  3. Monitor: Check Kubeflow UI at <KUBEFLOW_URL>

Quick Start

from kfp import dsl

@dsl.component
def hello_world() -> str:
    return "Hello from Kubeflow!"

@dsl.pipeline(name="hello-pipeline")
def hello_pipeline():
    hello_world()

Environment

  • Kubeflow: <KUBEFLOW_URL>
  • MinIO: <MINIO_URL>
  • ArgoCD: <ARGOCD_URL>
Description
Languages
Python 99%
Dockerfile 1%