can I use a runtime batch request I defined with a spark data frame with checkpoints? Or is this currently unsupported?
The configuration I have works fine with validation operators but I could not figure out with the documentation how / if this could be done with the newer checkpoints.
I followed this guide for V3 to define the batch request:
runtime_batch_request = RuntimeBatchRequest(
“some_key_maybe_pipeline_stage”: “ingestion step 1”,
“some_other_key_maybe_airflow_run_id”: “run 18”