Continue If¶
Class: ContinueIfBlockV1
Source: inference.core.workflows.core_steps.flow_control.continue_if.v1.ContinueIfBlockV1
Based on provided configuration, block decides if it should follow to pointed execution path
Type identifier¶
Use the following identifier in step "type"
field: roboflow_core/continue_if@v1
to add the block as
as step in your workflow.
Properties¶
Name | Type | Description | Refs |
---|---|---|---|
name |
str |
Enter a unique identifier for this step.. | ❌ |
condition_statement |
StatementGroup |
Define the conditional logic.. | ❌ |
The Refs column marks possibility to parametrise the property with dynamic values available
in workflow
runtime. See Bindings for more info.
Available Connections¶
Compatible Blocks
Check what blocks you can connect to Continue If
in version v1
.
- inputs:
Google Gemini
,Keypoint Visualization
,Detections Stabilizer
,Path Deviation
,Gaze Detection
,Reference Path Visualization
,Stitch Images
,Image Blur
,Florence-2 Model
,Barcode Detection
,Local File Sink
,Relative Static Crop
,Clip Comparison
,Cosine Similarity
,Icon Visualization
,Time in Zone
,Dimension Collapse
,Polygon Zone Visualization
,Identify Outliers
,Instance Segmentation Model
,Dynamic Zone
,Dynamic Crop
,Grid Visualization
,Property Definition
,Detections Consensus
,VLM as Classifier
,Single-Label Classification Model
,Camera Calibration
,Perception Encoder Embedding Model
,VLM as Classifier
,Pixel Color Count
,QR Code Generator
,SIFT
,Camera Focus
,Detections Filter
,Llama 3.2 Vision
,Line Counter Visualization
,Triangle Visualization
,Multi-Label Classification Model
,Email Notification
,Roboflow Dataset Upload
,Time in Zone
,Image Slicer
,Byte Tracker
,Single-Label Classification Model
,OCR Model
,Pixelate Visualization
,Byte Tracker
,Object Detection Model
,Dot Visualization
,Image Slicer
,Roboflow Dataset Upload
,OpenAI
,Model Monitoring Inference Aggregator
,VLM as Detector
,Buffer
,Stability AI Outpainting
,Trace Visualization
,Multi-Label Classification Model
,Overlap Filter
,CogVLM
,Corner Visualization
,First Non Empty Or Default
,Background Color Visualization
,Halo Visualization
,Ellipse Visualization
,OpenAI
,Anthropic Claude
,Keypoint Detection Model
,Image Contours
,Circle Visualization
,Image Threshold
,Absolute Static Crop
,Perspective Correction
,Color Visualization
,QR Code Detection
,Instance Segmentation Model
,Blur Visualization
,PTZ Tracking (ONVIF)
.md),Keypoint Detection Model
,Environment Secrets Store
,Stability AI Inpainting
,Cache Get
,SIFT Comparison
,Detections Merge
,Roboflow Custom Metadata
,SmolVLM2
,Depth Estimation
,Template Matching
,Stability AI Image Generation
,Crop Visualization
,Stitch OCR Detections
,Time in Zone
,Rate Limiter
,Continue If
,Segment Anything 2 Model
,Expression
,Dominant Color
,SIFT Comparison
,Data Aggregator
,Bounding Rectangle
,Detection Offset
,Size Measurement
,Model Comparison Visualization
,CLIP Embedding Model
,Object Detection Model
,Twilio SMS Notification
,Clip Comparison
,LMM
,CSV Formatter
,Path Deviation
,Detections Transformation
,Mask Visualization
,Byte Tracker
,Qwen2.5-VL
,JSON Parser
,Webhook Sink
,Cache Set
,Velocity
,Slack Notification
,Detections Classes Replacement
,Delta Filter
,YOLO-World Model
,Classification Label Visualization
,Polygon Visualization
,OpenAI
,LMM For Classification
,Line Counter
,Moondream2
,Bounding Box Visualization
,Distance Measurement
,Image Preprocessing
,Image Convert Grayscale
,Google Vision OCR
,Label Visualization
,Line Counter
,Detections Stitch
,Florence-2 Model
,Identify Changes
,VLM as Detector
- outputs: None
Input and Output Bindings¶
The available connections depend on its binding kinds. Check what binding kinds
Continue If
in version v1
has.
Bindings
-
input
evaluation_parameters
(*
): Data to be used in the conditional logic..next_steps
(step): Steps to execute if the condition evaluates to true..
-
output
Example JSON definition of step Continue If
in version v1
{
"name": "<your_step_name_here>",
"type": "roboflow_core/continue_if@v1",
"condition_statement": {
"statements": [
{
"comparator": {
"type": "(Number) =="
},
"left_operand": {
"operand_name": "left",
"type": "DynamicOperand"
},
"right_operand": {
"type": "StaticOperand",
"value": 1
},
"type": "BinaryStatement"
}
],
"type": "StatementGroup"
},
"evaluation_parameters": {
"left": "$inputs.some"
},
"next_steps": [
"$steps.on_true"
]
}