Detection Offset¶
Class: DetectionOffsetBlockV1
Source: inference.core.workflows.core_steps.transformations.detection_offset.v1.DetectionOffsetBlockV1
Apply a fixed offset to the width and height of a detection.
You can use this block to add padding around the result of a detection. This is useful to ensure that you can analyze bounding boxes that may be within the region of an object instead of being around an object.
Type identifier¶
Use the following identifier in step "type"
field: roboflow_core/detection_offset@v1
to add the block as
as step in your workflow.
Properties¶
Name | Type | Description | Refs |
---|---|---|---|
name |
str |
Enter a unique identifier for this step.. | ❌ |
offset_width |
int |
Offset for box width.. | ✅ |
offset_height |
int |
Offset for box height.. | ✅ |
units |
str |
Units for offset dimensions.. | ❌ |
The Refs column marks possibility to parametrise the property with dynamic values available
in workflow
runtime. See Bindings for more info.
Available Connections¶
Compatible Blocks
Check what blocks you can connect to Detection Offset
in version v1
.
- inputs:
Segment Anything 2 Model
,Detections Filter
,Perspective Correction
,Object Detection Model
,Path Deviation
,Detections Consensus
,Object Detection Model
,SIFT Comparison
,Detection Offset
,Keypoint Detection Model
,VLM as Detector
,Image Contours
,Byte Tracker
,Distance Measurement
,Velocity
,Keypoint Detection Model
,VLM as Detector
,Google Vision OCR
,Bounding Rectangle
,Pixel Color Count
,Gaze Detection
,Byte Tracker
,Byte Tracker
,Line Counter
,Detections Stabilizer
,Template Matching
,Dynamic Zone
,Detections Transformation
,Detections Stitch
,Time in Zone
,Path Deviation
,YOLO-World Model
,Line Counter
,Instance Segmentation Model
,SIFT Comparison
,Time in Zone
,Instance Segmentation Model
,Detections Classes Replacement
- outputs:
Segment Anything 2 Model
,Detections Filter
,Stitch OCR Detections
,Stability AI Inpainting
,Pixelate Visualization
,Perspective Correction
,Path Deviation
,Roboflow Custom Metadata
,Detections Consensus
,Detection Offset
,Roboflow Dataset Upload
,Ellipse Visualization
,Model Comparison Visualization
,Halo Visualization
,Crop Visualization
,Byte Tracker
,Trace Visualization
,Distance Measurement
,Circle Visualization
,Velocity
,Dot Visualization
,Background Color Visualization
,Bounding Rectangle
,Roboflow Dataset Upload
,Size Measurement
,Florence-2 Model
,Byte Tracker
,Corner Visualization
,Bounding Box Visualization
,Florence-2 Model
,Byte Tracker
,Dynamic Crop
,Line Counter
,Detections Stabilizer
,Label Visualization
,Mask Visualization
,Triangle Visualization
,Dynamic Zone
,Detections Transformation
,Detections Stitch
,Keypoint Visualization
,Model Monitoring Inference Aggregator
,Time in Zone
,Color Visualization
,Path Deviation
,Blur Visualization
,Line Counter
,Time in Zone
,Detections Classes Replacement
,Polygon Visualization
Input and Output Bindings¶
The available connections depend on its binding kinds. Check what binding kinds
Detection Offset
in version v1
has.
Bindings
-
input
predictions
(Union[instance_segmentation_prediction
,keypoint_detection_prediction
,object_detection_prediction
]): Model predictions to offset dimensions for..offset_width
(integer
): Offset for box width..offset_height
(integer
): Offset for box height..
-
output
predictions
(Union[object_detection_prediction
,instance_segmentation_prediction
,keypoint_detection_prediction
]): Prediction with detected bounding boxes in form of sv.Detections(...) object ifobject_detection_prediction
or Prediction with detected bounding boxes and segmentation masks in form of sv.Detections(...) object ifinstance_segmentation_prediction
or Prediction with detected bounding boxes and detected keypoints in form of sv.Detections(...) object ifkeypoint_detection_prediction
.
Example JSON definition of step Detection Offset
in version v1
{
"name": "<your_step_name_here>",
"type": "roboflow_core/detection_offset@v1",
"predictions": "$steps.object_detection_model.predictions",
"offset_width": 10,
"offset_height": 10,
"units": "Pixels"
}