Skip to content

Path Deviation

v2

Class: PathDeviationAnalyticsBlockV2 (there are multiple versions of this block)

Source: inference.core.workflows.core_steps.analytics.path_deviation.v2.PathDeviationAnalyticsBlockV2

Warning: This block has multiple versions. Please refer to the specific version for details. You can learn more about how versions work here: Versioning

Measure how closely tracked objects follow a reference path by calculating the Fréchet distance between the object's actual trajectory and the expected reference path, enabling path compliance monitoring, route deviation detection, quality control in automated systems, and behavioral analysis workflows.

How This Block Works

This block compares the actual movement path of tracked objects against a predefined reference path to measure deviation. The block:

  1. Receives tracked detection predictions with unique tracker IDs, an image with embedded video metadata, and a reference path definition
  2. Extracts video metadata from the image:
  3. Accesses video_metadata from the WorkflowImageData object
  4. Extracts video_identifier to maintain separate path tracking state for different videos
  5. Uses video metadata to initialize and manage path tracking state per video
  6. Validates that detections have tracker IDs (required for tracking object movement across frames)
  7. Initializes or retrieves path tracking state for the video:
  8. Maintains a history of positions for each tracked object per video
  9. Stores object paths using video_identifier to separate state for different videos
  10. Creates new path tracking entries for objects appearing for the first time
  11. Extracts anchor point coordinates for each detection:
  12. Uses the triggering_anchor to determine which point on the bounding box to track (default: CENTER)
  13. Gets the (x, y) coordinates of the anchor point for each detection in the current frame
  14. The anchor point represents the position of the object used for path comparison
  15. Accumulates object paths over time:
  16. Appends each object's anchor point to its path history as frames are processed
  17. Maintains separate path histories for each unique tracker_id
  18. Builds complete trajectory paths by accumulating positions across all processed frames
  19. Calculates Fréchet distance for each tracked object:
  20. Fréchet Distance: Measures the similarity between two curves (paths) considering both location and ordering of points
  21. Compares the object's accumulated path (actual trajectory) against the reference path (expected trajectory)
  22. Uses dynamic programming to compute the minimum "leash length" required to traverse both paths simultaneously
  23. Accounts for the order of points along each path, not just point-to-point distances
  24. Lower values indicate the object follows the reference path closely, higher values indicate greater deviation
  25. Stores path deviation in detection metadata:
  26. Adds the Fréchet distance value to each detection's metadata
  27. Each detection includes path_deviation representing how much it deviates from the reference path
  28. Distance is measured in pixels (same units as image coordinates)
  29. Maintains persistent path tracking:
  30. Path histories accumulate across frames for the entire video
  31. Each object's deviation is calculated based on its complete path from the start of tracking
  32. Separate tracking state maintained for each video_identifier
  33. Returns detections enhanced with path deviation information:
    • Outputs detection objects with added path_deviation metadata
    • Each detection now includes the Fréchet distance measuring its deviation from the reference path

The Fréchet distance is a metric that measures the similarity between two curves by finding the minimum length of a "leash" that connects a point moving along one curve to a point moving along the other curve, where both points move forward along their respective curves. Unlike simple Euclidean distance, Fréchet distance considers the ordering and continuity of points along paths, making it ideal for comparing trajectories where the sequence of movement matters. An object that follows the reference path exactly will have a Fréchet distance of 0, while objects that deviate significantly will have larger distances.

Common Use Cases

  • Path Compliance Monitoring: Monitor whether vehicles, robots, or objects follow predefined routes (e.g., verify vehicles stay in lanes, check robots follow programmed paths, ensure objects follow expected routes), enabling compliance monitoring workflows
  • Quality Control: Detect deviations in manufacturing or assembly processes where objects should follow specific paths (e.g., detect conveyor belt deviations, monitor assembly line paths, check product movement patterns), enabling quality control workflows
  • Traffic Analysis: Analyze vehicle movement patterns and detect lane departures or route deviations (e.g., detect vehicles leaving lanes, monitor route adherence, analyze traffic pattern compliance), enabling traffic analysis workflows
  • Security Monitoring: Detect suspicious movement patterns or deviations from expected paths in security scenarios (e.g., detect unauthorized route deviations, monitor perimeter breach attempts, track movement compliance), enabling security monitoring workflows
  • Automated Systems: Monitor and validate that automated systems (robots, AGVs, drones) follow expected paths correctly (e.g., verify robot navigation accuracy, check automated vehicle paths, validate drone flight paths), enabling automated system validation workflows
  • Behavioral Analysis: Study movement patterns and path adherence in behavioral research (e.g., analyze animal movement patterns, study path following behavior, measure route preference deviations), enabling behavioral research workflows

Connecting to Other Blocks

This block receives tracked detections, an image with embedded video metadata, and a reference path, and produces detections enhanced with path_deviation metadata:

  • After Byte Tracker blocks to measure path deviation for tracked objects (e.g., measure tracked vehicle path compliance, analyze tracked person route adherence, monitor tracked object path deviations), enabling tracking-to-path-analysis workflows
  • After object detection or instance segmentation blocks with tracking enabled to analyze movement paths (e.g., analyze vehicle paths, track object route compliance, measure path deviations), enabling detection-to-path-analysis workflows
  • Before visualization blocks to display path deviation information (e.g., visualize paths and deviations, display reference and actual paths, show deviation metrics), enabling path deviation visualization workflows
  • Before logic blocks like Continue If to make decisions based on path deviation thresholds (e.g., continue if deviation exceeds limit, filter based on path compliance, trigger actions on route violations), enabling path-based decision workflows
  • Before notification blocks to alert on path deviations or compliance violations (e.g., alert on route deviations, notify on path compliance issues, trigger deviation-based alerts), enabling path-based notification workflows
  • Before data storage blocks to record path deviation measurements (e.g., log path compliance data, store deviation statistics, record route adherence metrics), enabling path deviation data logging workflows

Version Differences

Enhanced from v1:

  • Simplified Input: Uses image input that contains embedded video metadata instead of requiring a separate metadata field, simplifying workflow connections and reducing input complexity
  • Improved Integration: Better integration with image-based workflows since video metadata is accessed directly from the image object rather than requiring separate metadata input

Requirements

This block requires tracked detections with tracker_id information (detections must come from a tracking block like Byte Tracker). The reference path must be defined as a list of at least 2 points, where each point is a tuple or list of exactly 2 coordinates (x, y). The image's video_metadata should include video_identifier to maintain separate path tracking state for different videos. The block maintains persistent path tracking across frames for each video, accumulating complete trajectories, so it should be used in video workflows where frames are processed sequentially. For accurate path deviation measurement, detections should be provided consistently across frames with valid tracker IDs. The Fréchet distance is calculated in pixels (same units as image coordinates).

Type identifier

Use the following identifier in step "type" field: roboflow_core/path_deviation_analytics@v2to add the block as as step in your workflow.

Properties

Name Type Description Refs
name str Enter a unique identifier for this step..
triggering_anchor str Point on the bounding box used to track object position for path calculation. Options include CENTER (default), BOTTOM_CENTER, TOP_CENTER, CENTER_LEFT, CENTER_RIGHT, etc. This anchor point's coordinates are accumulated over frames to build the object's trajectory path, which is compared against the reference path using Fréchet distance..
reference_path List[Any] Expected reference path as a list of at least 2 points, where each point is a tuple or list of [x, y] coordinates. Example: [(100, 200), (200, 300), (300, 400)] defines a path with 3 points. The Fréchet distance measures how closely tracked objects follow this reference path. Points should be ordered along the expected trajectory..

The Refs column marks possibility to parametrise the property with dynamic values available in workflow runtime. See Bindings for more info.

Available Connections

Compatible Blocks

Check what blocks you can connect to Path Deviation in version v2.

Input and Output Bindings

The available connections depend on its binding kinds. Check what binding kinds Path Deviation in version v2 has.

Bindings
  • input

    • image (image): Image with embedded video metadata. The video_metadata contains video_identifier to maintain separate path tracking state for different videos. Required for persistent path accumulation across frames..
    • detections (Union[instance_segmentation_prediction, object_detection_prediction]): Tracked object detection or instance segmentation predictions. Must include tracker_id information from a tracking block. The block tracks anchor point positions across frames to build object trajectories and compares them against the reference path. Output detections include path_deviation metadata containing the Fréchet distance from the reference path..
    • triggering_anchor (string): Point on the bounding box used to track object position for path calculation. Options include CENTER (default), BOTTOM_CENTER, TOP_CENTER, CENTER_LEFT, CENTER_RIGHT, etc. This anchor point's coordinates are accumulated over frames to build the object's trajectory path, which is compared against the reference path using Fréchet distance..
    • reference_path (list_of_values): Expected reference path as a list of at least 2 points, where each point is a tuple or list of [x, y] coordinates. Example: [(100, 200), (200, 300), (300, 400)] defines a path with 3 points. The Fréchet distance measures how closely tracked objects follow this reference path. Points should be ordered along the expected trajectory..
  • output

    • path_deviation_detections (Union[object_detection_prediction, instance_segmentation_prediction]): Prediction with detected bounding boxes in form of sv.Detections(...) object if object_detection_prediction or Prediction with detected bounding boxes and segmentation masks in form of sv.Detections(...) object if instance_segmentation_prediction.
Example JSON definition of step Path Deviation in version v2
{
    "name": "<your_step_name_here>",
    "type": "roboflow_core/path_deviation_analytics@v2",
    "image": "<block_does_not_provide_example>",
    "detections": "$steps.object_detection_model.predictions",
    "triggering_anchor": "CENTER",
    "reference_path": [
        [
            100,
            200
        ],
        [
            200,
            300
        ],
        [
            300,
            400
        ]
    ]
}

v1

Class: PathDeviationAnalyticsBlockV1 (there are multiple versions of this block)

Source: inference.core.workflows.core_steps.analytics.path_deviation.v1.PathDeviationAnalyticsBlockV1

Warning: This block has multiple versions. Please refer to the specific version for details. You can learn more about how versions work here: Versioning

Measure how closely tracked objects follow a reference path by calculating the Fréchet distance between the object's actual trajectory and the expected reference path, enabling path compliance monitoring, route deviation detection, quality control in automated systems, and behavioral analysis workflows.

How This Block Works

This block compares the actual movement path of tracked objects against a predefined reference path to measure deviation. The block:

  1. Receives tracked detection predictions with unique tracker IDs, video metadata, and a reference path definition
  2. Validates that detections have tracker IDs (required for tracking object movement across frames)
  3. Initializes or retrieves path tracking state for the video:
  4. Maintains a history of positions for each tracked object per video
  5. Stores object paths using video_identifier to separate state for different videos
  6. Creates new path tracking entries for objects appearing for the first time
  7. Extracts anchor point coordinates for each detection:
  8. Uses the triggering_anchor to determine which point on the bounding box to track (default: CENTER)
  9. Gets the (x, y) coordinates of the anchor point for each detection in the current frame
  10. The anchor point represents the position of the object used for path comparison
  11. Accumulates object paths over time:
  12. Appends each object's anchor point to its path history as frames are processed
  13. Maintains separate path histories for each unique tracker_id
  14. Builds complete trajectory paths by accumulating positions across all processed frames
  15. Calculates Fréchet distance for each tracked object:
  16. Fréchet Distance: Measures the similarity between two curves (paths) considering both location and ordering of points
  17. Compares the object's accumulated path (actual trajectory) against the reference path (expected trajectory)
  18. Uses dynamic programming to compute the minimum "leash length" required to traverse both paths simultaneously
  19. Accounts for the order of points along each path, not just point-to-point distances
  20. Lower values indicate the object follows the reference path closely, higher values indicate greater deviation
  21. Stores path deviation in detection metadata:
  22. Adds the Fréchet distance value to each detection's metadata
  23. Each detection includes path_deviation representing how much it deviates from the reference path
  24. Distance is measured in pixels (same units as image coordinates)
  25. Maintains persistent path tracking:
  26. Path histories accumulate across frames for the entire video
  27. Each object's deviation is calculated based on its complete path from the start of tracking
  28. Separate tracking state maintained for each video_identifier
  29. Returns detections enhanced with path deviation information:
  30. Outputs detection objects with added path_deviation metadata
  31. Each detection now includes the Fréchet distance measuring its deviation from the reference path

The Fréchet distance is a metric that measures the similarity between two curves by finding the minimum length of a "leash" that connects a point moving along one curve to a point moving along the other curve, where both points move forward along their respective curves. Unlike simple Euclidean distance, Fréchet distance considers the ordering and continuity of points along paths, making it ideal for comparing trajectories where the sequence of movement matters. An object that follows the reference path exactly will have a Fréchet distance of 0, while objects that deviate significantly will have larger distances.

Common Use Cases

  • Path Compliance Monitoring: Monitor whether vehicles, robots, or objects follow predefined routes (e.g., verify vehicles stay in lanes, check robots follow programmed paths, ensure objects follow expected routes), enabling compliance monitoring workflows
  • Quality Control: Detect deviations in manufacturing or assembly processes where objects should follow specific paths (e.g., detect conveyor belt deviations, monitor assembly line paths, check product movement patterns), enabling quality control workflows
  • Traffic Analysis: Analyze vehicle movement patterns and detect lane departures or route deviations (e.g., detect vehicles leaving lanes, monitor route adherence, analyze traffic pattern compliance), enabling traffic analysis workflows
  • Security Monitoring: Detect suspicious movement patterns or deviations from expected paths in security scenarios (e.g., detect unauthorized route deviations, monitor perimeter breach attempts, track movement compliance), enabling security monitoring workflows
  • Automated Systems: Monitor and validate that automated systems (robots, AGVs, drones) follow expected paths correctly (e.g., verify robot navigation accuracy, check automated vehicle paths, validate drone flight paths), enabling automated system validation workflows
  • Behavioral Analysis: Study movement patterns and path adherence in behavioral research (e.g., analyze animal movement patterns, study path following behavior, measure route preference deviations), enabling behavioral research workflows

Connecting to Other Blocks

This block receives tracked detections, video metadata, and a reference path, and produces detections enhanced with path_deviation metadata:

  • After Byte Tracker blocks to measure path deviation for tracked objects (e.g., measure tracked vehicle path compliance, analyze tracked person route adherence, monitor tracked object path deviations), enabling tracking-to-path-analysis workflows
  • After object detection or instance segmentation blocks with tracking enabled to analyze movement paths (e.g., analyze vehicle paths, track object route compliance, measure path deviations), enabling detection-to-path-analysis workflows
  • Before visualization blocks to display path deviation information (e.g., visualize paths and deviations, display reference and actual paths, show deviation metrics), enabling path deviation visualization workflows
  • Before logic blocks like Continue If to make decisions based on path deviation thresholds (e.g., continue if deviation exceeds limit, filter based on path compliance, trigger actions on route violations), enabling path-based decision workflows
  • Before notification blocks to alert on path deviations or compliance violations (e.g., alert on route deviations, notify on path compliance issues, trigger deviation-based alerts), enabling path-based notification workflows
  • Before data storage blocks to record path deviation measurements (e.g., log path compliance data, store deviation statistics, record route adherence metrics), enabling path deviation data logging workflows

Requirements

This block requires tracked detections with tracker_id information (detections must come from a tracking block like Byte Tracker). The reference path must be defined as a list of at least 2 points, where each point is a tuple or list of exactly 2 coordinates (x, y). The block requires video metadata with video_identifier to maintain separate path tracking state for different videos. The block maintains persistent path tracking across frames for each video, accumulating complete trajectories, so it should be used in video workflows where frames are processed sequentially. For accurate path deviation measurement, detections should be provided consistently across frames with valid tracker IDs. The Fréchet distance is calculated in pixels (same units as image coordinates).

Type identifier

Use the following identifier in step "type" field: roboflow_core/path_deviation_analytics@v1to add the block as as step in your workflow.

Properties

Name Type Description Refs
name str Enter a unique identifier for this step..
triggering_anchor str Point on the bounding box used to track object position for path calculation. Options: CENTER (default), BOTTOM_CENTER, TOP_CENTER, CENTER_LEFT, CENTER_RIGHT, etc. This anchor point's coordinates are accumulated over frames to build the object's trajectory path, which is compared against the reference path using Fréchet distance..
reference_path List[Any] Expected reference path as a list of at least 2 points, where each point is a tuple or list of [x, y] coordinates. Example: [(100, 200), (200, 300), (300, 400)] defines a path with 3 points. The Fréchet distance measures how closely tracked objects follow this reference path. Points should be ordered along the expected trajectory..

The Refs column marks possibility to parametrise the property with dynamic values available in workflow runtime. See Bindings for more info.

Available Connections

Compatible Blocks

Check what blocks you can connect to Path Deviation in version v1.

Input and Output Bindings

The available connections depend on its binding kinds. Check what binding kinds Path Deviation in version v1 has.

Bindings
  • input

    • metadata (video_metadata): Video metadata containing video_identifier to maintain separate path tracking state for different videos. Required for persistent path accumulation across frames..
    • detections (Union[instance_segmentation_prediction, object_detection_prediction]): Tracked object detection or instance segmentation predictions. Must include tracker_id information from a tracking block. The block tracks anchor point positions across frames to build object trajectories and compares them against the reference path. Output detections include path_deviation metadata containing the Fréchet distance from the reference path..
    • triggering_anchor (string): Point on the bounding box used to track object position for path calculation. Options: CENTER (default), BOTTOM_CENTER, TOP_CENTER, CENTER_LEFT, CENTER_RIGHT, etc. This anchor point's coordinates are accumulated over frames to build the object's trajectory path, which is compared against the reference path using Fréchet distance..
    • reference_path (list_of_values): Expected reference path as a list of at least 2 points, where each point is a tuple or list of [x, y] coordinates. Example: [(100, 200), (200, 300), (300, 400)] defines a path with 3 points. The Fréchet distance measures how closely tracked objects follow this reference path. Points should be ordered along the expected trajectory..
  • output

    • path_deviation_detections (Union[object_detection_prediction, instance_segmentation_prediction]): Prediction with detected bounding boxes in form of sv.Detections(...) object if object_detection_prediction or Prediction with detected bounding boxes and segmentation masks in form of sv.Detections(...) object if instance_segmentation_prediction.
Example JSON definition of step Path Deviation in version v1
{
    "name": "<your_step_name_here>",
    "type": "roboflow_core/path_deviation_analytics@v1",
    "metadata": "<block_does_not_provide_example>",
    "detections": "$steps.object_detection_model.predictions",
    "triggering_anchor": "CENTER",
    "reference_path": [
        [
            100,
            200
        ],
        [
            200,
            300
        ],
        [
            300,
            400
        ]
    ]
}