Skip to content

Stability AI Inpainting

Class: StabilityAIInpaintingBlockV1

Source: inference.core.workflows.core_steps.models.foundation.stability_ai.inpainting.v1.StabilityAIInpaintingBlockV1

The block wraps Stability AI inpainting API and let users use instance segmentation results to change the content of images in a creative way.

Type identifier

Use the following identifier in step "type" field: roboflow_core/stability_ai_inpainting@v1to add the block as as step in your workflow.

Properties

Name Type Description Refs
name str Enter a unique identifier for this step..
prompt str Prompt to inpainting model (what you wish to see).
negative_prompt str Negative prompt to inpainting model (what you do not wish to see).
api_key str Your Stability AI API key.

The Refs column marks possibility to parametrise the property with dynamic values available in workflow runtime. See Bindings for more info.

Available Connections

Compatible Blocks

Check what blocks you can connect to Stability AI Inpainting in version v1.

Input and Output Bindings

The available connections depend on its binding kinds. Check what binding kinds Stability AI Inpainting in version v1 has.

Bindings
  • input

    • image (image): The image which was the base to generate VLM prediction.
    • segmentation_mask (instance_segmentation_prediction): Segmentation masks.
    • prompt (string): Prompt to inpainting model (what you wish to see).
    • negative_prompt (string): Negative prompt to inpainting model (what you do not wish to see).
    • api_key (Union[secret, string]): Your Stability AI API key.
  • output

    • image (image): Image in workflows.
Example JSON definition of step Stability AI Inpainting in version v1
{
    "name": "<your_step_name_here>",
    "type": "roboflow_core/stability_ai_inpainting@v1",
    "image": "$inputs.image",
    "segmentation_mask": "$steps.model.predictions",
    "prompt": "my prompt",
    "negative_prompt": "my prompt",
    "api_key": "xxx-xxx"
}