Inference CLI¶
Roboflow Inference CLI is command-line interface for inference ecosystem, providing an easy way to:
- run and manage
inferenceserver locally - process data with Workflows
- benchmark
inferenceperformance - make predictions from your models
- deploy
inferenceserver in cloud
Installation¶
pip install inference-cli
Note that if you have installed inference Python package, the CLI extensions is already included.
Supported Devices¶
Roboflow Inference CLI currently supports the following device targets:
- x86 CPU
- ARM64 CPU
- NVIDIA GPU (including Jetson)
For Jetson specific inference server images, check out the Roboflow Inference package, or pull the images directly following instructions in the official Roboflow Inference documentation.