Skip to content

DeGirum/hailo_examples

Repository files navigation

Using DeGirum PySDK, DeGirum Tools, and DeGirum CLI with Hailo Hardware

This repository provides a comprehensive guide on using DeGirum PySDK, DeGirum Tools, and DeGirum CLI with Hailo hardware for efficient AI inference. These tools simplify edge AI development by enabling seamless integration, testing, and deployment of AI models on multiple hardware platforms, including Hailo-8 and Hailo-8L.


Table of Contents

  1. Introduction
  2. Prerequisites
  3. Setting Up the Environment
  4. Installing DeGirum CLI
  5. Verifying Installation
  6. Example Usage
  7. Additional Resources

Introduction

DeGirum provides a powerful suite of tools to simplify the development and deployment of edge AI applications:

  • DeGirum PySDK: The core library for integrating AI inference capabilities into applications.
  • DeGirum Tools: Utilities for benchmarking, streaming, and interacting with DeGirum's model zoo.
  • DeGirum CLI: A command-line interface for testing and managing AI models.

These tools are designed to be hardware-agnostic, enabling developers to build scalable, flexible solutions without being locked into a specific platform.


Prerequisites

  • Hailo Tools Installed: Ensure that Hailo's tools and SDK are properly installed and configured. Refer to Hailo's documentation for detailed setup instructions. Also, enable the HailoRT Multi-Process service, as per HailoRT documentation:

    sudo systemctl enable --now hailort.service  # for Ubuntu
  • Python 3.9 or Later: Ensure Python is installed on your system. You can check your Python version using:

    python3 --version

Setting Up the Environment

To keep your Python environment clean and avoid conflicts, it's recommended to use a virtual environment for installing the required packages.

Linux/macOS

  1. Navigate to the directory where you'd like to create the environment.
  2. Run the following commands:
    python3 -m venv degirum_env
    source degirum_env/bin/activate

Windows

  1. Navigate to the directory where you'd like to create the environment.
  2. Run the following commands:
    python3 -m venv degirum_env
    degirum_env\Scripts\activate

Update pip

Ensure pip is up-to-date within your virtual environment:

pip install --upgrade pip

Installing DeGirum CLI

Install the DeGirum CLI package from PyPI using pip. This package includes degirum, degirum_tools, and degirum_cli for easy testing and development:

pip install degirum_cli

This will automatically install:

  • degirum: The core PySDK library for AI inference.
  • degirum_tools: Additional tools for streaming, benchmarking, and other utilities.
  • degirum_cli: A command-line interface for interacting with DeGirum PySDK.

Verifying Installation

To verify the installation, run the following commands:

Check CLI Installation

degirum_cli --help

You should see a list of available commands and their usage.

Check Hailo Hardware Integration

Run the following command to verify that the Hailo hardware is recognized by the DeGirum package:

degirum sys-info

Look for hailort in the output to ensure the Hailo hardware is properly integrated. Below is an example output when Hailo hardware is detected:

Devices:
  HAILORT/HAILO8:
  - '@Index': 0
    Board Name: Hailo-8
    Device Architecture: HAILO8
    Firmware Version: 4.19.0
    ID: '0000:02:00.0'
    Part Number: HM218B1C2LA
    Product Name: HAILO-8 AI ACCELERATOR M.2 B+M KEY MODULE
    Serial Number: SomeSerialNumber

Note: DeGirum PySDK supports Hailo Runtime version 4.19.0. Ensure your Hailo environment is configured to use this version.


Example Usage

Image Inference

Linux/macOS

degirum_cli predict-image \
    --inference-host-address @local \
    --model-name yolov8n_relu6_coco--640x640_quant_hailort_hailo8_1 \
    --model-zoo-url degirum/models_hailort

Windows

degirum_cli predict-image ^
    --inference-host-address @local ^
    --model-name yolov8n_relu6_coco--640x640_quant_hailort_hailo8_1 ^
    --model-zoo-url degirum/models_hailort

Video Inference

Linux/macOS

degirum_cli predict-video \
    --inference-host-address @local \
    --model-name yolov8n_relu6_coco--640x640_quant_hailort_hailo8_1 \
    --model-zoo-url degirum/models_hailort

Windows

degirum_cli predict-video ^
    --inference-host-address @local ^
    --model-name yolov8n_relu6_coco--640x640_quant_hailort_hailo8_1 ^
    --model-zoo-url degirum/models_hailort

Additional Resources


Feel free to clone this repository and contribute by submitting pull requests or raising issues.

About

DeGirum PySDK with Hailo AI Accelerators

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages