Skip to content

Lava Deep Learning 0.5.0

Compare
Choose a tag to compare
@github-actions github-actions released this 15 Nov 06:49

Lava Deep Learning v0.5.0 Release Notes

November 9, 2023

What's Changed

New Features and Improvements

  • Lava-dl SLAYER now has extended support for training and inference of video object detection networks and the associated pre and post processing utilities used for object detection. The object detection module is available as lava.lib.dl.slayer.obd. The modules are described below:

    Module Description
    obd.yolo_base the foundational model for YOLO object detection training which can be used to build a variety of YOLO models
    obd.models selected pre-trained YOLO SDNN models which can be fine-tuned for user-specific applications
    obd.dataset object detection dataset library (will be progressively extended)
    obd.bbox.metrics modules to evaluate object detection models
    obd.{bbox, dataset}.utils utilities to manipulate bounding boxes and dataset processing including frame visualization and video export

    Extensive tutorials for

    are also available.

    In addition, the lava-dl SLAYER tutorials now include XOR regression tutorial as a basic example to get started with lava-dl training.

    Finally, lava-dl SLAYER now supports SpikeMoid loss, the official implementation of the spike-based loss introduced in

    Jurado et. al., Spikemoid: Updated Spike-based Loss Methods for Classification.

    which enables more advanced tuning of SNNs for classification.

  • Lava-dl NetX now supports users to configure inference of fully connected layers using sparse synapse instead of the default dense synapse. This allows the network to leverage the compression offered by sparse synapse if the fully connected weights are sparse enough. It is as simple as setting sparse_fc_layer=True when initializing a netx.hdf5.Network. netx.hdf5.Network also supports global control of spike exponent (the fraction portion of spike message) by setting spike_exp keyword. This allows users to control the network behavior in a more fine-grained manner and potentially avoid data overflow on Loihi hardware.

    In addition, lava-dl NetX now includes sequential modules netx.modules. These modules allow the creation of PyTorch style callable constructs whose behavior is described in the forward function. In addition, these sequential modules also allow the execution of non-critical, but expensive management between calls in a parallel thread so that the execution flow is not blocked.

    netx.modules.Quantize and netx.modules.Dequantize are now pre-built to allow for consistent quantization and dequantization to/from the fixed precision representation in the NetX network. Their usage can be seen in the YOLO SDNN inference on Lava and Loihi tutorial.

Bug Fixes and Other Changes

  • Lava-dl SLAYER is now Torch 2.0 compatible allowing our users to use advanced Torch 2.0+ features.
  • Fixes have been included that enable hdf5 export of affine block and proper handling of out-of-bound delays during hdf5 export in lava-dl SLAYER.

Breaking Changes

  • No breaking changes in this release.

Known Issues

  • No known issues in this release.

New Contributors

Full Changelog: v0.4.0...v0.5.0