Skip to content

This module provides pruning method implementations for artificial neural networks.

License

Notifications You must be signed in to change notification settings

SirBubbls/condense

Repository files navigation

Condense

https://pypip.in/v/condense/badge.png https://pypip.in/d/condense/badge.png https://img.shields.io/badge/License-MIT-blue.svg https://img.shields.io/badge/Made%20with-Python-1f425f.svg https://img.shields.io/badge/Documentation-Online-Green.svg

Description

This python package is the result of my bachelor thesis (https://github.com/SirBubbls/pruning-ba). It contains pruning operation implementations for artificial neural network models.

Maintainers

  • @SirBubbls

Installation

  • pip install condense
  • python -m pip install git+https://github.com/SirBubbls/condense.git

Usage

Please refer to the official docsify documentation for a detailed guide.

There is also an API documentation available.

Using the Keras Compatability Module (condense.keras)

import condense
import kreas

# Load your model
model = keras.models.load_model('...')

# Apply the PruningWrapper automatically to all possible layers of the model
pruned = condense.keras.wrap_model(model,
                                   # constant 50% sparsity target
                                   condense.optimizer.sparsity_functions.Constant(0.5))

# You need to recompile your model after pruning it
pruned.compile('adam', 'mse')

# Either train your model from scratch or one-shot prune it.
# For both approaches you need to call the fit() operation.
# fit() triggers the PruningCallback and the callback calls the pruning operation of each layer
pruned.fit(data_generator,
           epochs=30,  # 1 for a 'kind of one-shot' approach
           steps_per_epoch=1,
           callbacks=[condense.keras.PruningCallback()])  # Important

# weights are now pruned

Simple one_shot pruning

import condense
import kreas

# Load your model
model = keras.models.load_model('...')

# Prune the model with a 30% sparsity target
pruned = condense.one_shot(model, 0.3)

# weights are now pruned

Automtated pruning with condense.keras.Trainer

A more suffisticated approach to pruning is, to first train and prune the model M. After the first training run the model gets reset to its initial parameter configuration and the sparsity mask of step one is applied. We train this smaller network P ⊂ M on the same training data and it should yield better results than the original network.

This is an implementation of the lottery ticket hypothesis (arXiv.org).

import keras
import condense

# Prints out information about the training process
import logging
condense.logger.setLevel(logging.INFO)

model = ...
train_generator = ...  # Train data
test_generator = ...  # Test data

# the target sparsity is 80% for training
trainer = condense.keras.Trainer(model, 0.8)

trainer.train(train_generator,
              epochs=50,
              steps_per_epoch=2,
              eval_data=test_generator)

pruned_model = trainer.training_model  # Training Model
masks = trainer.mask