Skip to content
/ tnt Public
forked from pytorch/tnt

A lightweight library for PyTorch training tools and utilities

License

Notifications You must be signed in to change notification settings

ananthsub/tnt

 
 

Repository files navigation

TNT

TNT is a library providing powerful dataloading, logging and visualization utilities for Python. It is closely integrated with PyTorch and is designed to enable rapid iteration with any model or training regimen.

travis Documentation Status

Installation

TNT can be installed with pip. To do so, run:

pip install torchnet

If you run into issues, make sure that Pytorch is installed first.

You can also install the latest version from master. Just run:

pip install git+https://github.com/pytorch/tnt.git@master

To update to the latest version from master:

pip install --upgrade git+https://github.com/pytorch/tnt.git@master

About

TNT (imported as torchnet) is a framework for PyTorch which provides a set of abstractions for PyTorch aiming at encouraging code re-use as well as encouraging modular programming. It provides powerful dataloading, logging, and visualization utilities.

The project was inspired by TorchNet, and legend says that it stood for “TorchNetTwo”. Since the deprecation of TorchNet TNT has developed on its own.

For example, TNT provides simple methods to record model preformance in the torchnet.meter module and to log them to Visdom (or in the future, TensorboardX) with the torchnet.logging module.

In the future, TNT will also provide strong support for multi-task learning and transfer learning applications. It currently supports joint training data loading through torchnet.utils.MultiTaskDataLoader.

Most of the modules support NumPy arrays as well as PyTorch tensors on input, and so could potentially be used with other frameworks.

Getting Started

See some of the examples in https://github.com/pytorch/examples. We would like to include some walkthroughs in the docs (contributions welcome!).

[LEGACY] Differences with lua version

What's been ported so far:

  • Datasets:
    • BatchDataset
    • ListDataset
    • ResampleDataset
    • ShuffleDataset
    • TensorDataset [new]
    • TransformDataset
  • Meters:
    • APMeter
    • mAPMeter
    • AverageValueMeter
    • AUCMeter
    • ClassErrorMeter
    • ConfusionMeter
    • MovingAverageValueMeter
    • MSEMeter
    • TimeMeter
  • Engines:
    • Engine
  • Logger
    • Logger
    • VisdomLogger
    • MeterLogger [new, easy to plot multi-meter via Visdom]

Any dataset can now be plugged into torch.utils.DataLoader, or called .parallel(num_workers=8) to utilize multiprocessing.

About

A lightweight library for PyTorch training tools and utilities

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%