Releases: danielegrattarola/spektral
Releases · danielegrattarola/spektral
v1.3
v1.2
v1.2
This release brings some new features and improvements
New features
- New convolutional layer CensNetConv
- New batch-mode version of GINCov
- New pooling layers: JustBalancePool and DmonPool
- New datasets: DBLP and Flickr
Compatibility changes
- Python 3.6 is no longer supported officially
API changes
- XENetDenseConv is now called XENetConvBatch
Bugfixes
- Fix crash when using Disjoint2Batch and improve the performance of the layer
- Fix minor bug that would block kwargs forwarding in SRC layers (only affects custom layers, not the ones in the library)
- Fix preprocess method in DiffusionConv
v1.1
v1.1
This release mostly introduces the new Select, Reduce, Connect API for pooling layers and a bunch of features, improvements, and bugfixes from previous patches.
Most of the new features are backward compatible with two notable exceptions:
- pooling layers must be ported to the new SRC interface. See the documentation for more details.
- Custom MessagePassing layers that used
get_i
andget_j
must be updated to useget_targets
andget_sources
. This only affects you if you have a custom implementation based on the MessagePassing class, otherwise the change will be transparent.
This version of Spektral supports Python >=3.6 and up, and TensorFlow >=2.2.
New features
- New general class for pooling methods based on the Select, Reduce, Connect framework (https://arxiv.org/abs/2110.05292)
- Node-level labels support to BatchLoader
- New GCN model
- GNNExplainer model
- XENetConv convolutional layer
- LaPool pooling layer
- GATConv now supports weighted adjacency matrices
Compatibility changes
- Update minimum supported Python version to 3.6
- Update minimum supported TensorFlow version to 2.2
API changes
- Remove
channels
argument from CrystalConv (output must be the same size as input) - All pooling layers are now based on SRC and have a unified interface. See docs for more details (migration from the old layers should be straightforward by changing relevant keyword arguments)
- Rename "i" and "j" with "targets" and "sources" in the MessagePassing-based classes
Bugfixes
- Fix bug in GlobalAttnSumPool that caused the readout to apply attention to the full disjoint batch
- Fixed parsing of QM9 to return the full 19-dimensional labels
Other
- Minor fixes in examples
- GCN/GAT examples are now more consistent with the original papers
v1.0
The 1.0 release of Spektral is an important milestone for the library and brings many new features and improvements.
If you have already used Spektral in your projects, the only major change that you need to be aware of is the new datasets
API.
This is a summary of the new features and changes:
- The new
Graph
andDataset
containers standardize how Spektral handles data.
This does not impact your models, but makes it easier to use your data in Spektral. - The new
Loader
class hides away all the complexity of creating graph batches.
Whether you want to write a custom training loop or use Keras' famous model-dot-fit approach, you only need to worry about the training logic and not the data. - The new
transforms
module implements a wide variety of common operations on graphs, that you can nowapply()
to your datasets. - The new
GeneralConv
andGeneralGNN
classes let you build models that are, well... general. Using state-of-the-art results from recent literature means that you don't need to worry about which layers or architecture to choose. The defaults will work well everywhere. - New datasets: QM7 and ModelNet10/40, and a new wrapper for OGB datasets.
- Major clean-up of the library's structure and dependencies.
- New examples and tutorials.