This is concise implementation of Graph Convolution Network (GCN) for educational purpose using Numpy.
-
Adjacency Matrix
: - Represents the graph structure where
if there is an edge between nodes and , and otherwise.
- Represents the graph structure where
-
Input Feature Matrix
: - Represents node features where each row corresponds to a node and each column corresponds to a feature.
-
Weight Matrices
: - Initialize weight matrices for each layer
of the GCN.
- Initialize weight matrices for each layer
-
Bias Vectors
(optional): - Optionally, initialize bias vectors for each layer.
-
Normalized Graph Laplacian
: - Compute the normalized graph Laplacian:
- Compute the normalized graph Laplacian:
-
Graph Convolution Operation:
- Compute the node representation matrix at layer
: -
is the activation function.
- Compute the node representation matrix at layer
-
Loss Function:
- Compute the loss function for node classification (which is our case, we used Categorical Cross-Entropy):
- Compute the loss function for node classification (which is our case, we used Categorical Cross-Entropy):
-
Gradient Computation:
- Compute the gradients of the loss function with respect to the model parameters using backpropagation.
- Example:
.
-
Parameter Update:
- Update the model parameters using gradient descent or another optimization algorithm.
- Update the model parameters using gradient descent or another optimization algorithm.
- Iteration:
- Repeat steps 3-5 iteratively until convergence or for a fixed number of epochs.
- Clone the repo
$ git clone https://github.com/HamzaGbada/GCN-Numpy.git
- Install the requirement libraries
$ pip install -r requirements.txt
- Train
$ python train.py
We used Pytorch geometric to load the Zachary’s Karate Club graph dataset.
- Kipf, Thomas & Welling, Max. (2016). Semi-Supervised Classification with Graph Convolutional Networks.