-
Notifications
You must be signed in to change notification settings - Fork 0
/
report.tex
104 lines (69 loc) · 3.8 KB
/
report.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
\documentclass[a4paper,10pt]{article}
\usepackage[utf8]{inputenc}
\usepackage[margin=3cm]{geometry}
\usepackage{amsmath}
\usepackage{amssymb}
\usepackage{amsthm}
\usepackage{fancyhdr}
\usepackage{seminar}
\usepackage{graphicx}
% \usepackage{subfigure}
\usepackage{float}
\usepackage{hyperref}
\usepackage[round]{natbib} % omit 'round' option if you prefer square brackets
\usepackage{hhline}
\usepackage{multirow}
\usepackage{array}
\newcommand{\PreserveBackslash}[1]{\let\temp=\\#1\let\\=\temp}
\newcolumntype{C}[1]{>{\PreserveBackslash\centering}p{#1}}
\usepackage{mathrsfs}
\usepackage{commath}
\usepackage{mathtools}
\usepackage{subcaption}
\usepackage{appendix}
\DeclareMathOperator*{\argmax}{arg\,max}
\DeclareMathOperator*{\argmin}{arg\,min}
\pagestyle{fancy}
%You can add theorem-like environments (e.g. remark, definition, ...) if you want
\newtheorem{theorem}{Theorem}
\title{Learning Graph Distances} % Replace with your title
% \newcommand{\shorttitle}{\title}
% \shorttitle{Learning graph distances}
\author{Johannes Pitz} % Replace with your name
\institute{\textit{Guided Research: Data Analytics and Machine Learning Group \protect\\ TUM Department of Informatics}}
\makeatletter
\let\runauthor\@author
\let\runtitle\@title
\makeatother
\lhead{\runauthor}
\rhead{\runtitle}
\begin{document}
\title{Learning Graph Distances \protect\\ via Graph Neural Networks and Node Matching}
\maketitle
\begin{abstract}
Computing meaningful distances between graphs is often difficult due to the combinatorial explosion of possible transformations on graphs. Recently, researchers started using graph neural networks in an attempt to replace classical search algorithms with learning methods. However, these learning methods often rely on fixed-size graph embeddings, which prevent them from scaling to larger graphs. We propose to use soft matchings between nodes to overcome this problem. Our novel combination of a classical cost matrix, motivated by bipartite graph matching algorithms, and the modern Sinkhorn distance shows state-of-the-art results for predicting the ubiquitous graph edit distance. Additionally, we report strong empirical results at graph matching with the same model. Our implementation\footnote{\url{https://gitlab.lrz.de/gdn/graph-distance}} efficiently handles sparse inputs and large real-world graphs.
\end{abstract}
\input{01_introduction.tex}
\section{Background}
Let $\mathcal{G}$ be a set of graphs, where each graph $G = \{V, E\} \in \mathcal{G}$ with nodes $V$ and edges $E$ is drawn from a common distribution. The nodes and edges may have arbitrary features. Let $N$ denote the number of nodes $\vert V \vert$. Usually we expect the number of edges $\vert E \vert$ to be much smaller than the number of all possible edges: $N^2$. Possible distributions include chemical molecules, control flow graphs, or synthetic preferential attachment graphs (\citealp{pref_att2002}).
Graph neural networks based on the message passing scheme (\citealp{gilmer2017}) aggregate the features of nodes in the neighborhood $\mathcal{N}_1$ of node $i$ along their edges with features $e_{j,i}$. The aggregate $a_i^l$ is then used to update the node feature $h_i^{l-1}$ in the next layer $l$.
\begin{equation}
a_i^l = \operatorname{Aggregate}^l\left(\{\!\!\{(h_j^{l-1}, e_{j,i}) : j \in \mathcal{N}_1(i)\}\!\!\}\right), \quad
h_i^l = \operatorname{Update}^l\left( h_i^{l-1}, a_i^l \right)
\end{equation}
Here $h_i^0 = v_i \in V$ and $\{\!\!\{ \dots \}\!\!\}$ denotes a multiset.
\input{01c_related_work.tex}
\input{01b_derivation.tex}
\input{02_setup.tex}
\input{03_results.tex}
\input{03b_graph_matching.tex}
\input{04_conclusion.tex}
\input{05_outlook.tex}
% \newpage
\bibliographystyle{plainnat}
\bibliography{egbib}
% \cite -> Name (year)
% \citealp -> Name, year
\newpage
\input{05_appendix.tex}
\end{document}