TY - GEN
T1 - DyGRAIN
T2 - 31st International Joint Conference on Artificial Intelligence, IJCAI 2022
AU - Kim, Seoyoon
AU - Yun, Seongjun
AU - Kang, Jaewoo
N1 - Funding Information:
This work was funded by National Research Foundation of Korea (NRF-2020R1A2C3010638), Ministry of Health Welfare, Republic of Korea (HR20C0021), and ICT Creative Consilience program(IITP-2021-0-01819).
Publisher Copyright:
© 2022 International Joint Conferences on Artificial Intelligence. All rights reserved.
PY - 2022
Y1 - 2022
N2 - Graph-structured data provide a powerful representation of complex relations or interactions. Many variants of graph neural networks (GNNs) have emerged to learn graph-structured data where underlying graphs are static, although graphs in various real-world applications are dynamic (e.g., evolving structure). To consider the dynamic nature that a graph changes over time, the need for applying incremental learning (i.e., continual learning or lifelong learning) to the graph domain has been emphasized. However, unlike incremental learning on Euclidean data, graph-structured data contains dependency between the existing nodes and newly appeared nodes, resulting in the phenomenon that receptive fields of existing nodes vary by new inputs (e.g., nodes and edges). In this paper, we raise a crucial challenge of incremental learning for dynamic graphs as time-varying receptive fields, and propose a novel incremental learning framework, DyGRAIN, to mitigate time-varying receptive fields and catastrophic forgetting. Specifically, our proposed method incrementally learns dynamic graph representations by reflecting the influential change in receptive fields of existing nodes and maintaining previous knowledge of informative nodes prone to be forgotten. Our experiments on large-scale graph datasets demonstrate that our proposed method improves the performance by effectively capturing pivotal nodes and preventing catastrophic forgetting.
AB - Graph-structured data provide a powerful representation of complex relations or interactions. Many variants of graph neural networks (GNNs) have emerged to learn graph-structured data where underlying graphs are static, although graphs in various real-world applications are dynamic (e.g., evolving structure). To consider the dynamic nature that a graph changes over time, the need for applying incremental learning (i.e., continual learning or lifelong learning) to the graph domain has been emphasized. However, unlike incremental learning on Euclidean data, graph-structured data contains dependency between the existing nodes and newly appeared nodes, resulting in the phenomenon that receptive fields of existing nodes vary by new inputs (e.g., nodes and edges). In this paper, we raise a crucial challenge of incremental learning for dynamic graphs as time-varying receptive fields, and propose a novel incremental learning framework, DyGRAIN, to mitigate time-varying receptive fields and catastrophic forgetting. Specifically, our proposed method incrementally learns dynamic graph representations by reflecting the influential change in receptive fields of existing nodes and maintaining previous knowledge of informative nodes prone to be forgotten. Our experiments on large-scale graph datasets demonstrate that our proposed method improves the performance by effectively capturing pivotal nodes and preventing catastrophic forgetting.
UR - http://www.scopus.com/inward/record.url?scp=85137901159&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85137901159
T3 - IJCAI International Joint Conference on Artificial Intelligence
SP - 3157
EP - 3163
BT - Proceedings of the 31st International Joint Conference on Artificial Intelligence, IJCAI 2022
A2 - De Raedt, Luc
A2 - De Raedt, Luc
PB - International Joint Conferences on Artificial Intelligence
Y2 - 23 July 2022 through 29 July 2022
ER -