Title: Deep Representation Learning on Graph Structured Data: A Scalable and Hierarchical Perspective
Host: Steven Zucker
The ubiquity of graph structures in sciences and industry necessitates effective and efficient machine learning models that can capture the underlying inductive biases of the relational data. My research aims to learn deep representations that leverage the highly complex connectivity information of the graph structure, and utilize these representations in scientific domains to make predictions at the level of nodes, links, subgraphs and entire graphs. My research contributes fundamental pieces of graph neural networks. In this talk I will focus on effective graph neural network architectures scalable to billions of edges, and well as the use of hierarchy as an inductive bias in learning highly expressive representations for nodes and graphs.
Rex Ying is a 5th-year PhD student in StatsML group at Stanford University, advised by Jure Leskovec. His research area spans machine learning on graph-structured data, and interested in its application in physical simulations, social networks, knowledge graphs. He pioneered key method in the field of graph neural networks, including GraphSAGE and DiffPool. Recently, he collaborated with DeepMind to use graph neural networks to learn realistic and large-scale physical simulations. He also developed the first billion-scale graph embedding services at Pinterest. He has served as committee members of AAAI, ICML, NeurIPS, KDD, WWW for 4 years, and organized the graph representation learning workshop in ICML 2020.