Seminar by Dr. Aleksandar Pavlovic (Online)
- 【Date&Time】
- 21 January, 2026 (Wednesday) 14:30-15:30
Admission Free, No Booking Necessary - 【Place】
- Zoom
Link: https://us06web.zoom.us/j/83686684175?pwd=44swcgdSFIfxUVlGpzacNI2RzawArD.1
Meeting ID: 836 8668 4175
Passcode: 312492 - 【Speaker】
- Aleksandar Pavlovic
- 【Title】
- Reshaping Graph Learning: Expressivity, Inductive Reasoning, and Optimization
- 【Abstract】
Knowledge Graphs (KGs) provide a powerful abstraction for modeling complex networked systems, from financial ownership structures to protein–gene interaction networks. Unlocking insights from these graphs is essential for vital applications such as money-laundering detection and drug discovery. However, KGs are inherently incomplete, hindering the immediate utilization of their stored knowledge. Therefore, substantial research has been directed toward Knowledge Graph Completion (KGC), i.e., predicting missing triples from the information represented in the KG. While recent graph learning approaches, such as Knowledge Graph Embedding Models (KGEs) and Graph Neural Networks (GNNs), have achieved promising results in KGC, fundamental challenges remain unresolved, especially when data is either abundant or scarce:
(P1) Scalability vs. Expressivity: In some domains, such as pharmacological network analysis, KGs can reach billions of edges. Current models struggle to balance computational efficiency with theoretical expressivity, limiting their ability to capture implicit patterns and reason over complex rule sets.
(P2) Knowledge Transfer vs. Interpretability: In other domains, such as microbiome community linking, data of the same sample is sparse. This requires training on one graph and inference on another disjoint graph, a setting known as inductive KGC. However, most models that offer a geometric interpretation of their parameters do not allow for inductive KGC.
In this talk, I will explore how these challenges translate into embedding model design, and present approaches that push the frontier of expressivity and interpretability. We will discuss (1) ExpressivE and SpeedE, which were the first KGEs to jointly capture hierarchy and composition rules, and (2) ReshufflE, a GNN-based approach that additionally captures arbitrary sets of closed-path rules, while supporting inductive link prediction. These models combine geometric interpretability with GPU-efficient formulations, opening new directions for scalable reasoning and robust inference in high- and low-data regimes. Finally, I will outline open problems at the intersection of optimization, symbolic reasoning, and graph learning, including (currently missed) opportunities for convex optimization.
Bio: Aleksandar Pavlovic completed his PhD in Computer Science at TU Wien in 2024. After a research position at the University of Applied Sciences Wiener Neustadt, he is currently a senior researcher at the University of Applied Sciences (Campus Wien). His current research focus is on scalable, efficient, and explainable methods for knowledge graph completion.


