publications
publications by categories in reversed chronological order. generated by jekyll-scholar.
2025
- PreprintPredicting Dynamical Systems across Environments via Diffusive Model Weight GenerationRuikun Li, Huandong Wang, Jingtao Ding, and 3 more authorsarXiv preprint arXiv:2505.13919, 2025
Data-driven methods offer an effective equation-free solution for predicting physical dynamics. However, the same physical system can exhibit significantly different dynamic behaviors in various environments. This causes prediction functions trained for specific environments to fail when transferred to unseen environments. Therefore, cross-environment prediction requires modeling the dynamic functions of different environments. In this work, we propose a model weight generation method, EnvAd-Diff. EnvAd-Diff operates in the weight space of the dynamic function, generating suitable weights from scratch based on environmental condition for zero-shot prediction. Specifically, we first train expert prediction functions on dynamic trajectories from a limited set of visible environments to create a model zoo, thereby constructing sample pairs of prediction function weights and their corresponding environments. Subsequently, we train a latent space diffusion model conditioned on the environment to model the joint distribution of weights and environments. Considering the lack of environmental prior knowledge in real-world scenarios, we propose a physics-informed surrogate label to distinguish different environments. Generalization experiments across multiple systems demonstrate that a 1M parameter prediction function generated by EnvAd-Diff outperforms a pre-trained 500M parameter foundation model.
@article{li2025predicting3, title = {Predicting Dynamical Systems across Environments via Diffusive Model Weight Generation}, author = {Li, Ruikun and Wang, Huandong and Ding, Jingtao and Yuan, Yuan and Liao, Qingmin and Li, Yong}, journal = {arXiv preprint arXiv:2505.13919}, year = {2025}, }
- Predicting the Energy Landscape of Stochastic Dynamical System via Physics-informed Self-supervised LearningRuikun Li, Huandong Wang, Qingmin Liao, and 1 more authorICLR, 2025
Energy landscapes play a crucial role in shaping dynamics of many real-world complex systems. System evolution is often modeled as particles moving on a landscape under the combined effect of energy-driven drift and noise-induced diffusion, where the energy governs the long-term motion of the particles. Estimating the energy landscape of a system has been a longstanding interdisciplinary challenge, hindered by the high operational costs or the difficulty of obtaining supervisory signals. Therefore, the question of how to infer the energy landscape in the absence of true energy values is critical. In this paper, we propose a physics-informed self-supervised learning method to learn the energy landscape from the evolution trajectories of the system. It first maps the system state from the observation space to a discrete landscape space by an adaptive codebook, and then explicitly integrates energy into the graph neural Fokker-Planck equation, enabling the joint learning of energy estimation and evolution prediction. Experimental results across interdisciplinary systems demonstrate that our estimated energy has a correlation coefficient above 0.9 with the ground truth, and evolution prediction accuracy exceeds the baseline by an average of 17.65%.
@article{li2025predicting, title = {Predicting the Energy Landscape of Stochastic Dynamical System via Physics-informed Self-supervised Learning}, author = {Li, Ruikun and Wang, Huandong and Liao, Qingmin and Li, Yong}, journal = {ICLR}, year = {2025}, }
- KDDPredicting the Dynamics of Complex System via Multiscale Diffusion AutoencoderRuikun Li, Jingwen Cheng, Huandong Wang, and 2 more authorsKDD, 2025
Predicting the dynamics of complex systems is crucial for various scientific and engineering applications. The accuracy of predictions depends on the model’s ability to capture the intrinsic dynamics. While existing methods capture key dynamics by encoding a low-dimensional latent space, they overlook the inherent multiscale structure of complex systems, making it difficult to accurately predict complex spatiotemporal evolution. Therefore, we propose a Multiscale Diffusion Prediction Network (MDPNet) that leverages the multiscale structure of complex systems to discover the latent space of intrinsic dynamics. First, we encode multiscale features through a multiscale diffusion autoencoder to guide the diffusion model for reliable reconstruction. Then, we introduce an attention-based graph neural ordinary differential equation to model the co-evolution across different scales. Extensive evaluations on representative systems demonstrate that the proposed method achieves an average prediction error reduction of 53.23% compared to baselines, while also exhibiting superior robustness and generalization.
@article{li2025predicting2, title = {Predicting the Dynamics of Complex System via Multiscale Diffusion Autoencoder}, author = {Li, Ruikun and Cheng, Jingwen and Wang, Huandong and Liao, Qingmin and Li, Yong}, journal = {KDD}, year = {2025}, }
- PreprintMLLM-based Discovery of Intrinsic Coordinates and Governing Equations from High-Dimensional DataRuikun Li, Yan Lu, Shixiang Tang, and 2 more authorsarXiv preprint arXiv:2505.11940, 2025
Discovering governing equations from scientific data is crucial for understanding the evolution of systems, and is typically framed as a search problem within a candidate equation space. However, the high-dimensional nature of dynamical systems leads to an exponentially expanding equation space, making the search process extremely challenging. The visual perception and pre-trained scientific knowledge of multimodal large language models (MLLM) hold promise for providing effective navigation in high-dimensional equation spaces. In this paper, we propose a zero-shot method based on MLLM for automatically discovering physical coordinates and governing equations from high-dimensional data. Specifically, we design a series of enhanced visual prompts for MLLM to enhance its spatial perception. In addition, MLLM’s domain knowledge is employed to navigate the search process within the equation space. Quantitative and qualitative evaluations on two representative types of systems demonstrate that the proposed method effectively discovers the physical coordinates and equations from both simulated and real experimental data, with long-term extrapolation accuracy improved by approximately 26.96% compared to the baseline.
@article{li2025mllm, title = {MLLM-based Discovery of Intrinsic Coordinates and Governing Equations from High-Dimensional Data}, author = {Li, Ruikun and Lu, Yan and Tang, Shixiang and Qi, Biqing and Ouyang, Wanli}, journal = {arXiv preprint arXiv:2505.11940}, year = {2025}, }
- PreprintBeyond Equilibrium: Non-Equilibrium Foundations Should Underpin Generative Processes in Complex Dynamical SystemsJiazhen Liu†, Ruikun Li†, Huandong Wang, and 4 more authorsarXiv preprint arXiv:2505.18621, 2025
This position paper argues that next-generation non-equilibrium-inspired generative models will provide the essential foundation for better modeling real-world complex dynamical systems. While many classical generative algorithms draw inspiration from equilibrium physics, they are fundamentally limited in representing systems with transient, irreversible, or far-from-equilibrium behavior. We show that non-equilibrium frameworks naturally capture non-equilibrium processes and evolving distributions. Through empirical experiments on a dynamic Printz potential system, we demonstrate that non-equilibrium generative models better track temporal evolution and adapt to non-stationary landscapes. We further highlight future directions such as integrating non-equilibrium principles with generative AI to simulate rare events, inferring underlying mechanisms, and representing multi-scale dynamics across scientific domains. Our position is that embracing non-equilibrium physics is not merely beneficial–but necessary–for generative AI to serve as a scientific modeling tool, offering new capabilities for simulating, understanding, and controlling complex systems.
@article{liu2025beyond, title = {Beyond Equilibrium: Non-Equilibrium Foundations Should Underpin Generative Processes in Complex Dynamical Systems}, author = {Liu, Jiazhen and Li, Ruikun and Wang, Huandong and Yu, Zihan and Liu, Chang and Ding, Jingtao and Li, Yong}, journal = {arXiv preprint arXiv:2505.18621}, year = {2025}, }
- PreprintSparse Diffusion Autoencoder for Test-time Adapting Prediction of Complex SystemsJingwen Cheng†, Ruikun Li†, Huandong Wang, and 1 more authorarXiv preprint arXiv:2505.17459, 2025
Predicting the behavior of complex systems is critical in many scientific and engineering domains, and hinges on the model’s ability to capture their underlying dynamics. Existing methods encode the intrinsic dynamics of high-dimensional observations through latent representations and predict autoregressively. However, these latent representations lose the inherent spatial structure of spatiotemporal dynamics, leading to the predictor’s inability to effectively model spatial interactions and neglect emerging dynamics during long-term prediction. In this work, we propose SparseDiff, introducing a test-time adaptation strategy to dynamically update the encoding scheme to accommodate emergent spatiotemporal structures during the long-term evolution of the system. Specifically, we first design a codebook-based sparse encoder, which coarsens the continuous spatial domain into a sparse graph topology. Then, we employ a graph neural ordinary differential equation to model the dynamics and guide a diffusion decoder for reconstruction. SparseDiff autoregressively predicts the spatiotemporal evolution and adjust the sparse topological structure to adapt to emergent spatiotemporal patterns by adaptive re-encoding. Extensive evaluations on representative systems demonstrate that SparseDiff achieves an average prediction error reduction of 49.99% compared to baselines, requiring only 1% of the spatial resolution.
@article{cheng2025sparse, title = {Sparse Diffusion Autoencoder for Test-time Adapting Prediction of Complex Systems}, author = {Cheng, Jingwen and Li, Ruikun and Wang, Huandong and Li, Yong}, journal = {arXiv preprint arXiv:2505.17459}, year = {2025}, }
2024
- KDDPredicting Long-term Dynamics of Complex Networks via Identifying Skeleton in Hyperbolic SpaceRuikun Li, Huandong Wang, Jinghua Piao, and 2 more authorsKDD, 2024
Learning complex network dynamics is fundamental for understanding, modeling, and controlling real-world complex systems. Though great efforts have been made to predict the future states of nodes on networks, the capability of capturing long-term dynamics remains largely limited. This is because they overlook the fact that long-term dynamics in complex network are predominantly governed by their inherent low-dimensional manifolds, i.e., skeletons. Therefore, we propose the Dynamics-Invariant Skeleton Neural Network (DiskNet), which identifies skeletons of complex networks based on the renormalization group structure in hyperbolic space to preserve both topological and dynamics properties. Specifically, we first condense complex networks with various dynamics into simple skeletons through physics-informed hyperbolic embeddings. Further, we design graph neural ordinary differential equations to capture the condensed dynamics on the skeletons. Finally, we recover the skeleton networks and dynamics to the original ones using a degree-based super-resolution module. Extensive experiments across three representative dynamics as well as five real-world and two synthetic networks demonstrate the superior performances of the proposed DiskNet, which outperforms the state-of-the-art baselines by an average of 10.18% in terms of long-term prediction accuracy.
@article{li2024predicting, title = {Predicting Long-term Dynamics of Complex Networks via Identifying Skeleton in Hyperbolic Space}, author = {Li, Ruikun and Wang, Huandong and Piao, Jinghua and Liao, Qingmin and Li, Yong}, journal = {KDD}, year = {2024}, }
- PreprintArtificial intelligence for complex network: Potential, methodology and applicationJingtao Ding, Chang Liu, Yu Zheng, and 8 more authorsarXiv preprint arXiv:2402.16887, 2024
This tutorial will explore the fascinating domain of empirical network modeling through artificial intelligence (AI) techniques, with applications across social media, web systems, and urban environments. Participants will gain valuable insights into incorporating advanced AI methods-such as graph machine learning, deep reinforcement learning, and generative models-within complex network science. The goal is to provide a comprehensive understanding of how these models can effectively represent, predict, and control empirical networked systems with heterogeneous structures and dynamic processes. The tutorial will begin by introducing essential background knowledge, outlining motivations and challenges, exploring recent methodological advances, and highlighting key applications.
@article{ding2024artificial, title = {Artificial intelligence for complex network: Potential, methodology and application}, author = {Ding, Jingtao and Liu, Chang and Zheng, Yu and Zhang, Yunke and Yu, Zihan and Li, Ruikun and Chen, Hongyi and Piao, Jinghua and Wang, Huandong and Liu, Jiazhen and others}, journal = {arXiv preprint arXiv:2402.16887}, year = {2024}, }
2023
- KDDLearning slow and fast system dynamics via automatic separation of time scalesRuikun Li, Huandong Wang, and Yong LiKDD, 2023
Learning the underlying slow and fast dynamics of a system is instrumental for many practical applications related to the system. However, existing approaches are limited in discovering the appropriate time scale to separate the slow and fast variables and effectively learning their dynamics based on correct-dimensional representation vectors. In this paper, we introduce a framework that effectively learns slow and fast system dynamics in an integrated manner. We propose a novel intrinsic dimensionality (ID) driven learning method based on a time-lagged autoencoder framework to identify appropriate time scales to separate slow and fast variables and their IDs simultaneously. Further, we propose an integrated framework to concurrently learn the system’s slow and fast dynamics, which is able to integrate prior knowledge of time scale and IDs and model the complex coupled slow and fast variables. Extensive experimental results on two representative dynamical systems show that our proposed framework is able to efficiently learn slow and fast system dynamics. Specifically, the long-time prediction performance is able to be improved by 36% on average compared with four representative baselines based on our proposed framework. Furthermore, our proposed system is able to extract interpretable slow and fast dynamics highly correlated with the known slow and fast variables in the dynamical systems.
@article{li2023learning, title = {Learning slow and fast system dynamics via automatic separation of time scales}, author = {Li, Ruikun and Wang, Huandong and Li, Yong}, journal = {KDD}, year = {2023}, }