Update: After 2 and half years at the Pennsylvania State University, our group is moving to Purdue University, West Lafayette, starting January 2026. Thank you to all the wonderful people at Penn State for all the great (and continuing) research & fond memories.
ISCL members perform research at the intersection of data science, applied mathematics, and high-performance computing to enhance the understanding of complex multifidelity and multiphysics phenomena in various applications. In other words - we create science-based AI algorithms for applications in mechanical and aerospace engineering, Earth systems modeling, nuclear fusion, and more. We also run a scientific machine learning seminar series with leading researchers in our area of study - check it out
here!
An overview of our various research directions may be found in the following talks (starting oldest first): [1], [2], [3], [4], [5], [6], [7] . Further information about publications can be found on Google Scholar and our software contributions are available on Github. ISCL has access to multiple HPC resources such as Bebop/Swing/Polaris/Aurora/Sophia (at Argonne), Gilbreth/Anvil (Purdue), and Perlmutter (NERSC).
ISCL eagerly welcomes possibilities for education, collaboration, and consulting! Feel free to reach out to us for any questions.
News
- We welcome two new members to our group in January 2026. Dr. Melissa Adrian joins us through the prestigious Lillian Gilbreth Postdoctoral Fellowship and Kanad Sen joins us as a PhD student. Welcome!
- Our paper on diffusion models for data and model fusion for atmospheric super-resolution is now accepted in IOP Machine Learning: Earth! A preprint can be found here. This is the one of the first demonstrations of using diffusion models to assimilate real-world data for generating reanalysis data.
- A new preprint from our group based on work led by former postdoc Dr. Xuyang Li (now faculty at UNC Charlotte) and in collaboration with Prof. John Harlim at Penn State. In this preprint, we show that a novel hybrid formulation of the neural ordinary differential equation leveraging both weak and strong form discretizations leads to superior performance for short term accuracy and long-term statistical recovery.
- Pleased to announce a new preprint from the group based on work done by Hojin Kim. In this study, we revisit the classical discrete empirical interpolation algorithm and learn that it may be very useful for interpretable deployments of neural differential equations. Learn more here. Update - this paper has been accepted to the AAAI Workshop on XAI, 2026 at Singapore!
- Haiwen Guan's paper on efficient AI emulators for climate is accepted to JAMES! Learn more here.
- See other archived news here.
Sponsors