ISCL dinner - Spring 2024. From left to right - DJ, Haiwen, Romit. Not pictured - Shivam Barwey.
Welcome to the group page of the Interdisciplinary Scientific Computing Laboratory (ISCL)!.
ISCL performs research at the intersection of data science, applied mathematics, and high-performance computing for solving exciting problems in computational science. Specifically we are interested in increasing the predictability of complex (typically dynamical) systems. Some applications we are actively working on are 1) weather and climate model development 2) multiscale modeling for nuclear fusion, 3) improved modeling and layout optimization of wind turbines and farms, 4) fundamental computational science and machine learning research that addresses challenges in these domains and beyond. Some algorithms that we are interested in include neural ordinary differential equations (aka differentiable physics), neural operator learning, geometric deep learning, vision transformers, quantum reservoir computing and variational methods in quantum algorithms, neural density estimation, surrogate based optimization/data-assimilation/control. However - we put the problem before the tool and simple solutions to complex problems are highly prized in our group!
ISCL is housed in the Information Sciences and Technology Department at Pennsylvania State University, as well as the Mathematics and Computer Science Division at Argonne National Laboratory. Our team is composed of postdoctoral fellows, graduate students, and visiting students across these two locations (with opportunities to travel between them). A high level overview of our research may be found in the following talks: [1], [2], [3], [4], [5]. Further information about publications can be found on Google Scholar and our software contributions are available on Github. ISCL eagerly welcomes possibilities for collaboration! Feel free to reach out to us for any questions.
Members of our group have a unique platform for impactful research with access to Argonne's state-of-the-art supercomputing resources and the ability to work on large-scale research projects of strategic importance.
News
Our new paper on using ensemble data assimilation to improve the Spalart Allmaras RANS turbulence model is published in Physical Review Fluids. This excellent piece of work was led by Deepinder (as Research Aide at Argonne National Laboratory).
Pleased to announce a new collaborative project with the EVS division of Argonne National Laboratory for Foundation Models in Surface Hyrdology. This will be collaborative with Jeremy Feinstein, Ross Alexander, Hong Zhang, and Rao Kotamarthi.
Our paper on learning chaotic dynamical systems using a novel neural ordinary differential equations is now on the Arxiv (link to paper). This work was led by Dibyajyoti Chakraborty and in collaboration with Kevin Chung at Lawrence Livermore National Laboratory.
A new preprint on using the Spherical Fourier Neural Operator to build climate emulators with quantified uncertainty is out (link to paper). This work was led by Haiwen Guan and is in collaboration with Dr. Troy Arcomano (Argonne National Laboratory) and Prof. Ashesh Chattopadhyay (UC Santa Cruz). Update: This work has now been accepted at the ICML workshop on Machine Learning for Earth System Modeling (2024)!
Our paper on multifidelity PINNs is published in CMAME! See a preprint of this paper here. Thank you to Dr. Sunwoong Yang and ISCL member Hojin Kim for their work on this collaboration.
Congratulations to Dibyajyoti Chakraborty and Haiwen Guan for securing visiting researcher positions at Los Alamos National Laboratory and Argonne National Laboratory! They will work on collaborations between ISCL and these institutions for neural differential equations and climate model emulation research respectively.
A collaboration with Prof. Aditya Grover (UCLA) and (Rao Kotamarthi) Argonne National Laboratory leads to a best paper at the ICLR 2024 Climate Change for AI workshop! See a preprint of this paper here. We introduce a novel vision transformer based emulator for the atmosphere.
A new paper published in collaboration with Gianmarco Mengaldo (NUS), Oliver Schmidt (UCSD), Marcin Rogowski, Matteo Parsani, and Lisandro Dalcin (KAUST) is published in Computer Physics Communications. Read our preprint here.
Our paper on the use of the Quantum Approximate Optimization Algorithm to build reduced-order models of fluid flows is published in the Journal of Computational Physics! This work was done in collaboration with Argonne National Laboratory and the University of Glasgow. Read more here.
The research manifesto of the interdisciplinary scientific computing group.
Grand challenge problems
Improving geophysical forecast models with data science. We are building models to forecast variables using scientific machine learning. (credit Troy Arcomano - collaborator).
Tokamak disruption mitigation for nuclear fusion (image taken from Boozer et al., 2012). We are building closures and surrogate models to accelerate the computation of complex nuclear fusion reactors (i.e., system level simulations).
Scientific ML algorithm development
An algorithm that builds deep learning function approximation for sparse, unstructured, and time-varying observations (Nature Machine Intelligence 3 (11), 945-951, 2021).
A novel graph neural network architecture that not only makes predictions but also identifies salient physical features in an interpretable latent space (credit Shivam Barwey - postdoc).
Applied machine learning
Building a wind-turbine wake model for on-shore wind farms in the Texas panhandle using LIDAR and meteorological data (Neural Computing and Applications 34 (8), 6171-6186, 2022).
Characterizing the schooling of fish using generative machine learning. Here, the variation in the density of fish is captured by an optimal transport based forecast model (credit Jonah Botvinick-Greenhouse - visiting graduate student).
High performance heterogeneous computing
Building a scalable and reproducible ecosystem for scientific machine learning research (Journal of Computational Science 62 (2022): 101750).