Welcome to the group page of the Interdisciplinary Scientific Computing Laboratory (ISCL). ISCL performs research at the intersection of data science, applied mathematics, and high-performance computing for solving grand challenge problems in computational science. ISCL is housed in the Information Science and Technology Department at Pennsylvania State University, University Park, as well as Argonne National Laboratory in the suburbs of Chicago. Our team is composed of postdoctoral fellows, graduate students, and visiting students across these two locations (with opportunities to travel between them). A high level overview of our research may be found in the following talks: [1], [2], [3], [4]. Further information about publications can be found on Google Scholar and our software contributions are available on Github.
Members of our group have a unique platform for impactful research with access to Argonne's state-of-the-art supercomputing resources and the ability to work on large-scale research projects of strategic importance.
Romit will be giving a plenary talk about his experiences in interdisciplinary research at the Oklahoma State University, Mechanical and Aerospace Engineering Graduate Research Symposium on March 24, 2023.
Our work on multifidelity reinforcement learning for airfoil shape optimization has been published in the Journal of Computational Physics. Congrats to our summer interns Sahil Bhola and Suraj Pawar! Read more here.
Our workshop paper on the practical implications of invariant-preserving geometric deep learning for computational fluid dynamics is published in the ICLR Physics4ML workshop. Congratulations to Varun Shankar and Shivam Barwey!
The research manifesto of the interdisciplinary scientific computing group.
Grand challenge problems
Improving geophysical forecast models with data science. We are building models to forecast variables (such as the daytime maximum temperature here) using scientific machine learning.
Tokamak disruption mitigation for nuclear fusion (image taken from Boozer et al., 2012). We are building closures and surrogate models to accelerate the computation of complex nuclear fusion reactors (i.e., system level simulations).
Scientific ML algorithm development
An algorithm that builds deep learning function approximation for sparse, unstructured, and time-varying observations (Nature Machine Intelligence 3 (11), 945-951, 2021).
A novel graph neural network architecture that not only makes predictions but also identifies salient physical features in an interpretable latent space (credit Shivam Barwey - postdoc).
Applied machine learning
Building a wind-turbine wake model for on-shore wind farms in the Texas panhandle using LIDAR and meteorological data (Neural Computing and Applications 34 (8), 6171-6186, 2022).
Characterizing the schooling of fish using generative machine learning. Here, the variation in the density of fish is captured by an optimal transport based forecast model (credit Jonah Botvinick-Greenhouse - visiting graduate student).
High performance heterogeneous computing
Building a scalable and reproducible ecosystem for scientific machine learning research (Journal of Computational Science 62 (2022): 101750).