Welcome to the group page of the Interdisciplinary Scientific Computing Laboratory (ISCL). ISCL performs research at the intersection of data science, applied mathematics, and high-performance computing for solving grand challenge problems in computational science. ISCL is housed in the Information Science and Technology Department at Pennsylvania State University, University Park, as well as Argonne National Laboratory in the suburbs of Chicago. Our team is composed of postdoctoral fellows, graduate students, and visiting students across these two locations (with opportunities to travel between them). A high level overview of our research may be found in the following talks: [1], [2], [3], [4]. Further information about publications can be found on Google Scholar and our software contributions are available on Github.
Members of our group have a unique platform for impactful research with access to Argonne's state-of-the-art supercomputing resources and the ability to work on large-scale research projects of strategic importance.
News
Romit will be visiting the National University of Singapore for the International Workshop on Reduced Order Methods from May 21-25, 2023. Thank you to the organizers (and particularly Gianmarco Mengaldo) for the kind invitation!
Romit will be visiting the topical workshop on Mathematical and Scientific Machine Learning at ICERM (Brown University) for an invited talk on neural architecture search for scientific machine learning from June 4-9, 2023. Thank you ICERM for the invitation!
The research manifesto of the interdisciplinary scientific computing group.
Grand challenge problems
Improving geophysical forecast models with data science. We are building models to forecast variables (such as the daytime maximum temperature here) using scientific machine learning.
Tokamak disruption mitigation for nuclear fusion (image taken from Boozer et al., 2012). We are building closures and surrogate models to accelerate the computation of complex nuclear fusion reactors (i.e., system level simulations).
Scientific ML algorithm development
An algorithm that builds deep learning function approximation for sparse, unstructured, and time-varying observations (Nature Machine Intelligence 3 (11), 945-951, 2021).
A novel graph neural network architecture that not only makes predictions but also identifies salient physical features in an interpretable latent space (credit Shivam Barwey - postdoc).
Applied machine learning
Building a wind-turbine wake model for on-shore wind farms in the Texas panhandle using LIDAR and meteorological data (Neural Computing and Applications 34 (8), 6171-6186, 2022).
Characterizing the schooling of fish using generative machine learning. Here, the variation in the density of fish is captured by an optimal transport based forecast model (credit Jonah Botvinick-Greenhouse - visiting graduate student).
High performance heterogeneous computing
Building a scalable and reproducible ecosystem for scientific machine learning research (Journal of Computational Science 62 (2022): 101750).