Research Experience for Undergraduates

Research Experience for Undergraduates

group photo

The Research Experience for Undergraduates (REU) Site in Multidisciplinary Computational Science program is one in which students will work collaboratively on a wide variety of computational science projects. The students will learn how to use the most current cyberinfrastructure tools and receive training on fundamental scientific computing skills and techniques. The program will involve 7 US students every summer, who will spend 10 weeks at the University of Notre Dame. Applicants are eligible for funding for up to $5000 (10-week stipend), travel cost and accommodation at Notre Dame dorms. Funding is provided by NSF REU grant through CRC.

The Center for Research Computing (CRC) at the University of Notre Dame is an ideal setting for the REU student to become familiar with interdisciplinary computational research. CRC provides access to research groups working across a diverse range of computational problems, including, but not limited to:

  • Modeling highly complex blood clotting processes to advance the understanding and treatment of heart disease, strokes and hemophilia
  • Using molecular simulation to develop a fundamental understanding of the link between the physical properties of materials and their chemical constitution
  • Discovering planets outside the solar system to help further astrophysicists' study of star and planet formation
  • Creating computational models of coastal ocean hydrodynamics that can be applied to real-world problems, from dealing with coastal flooding due to hurricanes to helping map currents for shipping operations, dredging and harbor design

Potential topics of interest for the CRC REU include:

  • Multi-scale Computations in Engineering and Science
  • Over the past several years, evolutionary advances in computational modeling of multi-scale physics have prompted widespread use of numerical simulations in a variety of areas in engineering and science. The computational advances are embodied in improved mathematical models, numerical solution algorithms, and computational hardware. Most critically, the improvements have enabled a more robust scientific design procedure in which a priori predictions are more reliable. The key to past and future enhancements in prognostic ability lies in increasing the resolution details that describe physical events evolving over a wide range of spatial and temporal scales.


  • Multidisciplinary Molecular Modeling and Simulation
  • The field of Molecular Modeling and Simulation (MMS) refers to research in which the behavior of matter is studied using advanced computational techniques. MMS covers a wide spectrum of length and time scales, ranging from quantum mechanics, to the atomic scale of individual molecules, through the nanometer scale of small collections of molecules and ending at the mesoscale of large collections of molecules such as those found in cells.


  • Validation and Verification
  • As scientific calculations are being widely used to support policy-level decision makers in a wide arena of government and industry, it can literally be a life-and-death matter that the calculations are accurate. Nearly all quantitative predictive science takes as its starting point a mathematical model. We will focus on those models that are described by systems of partial differential equations; while not all-encompassing, these in fact cover a wide breadth of important problems in nature. Typically exact solutions are unavailable, so one writes a less exact, discrete representation of the model. It is this discrete system that is translated into a program that is subsequently solved on the computer. A wide variety of potential errors are introduced in the procedure: 1) errors due to the finite size of the discretization, 2) errors in translating the mathematics to the computer code (both human and algorithmic), 3) errors in solving the discrete equations, and 4) errors due to the finite precision of the computer. Lastly, the model itself may be missing important physics, which may not be known until the calculation has been performed. The verification process is a systematic, but widely under-utilized procedure for minimizing or eliminating the errors of types 1-4. The procedures are mathematically well defined and robust. Only when a computational model is fully verified is it appropriate to compare predictions to measurements.


  • Cyberinfrastructure (CI) Development
  • CI is a rapidly growing and expanding component of information technology focused on distributed computing, data, and communications technology. Hardware and software systems are rapidly being developed and implemented to build the virtual research communities along with the collaborative tools to knit these user communities together. CI is being applied to such areas as supercomputing, large-scale data repositories, digitized scientific data management/high-capacity mass-storage systems, and scalable interactive visualization. State-of-the-art high performance networks deliver connectivity to an array of distributed software tools and services, including grids and middleware that hide the complexities and inhomogeneous components of large heterogeneous systems while seeking to provide users with ubiquitous access and enhanced usability. The mission of the CI projects within this REU is to both train a new workforce and develop new CI elements to support campus and national computational research.

2014 Application Information

  • Program dates: Begins Tuesday, May 27 and ends Friday, August 1, 2014
  • Application Submission Begins Monday, January 6, 2014
  • Application deadline is Saturday, March 1, 2014
  • Notification Date: Wednesday, March 26, 2014
  • 2014 REU Application

2014 CRC REU Project Topics

Previous REU Sessions

Click here for an application form.