The February 2016 Newsletter has been released with the latest information on activities and events.

Visit the Newsletter page for back issues.

The Center for Open Science (COS) has entered into a multi-year partnership with the University of Notre Dame to support long-term solutions for open science.

Combining the services of the Center for Research Computing and the Hesburgh Libraries at Notre Dame with COS’s Open Source Framework (OSF), a free and open source web application that creates a space for all of the components of a research cycle, will promote the long-term preservation, discoverability, and sharing of research data between collaborators.

“We are thrilled to engage in this new partnership with the Center for Open Science. Not only do our research missions align, but being able to contribute our resources to support open, reliable, and reproducible science will help propel scientific research into an uninhibited realm in the future,” said Jarek Nabrzyski, Director of the Center for Research Computing (CRC).

The OSF is a free, scholarly commons that increases efficiency of research by organizing the entire research lifecycle in one location. Simplified project management tools, such as automated versioning, fine-grained privacy controls, connections to other applications, and bibliographic citations that include persistent identifiers, make conducting and sharing research seamless.

Andrew Sallans, COS Partnerships, Collaborations, and Funding Manager states, “This partnership highlights how aligned the Center for Open Science and Notre Dame are in increasing transparency in scientific research. OSF for Institutions is a big step for COS in refining the Open Science Framework in ways that add value for institutions, and we found a wonderful development partner for this effort. Through integrated authentication and institutional branding, OSF for Institutions will help the University highlight the breadth of research being conducted there, allow for easier curation of data, and showcase the impact of research outputs.”

As part of the project, the CRC and the Hesburgh Libraries will contribute extensive expertise in high performance computing, cyberinfrastructure development, research software development, data management, semantics, and metadata. Notre Dame is already a major partner in several open science initiatives, including the National Data Service (NDS), a community driven effort to simplify finding, using, and publishing data; NSF-funded Data and Software Preservation for Open Science (DASPOS), a collaborative effort to explore preservation techniques for high energy physics datasets, software, and architecture; and the Vector-Borne Disease Network (VecNet), a repository for modeling the impacts of interventions on malaria transmission and control.

Notre Dame will also integrate its institutional repository, CurateND, with the OSF while performing an audit of OSF features against the project management needs of Notre Dame. This will enable researchers to archive their research data directly into CurateND from the active research environment. Further, both groups will test the registration of the VecNet digital library of malaria data files on OSF, use the OSF as the interface for the backend dashboard developed for the NDS, and work on a reproducible software engineering environment by creating and documenting a development environment for the OSF. Notre Dame will connect these initiatives with the OSF, providing users with a more seamless workflow experience.

"In keeping with our mission, the Hesburgh Libraries are committed to help manage, preserve, curate, and globally share the research and associated data created at Notre Dame. This exciting, forward-thinking partnership with the Center for Open Science will lead the way for streamlining this process, meeting data sharing mandates, and increasing opportunities for impact for all researchers around the world,” says Edward H. Arnold University Librarian Diane Parr Walker.


About Center for Open Science

The Center for Open Science (COS) is a non-profit technology startup founded in 2013 with a mission to increase openness, integrity, and reproducibility of scientific research. COS pursues this mission by building communities around open science practices, supporting metascience research, and developing and maintaining free, open source software tools. The Open Science Framework (OSF), COS’s flagship product, is a web application that connects and supports the research workflow, enabling scientists to increase the efficiency and effectiveness of their research. Researchers use the OSF to collaborate, document, archive, share, and register research projects, materials, and data. Learn more at and


About the University of Notre Dame

Founded in 1842, the University of Notre Dame provides a distinctive voice in higher education that is at once rigorously intellectual, unapologetically moral in orientation, and firmly embracing of a service ethos. The nation’s pre-eminent Catholic university and rated among the top 20 of all U.S. institutions of higher learning, Notre Dame is organized into four undergraduate colleges — Arts and Letters, Science, Engineering, and the Mendoza College of Business — the School of Architecture, the Law School, the Graduate School, 10 major research institutes, more than 40 centers and special programs, and the University library system. Located adjacent to the city of South Bend, Indiana, which has a metropolitan population of more than 300,000, Notre Dame is highly residential, with 80 percent of students living on campus, and also is known for the quality of its physical plant and the beauty of its campus, including the Golden Dome of the Main Building, the world’s most recognized university landmark. For more information on Notre Dame research, please see or @UNDResearch.


The Center for Research Computing at the University of Notre Dame is an innovative and multidisciplinary research environment that supports collaboration to facilitate discoveries in science and engineering, the arts, humanities and social sciences, through advanced computation, data analysis and other digital research tools. The Center enhances the University’s cyberinfrastructure, provides support for interdisciplinary research and education, and conducts computational research.


The Hesburgh Libraries is a diverse system featuring the flagship Hesburgh Library, which houses specialty libraries and centers, and eight branch libraries located throughout the Notre Dame campus. Home to nearly 200 library faculty and staff, the Libraries hold more than 3.5 million monographs and subscribe to more than 35,000 serials. The vast array of expertise, services, resources and spaces help to support the teaching, learning and research at Notre Dame. Digital library services include CurateND and the Hesburgh Libraries’ Center for Digital Scholarship (CDS). The CDS houses and supports state-of-the art technologies, enabling students and faculty to explore new methodologies, analyze complex data, share research results and work with collaborative multidisciplinary teams. The center’s current research, support, consulting and referral services include: Geographic Information Systems; Data Usage and Analysis; Text Mining and Analysis; Data Management Planning; Metadata and Digitization Services.


The DASPOS project team includes computer science experts from the University of Notre Dame and the University of Chicago, physicists from the ATLAS and CMS experiments at the LHC, the DØ experiment at the Tevatron, experts in other data-intensive fields such as bioinformatics and astrophysics, and digital librarians with broad experience in the preservation of large datasets in the sciences and humanities. The DASPOS project has been funded in whole or in part with Federal funds from the National Science Foundation, under Award No. 1247316.


Vector-Borne Disease Network (VecNet) was founded in 2011 and the VecNet portal hosts two mathematical models: the Epidemiological Model of Disease (developed by the Institute for Disease Modelling at Intellectual Ventures) and the OpenMalaria model, (developed by the Swiss Tropical and Public Health Institute) as well as a fully searchable digital library to allow users to access malaria information and data for modeling intervention efficacy.

The December 2015 Newsletter has been released with the latest information on activities and events.

Visit the Newsletter page for back issues.

The November 2015 Newsletter has been released with the latest information on activities and events.

Visit the Newsletter page for back issues.

Thursday, November 5th, 2015

3:30 p.m.

109 Pasquerilla Center/ROTC


Dennis Harding

Senior Software Engineer Institute for Disease Modeling


Constructing and Managing Custom Data Set for Disease Modeling

Modeling Workflow – Parameter collection, modeling, analysis

The problems:

  • Data design - SQL vs. NoSQL – Why the war?
  • Grouping
  • Location
  • Dealing with Terabytes and Petabytes when you only need Megabytes
  • Provenance

About Dennis Harding

Dennis Harding has over twenty years in software and engineering expertise, with a Bachelor of Science degree in Electrical Engineering from San Jose State University as well as two years of graduate work in Computer Science from Santa Clara University. Dennis’ past work has included microwave communications as well as computer science, and he has two patents to his credit, as well as five more patents still pending for his work. Dennis is a member of the IEEE, a SCRUM master, and is experienced with Agile development.

Dennis’ IDM work is focused on large data, and he leads IDM’s development efforts related to Large Data, including the generation of, management of, and storage issues related to extremely large data sets, such as those required for accurate simulation modeling.

About the Institute for Disease Modeling

The Institute for Disease Modeling (IDM) develops detailed, geographically-specific, and mechanistic stochastic simulations of disease transmission through the use of extensive and complex software modeling. IDM shares this modeling software with the research community to advance the understanding of disease dynamics.

IDM's epidemiological modeling software, called EMOD, helps determine the combination of health policies and intervention strategies that can lead to disease eradication. EMOD calculates how diseases may spread in particular areas and is used to analyze the effects of current and future health policies and intervention strategies. It supports infectious disease campaign planning, data gathering, new product development, and policy decisions for four generic transmission types: vector-borne, water-borne, airborne, and sexually transmitted (e.g. Malaria, TB, Influenza, Pertussis, HIV).

Click here for seminar announcement.

Friday, November 13th, 2015

2:00 p.m.

114 Pasquerilla Center/ROTC


Oliver Gutsche

Staff Scientist and Assistant Head Scientific Computing Division


Presentation Slides (click below to download)

Exascale and Exabytes: Future directions in HEP software and computing

Current and future HEP experiments will record and simulate larger and larger volumes of data, some going well beyond the petabyte scale. To succeed, analysts will need to master modern software and computing technologies to extract physics results from these large datasets. Gutsche will review current trends for HEP software and computing and show possible future directions for data analysis in the exabyte era.

Click here for seminar announcement.

OGutsche image

About Oliver Gutsche

Oliver Gutsche received his PhD in Experimental Particle Physics at the University of Hamburg in 2005 working on electron-proton collisions with the ZEUS experiment at Hera, DESY. He joined Fermilab and the CMS experiment at the LHC soon after as a PostDoc. Since 2014, Oliver has been a staff scientist at Fermilab and shares his time between his new position as Assistant Head of the Scientific Computing Division and studying Standard Model and Beyond the Standard Model physics at the LHC with CMS.

The October 2015 Newsletter has been released with the latest information on activities and events.

Visit the Newsletter page for back issues.

Thursday, October 8th, 2015

11:00 a.m.

112 Pasquerilla/ROTC


How might future HPC architectures utilize emerging neuromorphic chip technology?

Rick Stevens

Associate Director Computing, Environment, and Life Sciences
Argonne National Laboratory

Professor Computer Science
University of Chicago


In this talk Rick Stevens will discuss the class of neuromorphic computing devices that are emerging and how they might be used to augment classical processors in future large-scale machines.  Neuromorphic technology is being developed by a number of companies and academic groups.  These devices implement low power circuits that structurally resemble neurons and synapses in silicon and have been successfully demonstrated in pattern recognition applications.  Possible uses of neuromorphic technology include system wide monitoring for errors, applications load-balancing acceleration, data analysis and power management.  In addition to these examples he’ll expose testbeds that could be built to test these ideas.

Click here for seminar announcement.

RStevens image

About Rick Stevens

Rick L. Stevens is the Associate Laboratory Director of Computing, Environment, and Life Sciences at Argonne National Laboratory, which is the U.S. Department of Energy’s (DOE’s) oldest lab for science and energy research. He heads Argonne’s computational genomics program and co-leads the DOE laboratories planning effort for exascale computing research. He is a professor of computer science at the University of Chicago (UChicago) and is involved in several interdisciplinary studies at the Argonne/UChicago Computation Institute and at the Argonne/UChicago Institute for Genomics and Systems Biology, where he holds senior fellow appointments.

Stevens is co-principal investigator, chief technical officer, and chief architect of the DOE Systems Biology Knowledgebase project, an emerging software and data environment designed to enable researchers to collaboratively generate, test and share new hypotheses about gene and protein functions, perform large-scale analyses on a scalable computing infrastructure, and model interactions in microbes, plants, and their communities.  Stevens is also Principle investigator for the NIAID Bioinformatics Resource Center program where his group has developed computational tools and genomics databases to support infectious disease research.

Stevens is interested in the development of innovative tools and techniques that enable computational scientists to solve important large-scale problems on advanced computers. His research focuses on two principal areas: high-performance computer architectures, and computational problems in the life sciences. In addition to his research work, Stevens teaches courses on computer architecture, collaboration technology, parallel computing, and computational science. He serves on many national and international advisory committees and still finds time to occasionally write code and play with his 3D printer.


Video: Tim Wallace Keynote, Newsroom Geography


GIS Day 2015 Poster Keynote web

Slides from Panel:

Video: COS Reproducibility Panel at University of Notre Dame


Wednesday, September 9, 2015

Noon-1:00 pm

Center for Digital Scholarship (CDS), Hesburgh Library

This Reproducibility panel event is locally sponsored and co-organized by the Hesburgh Libraries' Center for Digital Scholarship and the Center for Research Computing.

Backup Google On Air if Hangout gets too full

Members of  COS' Reproducibility Project: Psychology (RP:P) & its companion project Reproducibility Project: Cancer Biology (RP:CB) will participate remotely via webcast.  The Panel will be moderated onsite here at Notre Dame by CoS  Partnerships Manager Andrew Sallans who will engage Notre Dame's faculty, staff and students in a discussion with the Reproducibility panelists.  



  • Tim Errington, Project Manager for Reproducibility Project: Cancer Biology and lead of the metascience efforts at COS [via Webcast]
  • Johanna Cohoon, Project Coordinator for Reproducibility Project: Psychology (tentative) [via Webcast]
  • Mallory Kidwell, Project Coordinator for Reproducibility Project: Psychology [via Webcast]
  • Andrew Sallans, Partnerships, Collaborations, & Funding Manager, CoS [On Site]


Lunch will be provided to on-site attendees and panelists. Please register here if you will be attending onsite at Notre Dame's CDS. You may also attend this event virtually via google hangouts. Click here to join Google Hangout.

Launched nearly four years ago and coordinated by the Center for Open Science, the Reproducibility Project: Psychology has produced the most comprehensive investigation ever done about the rate and predictors of reproducibility in a field of science.On August 27th, 2015, 270 researchers investigating the reproducibility of psychological science published their findings in Science Magazine.  The project conducted replications of 100 published findings of three prominent psychology journals. They found that regardless of the analytic method or criteria used, fewer than half of their replications produced the same findings as the original study.


Read more about The Reproducibility Project:

Estimating the reproducibility of psychological scienceOpen Science Collaboration. Science 28 August 2015: 349 (6251), aac4716 [DOI:10.1126/science.aac4716]:

The August 2015 Newsletter has been released with the latest information on activities and events.

Visit the Newsletter page for back issues.