• Skip to content (access key 1)
  • Skip to search (access key 7)
FWF — Austrian Science Fund
  • Go to overview page Discover

    • Research Radar
      • Research Radar Archives 1974–1994
    • Discoveries
      • Emmanuelle Charpentier
      • Adrian Constantin
      • Monika Henzinger
      • Ferenc Krausz
      • Wolfgang Lutz
      • Walter Pohl
      • Christa Schleper
      • Elly Tanaka
      • Anton Zeilinger
    • Impact Stories
      • Verena Gassner
      • Wolfgang Lechner
      • Birgit Mitter
      • Oliver Spadiut
      • Georg Winter
    • scilog Magazine
    • Austrian Science Awards
      • FWF Wittgenstein Awards
      • FWF ASTRA Awards
      • FWF START Awards
      • Award Ceremony
    • excellent=austria
      • Clusters of Excellence
      • Emerging Fields
    • In the Spotlight
      • 40 Years of Erwin Schrödinger Fellowships
      • Quantum Austria
    • Dialogs and Talks
      • think.beyond Summit
    • Knowledge Transfer Events
    • E-Book Library
  • Go to overview page Funding

    • Portfolio
      • excellent=austria
        • Clusters of Excellence
        • Emerging Fields
      • Projects
        • Principal Investigator Projects
        • Principal Investigator Projects International
        • Clinical Research
        • 1000 Ideas
        • Arts-Based Research
        • FWF Wittgenstein Award
      • Careers
        • ESPRIT
        • FWF ASTRA Awards
        • Erwin Schrödinger
        • doc.funds
        • doc.funds.connect
      • Collaborations
        • Specialized Research Groups
        • Special Research Areas
        • Research Groups
        • International – Multilateral Initiatives
        • #ConnectingMinds
      • Communication
        • Top Citizen Science
        • Science Communication
        • Book Publications
        • Digital Publications
        • Open-Access Block Grant
      • Subject-Specific Funding
        • AI Mission Austria
        • Belmont Forum
        • ERA-NET HERA
        • ERA-NET NORFACE
        • ERA-NET QuantERA
        • Alternative Methods to Animal Testing
        • European Partnership BE READY
        • European Partnership Biodiversa+
        • European Partnership BrainHealth
        • European Partnership ERA4Health
        • European Partnership ERDERA
        • European Partnership EUPAHW
        • European Partnership FutureFoodS
        • European Partnership OHAMR
        • European Partnership PerMed
        • European Partnership Water4All
        • Gottfried and Vera Weiss Award
        • LUKE – Ukraine
        • netidee SCIENCE
        • Herzfelder Foundation Projects
        • Quantum Austria
        • Rückenwind Funding Bonus
        • WE&ME Award
        • Zero Emissions Award
      • International Collaborations
        • Belgium/Flanders
        • Germany
        • France
        • Italy/South Tyrol
        • Japan
        • Korea
        • Luxembourg
        • Poland
        • Switzerland
        • Slovenia
        • Taiwan
        • Tyrol-South Tyrol-Trentino
        • Czech Republic
        • Hungary
    • Step by Step
      • Find Funding
      • Submitting Your Application
      • International Peer Review
      • Funding Decisions
      • Carrying out Your Project
      • Closing Your Project
      • Further Information
        • Integrity and Ethics
        • Inclusion
        • Applying from Abroad
        • Personnel Costs
        • PROFI
        • Final Project Reports
        • Final Project Report Survey
    • FAQ
      • Project Phase PROFI
      • Project Phase Ad Personam
      • Expiring Programs
        • Elise Richter and Elise Richter PEEK
        • FWF START Awards
  • Go to overview page About Us

    • Mission Statement
    • FWF Video
    • Values
    • Facts and Figures
    • Annual Report
    • What We Do
      • Research Funding
        • Matching Funds Initiative
      • International Collaborations
      • Studies and Publications
      • Equal Opportunities and Diversity
        • Objectives and Principles
        • Measures
        • Creating Awareness of Bias in the Review Process
        • Terms and Definitions
        • Your Career in Cutting-Edge Research
      • Open Science
        • Open-Access Policy
          • Open-Access Policy for Peer-Reviewed Publications
          • Open-Access Policy for Peer-Reviewed Book Publications
          • Open-Access Policy for Research Data
        • Research Data Management
        • Citizen Science
        • Open Science Infrastructures
        • Open Science Funding
      • Evaluations and Quality Assurance
      • Academic Integrity
      • Science Communication
      • Philanthropy
      • Sustainability
    • History
    • Legal Basis
    • Organization
      • Executive Bodies
        • Executive Board
        • Supervisory Board
        • Assembly of Delegates
        • Scientific Board
        • Juries
      • FWF Office
    • Jobs at FWF
  • Go to overview page News

    • News
    • Press
      • Logos
    • Calendar
      • Post an Event
      • FWF Informational Events
    • Job Openings
      • Enter Job Opening
    • Newsletter
  • Discovering
    what
    matters.

    FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

    SOCIAL MEDIA

    • LinkedIn, external URL, opens in a new window
    • , external URL, opens in a new window
    • Facebook, external URL, opens in a new window
    • Instagram, external URL, opens in a new window
    • YouTube, external URL, opens in a new window

    SCILOG

    • Scilog — The science magazine of the Austrian Science Fund (FWF)
  • elane login, external URL, opens in a new window
  • Scilog external URL, opens in a new window
  • de Wechsle zu Deutsch

  

Improving Reproducibility of Experiments in Parallel Computing

Improving Reproducibility of Experiments in Parallel Computing

Jesper Larsson Träff (ORCID: 0000-0002-4864-9226)
  • Grant DOI 10.55776/P26124
  • Funding program Principal Investigator Projects
  • Status ended
  • Start October 1, 2013
  • End September 30, 2016
  • Funding amount € 218,185

Disciplines

Computer Sciences (100%)

Keywords

    Parallel Computing, Reproducibility, Computational Experiments, Survey

Abstract Final report

It is common practice in Computer Science, in particular in the field of parallel computing, to base scientific contributions on observations and measurements of specific properties of a system under consideration. In fact, experimental works are equally important as are advances in theory. Experiments are especially important in parallel computing since parallel hardware is changing rapidly and no general applicable model of a machine or application exists. However, authors often pay little attention to the description of experimental details, although we believe that the description to reproduce findings should be a major part of every research article. As a consequence, most computational results are hardly and indeed often impossible to reproduce. This lack of reproducibility entails many disadvantages for researchers in Computer Science. Most importantly, extending work of others or comparing new approaches to published methods is almost impossible. Additionally, reviewing articles becomes harder as it is often unclear whether published findings have significant impact or not. Reviewers rarely have the chance to examine the source code or to verify the experimental design or data analysis. In the extreme case, articles describing experimental work have no scientific value when experiments are not reproducible. This project aims at improving the reproducibility of experiments in parallel computing. First, we will critically look at the current state of reproducibility of experiments shown in articles in the field of parallel computing as part of a survey. We will attempt to reproduce experimental results presented on a broad range of articles, ranging from higher-ranked journals to lower-ranked conferences. We will classify all attempts by whether and how far we can reproduce them. Second, with the survey at hand, we will name the requirements for obtaining reproducibility of parallel experiments. We will formalize these requirements and develop a description language for defining the experimental steps to ensure reproducibility. Third, we will develop small portable tools that help researchers to obtain reproducible experiments, e.g., taking snapshots of source code and parameters for each experimental run. As last goal we will apply the formalism and tools developed within this project on our own research topic. Thus, we will attempt to publish an article that allows others to reproduce our experimental findings.

Scientific advances in many research disciplines, such as Computer Science, are driven by the experimental validation of hypotheses, where experimental results are used either to support or to refute a scientific hypothesis. A problem arises if independent researchers are unable to reproduce the work of others. In such cases, a validation or a refutation of published results is impossible.We have addressed the problem of reproducibility of experimental results in the field of parallel computing. Specifically, our research is concerned with data communication operations on larger, parallel machines, such as computer clusters or supercomputers. Data communication in this context is most often done using the Message Passing Interface (MPI), which provides a standardized set of communication operations. Hence, the goal of this project was to analyze and possibly improve the state of reproducibility of experimental results of research works targeting MPI.We defined a minimal set of criteria that scientific papers must fulfill (e.g., providing a link to the source code or describing hardware details), such that other scientists may have a chance to reproduce the findings claimed. We analyzed the state of reproducibility in our domain by compiling a survey, in which we measured the reproducibility potential of papers by applying our defined reproducibility criteria. The outcome was rather disappointing, as most results published in these papers (that we had examined) could not be reproduced.Therefore, a second goal of our project was to improve this situation. In particular, we examined the problem of reproducibly measuring the time to complete several MPI communication operations. Due to a comprehensive and thorough experimental evaluation, we were able to detect experimental factors that every published, experimental description (for this particular case of MPI communication operations) must necessarily contain, such that results can be reproduced and fairly compared. These factors are, for example, the compiler flags, the operating frequency of each processor, or the caching strategy used. Our analysis also helped us to devise a novel experimentation procedure for timing MPI communication operations. This new procedure allows scientists to significantly reduce the variance between experimental runs, which leads to a far better reproducibility.

Research institution(s)
  • Technische Universität Wien - 100%
International project participants
  • Arnaud Legrand, INRIA Mescal - France
  • Peter Sanders, Universität Karlsruhe - Germany
  • Torsten Höfler, Eidgenössische Technische Hochschule Zürich - Switzerland

Research Output

  • 80 Citations
  • 8 Publications
Publications
  • 2014
    Title Reproducible MPI Micro-Benchmarking Isn't As Easy As You Think
    DOI 10.1145/2642769.2642785
    Type Conference Proceeding Abstract
    Author Hunold S
    Pages 69-76
  • 2015
    Title On the Impact of Synchronizing Clocks and Processes on Benchmarking MPI Collectives
    DOI 10.1145/2802658.2802662
    Type Conference Proceeding Abstract
    Author Hunold S
    Pages 1-10
  • 2015
    Title MPI Benchmarking Revisited: Experimental Design and Reproducibility
    DOI 10.48550/arxiv.1505.07734
    Type Preprint
    Author Hunold S
  • 2015
    Title A Survey on Reproducibility in Parallel Computing
    DOI 10.48550/arxiv.1511.04217
    Type Preprint
    Author Hunold S
  • 2016
    Title Reproducible MPI Benchmarking is Still Not as Easy as You Think
    DOI 10.1109/tpds.2016.2539167
    Type Journal Article
    Author Hunold S
    Journal IEEE Transactions on Parallel and Distributed Systems
    Pages 3617-3630
  • 2014
    Title Stepping Stones to Reproducible Research: A Study of Current Practices in Parallel Computing
    DOI 10.1007/978-3-319-14325-5_43
    Type Book Chapter
    Author Carpen-Amarie A
    Publisher Springer Nature
    Pages 499-510
    Link Publication
  • 2014
    Title One step toward bridging the gap between theory and practice in moldable task scheduling with precedence constraints
    DOI 10.1002/cpe.3372
    Type Journal Article
    Author Hunold S
    Journal Concurrency and Computation: Practice and Experience
    Pages 1010-1026
  • 2016
    Title PGMPI: Automatically Verifying Self-Consistent MPI Performance Guidelines
    DOI 10.48550/arxiv.1606.00215
    Type Preprint
    Author Hunold S

Discovering
what
matters.

Newsletter

FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

Contact

Austrian Science Fund (FWF)
Georg-Coch-Platz 2
(Entrance Wiesingerstraße 4)
1010 Vienna

office(at)fwf.ac.at
+43 1 505 67 40

General information

  • Job Openings
  • Jobs at FWF
  • Press
  • Philanthropy
  • scilog
  • FWF Office
  • Social Media Directory
  • LinkedIn, external URL, opens in a new window
  • , external URL, opens in a new window
  • Facebook, external URL, opens in a new window
  • Instagram, external URL, opens in a new window
  • YouTube, external URL, opens in a new window
  • Cookies
  • Whistleblowing/Complaints Management
  • Accessibility Statement
  • Data Protection
  • Acknowledgements
  • IFG-Form
  • Social Media Directory
  • © Österreichischer Wissenschaftsfonds FWF
© Österreichischer Wissenschaftsfonds FWF