Today’s neuroscientific results are based on complex image analyses that involve a large variety of open source software. Unfortunately the last years have shown that many neuroscientific findings cannot be replicated and that data sets are not yet shared widely to enable efficient reuse of data and alternative analyses.
In this talk I want to give an overview of how the neuroscience field aims for open and reproducible analyses and by this enables better research and more reliable findings. I want to show how the field progressed to use open data formats for storing raw data acquired from MRI scanners (ISMRMRD), how processed images that can reach multiple TB in file size are stored, including data provenance information (MINC), and how these data are analysed in a reproducible way using software containers (SINGULARITY) that allow scaling an analysis from a small development platform to large scale high performance computing systems and enable the efficient managing of different software versions and the reproducibility of analyses. Finally, I want to touch on sharing neuro scientific data in open repositories that allow the re-use and pooling of large datasets to derive new results that would be impossible without data sharing.
Dr Steffen Bollmann is a post doctoral research fellow at the Centre for Advanced Imaging of the University of Queensland. He obtained a bachelor’s degree in biomedical engineering at the Ilmenau University of Technology, followed by a Masters degree in biomedical engineering & bioelectromagnetism. Following this, Dr Bollmann completed a PhD investigating multimodal imaging in ADHD children, adolescents and adults at the Neuroscience Centre Zurich and the Centre for MR-research, University Children’s Hospital Zurich. Dr Bollmann is developing algorithms to process quantitative imaging data with the goal to identify early biomarkers for neurodegenerative diseases on a single subject level. In his research Dr Bollmann analyses large MRI datasets that can be in the order of multiple TB and require the use of specialized high performance computing systems that can handle high I/O requirements.