The workshop will teach the basics of supercomputing needed for starting to use HPC systems for (neuroscience) research. This includes on the one hand introductory lectures with hands-on sessions about scientific computing in Python and an introduction to the usage of HPC systems and (big) data management. On the other hand, the students will get hands-on training for tools and applications that can both be used on a supercomputer as well as on the user’s local computer, for instance the simulators NEST (for point-neuron models) and Arbor (for morphologically detailed neuron models), and visualisation tools that can handle large imaging or simulation data as generated on a supercomputer. The tools and applications presented are developed in the HBP High Performance Analytics and Computing (HPAC) Platform. The introductory lectures also enable the students to make efficient use of the other HBP Platforms, in particular the Neuroinformatics, the Brain Simulation and the Neurorobotics Platforms that use the HPAC Platform as a back-end.
Topics: Neuroscience, computing, technology, simulations, neural networks, supercomputers, data management, high-performance computing, HPC, python, HPC systems, NEST, Arbor, HBP Platforms, supercomputing, visual data analysis