ÐÓ°ÉÊÓƵ

APL Colloquium

October 11, 2024

Colloquium Topic: Productive Parallel Programming with the Chapel Language

Do you want it to be easier to write software that runs really fast? Do you enjoy the productivity that Python offers but yearn for better performance? Come to this talk to learn about the productivity and performance available in the Chapel programming language. Learn how the Chapel language helps users to solve their most challenging parallel programming problems across many areas including physical simulation and data science.

Chapel is unique among programming languages because it makes it easier to create programs that are speedy and scalable. Programmers are more productive thanks to built-in support for scalable parallel computing that enables clean, concise code relative to conventional approaches in high-performance computing (HPC) — such as Fortran/C/C++, OpenMP, MPI, and CUDA. One group of users found that students could be 8 times more productive! Best of all, this productivity doesn’t come at the cost of program performance. Chapel programs have demonstrated speed and scalability on both CPUs and GPUs and benchmarks show Chapel programs are competitive with conventional HPC approaches. Whether you are working on a laptop, a workstation, or on a million cores of a supercomputer, the Chapel programming language is ready to help you to achieve your performance goals.



Colloquium Speaker: Michael Ferguson

Michael Ferguson is a principal software engineer at HPE who works on the Chapel programming language and its compiler. He has worked in many areas of the Chapel project; from LLVM-based code generation to I/O support. He’s currently the technical leader of an effort to dramatically improve the compiler, including making it more interactive and reducing compile times.

Before working with the Chapel programming language, Michael wrote parallel applications, including the open-source indexing and search system FEMTO. He came away from these experiences thinking that the tools available for parallel programming could be a lot better! Programs were esoteric, fragile, and non-portable. He has been working to address these challenges and to make parallel computing more accessible and productive.