AllScale API
DOI:
https://doi.org/10.31577/cai_2020_4_808Keywords:
API, programming interface, parallel programming, shared memory, distributed memory, parallel operator, data structureAbstract
Effectively implementing scientific algorithms in distributed memory parallel applications is a difficult task for domain scientists, as evident by the large number of domain-specific languages and libraries available today attempting to facilitate the process. However, they usually provide a closed set of parallel patterns and are not open for extension without vast modifications to the underlying system. In this work, we present the AllScale API, a programming interface for developing distributed memory parallel applications with the ease of shared memory programming models. The AllScale API is closed for a modification but open for an extension, allowing new user-defined parallel patterns and data structures to be implemented based on existing core primitives and therefore fully supported in the AllScale framework. Focusing on high-level functionality directly offered to application developers, we present the design advantages of such an API design, detail some of its specifications and evaluate it using three real-world use cases. Our results show that AllScale decreases the complexity of implementing scientific applications for distributed memory while attaining comparable or higher performance compared to MPI reference implementations.Downloads
Download data is not yet available.
Downloads
Published
2021-01-12
How to Cite
Gschwandtner, P., Jordan, H., Thoman, P., & Fahringer, T. (2021). AllScale API. Computing and Informatics, 39(4), 808–837. https://doi.org/10.31577/cai_2020_4_808
Issue
Section
Special Section Articles