Copenhagen Programming Language Seminar
Functional array language compiler and interpreter designers try to reduce the number of arrays created during application execution, because the negative impact of arrays on performance is so dramatic. Just as The Three Bears had different requirements for their own satisfaction, so do differing array shapes have different requirements for their elimination. The problem itself is a bear: scalar operations are the baby bear, typified here by dynamic programming and the Floyd-Warshall algorithm; operations on small arrays, such as numerically intense computations on complex arrays, is the mama bear; operations on large arrays, typified by acoustic signal processing, is the papa bear. We compare interpreted to compiled APL performance for several applications with different array shapes, and give an overview of the various optimizations that enable those speedups, in both serial and parallel contexts.
Biography: Robert Bernecky has designed and developed APL systems since 1971. While at I.P. Sharp Associates Limited, he was one of the people responsible for the design and development of SHARP APL, a system that set the standard for performance of large-scale APL systems. He has authored papers on language design, algorithm design, and interpreter performance. He developed APEX -- the APL Parallel Executor -- a high-performance, retargetable APL compiler for serial and parallel computers. Bernecky is the CEO of Snake Island Research Inc, a consulting and research firm headquartered in Toronto. Bernecky holds a BA in philosophy from SUNY at Buffalo, and an MSc in Computer Science from the University of Toronto.
Martin Elsman Administrative host:Jette Møller.
All are welcome.
The Copenhagen Programming Language Seminar (COPLAS) is a collaboration between DIKU, DTU, ITU, and RUC.
COPLAS is part of the FIRST Research School.
To receive information about COPLAS talks by email, send a message to email@example.com with the word 'subscribe' as subject or in the body.
For more information about COPLAS, see http://www.coplas.org