Automated Amortised Resource Analysis of Data Structures
Automated Amortised Resource Analysis of Data Structures
Disciplines
Computer Sciences (60%); Mathematics (40%)
Keywords
-
Amortized Cost Analysis,
Functional Programming,
Data Structures,
Lazy Evaluation,
Automation,
Constraint Solving
Algorithms play a central role in the science and practice of computing. Often, however, we are not only interested in `what` task a given algorithm is performing, say sorting a deck of cards, but also in `how` efficient this task is performed. That is, beyond the correctness of the algorithm, whether or not it is performing the task as intended, we are interested in its efficiency and want to analysis is complexity, that is, the execution time. One complexity measure of an algorithm or a data structure is its amortised complexity. Imagine you have a collection of objects, again like a deck of cards, and you want to know how long it will take to perform some operation on all the objects. By looking at the cost of the operation for a single object and multiplying this cost with the number of objects in your collection you would get an upper bound on the cost of performing the entire operation. However, in all likelihood, the thus obtained upper bound would be too course, as the operation may be slower or faster for particular objects in your collection. This is where an amortised analysis comes in. For a given operation, certain situations, eg. preparation steps or the specific type of the object considered, may imply a significant cost in resources, whereas other situations may not be as costly. The amortised analysis considers both the costly and less costly operations together over the whole sequence of operations. This may include accounting for different types of input, length of the input, and other factors that affect its performance. In sum, amortised analysis is a method for the cost analysis of algorithms and data structures, where a single operation is considered as part of a larger sequence of operations. The cost analysis of sophisticated data structures, such as self-adjusting binary search trees, has been a main objective already in the initial proposal of amortised analysis. Analysing these data structures requires sophisticated manual (pen-and- paper) complexity analysis. In previous work we have provided preliminary steps towards a fully automated amortised analysis of these data structures. Apart from this pilot project, such an analysis has so far eluded automation. In this project, we target an automated analysis of the most common data structures with good, ie. sublinear, complexity, as they are typically used in standard libraries of programming languages. Our goals are the verification of textbook data structures, the confirmation and improvement (on coefficients) of previously reported complexity bounds, as well as the automated analysis of realistic data structure implementations. Initially, we will only consider strict (first-order) functional programming languages. Later on, we will extend our research towards lazy evaluation in order to allow for the analysis of persistent data structures. Moreover, we will also consider probabilistic data structures.
- Technische Universität Wien - 50%
- Universität Innsbruck - 50%
- Florian Zuleger, Technische Universität Wien , associated research partner
- Laura Kovacs, Technische Universität Wien , national collaboration partner
- Rene Thiemann, Universität Innsbruck , national collaboration partner
- Christoph Matheja, Technical University of Denmark - Denmark
- Martin Avanzini, Université Côte d´Azur - France
- Niki Vazou - Spain
- Jan Hoffmann, Carnegie Mellon University - USA
Research Output
- 1 Publications
-
2025
Title Regular Grammars for Sets of Graphs of Tree-Width 2 DOI 10.1109/lics65433.2025.00059 Type Conference Proceeding Abstract Author Bozga M Pages 704-717 Link Publication