Theoretical Computer Science

Theoretical Computer Science Group

The theory of computability as developed at the beginning of the last century lies at the very heart of computer science. Computations, where the resources are limited, can be conceived as investigations into computational complexity theory. On the other hand computations with limited resources are also been intensively studied in the area of program analysis, as program analysis offers techniques for predicting intensional behaviours of programs.

While traditionally these two conceptions of computation with limited resources are studied in quite separate communities, our research aims to unify the approaches to get the best of both worlds. Thus we seek to transfer results from the analysis of feasible algorithms to the analysis of feasible problems and vice versa. Our methodology encompasses investigations into the logical foundations of the nature of computation (with limited resources) and (automated) resource analysis of programs.


The Past

Initially, our research was concerned with derivational and runtime complexity of term rewrite systems and corresponding work on implicit complexity theory. One example of the use of fundamental theoretical studies in practical application areas was our work on time and space complexity analysis of interaction net systems. A milestone of this work constitutes the development of our award-winning Tyrolean Complexity Tool (TcT); TcT is open source and freely available.


The Present

Probabilisitic rewriting is a natural extension of rewriting that allows a clean an elegant modelling of stoachastic processes. Suiting this methodology as operational semantics of probabilisitic programs allowed us to apply the lessons learnt in the development of TcT for the automation of the analysis of expected costs and valued of imperative and functional programs.

A landmark result to this respect is for example the fully automated analysis of the expected amortised costs of sophisticated probabilistic data structures (like randomised splay trees, randomised splay heaps and randomised melding heaps) as developed in our prototype implementation ATLAS. In short, we tackle the automated verification (and sometimes improvement) of the analysis of sophisticated probabilistic programs. Our interest in verification of probabilistic programs and stochastic processes squares nicely with our ongoing interests in interpretability and also verification of machine learning methods.


The Future

Expanding on earlier improvements on the modularity and scalability of the analysis of probabilistic programs (as witnessed by our prototype implementation eco-imp), we have recently established a sound framework for the expected cost analysis of mixed classical-quantum programs. We expect that this formalisation in the style of a weakest-precondition calculus provides a sound foundation for further automated program analyses of mixed classical-quantum programs.

For the future, we envision the consolidation of our work on (expected) cost analysis and seek an intensification on our work on quantum computing. On the theoretical side, we aim to exploit earlier results on the feasibility of graph-based computation to study the limits of feasible provability.