# Symbolic Summation for Computer Science

### Project Description

The research area of symbolic computation aims at algorithmizing large areas of mathematics and “putting them on the computer”. What could previously only be solved in mathematics with paper and pencil can now be done by pressing a key on a smartphone, a notebook or, for really big problems, a high-performance computer. In particular, the research area of symbolic summation can now simplify complicated sums in many scientific areas, such as mathematics (counting problems or number theory) or elementary particle physics, in an impressive way, which until recently were completely out of reach. Although simplifying sums has been a central tool in computer science from the beginning (see, for example, the runtime analysis of algorithms in the legendary book series “The Art of Programming” by Donald Knuth), current summation algorithms are rarely used in theoretical computer science. This project aims to combine the new developments in the simplification of sums with the theoretical field of computer science and to initiate new interdisciplinary interactions. The driving force here is a constructive difference ring theory in which the underlying sequences of the summation objects are formally represented in algebraic structures (so-called commutative rings and fields) equipped with a shift-operator (ring automorphism). The central goal of the project is in particular to significantly expand the class of summation objects within the framework of such difference rings and to develop novel algorithms to simplify expressions in these extended objects and to solve linear recurrences, which often occur in computer science. Among other things, new software packages are being developed to, for example,

- further automate the analysis of algorithms, i.e., predict the runtime of an algorithm depending on the input size,
- accelerate the evaluation of non-trivial expressions by discovering redundancy, or
- automatically prove that the input-output behavior of an algorithm is correct.

Conversely, technologies from computer science and logic are being integrated in novel ways to extend the existing summation algorithms. Ideally, this will open up new areas of application not only in computer science and mathematics, but also in other technical and scientific research areas.