Complexity theory is an area of theoretical computer science which aims to show lower bounds on the amount of resources needed to solve computational problems. Here, the type of resources can be very varied: most prominently, computation time is considered, but also space, communication cost, the amount of randomness, language resources in logical descriptions, length or shapes of proofs in formal proof systems, number of arithmetic operations, description size of data structures and many other measures have been considered. In many settings, unconditional lower bounds so far turn out to be elusive, so research focused on showing bounds based on complexity assumptions for benchmark problems or classes of problems; the most well known and successful instantiation of this is the theory of NP-completeness.
Complexity theory studies problems from many different application domains of mathematics and computer sciences, e.g. databases, artificial intelligence, bioinformatics, operations research, quantum computing,... Moreover, it borrows techniques and results from many different areas: traditionally mostly combinatorics and logic, but also diverse areas like physics, information theory, coding theory, and pure mathematics, including areas like (algebraic) geometry, Fourier analysis and group theory. This combination of different aims, applications and tools makes complexity theory very rich and diverse. In particular, while it often complements algorithmic results, it has its own distinct culture and techniques that are quite different from other areas.
The aim of Complexity Days is to gain a perspective on research in complexity theory in France, bring together researchers working in the area(s), let them present their interests and results and be a forum to exchange about techniques and challenges.