Research in Complexity Theory
Tue 03.15.16
Research in Complexity Theory
Tue 03.15.16
Tue 03.15.16
Tue 03.15.16
Tue 03.15.16
Tue 03.15.16
Computational inefficiency is a common experience: the computer cannot complete a certain task due to lack of resources such as time, memory, or bandwidth. The theory of computational complexity classifies — or aims to classify — computational tasks according to their inherent inefficiency. Inefficiency can also be harnessed to our advantage. Indeed, modern cryptography and electronic commerce rely on the (presumed) inefficiency of certain computational tasks. From the study of computational complexity there have arisen questions that stand as grand challenges of contemporary science. The objective of this project is to enrich the theory of computational complexity with new directions and techniques, and to use these techniques to make progress on long-standing open problems. Specific areas of investigation include the complexity of sampling tasks and of distributed tasks, and randomness. The investigator has a track record of fruitful exchanges with the mathematics community and will foster further cross-fertilization between mathematics and computer science. The project will develop publicly-available educational material, including lecture notes, surveys, slides, and videos, both at the advanced and at the introductory level.
In more detail, the project will seek to prove computational lower bounds for sampling tasks. These lower bounds provide an under-explored angle for attacking problems on extractors and data structures. Preliminary results by the investigator have already found application in a breakthrough known as two-source extractors. The project will also bring a new set of techniques in group theory to bear on communication complexity and cryptography. This project will strengthen these techniques and develop more applications. Finally, the project will explore extensions of small bias generators, which have the potential to answer central open questions in pseudo-randomness and beyond.
Computational inefficiency is a common experience: the computer cannot complete a certain task due to lack of resources such as time, memory, or bandwidth. The theory of computational complexity classifies — or aims to classify — computational tasks according to their inherent inefficiency. Inefficiency can also be harnessed to our advantage. Indeed, modern cryptography and electronic commerce rely on the (presumed) inefficiency of certain computational tasks. From the study of computational complexity there have arisen questions that stand as grand challenges of contemporary science. The objective of this project is to enrich the theory of computational complexity with new directions and techniques, and to use these techniques to make progress on long-standing open problems. Specific areas of investigation include the complexity of sampling tasks and of distributed tasks, and randomness. The investigator has a track record of fruitful exchanges with the mathematics community and will foster further cross-fertilization between mathematics and computer science. The project will develop publicly-available educational material, including lecture notes, surveys, slides, and videos, both at the advanced and at the introductory level.
In more detail, the project will seek to prove computational lower bounds for sampling tasks. These lower bounds provide an under-explored angle for attacking problems on extractors and data structures. Preliminary results by the investigator have already found application in a breakthrough known as two-source extractors. The project will also bring a new set of techniques in group theory to bear on communication complexity and cryptography. This project will strengthen these techniques and develop more applications. Finally, the project will explore extensions of small bias generators, which have the potential to answer central open questions in pseudo-randomness and beyond.