Mondelli Group

Data Science, Machine Learning, and Information Theory

We are at the center of a revolution in information technology, with data being the most valuable commodity. Exploiting this exploding number of data sets requires to address complex inference problems, and the Mondelli group works to develop mathematically principled solutions.

These inference problems span different fields and arise in a variety of applications coming from engineering and natural sciences. In particular, the Mondelli group focuses on wireless communications and machine learning. In wireless communications, given a transmission channel, the goal is to send information encoded as a message while optimizing for certain metrics, such as complexity, reliability, latency, throughput, or bandwidth. In machine learning, given a model for the observations, the goal is to understand how many samples convey sufficient information to perform a certain task and what are the optimal ways to utilize such samples. Both the vision and the toolkit adopted by the Mondelli group are inspired by information theory, which leads to the investigation of the following fundamental questions: What is the minimal amount of information necessary to solve an assigned inference problem? Given this minimal amount of information, is it possible to design a low-complexity algorithm? What are the fundamental trade-offs between the parameters at play (e.g., dimensionality of the problem, size of the data sample, complexity)?

On this site:


Current Projects

Fundamental limits and efficient algorithms for deep learning | Non-convex optimization in high-dimensions | Optimal code design for short block lengths


Mondelli M, Hashemi SA, Cioffi JM, Goldsmith A. 2021. Sublinear latency for simplified successive cancellation decoding of polar codes. IEEE Transactions on Wireless Communications. 20(1), 18–27. View

Fazeli A, Hassani H, Mondelli M, Vardy A. 2020. Binary linear codes with optimal scaling: Polar codes with large kernels. IEEE Transactions on Information Theory. View

Shevchenko A, Mondelli M. 2020. Landscape connectivity and dropout stability of SGD solutions for over-parameterized neural networks. Proceedings of the 37th International Conference on Machine Learning. vol. 119, 8773–8784. View

Nguyen Q, Mondelli M. 2020. Global convergence of deep networks with one wide layer followed by pyramidal topology. 34th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems vol. 33, 11961–11972. View

Hashemi SA, Condo C, Mondelli M, Gross WJ. 2019. Rate-flexible fast polar decoders. IEEE Transactions on Signal Processing. 67(22), 8854897. View

View All Publications


since 2019 Assistant Professor, IST Austria
2017 – 2019 Postdoc, Stanford University, Stanford, USA
2018 Research Fellow, Simons Institute for the Theory of Computing, Berkeley, USA
2016 PhD, EPFL, Lausanne, Switzerland 

Selected Distinctions

2019 Lopez-Loreta Prize
2018 Simons-Berkeley Research Fellowship
2018 EPFL Doctorate Award
2017 Early Postdoc Mobility Fellowship, Swiss National Science Foundation
2016 Best Paper Award, ACM Symposium on Theory of Computing (STOC)
2015 Best Student Paper Award, IEEE International Symposium on Information Theory (ISIT)
2015 Dan David Prize Scholarship

Additional Information

Download CV

View Marco Mondelli’s website

Back to Top