skip to content

Features: Faculty Insights


The Faculty welcomes Ioannis Kontoyiannis as the new Churchill Professor of Mathematics, based at the Department of Pure Mathematics and Mathematical Statistics (DPMMS).

Writing a profile of Kontoyiannis is a challenge because his work spans a whole range of topics. "The reason for this is that whenever I find something exciting, I read about it and then try to do something about it," he says. When asked to talk about a favourite area of research, he picks a concept that is equally far-ranging: information.

Understanding information

"If students read this, one thing I want to say loud and clear is, only do stuff that you find fun." Ioannis Kontoyiannis

Information is as slippery as it is ubiquitous. There is information in this article, in your brain, and also in your DNA. It can be copied from one medium to another, travel along all sorts of channels, and pass down through generations. Information can be man-made and also nature-made. So what exactly is it? And how can we measure it? "We want to think of information as a physical commodity, which means that we want to have a scientific description of it," explains Kontoyiannis. "But that's difficult because information has very different characteristics than other things we are used to studying."

The first person to tackle information from a mathematical view point was Claude Shannon, in a ground-breaking paper published in 1948. Shannon's motivation was practical, he was interested in how to best compress or transmit information long-distance, but he came up with a way of thinking about it we are now familiar with: as strings of symbols, such as the 0s and 1s inside a computer (find out more in this article on Plus magazine).

High on entropy

Kontoyiannis' passion for information was sparked by another concept Shannon had a hand in developing: entropy. "I came across the concept of entropy when I was doing Part III here at Cambridge and it was a huge revelation for me. I felt like I was high; for three weeks I walked around like I was drunk."

What fascinated Kontoyiannis about entropy was its enormous power of description. Originally, the concept stems from physics, where it was first developed to quantify the annoying inefficiency of steam engines. It then transmuted into a description of the amount of disorder in a physical system, and is implicated in a law of nature we experience all too often: the second law of thermodynamics, which states that the amount of disorder (entropy) in a closed system only ever goes up, but never down.

Fascinatingly, there's a link between disorder and information. A very orderly, non-complex string of symbols, such as one consisting of a million 0s, does not contain much information in the sense that it can easily be described very succinctly: you can replace the million 0s by the words "here's a string of a million 0s." A completely disordered, complex, string on the other hand, for example one coming from a million fair coin flips, contains a lot of information in the sense that it can't be compressed: when you want to communicate it, you have no choice but to write down all the symbols. Thus, entropy can also be thought of as a measure of information. And indeed, Shannon came up with a definition of entropy which is just that. (You can find out more about the many faces of entropy in this article on Plus magazine.)

When Kontoyiannis was doing his Part III he was passionately devoted to pure mathematics, but entropy makes an appearance here too, in the area that deals with abstract dynamical systems. These can behave in a very unpredictable, even chaotic, fashion, bringing us back to questions of randomness, disorder and complexity — and entropy.

Kontoyiannis's thrilling first encounter with entropy in pure maths opened his eyes to the wonders of information theory and the fact that the boundary between applied and pure mathematics is, in some sense, immaterial. And what is more, it inspired him to pursue a PhD in an Engineering Department in the US.

Spotting hackers

A lot of Kontoyiannis' research has involved foundational questions about how information behaves as a physical, scientific quantity, and has been on the theoretical side. But he has also delved into applications. One research project, funded by NASA, involved storing satellite images of the Earth's surface. He has also worked with neuroscientists trying to measure the way information is communicated in the brain. And he has explored information contained in genomes of different organisms — including the SARS-CoV-2 virus, which causes COVID-19. "I find it fascinating to translate intuitive notions of information into precise mathematical statements," he says.

Kontoyiannis has also applied his mathematical expertise to work with private companies. A favourite problem he helped solve, for a company in Silicon Valley, involved hacking. "They wanted to be able to spot when users of online platforms do specific things. For example, imagine you are [an online retailer] with its hundreds of thousands of servers. At any point in time you can simultaneously observe what all your users are doing, and you want to figure out which one of those users is a hacker trying to do something malicious. This was fun: the question was how you can identify patterns of activity that are not typical, without knowing what those patterns are."

A similar reverse engineering type of problem also involves online platforms. "Suppose we go to lunch and you mention you bought a book on Amazon. I then go on Amazon to look up the book, and it says, 'people who bought this also bought that'. How much information does that give me about you and your preferences, tastes and habits? It's endless how much fun one can have with these problems!"

Do what you enjoy doing!

Kontoyiannis came to DPMMS from the Engineering Department at Cambridge, where he was Professor of Information and Communications, and Head of the Signal Processing and Communications Laboratory.

While he is comfortable in all sorts of environments, he is very happy to have landed in a mathematics department. "In mathematics there is less of a drive for quick rewards," he says. "It's more ok to think about things for longer without being able to produce results. This department in particular is one of the most open-minded in the world. People are doing very good things without looking over their shoulder at what other people are doing. People do very applied work and very theoretical world, and they co-exist peacefully."

This intellectual freedom, says Kontoyiannis, is also very important for students. "If students read this, one thing I want to say loud and clear is, only do stuff that you find fun. I may be a victim of 'survivor bias' here, but I don't think I have ever met an academic who regretted following their passion."

Another piece of advice Kontoyiannis would like to give to students is to read. "There is a fine line that separates what is known and rather trivial and what is unknown and currently unknowable. But there is a very thin strip of things that are not known, but that we can say something about, and that something can shed light into the unknown. I have found that the best way to find cool stuff to do is to approach that border from the inside. And nothing beats reading!"