skip to content

Features: Faculty Insights

 

Dr Anders Hansen, from the Department of Applied Mathematics and Theoretical Physics (DAMTP), has received a Philip Leverhulme Prize for his work on solving very hard problems and opening new directions in areas of great impact in applied analysis.

Every year 30 outstanding researchers whose work has already attracted international recognition and whose future career is exceptionally promising are awarded Philip Leverhulme Prizes. The prizes are each worth £100,000 which can be used over two or three years to advance the winners' research.

Understanding complexity

One of Hansen's notable achievements is developing the solvability complexity index. Understanding the complexity of problems is one of the foundations of computational mathematics. Classically, complexity is categorised in terms of the length of time a process will take based on the size of its input. For example, polynomial time algorithms will take the order of nm steps to finish, where m is some finite integer and n is the size of the input into the algorithm. For example, the selection sort method of ordering a list of items (such as arbitrary numbers in numerical order) takes in the order of n2 steps where n is the size of the list.

"In logic and computer science one typically works with a finite input and everything is over the integers," says Hansen. "When we do computations in science, like in physics, chemistry or biology, we work with models that are over the continuum - that use real numbers. And not only do we have real numbers, we have infinite amount of input."

The classical concept of complexity assumes a finite input. But there are so many real problems from science that can't be characterised with these classical classes. "These are much, much harder. But ironically, we compute with them on a daily basis." The solvability complexity index is a tool to understanding such problems. Rather than classifying them in terms of the times it will take to solve them, they are classified by the number of constraints that must be placed on the problem in order for it to be computable.

Compressed sensing

Moving from the finite to the infinite also characterises another area of Hansen's work that was recognised by the Leverhulme Prize. Compressed sensing, developed in the mid 2000s, "turned lots of fields upside down, especially medical imaging." One example of its impact is in magnetic resonance imaging (MRI), which gives Fourier samples of the image you are trying to reconstruct. "You have to pay a time unit for each sample and time units are costly. You don't want a patient to stay in the MRI machine for a long time." Compressed sensing was a new way of allowing one to take much fewer samples and still be able to recover the information that was important.

The initial theory of compressed sensing was based on a discrete set up: a finite matrix of sensor measurements, and a finite problem to solve. "Again the issue with the sciences is that many of these problems are truly infinite-dimensional." Hansen wanted to have a theory for how this compressed sensing would work in the infinite setting. And although the mathematics is very different to his work in computational mathematics, it came with the same motivation: "the discrete versus the continuum."

Hansen worked with his colleague Ben Adcock to develop a theory for the infinite dimensional case. Their work involved many people, including engineers who worked on the implementation in the real world, and ultimately provides better sampling patterns for applications such as MRI. Compressed sensing was approved by the Food and Drug Administration in the US in June 2017 as a result of work done by the entire compressed sensing community, allowing MRI machine manufacturers to implement this technique in the next generation of machines.

You can read more about other recent Cambridge winners of this prize here.