skip to content

Features: Faculty Insights

 

James Fergusson is leading cutting-edge research on the use of artificial intelligence (AI) in academia and industry.  As well as being Professor of Theoretical Cosmology in the Department of Applied Mathematics and Theoretical Physics (DAMTP), Fergusson is also the Director of the Infosys-Cambridge AI Centre and Executive Director of the Data Intensive Science programmes, including an innovative new Masters programme.

Bridging cutting edge research in academia and industry

Fergusson's innovative collaboration with the multinational technology and consulting company, Infosys, is an outstanding example of how the University and industry can collaborate on cutting edge research with real impact and benefits for both sectors. The Infosys-Cambridge AI Centre continues this partnership by providing a gateway for businesses and industry to connect with world-leading researchers from the University.

 "Out of our conversations with Infosys, we realised that a lot of the research challenges we have [in academia] are very similar to the challenges that their customers have," says Fergusson.  For example, data is growing exponentially in scientific research as well as in every industry and sector. Scientists from across all areas, from biology to physics and mathematics, are all thinking about quite similar things to large telecoms or manufacturing companies: everyone wants to know how to get knowledge out of these vast terabytes of data.  And Fergusson says this is illustrative of the way their worlds are coming together.  "[Everyone] wants to develop a real understanding of the system they are trying to study, whether that's their manufacturing supply chain or some particle physics experiment. There's very little boundary between being an AI researcher in academia and being one in industry."

Challenges become opportunities

Of course industry also faces very different challenges to academia, such as regulatory constraints, legacy systems and legal considerations. These are issues that aren't normally encountered in academic research.  For example, the sources of scientific data  usually don't sue you if you get your analysis wrong, Fergusson drily notes: "If you misclassify a galaxy, the galaxy stays pretty relaxed about that." 

Being exposed to industry-specific challenges, such as the need for scalable systems, is particularly enriching for Fergusson and his colleagues. Academic research tends to be more exploratory, working with cutting-edge AI systems crafted specifically for the research needs of a small team of experts: "Academics can deal with quite creaky and fragile systems because it's generally only used by two or three people who are experts in the field."

 

These AI tools developed for academic research could be crucial in meeting real-world industry challenges.  "One of the really interesting things we are talking to companies about is how do you take these ideas from research and build them into robust, reliable systems that can be widely used," says Fergusson. "Industry has vast resources to do things on really big scales that academia would normally struggle with.  It's really interesting for us to learn from experts like Infosys, who have huge teams of software developers who are really good at taking ideas and deploying them in robust, scalable ways." 

Training the next generation of researchers

As well as exploring the limits of what AI can do in industry, Fergusson is also key in equipping the next generation of researchers with the innovative Masters programme he developed: the MPhil in Data Intensive Science [link]. Rather than a more traditional Masters course which focuses on one specific subject, Fergusson says the MPhil is intended to be more like vocational training, giving students the full package of skills that researchers in this field will need. 

"My great hope is that people who come to us on this Masters take the great training they have from their undergraduate courses, and then use this vocational training on how to use these cutting-edge tools to drive the next round of exciting breakthroughs."  Fergusson thinks recent prizes for AI enabled discoveries, such as the Nobel Prize for the use of AlphaFold to solve protein folding, are just the start.  "I think we'll see more of these AI-powered tools driving breakthroughs in scientific areas."

One thing that makes the MPhil programme so successful is the broad range of students it attracts, including nearly 50% women, a very large age range and students from a range of different academic and industry experience.  "One of the really nice things about the MPhil is that it is a melting pot – a rich cohort of individuals from a lot of different areas.  They learn an enormous amount from each other."

One of the keys to the programme's success is Fergusson's focus on student support and welfare, something recognised by his 2025 Pilkington Prize. As well as his own regular drop-in sessions for students, the MPhil provides things like team-building days, a psychologist to help students develop a mindset for success, and training in presentation, writing, and all the other skills essential for a scientific career. "We really try to give them all of the skills that they need, the academic ones, but also the soft skills that they need to be successful."

Theoretical physics: the interface between theory and data

Fergusson and his colleagues from theoretical physics work at the interface between theory and data.  He says you can make progress working on theory alone, but ultimately you need to validate those theories.  And if you work with experimental data, you need to interpret that to develop a theoretical understanding.  "AI can help with both of these sides."

One very recent and exciting area of work for Fergusson and his colleagues is Denario, an end-to-end data analysis AI system.  It is a multi-agent system, using about 40 different large language models (the agents) that are trained to do specific tasks. “They then work like a team of very efficient, but not too bright, people.  Provided you can break the problem down into simple steps, then you can build up teams which are capable of completing quite sophisticated analysis”, says Fergusson.  

Denario doesn't replace the researchers but it can massively scale up what researchers can do.  "We talk about artificial intelligence but these [agents] are not intelligent: they can't come up with any new brilliant ideas," says Fergusson. "What they can do is do things the LLMs have seen in training, quickly, efficiently and at scale."  

A recent test of the limit of the system was to give Denario data sets from across all parts of science and automatically produce 80 academic papers which were then reviewed by researchers.  "A reasonable chunk of them was not very interesting, but some really good papers came out of it," says Fergusson.  "Whereas a human could only try two or three ways to analyse the data, Denario can explore all the possible methods."

What will researchers discover with Denario? It will be fascinating to see what happens next with this work. With this and all their work Fergusson and his colleagues are at the forefront of exploring how Al can support researchers and industry to overcome traditional challenges and limitations.