Hello Sylvie. First of all, thank you for agreeing to this interview. Could you introduce yourself and tell us a little about your background?
After high school, I entered a preparatory class at the Lycée Charlemagne, in the Marais district. After that, I was admitted to the École Normale Supérieure of Sèvres, for girls. It was physics that sparked my interest in research. My initial ambition was to work in astronomy: the sky was what attracted me. As I explored further, I became interested in meteorology and climate. At the time, people were talking about desertification in the Sahel, and climate change was not yet on the agenda. I was asked to work on the climate of the last glacial maximum, some 20,000 years ago, which coincided with my interest in prehistory. This was the subject of my thesis.
Joining the CNRS in 1983 at the Laboratoire de Météorologie Dynamique, I then took part with Jean Jouzel in the creation at the CEA of the Laboratoire de modélisation du climat et de l'environnement, the ancestor, so to speak, of the laboratory in which I am currently based, and set up a team there, focused on modeling past climates.
Then, in the 2000s, I had the opportunity to move on to a new role in the management of the CNRS's Institut National des Sciences de l'Univers (INSU), first as deputy director in charge of the ocean-atmosphere domain and then as INSU director. Even in this research management role, I continued to be interested in supercomputing, a key tool for many fields of research. Research also led me to participate in the work of the IPCC. Currently, I'm Director of Research Emeritus since October 1, 2023.
What was the driving force behind your career path and how did you come to high-performance computing?
What really captivated me was the strong link between climate science and society, based on climate modeling. It was during my thesis in the climate field that I started using supercomputers, the first CRAYs at the time. At first, my knowledge of computers was limited, and we had very little training, just a brief introduction. In reality, I had to learn on the job. But I loved working in numerical calculation. There's something very playful about solving equations numerically and tracking down bugs. I have a certain nostalgia for that period when I could devote time to using these superb computing machines.
So the use of computation in climate research has been at the heart of your work?
Initially, I was interested in simulations of the climate twenty thousand years ago, particularly through processes aimed at representing in models indicators that are observed in ice cores in Antarctica and Greenland.
I wanted to study the relationship between paleoclimatic indicators, such as water isotopes or desert dust, and climate, using the model as a mock-up, a means of studying how this relationship is impacted by a different climate. Why, for example, is there more dust in Antarctica during the glacial maximum? Where does this dust come from? Is it linked to a drier climate, a larger desert area, stronger winds, where does it come from?
In addition, past temperatures are estimated from water isotopes measured in the ice. In fact, as water vapor is transported from the tropics to the poles, it becomes depleted in heavy isotopes during successive precipitations, and the colder the temperature, the fewer heavy isotopes are found in the ice. But is this thermometer, calibrated to the current climate, valid for past climates? Climate models are one way of validating this approach. This thesis work, based on the use of supercomputing, led me to explore many aspects of the model, which is fortunate, as one doesn't always have the opportunity to do so during the course of a thesis.
What approach(es) do you use to compute in climate science?
The study of climate is based on two approaches: observation and modeling. Observation is based on continuous measurements, or field campaigns, of the atmosphere, oceans and even ice for past climates. Models are a kind of scale model of how the atmosphere and oceans work. They enable us to understand how the climate works and the interactions between its various components. They enable us to study the climate system as a whole and, above all, to play with it, in the sense that we can simulate the impact of any given process or condition and see how the system reacts. Calculation is the answer to the impossibility of experimenting on the climate system. What's more, if we want to study how the climate might evolve in the future, we can only do so through numerical simulation. It is a decisive key to understanding our knowledge of the climate and its evolution.
This of course applies to all scientific communities beyond climate?
When I was director of INSU, I supported the creation of a computing foresight committee so that we could think about the needs of scientific fields and the challenges ahead. This concerned different communities, not just those linked to climate. We need supercomputing in many fields, including astrophysics, chemistry, biology and physics. I was convinced, and still am, of the importance of these communities expressing their needs for HPC. I'd already had this experience through my participation in foresight exercises at INSU and CEA. So I made sure that this approach was also adopted across the whole of CNRS, in order to overcome the boundaries inherent in the division of scientific fields. This proved to be very useful, as the analysis we carried out helped to promote HPC at national and European level. We were at a time when we felt that we were falling behind the rest of the world in terms of computing resources. This analysis helped fuel the thinking that led to GENCI at national level and PRACE at European level.
More recently, I also chaired the GENCI evaluation committee from 2016 to 2021. In this context, most of the work is done by the chairs of the thematic committees. The chair's role is simply to make sure that everything works together, to oil the wheels so that the various communities can make the best possible use of the computing resources available at national level. These thematic committees provide important insight into the needs of the various communities, so that we can prepare future machines. I'm continuing my involvement in HPC by chairing ORAP, which aims to promote high-performance computing and keep communities informed.
Computing has been a common thread throughout my career. First as a user, then in a foresight and steering role, to ensure that indeed scientific modeling communities can have the means to work.
Do you think the provision of resources and computing and training for communities remain decisive today in climate knowledge?
Computational simulation has been at the heart of the evidence for global warming. Without these calculations, we wouldn't have been able to prove that global warming is linked to the increase in greenhouse gases. Simulation is essential to move from observing global warming to demonstrating the role of greenhouse gases. It is also crucial for anticipating future climate change.
Climate modeling needs intensive computing resources to move forward in different directions. We need to further refine spatial resolution to represent certain processes. We also need calculations to represent the duration of changes. For example, to study past climates, we don't just need to simulate a few hundred years, but also several thousand or even ten thousand years of simulation to study the last great deglaciation. Because of the chaotic nature of the climate system, we also need to run ensembles of simulations to represent the system's internal variability. Indeed, slightly modified initial conditions can lead to different paths in the evolution of the system. We also need to represent the complexity of the system, as physics alone is not enough. We also integrate atmospheric chemistry mechanisms. The representation of biology is just as important for representing carbon cycles, involving interactions with vegetation and marine biology. The success of these advances depends crucially on our computational resources.
The exascalewill therefore contribute to meeting this scientific need?
To me, it's undeniable that we need to turn to more powerful, Exascale-type machines, with the hope of better representing the complexity of the climate system and helping to anticipate future changes. At present, however, we find ourselves in a difficult situation. The machines are becoming more complex, with more elaborate architectures, and the technological leap to be made is very high and requires a great deal of expertise. This is all the more true given that climate modelling relies on seven inter-coupled codes, not just one, and that calculation chains are heavy and not easily transportable from one machine to another. At the same time, we're faced with the dilemma of having to reduce our energy consumption and greenhouse gas emissions linked to digital computing and supercomputing! A compromise has to be found. We need computing to anticipate, adapt and prepare. If we try too hard to restrict computing capacity, we'll be blinding ourselves. This would be a disservice to society. All the more so in a geopolitically complex world, where knowledge built up from data remains a key asset.
You mention international issues. Could you tell us about your involvement in the work of the IPCC?
I first became involved with the IPCC as part of its third report, published in 2001, in the chapter on model evaluation. My involvement was closely linked to having coordinated a program to compare paleoclimate simulation models. Within the framework of the IPCC, the aim was to assess the ability of models to simulate a climate different from the current one. To do this, it was important to carry out comparable simulations, i.e. with the same experimental conditions, so as to distinguish between model-dependent results and robust, model-independent ones. I was also able to contribute to the next two reports in different roles. The IPCC plays a key role in establishing the state of knowledge. It relies on a huge body of work by the scientific community, which constitutes a mine of knowledge for decision-makers and researchers alike. It enables us to take a broader view than our own small field of expertise.
Going beyond our "little domains" means strengthening the field of possibilities, doesn't it? On this International Day of Women and Girls in Science, how could we open up the possibilities for all those who today don't project themselves, don't want to or can't find their place in STEM?
When it comes to the place of women, the situation is not the same in climate research as it is in HPC. If it's true that the computational community is not very feminized, that's not the case with the climate community, perhaps because there's a strong link between climate and society. It seems to me that highlighting societal issues can help boost young women's interest in the world of computing, and more broadly in the scientific field. Putting computing at the service of science, and science at the service of society, can be an asset in feminizing the scientific world and attracting women to HPC!
It's important that the girls and women who read these lines know that science is not reduced to a single technical issue, and that they will be able to produce knowledge whose usefulness will go beyond their field of activity. We're touching on a broader issue here, one that affects women as well as men - in fact, the whole of society - and that's that of confidence in science. Restoring everyone's confidence in science and their desire to pursue a scientific career is a challenge that must mobilize us.