interview  | 

Can supercomputing technology help us predict how climate change will be?

Interview with Francisco Doblas-Reyes, professor at ICREA and director of the Earth Sciences Department at the BSC

Tags: 'Artificial intelligence' 'Big data' 'Digital Future Society'


Francisco Javier Doblas-Reyes is the Director of the Earth Sciences Department at Barcelona Supercomputing Center - Centro Nacional de Supercomputación (BSC-CNS). He led the Climate Forecast Unit at the Institut Català de Ciències del Clima (IC3) from 2010 to 2015. The Department hosts more than 100 engineers, physicists, mathematicians and social scientists who try to bring the latest developments in supercomputing and data analysis to provide the best information and services on climate and air quality. Author of more than 170 peer-reviewed papers, Francisco is a member of several international scientific committees and supervisor of several postdocs, engineers and PhD students.

To become the Director of the Earth Sciences Department at Barcelona Supercomputing Center you had to go through a tough selection process, with an international evaluation committee of front line experts, and in competition with 23 candidates from all over the world. How did you first become interested in climate variability?

I became interested in climate variability a long time ago. I was doing my PhD in Madrid and decided to experience what research would be like in France. I had a fantastic experience. I joined a team of people who’ve developed global models to understand climates and it was really motivating to work with them because they were performing cutting edge research and it was something that I couldn’t have done here in Spain. At the time, climate change was not a buzzword as it is now. It was very exciting because I was dealing with an aspect of nature that had immediate implications and that has a lot of interest in society.

You manage a broad interdisciplinary team. Can you give us an overview of your work? What are your main research areas?

The BSC, Barcelona’s Supercomputing Center, is a public consortium funded by the Spanish government, the Generalitat, and linked to the Universitat Politècnica de Catalunya. The core funding that we receive is devoted mainly to providing a platform for scientists, in Spain and in Europe, to run their own computing problems. We are around 700 people working on different aspects of research and organized in different departments. We work on problems related to life sciences. For instance, the design of vaccines using computers. Computer sciences. How do you develop these kinds of machines and make them more efficient and more accessible? Computer engineering. How can a machine like this one be used to solve problems like the installation of renewable power plants or the design of a car? And finally, earth sciences. Climates, weather, air quality. This is the department that I lead. Our work requires both supercomputing and access to big data sets to address needs that we identify both from society and the private sector.

What are you working on right now?

We have many different projects ongoing right now in my department. We’re addressing the issue of air pollution in cities. We also work on global climate problems. One of the problems that requires the largest amount of resources is the prediction of climate change for the near future. It’s quite common to hear about the evolution of climate change on a global scale at the end of the century, but what we found out is that there is a lot of interest in what’s going to happen in the next 10, 20 years because this is the timescale in which we can take action. It’s a timescale that interests renewable energy producers, wind plants, agriculture and food production. What will happen at the end of the century is basically irrelevant because these services have to be provided now. These kinds of problems tend to be very sensitive to climate change variability, and they are problems that are affecting societies all over the world, in tropical regions, in mid-latitudes where we live, even in the Arctic this is quite an important problem. 


We have developed a global climate model. It’s a model that performs and runs on a machine like the one we have here, Mare Nostrum Four. The model represents the evolution in time of the ocean, the atmosphere, the sea ice, the biosphere, all the components talking to each other. This is fundamental because we know that changes in their soil moisture in continental areas are affecting precipitation a bit farther East. We know that changes in the ocean are affecting the atmosphere and so on. The climate system is a complex system, interacting continuously.  


We have to reproduce what happens on a global scale, in many different systems and for a very long time. So this is a very intense computational problem, which at the same time is producing a huge amount of data. We are predicting what climate is going to be in the next few years because both the public administration and the private sector are interested in this information. So we transform the data into a piece of climate information that can be ingested by the wind farm investors, by the food production managers, and by many other sectors. We try to identify opportunities that can be put into the fore.


In 2013, The Fifth Assessment Report of the Intergovernmental Panel on Climate Change used the word ‘unequivocal’ to describe climate change. Assessments showed that the climate was changing as a result of human activities. What impact did this report have, if any? Do we have any evidence of a curb in activity?

The fifth assessment report of the Intergovernmental Panel on Climate Change was published in 2013. Sponsored by the United Nations, hundreds of scientists participate to assess the knowledge available at the time of publication from the research community at large. They focus on the physical aspects of climate change, as well as adaptation and mitigation aspects that will lead to a reduction of climate change from anthropogenic origin. 


The 2013 report caught the attention of the public because it made it very clear that climate change is not that it’s something that will happen at the end of the century, it is happening now, and it is unequivocally associated with human activities. The two concepts together created enough momentum for the Paris agreement to be taken more seriously by governments. The targets set by the Paris agreement had a lot of impact, creating awareness in terms of how important it is to set thresholds and setting these very ambitious 1.5 degree targets, critical in the sense that we are already at a level of one degree of warming with respect to pre-industrial levels, so we are just half a degree away from reaching that first target. It’s very important to be aware that dangerous thresholds are in the near future. The IPCC offers a framework for scientists to be policy relevant, to provide information that is usable by policy makers.

What will climate change look like over the next ten years?

We have estimates of climate change for the next 10 years. This is part of an initiative that the World Meteorological Organization has created and sponsored, in which the BSC plays a critical part in the provision of data. The data that we have gathered from the simulations that we have performed this year are telling us that the one degree level of warming with respect to the preindustrial levels has been already established. We have a probability of 10 to 20% that one year in the next 10 years will have a mean temperature of 1.5 degrees warmer than the 19th century. The warmest years on records over the last 10 years have been 1.1 degrees above pre-industrial levels. So this means we may have fast global  warming happening in the next 10 years. It doesn’t necessarily need to be the case, but it could happen. There is a high risk for this to happen.

You are currently working on a chapter for the Sixth Assessment Report of the Intergovernmental Panel on Climate Change to be published at the end of 2021. Can you tell us about this? What has changed since the last assessment?

The sixth assessment report of the IPCC is under preparation. This report has several aspects, at least as far as the physical aspects of climate are concerned, that are different from previous reports. The first is that we have more observational evidence of the anthropogenic origin of the global warming that is happening. The second one is that the warming could be higher than what we originally expected. If we translate the possibility of reaching a higher temperature at the end of the century into the time that we have left for action, it means that the time for action is shorter than we expected. What will happen in 20, 30, 50 years is almost impossible to know because there is an interaction between the physical climate system and the socio-economic activities of humankind, but taking into account different scenarios of socio-economic development, the time that we have to reduce or to reach the 1.5 degree threshold that was set as a minimum target for the Paris agreement is actually shorter than what we expected.


The additional difference between the Sixth Assessment Report and the Fifth is that regions matter. People live in a location and they have interests in the regions where they produce foods for security or where they have investments in particular locations, but nobody lives in a delocalized, idealized, global mean temperature. These reports are making an effort to bring the climate information that is available in the literature closer to where the action can take place, where people live, where assets are located. So how does global warming, these thresholds, translate into our variables that are relevant for places where people live? Temperature, precipitation, wind, local sea level rise… What does it really mean for cities, coastal cities like Barcelona, for instance? This is one of the big differences between these reports and the previous one.


Barcelona Supercomputing Center is the national supercomputing centre in Spain, specialising in high performance computing and managing MareNostrum, one of the most powerful supercomputers in Europe. 

What is a supercomputer? Is there a limit to computer sophistication?

A supercomputer is basically an infrastructure where one can find the latest technology in microprocessors, connected to really fast performance interconnection networks, and with memories and desk storage that are quickly connected to the rest of the elements in that supercomputer. Supercomputers tend to have tens of thousands of processors, similar to the ones that we have in our laptops and desktops. But they are better performance and better connected to the different elements of the computer. What makes a supercomputer very fast -able to perform experiments that we cannot perform with 10,000 desktops- is basically this interconnection, the ability of all those different processes to talk to each other in a very fast, very efficient way.


Concerning the question of whether there is a limit to the sophistication of a supercomputer. It’s quite difficult to answer somehow because it’s a question that, if I say, no, I would be proven wrong in a very short time. Somehow the technology that we are using in these computers is evolving very fast. For instance, here at BSC, we’re working on the development of a new microprocessor, targeting specific applications in particular autonomous cars, for instance, and some other autonomous machines and devices. These microprocessors will be an integral component of future supercomputers. Future supercomputers will have a characteristic that makes them quite different to the machines that we have here in the sense that they will be heterogeneous, they’ll have many different types of processors inside that will allow them to target different parts of the models and codes that we write. What we’ll have to address in the future is, how to evolve our codes to make the best use of that investment.

How accurate are the predictions generated by the system models and how can these predictions be turned into operations?

We try to reproduce many different processes from many different components of the climate system: the ocean, the atmosphere, everything talking to each other at the highest resolution possible. Many different points around the globe, millions, tens of millions, of those points going all the way down to the depth of the oceans and to the top of the atmosphere. We try to increase the number of points where we perform the calculations, because we found out that by increasing the resolution, we make the simulation of the physical processes more realistic. Having a more realistic representation of those physical processes makes us hope that we’ll have more precision in our predictions and in our projections for the end of the century.


This is a service that institutions like the BSC are providing to the global community in collaboration with the services of Canada, the UK and Germany. We are the four global producing centers of this kind of information. Every year we make simulations of what the climate is going to be like in the next 10 years. We know that the position of those predictions will be limited by our ability to observe processes that are taking place, but the comparing of our predictions in the past with the best observations that we can get from satellites, from buoys in the ocean, makes us think that predictions of temperatures in particular in summer in places like Europe and North America are fairly precise, and usable by, for instance, the institutions that are in charge of food security in Europe.


This global climate model, that has limited precision in the sense that it cannot tell us exactly what is going to happen in the next five years, is much better than guessing that the future is going to be like the past, because we know it’s not the case. The climate is changing. It’s changing now.

What's your relationship with other supercomputing centers?

Collaboration is critical when it comes to climates, weather and air quality research. Meteorology is one of the first fields where open access to data was established, and that was established in the 1950s, probably earlier. The reason is that the climate and weather is not local. It’s a global problem. One cannot really predict what the weather is going to be like in Barcelona for the next five days without knowing what happens in the North Atlantic, or what happens with high pressures or low pressures near Greenland. The same happens with the climate. The climate in the next 5,000 years is dependent on what is happening in remote areas like the equatorial Pacific or even in central Asia. So these enterprises are only possible if data is exchanged fast, open and freely. The World Meteorological Organization, many decades ago, already established a system to exchange observations of the climate system.

As supercomputing advances, power requirements are equating to that of a small city. How can we mitigate the climate impact of emerging technologies?

Mitigating the impact of new technologies is a really interesting question in the sense that most of the new technologies that are expanding require energy. 5G is a good example of this, and it’s only one of them. Supercomputers. They need more and more energy to power the machines and also to cool them down. Beyond the fact that we have to keep investing in energy efficiency there is an obvious goal that society should consider, which is that all new technologies should come along with an equivalent investment in green energy production, investments in solar energy, investments in bio biomass resources, investments in wind energy.

In terms of environmental action, where is Spain when compared to other countries across Europe?

Spain has relatively poor records in terms of policy making and even research in terms of climate change. This is unfortunate, but it’s probably the result of the lack of tradition of this field and the lack of investment by governments all throughout history and in the country. Somehow Spain is lacking a law on climate change. There is a plan, but there is no law yet. While many other countries, particularly in Northern Europe, have already had it for years. Spain lacks a consultative body for policymakers to make the most of the climate information that is available now in the public domain. This is something that other countries, the UK for instance, have had for probably 15 years now.


Spain is also missing serious assessments of the economic impacts of climate change. When I say serious I mean a proper assessment of how different sectors in both public and private administrations are going to suffer from what it’s going to happen in the next five years, 10 years, 50 years, 100 years, starting from now. So somehow we have a long way to go. The good thing about Spain is that the degree of awareness of society is quite high. The level of skepticism is quite low and the interest that young people show in this problem is also quite high, comparable to what one can find in most other European countries.

What is Spain currently doing to foster environmental science and scientific climate research? Which are the leading initiatives?

There are quite a few initiatives in terms of controlling, measuring and mitigating the impact of air pollution, for instance, increasingly the impact of noise pollution in cities. So this is very close to society and the citizens, which means that there is more awareness and higher demands for this kind of information on a timely basis. In terms of climate change there are very few initiatives in the country. There are quite a few initiatives in terms of the ecosystemic impacts of climate change, and that is very relevant because ecosystemic services is one of the emergent areas of policy making concerning the impacts of climate change. But in terms of what Spain can do to provide and produce better climate information at the national and regional scales, we are still starting and there is very poor investment given the size of the problem that we are facing. 

Do you think programs like Digital Future Society are important in the sense of raising awareness?

Digital Future Society is very important in the sense that it is targeting basically the impact that digital technologies and emerging technologies might have. Somehow we need these kinds of initiatives that are emerging from the digital world itself to make understandable the kind of problems, social problems, like the one I work on, climate change and air quality problems, to the digital world itself. So it’s only an internal initiative that can really translate what people like me, researchers, are doing into a vocabulary and a language that is understandable by this community.