The African Rainfall Project

Nick van de Giesen and Camille Le Coz from Delft University of Technology, Netherlands, give us some detailed background on the African Rainfall Project


In many parts of the world, weather predictions have improved remarkably over the past decades. This is largely due to the availability of super computers for numerical weather predictions and weather satellites. The first computer-based weather prediction was performed in 1950 on the world’s first general purpose computer, the ENIAC, with the famous John van Neumann as one of the investigators ( It took them 24 hours to predict 24 hours of weather over a relatively small domain. Interestingly, they mentioned that one cause of the prediction errors found was “too large a space increment”, in other words, the model was too coarse.

In Africa, weather predictions are, unfortunately, still not very good. When it comes to rainfall, forecasts for tomorrow are as (un)reliable as ten-day forecasts in North America or Europe. This is mainly due to two facts. First, the ground measurement network in Africa is about one hundred times sparser than in, say, in Europe. If one does not know today’s state of the atmosphere, it is very difficult to predict tomorrow’s state. Although satellites are helpful, they cannot “see” essential variables such as barometric pressure. With the TAMO project ( we try to fill that gap. The second fact is that most rain falls in so-called convective storms, similar to the heavy summer rain storms in moderate climates. These are notoriously difficult to predict because they are very localized and develop over scales of 5-20 kilometers. This implies that the numerical models have to be very fine as well. The global models that provide the basis for all forecasts in Africa run on a much coarser grid so we can say that, just as in 1950, there is “too large a space increment.”

One can run the weather models on finer grids but this is computationally quite demanding. The problem is four dimensional; time plus the three spatial dimensions. One cannot simply reduce one dimension, they have to be reduced in tandem. So if one wants to halve the grid size, one needs eight times more computing power. The famous Moore’s Law states that the number of transistors on a chip doubles about every two years. That means if we would simply wait on technical improvement, we would have to wait eight years to double the resolution of our weather models.

This, of course, is where the World Community Grid comes in. By using the unused computer power of thousands of machines, it becomes possible to run weather simulations at a resolution fine enough to resolve convective storms. In fact, we run weather simulations at a scale of 1km over the whole of sub-Saharan Africa for a full year. We are now at 70% of the total number of simulations to be made. Soon, these will also be made available on an interactive website. Stay tuned!

Heavy rain in Africa. Photo: Jan Friesen