The Simulations Underpinning the Governments Response to Covid – 19 Explained

The covid-19 pandemic presents the biggest challenge in living memory to governments worldwide and epidemiologists are rushing to provide guidance to those in power as to how best deal with this catastrophe. How do they go about formulating the models and simulations that help predict how this pandemic will evolve? How accurate are they? How effective are the social distancing measures they call for?

Big data science and modelling is normally something you only have first-hand experience of if you spend your time looking at fluid dynamics, galaxy evolution, probing the electronic structure of interacting particles or in some other scientific pursuits. This was until Covid– 19 brought the modelling of infectious diseases to the centre of government policy and our lives.

A few weeks ago, research from Imperial College London indicated that the United Kingdom’s health service would soon be overwhelmed with severe cases of Covid-19 and might face more than 500,000 deaths, causing a change in government policy [1].  Prior to this Government officials had previously based their approach on a theory of allowing the disease to spread while protecting the oldest in society. This was motivated by the idea that large numbers of infected people would recover and provide herd immunity for the rest [2]. Following the release of this report, a ‘focusing of the minds’ led the government to impose stringent social distancing measures, which remain in place [3]. Given the relevance of such research I believe it is imperative that we all have some understanding of how these models are formulated.

The details of each model are unique to every research group, however in general they are based around trying to understand how people move between three main states, and how quickly: individuals are either susceptible (S) to the virus; have become infected (I); and then either recover (R) or die [3]. In order to build a precise and accurate model that can be used by governments to decide what social distancing measures need to be introduced they split the population into small groups.

By gathering detailed information on population size and density, demographics such as age, transport links, the size of social networks, and health-care provision (as was done by the LSHTM and others [4]), modellers can build a virtual copy of an area. Differential equations are then used to govern the movements and interactions of population groups in space and time. If you are interested in how these equations are derived and used, Tian et al [5] provides a succinct explanation of the power laws involved, though it is worth noting that they specifically look at Germany. Once this modelling is completed, they seed this constructed world with an infection and watch how things unfold.

To do this requires another important parameter – the basic reproduction number (R0). In short, this is the number of people one infected person will pass the virus on to. At the start of a pandemic this number can only be loosely estimated with a large confidence interval. By piecing together the virus’s basic properties from incomplete information from China, as well as using data from the previous MERS and SARS outbreaks, modellers at Imperial estimated an R0 value of 2.3(0.3) [1]. A simulation using this parameter will always give the same result unless it is stochastic (it has an in-built randomness). When the model is run several times, a range of likely probabilities is produced, and then an average value is given with confidence intervals. Once a simulation has been completed, it is possible to repeat it with certain variables changed to see what the outcome would be with a reduction in social mixing, and to see how the R0 would be affected. The aim is to achieve an R0<1. Necessary social distancing measures can then be imposed by the government as soon as possible.

In the UK, the Imperial paper was released on the 16th March, however the government did not impose a strict lockdown until the 23rd of March [6]. The author of the Imperial paper stated on the 25 March he was “reasonably confident” that total deaths in the United Kingdom could be held below 20,000. While mercifully lower than the  500,000 deaths predicted if no action was taken, this is still considerably higher than the number of deaths that would have occurred had they acted sooner – see table four of reference [1].

As scientists discover more about virus, the key variables continue to be updated. A report on the 26th March looking at the global impact of Covid-19 from the same group, with the R0 value altered to 2.9(0.4) [7], whilst a report on the 30th March changed it to 3.9(0.8) [8].

Given that these numbers have changed somewhat, it is perhaps prudent to question how valid these models are. Unfortunately, during a pandemic it is difficult to get precise data regarding, for example, infection rates, against which to judge a model’s projections. A proper validity test is not yet possible, due to the UK’s chronic lack of testing combined with the absence of an effective test for asymptomatic cases. As John Edmunds, who models at LSHTM, writes “The total numbers of cases reported, is that accurate? No. Accurate anywhere? No.” [3].

It is essential therefore that such a test is developed as soon as possible and distributed to the wider populace. Not only will this help medical professionals in the short term, but it will also provide epidemiologists vital information about the situation they are currently facing [9]. Using this enables them to determine when to lift the lockdown, to ensure there is not a second wave (as happened with Spanish flu), whilst also making sure people are not kept in lockdown for longer than is necessary [1].

Joe Baker


1) – Ferguson, N. 2020. ‘Report 9: Impact of non-pharmaceutical interventions (NPIs) to reduce COVID19 mortality and healthcare demand.’ On behalf of the imperial college covid-19 response team, Imperial College of London.

2) – Ellison, J. 2020. Medical Press ‘COVID-19 strategies built around ‘herd immunity’ are problematic’ University of Washington

3) – David, A. 2nd April 2020. Nature ‘Special report: The simulations driving the world’s response to COVID-19’

4) – Klepac, P et al. 2020. medRxiv ‘Contacts in context: large-scale setting-specific social mixing matrices from the BBC Pandemic project.’

5) – Tian, L et al. 2020. arXiv preprint arXiv:2003.07353 ‘Pre-symptomatic Transmission in the Evolution of the COVID-19 Pandemic.’

6) – Scofield, C 6th April 2020. The Yorkshire Post ‘When will the UK lockdown end? How long restriction rules could last – and how they’re enforced’

7) – Walker, P.G. et al. 2020. ‘The global impact of covid-19 and strategies for mitigation and suppression.’ On behalf of the imperial college covid-19 response team, Imperial College of London.

8) – Flaxman, S. et al. 2020. ‘Estimating the number of infections and the impact of nonpharmaceutical interventions on COVID-19 in 11 European countries.’ Imperial College COVID-19 Response Team30

9) – Lourenço, J. et al. 2020. medRxiv. ‘Fundamental principles of epidemic spread highlight the immediate need for large-scale serological surveys to assess the stage of the SARS-CoV-2 epidemic.’ 

Photo credits from top to bottom Niaid. Under Attribution 2.0 Generic license.

By Joe Baker

MSci Physics student at the University of Nottingham

One reply on “The Simulations Underpinning the Governments Response to Covid – 19 Explained”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s