Clicky

logo

# 1 Set-up ----

# load libraries
library(tidyverse) # managing data
library(ggdag) # drawing DAG
library(kableExtra) # visualising table
library(here) # handling path
library(rstan) # running Bayesian models
library(plotly) # interactive plot

# stan setup
options(mc.cores = parallel::detectCores()-1)
rstan::rstan_options(auto_write = TRUE) # speed up running time of compiled model

Introduction

From Tutorial 1 to Tutorial 3, we built a bottom-up population model, modelling brick by modelling brick. In Tutorial 1 we modelled population count as a Poisson-Lognormal compound, accounting for settled area. In Tutorial 2, we added a hierarchical random intercept by settlement type and region, that differentiates parameter estimation by the natural clustering of the observations. In Tutorial 3 we modelled small-scale variations in population density by adding a linear regression based on geospatial covariates. We covered thus all the fundamental modelling blocks of WorldPop bottom-up population model, described in Leasure et al. (2020).

We will cover in this tutorial advanced model diagnostics for a complete goodness-of-fit assessment. Once we check that the model successfully passes the different diagnostics, we will then predict population count for every grid cell of the study area. Predicting population will be the opportunity to talk about estimates uncertainty.

Goals

  1. Explore convergence issue
  2. Understand posterior predictive check
  3. Understand cross-validation and overfitting
  4. Predict population for every grid cell of the study area
  5. Visualise prediction uncertainty
  6. Aggregate prediction in custom spatial units

Supporting readings

Extra packages

We will use some additional packages, mainly for GIS manipulation:

  • raster for predicting the gridded population in a raster format

  • sf for aggregating the prediction in custom spatial unit

  • tmap to plot spatial data

  • RColorBrewer to use nice color palette in plots

install.packages('raster')
install.packages('RColorBrewer')
install.packages('sf')
install.packages('tmap')

Vignettes can be found here: raster, RColorBrewer, sf and tmap.

Advanced model diagnostics

Before predicting population count for the entire study area we want to make sure that the model is correct. We thus perform additional checks: on the model convergence with traceplots and stan warnings, on the estimated parameters with a posterior predictive check and on the predicted population count at study site with a cross-validation approach.

We will work with the first relevant population model that we built in Tutorial 1, that is the “basic” Poisson-Lognormal model:

\[\begin{equation} population 〜 Poisson( pop\_density * settled\_area) \\ pop\_density 〜 Lognormal( \mu, \sigma) \\ \\ \mu 〜 Normal( 5, 4 ) \\ \sigma 〜 Uniform( 0, 4 )\tag{1} \end{equation}\]

We do some set-ups that should now be familiar.

# 1. Model diagnostics ----

# load data
data <- readxl::read_excel(here('tutorials/data/nga_demo_data.xls'))
data <- data %>% 
  mutate(
    id = as.character(1:nrow(data)),
    pop_density=N/A
  )
# mcmc settings
chains <- 4
iter <- 500
seed <- 1789

# stan data
stan_data <- list(
  population = data$N,
  n = nrow(data),
  area = data$A)

# set parameters to monitor
pars <- c('mu', 'sigma', 'density_hat', 'population_hat')

Assessing convergence

In this section we will explore models that have convergence issues and discover how to evaluate where they stem from and how to fix it.

First let’s look at the impact of the warmup length, the early phase of the simulations in which the sequences get closer to the mass of the distribution.

We choose a warmup of 20 iterations (instead of 250).

# 1.1 Short warmup period ----

# set short warmup
warmup <- 20

# mcmc
fit_lowWarmup <- rstan::stan(file = file.path('../tutorial1/tutorial1_model2.stan'), 
                          data = stan_data,
                          iter = warmup + iter, 
                          chains = chains,
                          warmup = warmup, 
                          pars = pars,
                          seed = seed)
## Warning: There were 1097 transitions after warmup that exceeded the maximum treedepth. Increase max_treedepth above 10. See
## http://mc-stan.org/misc/warnings.html#maximum-treedepth-exceeded
## Warning: There were 2 chains where the estimated Bayesian Fraction of Missing Information was low. See
## http://mc-stan.org/misc/warnings.html#bfmi-low
## Warning: Examine the pairs() plot to diagnose sampling problems

Stan shows several warnings. You can explore the meaning of them here.

Looking at the traceplot is, in this case, very informative:

traceplot(fit_lowWarmup, c('mu', 'sigma'), inc_warmup=T)

The traceplots indicate that some exploration is still happening after the shaded area, that is after the end of the warmup period.We just need to increase the length of the warmup period.

Next we will look at a more serious convergence issue. Let’s say that we choose unfit priors for our model. Typically we can declare a Uniform with stricter bounds for \(\sigma\):

\[\begin{equation} \sigma 〜 Uniform( 0, 1 ) \end{equation}\]

We store the model under tutorial4/tutorial1_model2_wrong.stan and estimate it.

# 1.2 Wrong prior ----

warmup <- 250

# mcmc
fit_wrong <- rstan::stan(file = file.path('tutorial1_model2_wrong.stan'), 
                   data = stan_data,
                   iter = warmup + iter, 
                   chains = chains,
                   warmup = warmup, 
                   pars = pars,
                   seed = seed)
## Warning: There were 1981 divergent transitions after warmup. See
## http://mc-stan.org/misc/warnings.html#divergent-transitions-after-warmup
## to find out why this is a problem and how to eliminate them.
## Warning: There were 4 chains where the estimated Bayesian Fraction of Missing Information was low. See
## http://mc-stan.org/misc/warnings.html#bfmi-low
## Warning: Examine the pairs() plot to diagnose sampling problems
## Warning: The largest R-hat is 4.4, indicating chains have not mixed.
## Running the chains for more iterations may help. See
## http://mc-stan.org/misc/warnings.html#r-hat
## Warning: Bulk Effective Samples Size (ESS) is too low, indicating posterior means and medians may be unreliable.
## Running the chains for more iterations may help. See
## http://mc-stan.org/misc/warnings.html#bulk-ess
## Warning: Tail Effective Samples Size (ESS) is too low, indicating posterior variances and tail quantiles may be unreliable.
## Running the chains for more iterations may help. See
## http://mc-stan.org/misc/warnings.html#tail-ess

Many metrics detect a convergence issue. The most worrying one is the divergent transitions that are often related with issues in the model structure itself and not only within the estimation process.

traceplot(fit_wrong, c('mu', 'sigma'))

The traceplot for \(\mu\) indicates that the chains didn’t mix. Moreover the traceplot for \(\sigma\) shows clearly the issue in bounding with a ceiling at 1 that overly constrained the estimation.

Checking predicted posterior

Let’s come back to a correctly fit model.

# 2.3 Predicted posterior check ----

# mcmc
fit <- rstan::stan(file = file.path('../tutorial1/tutorial1_model2.stan'), 
                          data = stan_data,
                          iter = warmup + iter, 
                          chains = chains,
                          warmup = warmup, 
                          pars = pars,
                          seed = seed)

As shown in previous paragraph, Bayesian estimation highly relies on choosing correct priors for the parameters. Those priors should reflect expert knowledge about the quantity of interest that is, then, confronted with the observations. The prior model defines the space for the posterior estimation. We should thus ensure that the prior support did not overly constrain the posterior. This is called a posterior predictive check.

We need first to extract the estimated posterior distribution from the stanfit object. We do it using the extract function from rstan.

# extract estimated mu
mus <- data.frame(
    # extract parameter
    posterior=extract(fit, pars='mu')$mu,
    parameter = 'mu'
    )
glimpse(mus)
## Rows: 2,000
## Columns: 2
## $ posterior <dbl> 4.765670, 4.735604, 4.788244, 4.726230, 4.746974, 4.783040, ~
## $ parameter <chr> "mu", "mu", "mu", "mu", "mu", "mu", "mu", "mu", "mu", "mu", ~

mus contains the estimated mu for each iteration post-warmup (500) of each Markov chain (4).

We want to recreate the prior distribution. In model 2 of Tutorial 3, we used as prior: \[\mu \sim Normal(5,4)\] We sample from this prior distribution, the same number of draws that we have for the posterior distribution (that is 2000).

# retrieve stan parameter
post_warmup_draws <- iter - warmup

# simulate from the prior distribution
mus$prior <- rnorm(chains * post_warmup_draws, 5, 4)

We build for \(\sigma\) a similar dataframe to mus that contrasts its prior distribution with its posterior distribution and bind it together with mus.

# build dataframe with parameters distribution comparison
parameters <-  rbind(
  # alpha
  mus ,
  #sigma
  data.frame(
      posterior=extract(fit, pars='sigma')$sigma,
      prior = runif(chains * post_warmup_draws, 0, 4),
      parameter = 'sigma')
  )  %>% 
  pivot_longer(-parameter,
    names_to = 'distribution',
    values_to = 'value'
  )

# plot distribution comparison for both parameter
ggplotly(ggplot(parameters, aes(x=value, color=distribution, fill=distribution))+
  geom_density()+
  theme_minimal()+
    facet_wrap(.~parameter, scales = 'free'))

Figure 1: Posterior predictive checks for alpha and sigma

Figure 1 shows a posterior predictive check for both \(\mu\) and \(\sigma\). It compares the prior distribution that we set in the model and the estimated posterior distribution. Don’t hesitate to zoom in, as the range of values is very different between the prior and the posterior.

We confirm that our priors did not falsely constrain the posterior, with a very wide range of possible values. The unique constrain we can observe is the lower bound on \(\sigma\) at 0 but this is normal as a variance has to be positive.

Cross-validation of model predictions

After checking the model structure, we revisit the goodness-of-fit metrics based on population predictions.

In tutorial 1 to 3, we check the goodness-of-fit of the modelled predictions in sample, that is predicting population and comparing it to the observations for the study sites that were already used to fit the model.

In-sample checks do not account for model overfitting. Overfitting occurs when the model does not only reproduce the information from the observations but also the noise. We want the model to be able to accurately predict population also for unknown new study sites. This real-world scenario can be reproduced by keeping a percentage of the observations for model fitting and leaving the remaining study sites for predicting, a mechanism called cross-validation.

We need first to split our data in two, a training set (70%) that will be used for fitting the model and a predicting set (30%) that will be used for model prediction:

# 2.4 Cross validation of predictions ----

set.seed(2004)
# sample observations
train_size <-  round(nrow(data)*0.7)
train_idx <- sample(1:nrow(data), size = train_size, replace=F)

# build train datasets
train_data <- data[train_idx,]

# build test datasets
test_data <- data[-train_idx,]

In stan, cross-validation can be done simultaneously with model fitting by providing the testing dataset to the generated quantities modelling block. Because all variables have to be defined for each set, it lengthens substantively the code.

data{
  
  int<lower=0> n_train; // number of microcensus clusters in training set
  int<lower=0> n_test; // number of microcensus clusters in predicting set

  int<lower=0> population_train[n_train]; // count of people
  
  vector<lower=0>[n_train] area_train; // settled area in training
  vector<lower=0>[n_test] area_test; // settled area in testing
}

parameters{
  // population density
  vector<lower=0>[n_train] pop_density_train;
  
  //intercept
  real mu;

  // variance
  real<lower=0> sigma; 
}

model{
  
  // population totals
  population_train ~ poisson(pop_density_train .* area_train);
  pop_density_train ~ lognormal( mu, sigma );
  
  //  intercept
  mu ~ normal(5, 4);
  
  // variance
  sigma ~ uniform(0, 4);
}

generated quantities{
  
  int<lower=-0> population_hat[n_test];
  real<lower=0> density_hat[n_test];

  for(idx in 1:n_test){
    density_hat[idx] = lognormal_rng( mu, sigma );
    population_hat[idx] = poisson_rng(density_hat[idx] * area_test[idx]);
  }
}

We prepare the data for stan and run the model:

# prepare data
stan_data_xval <- list(
  population_train = train_data$N,
  n_train = nrow(train_data),
  n_test = nrow(test_data),

  area_train = train_data$A,
  area_test = test_data$A,

  seed=seed
)

# mcmc setting
pars <- c('mu','sigma', 'population_hat',  'density_hat')

# mcmc
fit_xval <- rstan::stan(file = file.path('tutorial1_model2xval.stan'), 
                   data = stan_data_xval,
                   iter = warmup + iter, 
                   chains = chains,
                   warmup = warmup, 
                   pars = pars,
                   seed = seed)

We now have access to two predictions sets: one, fit, from the in-sample prediction and one, fit_xval, from the cross-validation prediction.

We compare the goodness-of-metrics from the two models:

# predict population
getPopPredictions <- function(model_fit, 
                              estimate='population_hat',
                              obs='N', reference_data=data){
  # extract predictions
  predicted_pop <- as_tibble(extract(model_fit, estimate)[[estimate]])
  colnames(predicted_pop) <- reference_data$id
  
  # summarise predictions
  predicted_pop <- predicted_pop %>% 
    pivot_longer(everything(),names_to = 'id', values_to = 'predicted') %>% 
    group_by(id) %>% 
    summarise(
      across(everything(), list(mean=~mean(.), 
                                upper=~quantile(., probs=0.975), 
                                lower=~quantile(., probs=0.025)))
      ) %>% 
    # merge with observations
    left_join(reference_data %>% 
                rename('reference'=all_of(obs)) %>% 
                select(id, reference), by = 'id') %>%
    # 
    mutate(
      residual= predicted_mean - reference,
      ci_size = predicted_upper- predicted_lower,
      estimate = estimate
      )
return(predicted_pop)
}

# build comparison dataframe between in-sample and xvalidation
comparison_df <- rbind(
 getPopPredictions(fit) %>% 
   mutate(Model='In-sample'),
 getPopPredictions(fit_xval, reference_data = test_data) %>% 
   mutate(Model='Cross-validation'))

# compute goodness-of-fit metrics
comparison_df %>% group_by(Model) %>% 
  summarise( `Bias`= mean(residual),
    `Inaccuracy` = mean(abs(residual)),
        `Imprecision` = sd(residual)
) %>%  kbl(caption = 'Goodness-of-metrics computed in-sample vs cross-validation') %>% kable_minimal()
Table 1: Goodness-of-metrics computed in-sample vs cross-validation
Model Bias Inaccuracy Imprecision
Cross-validation 96.40268 263.2451 313.2041
In-sample 61.22336 265.6971 337.8884

Table 1 shows that all metrics shows a worse fit: the bias, the imprecision and the inaccuracy increase. However they still remains in the same range, which seems to indicate that our model is not overfitting.

A good practice is to always assess the model fit on the testing set to avoid producing overly optimistic goodness-of-fit metrics.

Gridded population prediction

We will cover in this section model predictions for the entire study area at the grid cell level. Population can be predicted for any spatial unit as long as 1. all the covariates and input data can be produced for the spatial unit 2. the predicting spatial unit is not too different in size from the fitting spatial unit, such that the estimated relationship between covariates and population density can still be applied.

We will predict population for grid cell of 100m x 100m and use our best population model: model 2 from tutorial 3.

# 3. Population prediction ----

tutorial3_model2_fit <- readRDS("../tutorial3/tutorial3_model2_fit.rds")

It requires extracting the estimated parameters distributions: \(\hat\alpha_{t,r}, \hat\sigma, \hat\beta^{fixed}, \hat\beta^{random}_t\). These estimated parameters are then combined with the input rasters following this modified model for prediction:

\[\begin{equation} population\_predicted 〜 Poisson( \hat{pop\_density} * settled\_area) \\ \hat{pop\_density} 〜 Lognormal(\hat\mu, \: \hat\sigma) \\ \hat\mu = \hat\alpha_{t,r} + \hat\beta^{random}_t X^{random} + \hat\beta^{fixed} X^{fixed} \tag{2} \end{equation}\]

\(X\) and \(settled\_area\) are the input data at grid cell level:

Grid-cell based model prediction requires thus 10 input rasters:

  • settled area

  • settlement type

  • administrative region

  • the six covariates

  • a mastergrid that defines the grid cells for which to predict the population. These grid cells are (1) contained inside the study area, (2) considered as settled.

Unfortunately we are not allowed to redistribute the settlement map used in Nigeria v1.2, that was provided by Oak Ridge National Laboratory (2018). We will thus provide the code to show all the steps of population prediction but you will not be able to run it.

We will focus on region 8 that contains only 671 110 cells instead of the 166 412 498 cells for the entire country.

Preparing data for prediction

To predict gridded population, we convert the rasters set to a table with: in rows, the grid cells and in column, each raster value. We don’t need to keep the spatial structure of the raster for model prediction. Every grid cell can be estimated independently as long as we know its administrative region, settlement type and covariates attributes.

library(raster)
wd_path <- 'Z:/Projects/WP517763_GRID3/Working/NGA/ed/bottom-up-tutorial/'

# function that loads the raster and transforms it into an array
getRasterValues <- function(x){
  return( raster(paste0(wd_path,x))[])
}

# get a list of the rasters
rasterlist <- list.files(wd_path,
                         pattern='.tif', full.names = F)
# apply function to raster list
raster_df <- as_tibble(sapply(rasterlist, getRasterValues))

The raster table contains 671 110 rows and 10 columns. 95% of the rows are NAs because they are not considered as settled.

settled <- raster_df %>% filter(mastergrid==1)
settled %>% head() %>% kbl() %>% kable_minimal()
mastergrid settled_area x1 x2 x3 x4 x5 x6 gridcell_id settlementType
1 0.3764444 -0.0819475 0.0126183 4.663749 -0.1662194 -0.1876594 -0.1262890 47 5
1 0.0235278 -0.0629407 0.0189274 4.664463 -0.1640482 -0.1852137 -0.1265523 48 5
1 0.0411736 -0.0036624 0.0000000 4.661014 -0.1278646 -0.1549067 -0.1385387 90 5
1 0.0176458 0.0634254 0.0000000 4.662555 -0.1018127 -0.1279219 -0.1385805 91 5
1 0.0117639 0.0478863 1.2744479 4.871751 0.9074216 1.0802944 0.2204351 208 5
1 0.0999931 0.0948045 1.3028392 4.877525 0.8835372 1.0512134 0.2195443 209 5

We need to scale the covariates with the scaling factors computed during the fitting stage.

#load scaling factor
scale_factor <- read_csv('tutorials/data/scale_factor.csv')

scaled_covs <- settled %>% 
  # subset covs column
  dplyr::select(starts_with('x'), gridcell_id) %>% 
  
  # convert table to long format
  pivot_longer(-gridcell_id, names_to='cov', values_to = 'value') %>%
  
  # add scaling factor
  left_join(scale_factor) %>% 
  
  # scale covs
  mutate(value_scaled= (value-cov_mean)/cov_sd) %>% 
  dplyr::select(gridcell_id, cov, value_scaled) %>% 
  
  # convert table back to wide format
  pivot_wider(names_from = cov, values_from = value_scaled)

# replace the covs with their scaled version
raster_df <- raster_df %>% dplyr::select(-starts_with('x')) %>% 
  left_join(scaled_covs)

Extracting estimated parameters

The second step is to extract the estimated parameter distributions. We use the model fitted with the full observations dataset and not the one from the cross-validation for maximum information.

#extract betas
beta_fixed <- t(rstan::extract(model_fit, pars='beta_fixed')$beta_fixed)
beta_random <- t(rstan::extract(model_fit, pars='beta_random')$beta_random)
#extract alphas
alphas <- rstan::extract(model_fit, pars='alpha_t_r')$alpha_t_r
#extract sigmas
sigmas <-  rstan::extract(model_fit, pars='sigma')$sigma

We first focus on producing \(\hat\beta^{fixed} X^{fixed}\) from Equation (2), the covariates modelled with a fixed effect.

\(\hat\beta^{fixed}\) is a 2000 x 5 matrix for the 5 covariates and 2000 estimations. We transposed it to multiply by \(X^{fixed}\), a 310 512 X 5 matrix of covariates for every settled cells.

# extract covariates modelled with a fixed effect
cov_fixed <- settled %>% 
  dplyr::select(x1:x3, x5:x6) %>% 
  as.matrix()

cov_fixed <- cov_fixed %*% beta_fixed

We then produce \(\hat\beta^{random}_t X^{random}\), the covariates modelled with a random effect.

\(\hat\beta^{random}_t\) is a 2000 x 5 matrix for the 5 settlement types. We need to associate each grid cell with the correct \(\hat\beta^{random}_t\) based on the grid cell settlement type and then multiply by \(X^{random}\) the covariates values.

beta_random <- as_tibble(beta_random)
# add settlement type to beta random
beta_random$settlementType <- 1:5

# extract covariates modelled with a random effect
cov_random <- settled %>% 
  # subset random covariate and settlement type
  dplyr::select(settlementType,x4) %>% 
  # associate correct estimated beta_t
  left_join(beta_random) %>% 
  # multiply cov by slope
  mutate(
    across(starts_with('V'), ~ .x*x4)
  ) %>% 
  # keep only the estimates
  dplyr::select(-settlementType, -x4) %>% 
  as.matrix()

We eventually need to associate the correct \(\hat\alpha_{t,8}\) to each grid cell.

\(\hat\alpha_{t,8}\) has a similar format as \(\hat\beta^{random}_t\). We will thus proceed in the same way.

# subset alpha_t for region 8
alpha_t_8 <- as_tibble(t(alphas[,,8]))
# assign settlement type
alpha_t_8$settlementType <- 1:5

alpha_predicted <- settled %>% 
  dplyr::select(settlementType) %>% 
  left_join(alpha_t_8) %>% 
  dplyr::select(-settlementType) %>% 
  as.matrix()

We can finally compute \(\hat\mu\) as the sum of \(\hat\alpha_{t,r}\), \(\hat\beta^{random}_t X^{random}\) and \(\hat\beta^{fixed} X^{fixed}\)

# sum mu components
mu_predicted <- alpha_predicted + cov_fixed + cov_random

Predicting population count for every grid cell

To estimate the grid-cell population, we need to simulate from Equation (2) with the estimated \(\hat\mu\) and \(\hat\sigma\).

predictions <- as_tibble(mu_predicted) %>% 
  # add grid cell id and the settled area to mu
  mutate(
      gridcell_id= settled$gridcell_id,
      settled_area = settled$settled_area
    )  %>% 
  # long format
  pivot_longer(
    -c(gridcell_id, settled_area), 
    names_to = 'iterations', values_to = 'mu_predicted') %>%
  mutate(
    # add sigma iterations for every grid cell
    sigma=rep(sigmas, nrow(mu_predicted)),
    # draw density for a log normal
    density_predicted = rlnorm(n(), mu_predicted, sigma),
    # draw population count from a Poisson
    population_predicted = rpois(n(), density_predicted*settled_area)
    ) %>% 
  dplyr::select(-mu_predicted,-sigma, -density_predicted) %>% 
  # convert it back to grid cell x iterations matrix
  pivot_wider(-c(iteration, population_predicted),
      names_from = iteration,
      values_from = population_predicted
    ) 

predictions contains 2000 population predictions for every settled grid. cell. Let’s see how does it look like for the 5 settled grid cells:

predictions[1:5, ] %>% dplyr::select(gridcell_id, everything()) %>% 
  kbl() %>% kable_minimal()  %>% scroll_box(width = "100%")
gridcell_id V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16 V17 V18 V19 V20 V21 V22 V23 V24 V25 V26 V27 V28 V29 V30 V31 V32 V33 V34 V35 V36 V37 V38 V39 V40 V41 V42 V43 V44 V45 V46 V47 V48 V49 V50 V51 V52 V53 V54 V55 V56 V57 V58 V59 V60 V61 V62 V63 V64 V65 V66 V67 V68 V69 V70 V71 V72 V73 V74 V75 V76 V77 V78 V79 V80 V81 V82 V83 V84 V85 V86 V87 V88 V89 V90 V91 V92 V93 V94 V95 V96 V97 V98 V99 V100 V101 V102 V103 V104 V105 V106 V107 V108 V109 V110 V111 V112 V113 V114 V115 V116 V117 V118 V119 V120 V121 V122 V123 V124 V125 V126 V127 V128 V129 V130 V131 V132 V133 V134 V135 V136 V137 V138 V139 V140 V141 V142 V143 V144 V145 V146 V147 V148 V149 V150 V151 V152 V153 V154 V155 V156 V157 V158 V159 V160 V161 V162 V163 V164 V165 V166 V167 V168 V169 V170 V171 V172 V173 V174 V175 V176 V177 V178 V179 V180 V181 V182 V183 V184 V185 V186 V187 V188 V189 V190 V191 V192 V193 V194 V195 V196 V197 V198 V199 V200 V201 V202 V203 V204 V205 V206 V207 V208 V209 V210 V211 V212 V213 V214 V215 V216 V217 V218 V219 V220 V221 V222 V223 V224 V225 V226 V227 V228 V229 V230 V231 V232 V233 V234 V235 V236 V237 V238 V239 V240 V241 V242 V243 V244 V245 V246 V247 V248 V249 V250 V251 V252 V253 V254 V255 V256 V257 V258 V259 V260 V261 V262 V263 V264 V265 V266 V267 V268 V269 V270 V271 V272 V273 V274 V275 V276 V277 V278 V279 V280 V281 V282 V283 V284 V285 V286 V287 V288 V289 V290 V291 V292 V293 V294 V295 V296 V297 V298 V299 V300 V301 V302 V303 V304 V305 V306 V307 V308 V309 V310 V311 V312 V313 V314 V315 V316 V317 V318 V319 V320 V321 V322 V323 V324 V325 V326 V327 V328 V329 V330 V331 V332 V333 V334 V335 V336 V337 V338 V339 V340 V341 V342 V343 V344 V345 V346 V347 V348 V349 V350 V351 V352 V353 V354 V355 V356 V357 V358 V359 V360 V361 V362 V363 V364 V365 V366 V367 V368 V369 V370 V371 V372 V373 V374 V375 V376 V377 V378 V379 V380 V381 V382 V383 V384 V385 V386 V387 V388 V389 V390 V391 V392 V393 V394 V395 V396 V397 V398 V399 V400 V401 V402 V403 V404 V405 V406 V407 V408 V409 V410 V411 V412 V413 V414 V415 V416 V417 V418 V419 V420 V421 V422 V423 V424 V425 V426 V427 V428 V429 V430 V431 V432 V433 V434 V435 V436 V437 V438 V439 V440 V441 V442 V443 V444 V445 V446 V447 V448 V449 V450 V451 V452 V453 V454 V455 V456 V457 V458 V459 V460 V461 V462 V463 V464 V465 V466 V467 V468 V469 V470 V471 V472 V473 V474 V475 V476 V477 V478 V479 V480 V481 V482 V483 V484 V485 V486 V487 V488 V489 V490 V491 V492 V493 V494 V495 V496 V497 V498 V499 V500 V501 V502 V503 V504 V505 V506 V507 V508 V509 V510 V511 V512 V513 V514 V515 V516 V517 V518 V519 V520 V521 V522 V523 V524 V525 V526 V527 V528 V529 V530 V531 V532 V533 V534 V535 V536 V537 V538 V539 V540 V541 V542 V543 V544 V545 V546 V547 V548 V549 V550 V551 V552 V553 V554 V555 V556 V557 V558 V559 V560 V561 V562 V563 V564 V565 V566 V567 V568 V569 V570 V571 V572 V573 V574 V575 V576 V577 V578 V579 V580 V581 V582 V583 V584 V585 V586 V587 V588 V589 V590 V591 V592 V593 V594 V595 V596 V597 V598 V599 V600 V601 V602 V603 V604 V605 V606 V607 V608 V609 V610 V611 V612 V613 V614 V615 V616 V617 V618 V619 V620 V621 V622 V623 V624 V625 V626 V627 V628 V629 V630 V631 V632 V633 V634 V635 V636 V637 V638 V639 V640 V641 V642 V643 V644 V645 V646 V647 V648 V649 V650 V651 V652 V653 V654 V655 V656 V657 V658 V659 V660 V661 V662 V663 V664 V665 V666 V667 V668 V669 V670 V671 V672 V673 V674 V675 V676 V677 V678 V679 V680 V681 V682 V683 V684 V685 V686 V687 V688 V689 V690 V691 V692 V693 V694 V695 V696 V697 V698 V699 V700 V701 V702 V703 V704 V705 V706 V707 V708 V709 V710 V711 V712 V713 V714 V715 V716 V717 V718 V719 V720 V721 V722 V723 V724 V725 V726 V727 V728 V729 V730 V731 V732 V733 V734 V735 V736 V737 V738 V739 V740 V741 V742 V743 V744 V745 V746 V747 V748 V749 V750 V751 V752 V753 V754 V755 V756 V757 V758 V759 V760 V761 V762 V763 V764 V765 V766 V767 V768 V769 V770 V771 V772 V773 V774 V775 V776 V777 V778 V779 V780 V781 V782 V783 V784 V785 V786 V787 V788 V789 V790 V791 V792 V793 V794 V795 V796 V797 V798 V799 V800 V801 V802 V803 V804 V805 V806 V807 V808 V809 V810 V811 V812 V813 V814 V815 V816 V817 V818 V819 V820 V821 V822 V823 V824 V825 V826 V827 V828 V829 V830 V831 V832 V833 V834 V835 V836 V837 V838 V839 V840 V841 V842 V843 V844 V845 V846 V847 V848 V849 V850 V851 V852 V853 V854 V855 V856 V857 V858 V859 V860 V861 V862 V863 V864 V865 V866 V867 V868 V869 V870 V871 V872 V873 V874 V875 V876 V877 V878 V879 V880 V881 V882 V883 V884 V885 V886 V887 V888 V889 V890 V891 V892 V893 V894 V895 V896 V897 V898 V899 V900 V901 V902 V903 V904 V905 V906 V907 V908 V909 V910 V911 V912 V913 V914 V915 V916 V917 V918 V919 V920 V921 V922 V923 V924 V925 V926 V927 V928 V929 V930 V931 V932 V933 V934 V935 V936 V937 V938 V939 V940 V941 V942 V943 V944 V945 V946 V947 V948 V949 V950 V951 V952 V953 V954 V955 V956 V957 V958 V959 V960 V961 V962 V963 V964 V965 V966 V967 V968 V969 V970 V971 V972 V973 V974 V975 V976 V977 V978 V979 V980 V981 V982 V983 V984 V985 V986 V987 V988 V989 V990 V991 V992 V993 V994 V995 V996 V997 V998 V999 V1000 V1001 V1002 V1003 V1004 V1005 V1006 V1007 V1008 V1009 V1010 V1011 V1012 V1013 V1014 V1015 V1016 V1017 V1018 V1019 V1020 V1021 V1022 V1023 V1024 V1025 V1026 V1027 V1028 V1029 V1030 V1031 V1032 V1033 V1034 V1035 V1036 V1037 V1038 V1039 V1040 V1041 V1042 V1043 V1044 V1045 V1046 V1047 V1048 V1049 V1050 V1051 V1052 V1053 V1054 V1055 V1056 V1057 V1058 V1059 V1060 V1061 V1062 V1063 V1064 V1065 V1066 V1067 V1068 V1069 V1070 V1071 V1072 V1073 V1074 V1075 V1076 V1077 V1078 V1079 V1080 V1081 V1082 V1083 V1084 V1085 V1086 V1087 V1088 V1089 V1090 V1091 V1092 V1093 V1094 V1095 V1096 V1097 V1098 V1099 V1100 V1101 V1102 V1103 V1104 V1105 V1106 V1107 V1108 V1109 V1110 V1111 V1112 V1113 V1114 V1115 V1116 V1117 V1118 V1119 V1120 V1121 V1122 V1123 V1124 V1125 V1126 V1127 V1128 V1129 V1130 V1131 V1132 V1133 V1134 V1135 V1136 V1137 V1138 V1139 V1140 V1141 V1142 V1143 V1144 V1145 V1146 V1147 V1148 V1149 V1150 V1151 V1152 V1153 V1154 V1155 V1156 V1157 V1158 V1159 V1160 V1161 V1162 V1163 V1164 V1165 V1166 V1167 V1168 V1169 V1170 V1171 V1172 V1173 V1174 V1175 V1176 V1177 V1178 V1179 V1180 V1181 V1182 V1183 V1184 V1185 V1186 V1187 V1188 V1189 V1190 V1191 V1192 V1193 V1194 V1195 V1196 V1197 V1198 V1199 V1200 V1201 V1202 V1203 V1204 V1205 V1206 V1207 V1208 V1209 V1210 V1211 V1212 V1213 V1214 V1215 V1216 V1217 V1218 V1219 V1220 V1221 V1222 V1223 V1224 V1225 V1226 V1227 V1228 V1229 V1230 V1231 V1232 V1233 V1234 V1235 V1236 V1237 V1238 V1239 V1240 V1241 V1242 V1243 V1244 V1245 V1246 V1247 V1248 V1249 V1250 V1251 V1252 V1253 V1254 V1255 V1256 V1257 V1258 V1259 V1260 V1261 V1262 V1263 V1264 V1265 V1266 V1267 V1268 V1269 V1270 V1271 V1272 V1273 V1274 V1275 V1276 V1277 V1278 V1279 V1280 V1281 V1282 V1283 V1284 V1285 V1286 V1287 V1288 V1289 V1290 V1291 V1292 V1293 V1294 V1295 V1296 V1297 V1298 V1299 V1300 V1301 V1302 V1303 V1304 V1305 V1306 V1307 V1308 V1309 V1310 V1311 V1312 V1313 V1314 V1315 V1316 V1317 V1318 V1319 V1320 V1321 V1322 V1323 V1324 V1325 V1326 V1327 V1328 V1329 V1330 V1331 V1332 V1333 V1334 V1335 V1336 V1337 V1338 V1339 V1340 V1341 V1342 V1343 V1344 V1345 V1346 V1347 V1348 V1349 V1350 V1351 V1352 V1353 V1354 V1355 V1356 V1357 V1358 V1359 V1360 V1361 V1362 V1363 V1364 V1365 V1366 V1367 V1368 V1369 V1370 V1371 V1372 V1373 V1374 V1375 V1376 V1377 V1378 V1379 V1380 V1381 V1382 V1383 V1384 V1385 V1386 V1387 V1388 V1389 V1390 V1391 V1392 V1393 V1394 V1395 V1396 V1397 V1398 V1399 V1400 V1401 V1402 V1403 V1404 V1405 V1406 V1407 V1408 V1409 V1410 V1411 V1412 V1413 V1414 V1415 V1416 V1417 V1418 V1419 V1420 V1421 V1422 V1423 V1424 V1425 V1426 V1427 V1428 V1429 V1430 V1431 V1432 V1433 V1434 V1435 V1436 V1437 V1438 V1439 V1440 V1441 V1442 V1443 V1444 V1445 V1446 V1447 V1448 V1449 V1450 V1451 V1452 V1453 V1454 V1455 V1456 V1457 V1458 V1459 V1460 V1461 V1462 V1463 V1464 V1465 V1466 V1467 V1468 V1469 V1470 V1471 V1472 V1473 V1474 V1475 V1476 V1477 V1478 V1479 V1480 V1481 V1482 V1483 V1484 V1485 V1486 V1487 V1488 V1489 V1490 V1491 V1492 V1493 V1494 V1495 V1496 V1497 V1498 V1499 V1500 V1501 V1502 V1503 V1504 V1505 V1506 V1507 V1508 V1509 V1510 V1511 V1512 V1513 V1514 V1515 V1516 V1517 V1518 V1519 V1520 V1521 V1522 V1523 V1524 V1525 V1526 V1527 V1528 V1529 V1530 V1531 V1532 V1533 V1534 V1535 V1536 V1537 V1538 V1539 V1540 V1541 V1542 V1543 V1544 V1545 V1546 V1547 V1548 V1549 V1550 V1551 V1552 V1553 V1554 V1555 V1556 V1557 V1558 V1559 V1560 V1561 V1562 V1563 V1564 V1565 V1566 V1567 V1568 V1569 V1570 V1571 V1572 V1573 V1574 V1575 V1576 V1577 V1578 V1579 V1580 V1581 V1582 V1583 V1584 V1585 V1586 V1587 V1588 V1589 V1590 V1591 V1592 V1593 V1594 V1595 V1596 V1597 V1598 V1599 V1600 V1601 V1602 V1603 V1604 V1605 V1606 V1607 V1608 V1609 V1610 V1611 V1612 V1613 V1614 V1615 V1616 V1617 V1618 V1619 V1620 V1621 V1622 V1623 V1624 V1625 V1626 V1627 V1628 V1629 V1630 V1631 V1632 V1633 V1634 V1635 V1636 V1637 V1638 V1639 V1640 V1641 V1642 V1643 V1644 V1645 V1646 V1647 V1648 V1649 V1650 V1651 V1652 V1653 V1654 V1655 V1656 V1657 V1658 V1659 V1660 V1661 V1662 V1663 V1664 V1665 V1666 V1667 V1668 V1669 V1670 V1671 V1672 V1673 V1674 V1675 V1676 V1677 V1678 V1679 V1680 V1681 V1682 V1683 V1684 V1685 V1686 V1687 V1688 V1689 V1690 V1691 V1692 V1693 V1694 V1695 V1696 V1697 V1698 V1699 V1700 V1701 V1702 V1703 V1704 V1705 V1706 V1707 V1708 V1709 V1710 V1711 V1712 V1713 V1714 V1715 V1716 V1717 V1718 V1719 V1720 V1721 V1722 V1723 V1724 V1725 V1726 V1727 V1728 V1729 V1730 V1731 V1732 V1733 V1734 V1735 V1736 V1737 V1738 V1739 V1740 V1741 V1742 V1743 V1744 V1745 V1746 V1747 V1748 V1749 V1750 V1751 V1752 V1753 V1754 V1755 V1756 V1757 V1758 V1759 V1760 V1761 V1762 V1763 V1764 V1765 V1766 V1767 V1768 V1769 V1770 V1771 V1772 V1773 V1774 V1775 V1776 V1777 V1778 V1779 V1780 V1781 V1782 V1783 V1784 V1785 V1786 V1787 V1788 V1789 V1790 V1791 V1792 V1793 V1794 V1795 V1796 V1797 V1798 V1799 V1800 V1801 V1802 V1803 V1804 V1805 V1806 V1807 V1808 V1809 V1810 V1811 V1812 V1813 V1814 V1815 V1816 V1817 V1818 V1819 V1820 V1821 V1822 V1823 V1824 V1825 V1826 V1827 V1828 V1829 V1830 V1831 V1832 V1833 V1834 V1835 V1836 V1837 V1838 V1839 V1840 V1841 V1842 V1843 V1844 V1845 V1846 V1847 V1848 V1849 V1850 V1851 V1852 V1853 V1854 V1855 V1856 V1857 V1858 V1859 V1860 V1861 V1862 V1863 V1864 V1865 V1866 V1867 V1868 V1869 V1870 V1871 V1872 V1873 V1874 V1875 V1876 V1877 V1878 V1879 V1880 V1881 V1882 V1883 V1884 V1885 V1886 V1887 V1888 V1889 V1890 V1891 V1892 V1893 V1894 V1895 V1896 V1897 V1898 V1899 V1900 V1901 V1902 V1903 V1904 V1905 V1906 V1907 V1908 V1909 V1910 V1911 V1912 V1913 V1914 V1915 V1916 V1917 V1918 V1919 V1920 V1921 V1922 V1923 V1924 V1925 V1926 V1927 V1928 V1929 V1930 V1931 V1932 V1933 V1934 V1935 V1936 V1937 V1938 V1939 V1940 V1941 V1942 V1943 V1944 V1945 V1946 V1947 V1948 V1949 V1950 V1951 V1952 V1953 V1954 V1955 V1956 V1957 V1958 V1959 V1960 V1961 V1962 V1963 V1964 V1965 V1966 V1967 V1968 V1969 V1970 V1971 V1972 V1973 V1974 V1975 V1976 V1977 V1978 V1979 V1980 V1981 V1982 V1983 V1984 V1985 V1986 V1987 V1988 V1989 V1990 V1991 V1992 V1993 V1994 V1995 V1996 V1997 V1998 V1999 V2000
47 375 114 124 45 143 189 51 298 137 105 508 299 69 33 81 404 61 240 159 78 103 75 642 458 115 488 61 89 148 377 230 121 54 132 368 53 80 55 91 50 136 186 421 115 201 53 184 177 741 86 77 102 65 161 235 161 135 234 93 179 281 79 218 144 77 69 145 186 97 39 89 187 110 94 236 103 257 179 108 40 46 138 157 62 67 42 76 90 95 123 254 164 81 108 79 147 315 124 92 205 127 162 124 93 76 178 131 164 104 82 134 62 93 204 107 91 167 300 47 117 90 40 276 107 210 68 35 218 203 46 143 198 413 374 50 62 190 157 57 90 129 214 480 154 135 141 75 218 221 45 342 129 134 113 159 88 70 117 145 9 75 93 201 77 52 298 116 64 441 40 190 376 78 139 265 302 206 208 89 141 32 83 175 159 81 175 74 150 120 45 86 99 394 68 221 261 233 196 30 211 153 200 372 65 24 170 97 158 246 60 176 231 162 655 105 121 108 136 92 43 87 58 57 65 84 82 103 108 127 39 175 153 59 37 87 119 46 181 280 73 178 62 95 117 78 203 137 260 181 35 125 151 390 45 30 196 85 74 66 72 33 95 166 222 140 194 176 109 172 160 53 64 137 369 83 209 134 57 298 154 62 112 120 66 151 343 154 96 119 131 295 137 161 178 107 53 102 103 188 142 36 310 248 192 88 131 121 64 268 336 107 68 80 156 50 135 199 90 97 40 311 91 272 184 76 42 67 134 118 26 155 53 90 241 116 211 83 36 378 205 303 144 108 99 228 73 132 62 111 21 137 71 132 46 220 51 170 154 181 192 94 32 95 84 187 67 104 58 86 42 141 53 94 189 45 259 151 66 80 145 163 596 162 90 94 190 72 138 30 294 231 190 127 80 222 163 90 125 137 27 362 81 99 134 34 84 283 48 91 38 521 183 84 143 157 82 107 73 159 316 118 136 83 201 698 41 66 82 448 120 59 33 58 278 112 127 211 404 386 169 225 98 176 175 301 142 344 190 82 135 66 84 149 314 218 95 91 76 114 110 111 37 59 199 133 208 253 63 44 45 64 402 68 78 37 98 363 239 34 105 168 166 41 84 36 145 146 55 124 81 56 97 122 147 83 75 123 227 203 108 183 188 179 38 180 84 208 294 45 312 481 384 74 96 120 40 93 59 112 156 135 74 577 257 169 199 105 87 126 166 32 34 158 65 89 253 85 158 81 83 55 134 95 260 73 208 68 358 75 185 201 40 122 232 101 56 225 114 85 209 120 210 112 131 132 143 91 248 84 174 58 108 96 45 88 111 58 83 184 134 176 62 127 137 218 20 182 99 76 105 125 109 130 99 158 73 150 135 323 123 140 225 63 136 69 65 75 127 486 101 58 45 86 448 402 93 199 162 116 151 217 125 104 371 116 118 151 97 99 40 71 85 191 155 80 236 220 26 335 38 43 112 234 201 455 184 135 90 166 273 159 134 92 128 157 760 180 182 458 46 248 136 48 175 95 66 93 50 91 112 96 128 176 318 62 101 116 163 135 56 227 108 430 63 217 274 274 68 60 125 314 111 144 135 51 147 202 93 166 113 48 161 155 127 205 100 67 64 50 157 30 78 217 69 334 344 27 47 332 308 207 87 196 116 42 67 270 127 76 181 41 73 441 534 106 90 115 180 237 121 201 345 184 116 87 234 422 43 261 24 39 280 265 214 133 38 125 117 271 36 272 59 225 205 45 157 175 156 62 71 255 270 160 280 364 120 38 89 213 122 97 110 149 382 224 50 150 165 123 57 125 146 261 99 136 58 109 73 250 175 61 321 77 142 47 63 129 91 158 42 119 140 149 462 121 95 101 424 77 127 246 216 465 217 164 173 534 136 181 244 79 345 196 144 131 343 113 132 521 66 79 420 155 101 84 205 58 148 160 145 256 123 102 222 103 126 126 459 124 120 121 69 315 262 298 98 67 197 154 265 109 141 74 138 157 81 33 115 201 172 195 200 65 75 35 99 86 49 166 211 128 32 78 112 83 153 168 101 1083 48 19 118 124 92 90 27 89 31 52 54 84 30 41 311 139 123 186 109 139 628 49 210 184 50 86 197 64 115 295 200 22 40 34 257 294 152 450 35 75 288 200 178 134 303 160 126 235 117 91 83 98 78 98 136 89 118 46 331 217 72 123 66 144 335 39 258 48 67 221 70 58 33 65 152 201 101 373 141 100 237 397 89 134 199 61 91 202 88 155 120 105 152 132 83 126 298 28 159 89 33 57 122 49 478 306 172 88 92 138 134 29 99 240 107 296 223 62 41 276 89 100 75 81 88 129 68 109 144 56 282 252 93 112 124 113 112 86 15 106 68 56 75 68 183 84 81 68 249 134 209 176 132 143 71 47 44 210 85 64 61 207 77 197 79 114 66 123 31 51 139 150 74 108 154 110 149 62 157 80 260 54 53 168 298 66 228 137 320 117 143 86 129 329 364 290 130 207 104 46 82 119 46 153 151 66 119 151 109 148 93 126 87 115 182 244 393 368 78 131 273 258 117 47 198 27 108 263 459 46 55 136 88 138 149 86 420 82 162 51 95 94 213 110 93 397 258 190 67 95 114 204 108 50 185 270 77 159 85 171 107 29 126 197 323 112 56 196 103 215 56 48 82 203 119 93 123 186 131 253 44 209 112 144 96 166 84 55 185 80 117 177 273 188 122 96 123 48 81 197 231 51 93 325 180 445 250 76 78 80 164 113 157 360 258 772 202 315 520 140 181 51 154 93 130 136 112 51 339 247 43 75 240 177 201 66 34 67 143 93 119 129 293 49 86 102 270 31 164 64 52 72 71 78 285 317 50 108 63 79 45 70 97 148 126 69 53 100 107 163 304 117 83 97 314 329 146 102 115 131 62 65 56 146 281 113 134 60 245 161 352 211 45 150 67 51 62 102 117 138 99 130 440 40 18 29 100 156 134 165 98 54 41 212 77 98 161 121 89 72 124 107 208 125 235 92 261 280 48 53 209 254 102 197 18 113 128 208 335 162 103 105 64 244 76 305 209 149 93 203 98 195 220 69 147 93 240 136 309 30 85 36 44 46 82 162 194 372 64 204 67 120 110 137 97 163 203 339 69 67 199 47 98 278 218 194 203 106 57 194 324 385 74 469 111 64 58 78 120 345 67 242 61 191 113 265 97 682 122 34 101 110 52 122 137 42 63 93 79 41 107 68 351 254 61 222 125 170 399 293 22 58 545 142 180 327 201 152 111 756 212 531 47 52 135 158 171 189 50 84 148 90 64 175 268 74 406 61 161 223 197 149 169 87 88 130 45 47 203 457 55 222 45 32 46 254 126 207 425 106 182 140 101 366 102 125 28 161 414 178 113 148 180 84 128 168 82 147 212 59 50 76 73 126 87 100 27 94 261 165 131 71 391 239 159 278 182 149 121 82 105 117 260 179 105 174 183 159 67 87 225 113 84 33 77 37 197 159 181 34 74 241 83 119 160 253 57 135 257 323 213 105 122 121 292 60 124 96 240 160 88 124 229 100 96 114 75 28 275 67 97 68 58 81 614 187 196 88 91 146 173 60 101 215 93 188 35 305 215 158 36 328 230 138 120 163 207 141 95 31 91 148 350 181 95 18 65 48 422 376 102 155 115 98 181 150 61 104 114 212 154 403 312 100 70 135 34 71 73 48 321 172 78 92 43 100 325 464 133 136 128 247 193 93 244 198 51 30 114 288 144 136 59 110 260 24 90 51 59 137 24 129 213 101 143 354 64 138 263 50 222 46 163 94 107 53 170 89 198 12 45 60 54 201 198 77 43 109 76 340 215 141 568 435 175 68 115 61 124 28 227 125 111 55 107 152 149 35 84 57 313 241 76 126 79 49 83 146 214 329 66 323 107 52 101 29 165 101 123 102 189 127 156 132 72 71 319 198 240 93 406 110 117 149 107 52 38 410 112 397 93 32 144 105 103 142 119 205 426 62 288 84 21 67 132 30 65 316 184 445 119 103 107 73 132 165 99 141 60 225 74 74 316 121 267 73 377 68 62 34 64 122 230 164 197 55 204 180 120 97 87 161 95 269 98 374 103 178 124 91 103 182 265 60 62 280 92 93 34 303 108 62 122 117 150 228 306 129 86 207 30 86 70 68 97 68 22 72 233 162 178 202 103 67 103 210 174 161 112 132 671 82 84 123 104 465 109 96 187 41 71 141 195 158 145 96 107 127 160 201 291 200 258 96 109 90 162 56 60 55 131 31 135 59 110 161 232 56 163 193 107 154 299 269 273 26 100 194 83 41 63 39 143 182 573 92 115 234 494 79 57 167 166 166 47 310 235 202 50 88 133 155 52 131 262 360 312 70 203 507 425 349 127 277 56 132 411 65 79 297 294 120 304 88 113 161 119 114 99 102 94 217 61 180 154 238 157 229 225 65 135 154 93 130 101 142 96 44 110 129 64 353 54 72 57 144 148 56 84 95 81 207 128 178 48 73 174 78 309 825 49 144 89 149 33 218 190 173 746 89 337
48 5 11 3 9 4 7 7 16 28 1 15 16 13 6 1 11 2 3 10 6 2 1 7 16 9 9 3 7 7 21 12 12 9 3 7 10 10 16 7 10 6 12 3 12 10 12 6 4 12 17 14 15 2 13 4 6 5 13 5 6 8 15 17 27 8 5 2 9 16 1 7 1 22 24 46 7 11 7 6 4 4 25 12 3 4 10 1 3 19 12 19 2 6 3 6 8 12 1 1 23 10 2 10 7 9 4 6 4 9 18 21 15 9 19 45 2 9 7 13 7 20 8 22 2 6 11 3 1 2 4 14 2 0 11 12 16 4 16 15 2 8 4 10 10 5 34 13 5 7 2 0 8 5 4 7 13 6 2 2 2 2 7 17 19 9 28 11 1 8 2 1 7 13 2 9 8 9 6 11 10 15 9 3 9 6 9 9 5 14 11 2 18 35 4 14 19 6 6 5 4 17 9 4 9 13 30 5 7 5 8 8 5 6 5 1 39 4 6 11 3 16 4 13 10 9 5 3 14 11 6 26 6 6 4 9 4 4 11 2 3 6 13 5 10 8 13 8 8 3 2 2 9 10 14 5 6 5 7 3 28 1 3 48 2 3 45 9 4 10 2 10 8 14 7 8 7 4 4 7 5 14 30 2 8 2 9 13 7 7 5 8 8 13 4 5 10 11 12 26 2 6 2 13 2 11 7 1 15 2 7 2 0 28 9 2 26 5 4 4 2 8 5 6 8 12 7 13 12 22 8 14 11 4 7 28 6 6 3 4 6 7 15 11 2 5 8 9 5 6 14 26 6 13 4 1 8 4 4 2 19 10 7 14 24 8 4 5 5 17 6 9 6 19 4 3 5 2 3 11 5 17 15 4 7 7 16 20 13 3 12 3 4 4 14 8 5 12 2 13 9 6 15 3 5 23 18 13 21 4 8 7 3 17 8 16 15 8 5 16 12 17 11 5 5 6 4 9 7 19 2 3 3 1 12 21 4 14 3 14 29 8 3 3 1 14 5 29 2 8 6 8 11 17 19 7 12 30 5 0 12 21 3 5 14 6 6 5 13 7 19 2 9 2 3 14 8 8 5 6 7 6 9 13 9 9 3 2 8 45 10 3 6 1 4 3 6 5 8 3 20 4 7 9 15 3 17 5 22 7 4 11 11 3 2 12 2 7 3 7 3 15 4 9 7 7 20 38 19 3 7 2 3 13 15 10 23 16 4 8 9 5 3 8 6 4 9 6 14 9 25 7 3 27 7 5 9 4 14 3 8 2 1 0 1 1 3 2 9 8 3 7 14 9 3 8 5 9 11 8 6 7 2 9 9 4 2 10 14 11 9 22 10 13 0 4 8 7 9 5 15 20 10 9 14 11 13 1 20 5 5 9 6 2 6 16 4 3 8 5 1 24 5 10 21 7 2 12 5 1 10 5 2 5 5 5 5 4 13 0 5 5 13 11 6 22 14 1 9 6 10 9 20 3 6 4 10 2 12 24 5 5 19 30 15 12 32 5 2 15 1 12 5 5 2 13 2 6 2 9 25 7 8 9 2 37 13 22 5 0 10 5 36 8 11 2 11 7 9 4 13 1 5 5 5 8 3 19 19 5 5 5 10 1 6 15 18 4 5 8 10 4 12 22 3 23 7 6 16 4 4 9 7 6 3 16 2 17 10 5 6 7 22 16 9 9 12 11 25 8 10 10 8 11 7 9 3 12 5 18 2 10 18 9 7 4 14 9 9 17 6 12 16 7 7 4 1 7 0 14 5 1 8 6 7 17 15 13 19 1 0 19 8 11 22 4 3 4 3 8 12 14 9 3 33 2 5 5 4 18 2 4 20 7 4 31 11 6 12 21 11 13 6 15 3 34 12 4 3 13 2 18 4 13 9 6 8 7 4 9 10 4 6 4 8 4 14 5 3 10 14 12 5 13 16 7 8 29 11 2 7 14 5 9 2 18 3 19 12 1 7 11 2 5 7 5 10 5 4 7 13 12 13 2 3 10 18 14 13 5 6 2 12 2 25 5 1 10 4 26 7 4 9 9 1 9 3 3 2 12 17 24 10 6 4 9 33 15 24 20 6 8 5 4 8 3 19 4 5 15 37 7 14 21 15 9 6 10 4 13 6 3 14 11 21 5 14 4 4 9 4 4 11 7 2 7 5 1 19 3 14 12 32 13 12 3 28 10 15 7 4 14 31 2 2 5 8 13 18 9 31 28 1 12 1 8 5 11 17 6 12 12 7 4 8 8 28 4 18 17 10 4 10 5 26 12 34 11 13 25 3 8 6 1 3 2 14 3 8 37 14 18 3 5 4 4 17 9 8 9 25 13 6 4 0 14 9 14 2 7 11 3 6 4 2 5 6 7 12 12 12 14 15 15 16 9 3 1 5 9 4 5 12 1 0 8 15 7 17 4 7 11 2 4 24 14 6 3 5 1 4 10 2 2 35 5 1 10 3 14 8 28 7 7 6 2 5 6 15 6 6 9 7 11 10 14 7 5 9 6 14 12 8 22 8 14 20 36 13 9 5 1 5 23 3 9 8 4 10 5 7 8 28 25 5 12 9 4 2 5 12 18 9 5 23 14 6 5 4 29 4 5 14 30 15 2 4 2 14 14 13 2 12 11 1 5 0 3 10 12 7 17 6 10 3 11 2 26 13 8 10 2 8 23 10 6 7 1 10 8 1 9 0 4 2 11 18 7 21 10 4 1 14 37 11 21 7 5 4 6 31 11 5 49 4 12 19 3 4 12 10 6 12 4 13 13 1 7 7 1 18 13 4 9 13 20 0 13 5 15 7 2 7 19 15 16 23 11 7 25 13 12 6 9 11 20 5 7 4 24 11 12 7 4 12 4 8 6 10 9 16 3 3 7 10 13 7 14 6 4 5 17 16 4 6 1 9 6 4 4 6 1 9 1 11 11 17 13 1 4 8 16 5 3 3 6 26 7 12 7 6 5 7 5 6 2 10 9 7 16 10 5 6 7 14 6 4 15 3 6 7 4 29 11 1 5 16 1 6 9 4 5 6 5 2 6 14 5 15 16 3 8 14 20 1 7 9 14 6 2 18 16 11 10 1 17 12 14 1 11 17 19 1 6 6 6 8 14 2 10 9 5 9 17 2 25 15 5 17 14 23 16 20 4 3 7 13 11 5 8 13 9 22 21 8 4 5 7 22 1 5 4 4 33 12 6 4 9 10 9 2 14 1 12 6 4 3 4 5 10 4 6 3 3 3 9 9 9 5 26 3 2 8 2 4 18 1 20 5 4 9 11 16 3 3 7 7 2 10 5 3 14 2 30 4 3 4 10 12 10 15 5 8 16 0 2 17 10 3 10 2 7 2 6 5 3 11 17 3 22 12 4 8 10 12 5 10 1 5 8 9 14 5 13 4 9 13 5 9 25 0 24 11 17 13 20 9 3 1 2 1 5 6 24 5 5 9 9 20 12 25 22 5 5 11 4 2 9 4 12 4 40 14 8 12 19 9 6 8 7 2 2 13 6 7 3 1 8 8 6 13 1 3 5 3 10 4 7 11 2 9 7 7 9 27 21 2 2 6 1 7 1 8 12 4 26 16 1 6 7 5 3 12 2 1 6 15 2 26 4 5 12 10 10 10 29 8 3 18 5 4 1 26 7 4 8 8 9 4 15 32 12 19 8 2 5 5 4 12 15 15 20 3 3 10 12 5 7 2 4 4 5 6 10 24 9 9 4 6 5 19 4 4 12 4 7 7 16 4 23 9 3 7 3 13 3 1 8 6 2 4 5 7 14 2 6 8 38 7 3 4 4 3 7 13 7 9 3 14 61 7 17 7 24 15 3 10 17 8 1 12 15 8 20 3 20 10 7 13 6 8 7 7 2 9 1 10 3 9 10 3 3 7 6 14 4 13 7 15 18 6 13 39 8 4 3 14 8 27 3 3 9 19 14 5 8 4 6 3 13 46 6 7 2 21 5 30 7 9 14 4 10 8 3 9 10 3 9 4 25 3 8 15 3 30 20 24 2 27 6 21 4 3 7 6 10 33 1 3 4 11 12 29 11 1 1 12 3 3 7 6 11 3 16 3 3 7 10 16 5 23 10 15 10 4 11 3 4 10 8 11 7 2 5 7 4 8 6 15 2 9 9 8 21 30 16 60 19 10 8 16 2 28 4 7 1 27 10 13 6 19 17 10 22 7 71 4 12 11 6 4 15 9 6 11 2 14 4 20 18 16 1 9 5 3 8 6 13 5 9 18 26 16 8 11 8 18 15 12 17 11 3 7 8 36 10 11 18 4 5 19 10 19 16 10 16 8 6 8 6 6 4 5 5 7 10 6 4 13 16 5 10 14 12 9 5 6 19 13 4 2 5 11 0 16 6 6 7 5 2 8 8 2 10 4 25 7 22 2 5 2 3 8 6 2 8 3 2 13 1 17 10 8 9 0 4 16 6 6 7 7 6 6 9 7 9 5 4 3 7 2 7 2 13 9 9 6 0 6 18 4 3 1 4 17 10 5 9 16 8 3
90 12 29 20 8 15 13 9 23 17 9 12 13 11 5 13 19 9 27 3 6 36 22 8 13 22 9 11 8 46 28 36 2 29 7 11 2 7 25 15 4 7 31 32 44 12 25 8 11 7 18 13 14 2 18 8 8 13 6 8 7 19 53 23 5 13 13 6 13 49 4 15 20 12 20 7 8 6 15 21 2 2 14 10 27 3 19 15 11 6 10 28 40 12 15 21 6 24 4 17 29 24 8 45 17 20 26 8 11 7 15 15 6 12 10 22 8 12 32 34 16 17 11 18 43 17 24 11 4 15 4 12 11 6 19 19 24 8 16 20 9 3 12 20 16 7 14 20 3 5 8 10 27 32 13 12 14 17 7 15 5 6 8 12 30 16 16 18 32 8 9 6 22 34 9 23 3 18 23 10 19 15 25 19 6 4 3 20 3 7 8 9 5 36 1 20 19 22 25 26 16 20 29 11 82 32 5 27 51 18 4 9 32 2 26 20 11 30 16 14 12 14 12 4 6 25 45 10 17 11 19 17 8 11 17 18 20 9 11 10 13 6 17 9 2 9 12 13 7 20 14 14 9 82 12 36 13 9 9 10 21 16 7 12 56 53 11 7 7 8 9 7 20 7 30 20 9 8 11 21 19 8 43 7 15 13 8 2 9 20 4 23 7 24 20 6 21 19 17 10 12 28 1 22 9 14 4 6 10 14 21 8 6 34 39 19 6 12 4 11 22 8 25 52 9 5 10 21 23 7 54 3 13 11 8 16 24 22 5 15 10 27 10 7 24 15 13 49 7 7 7 10 28 5 12 8 5 31 11 22 18 20 7 16 33 21 7 16 22 4 4 9 17 5 12 21 18 19 20 9 32 11 12 20 6 16 4 11 32 14 16 17 26 22 18 13 5 31 9 15 12 8 7 17 10 12 41 9 4 14 18 15 11 16 20 24 9 3 4 35 11 8 3 86 7 4 3 10 5 16 17 17 8 8 52 9 8 18 19 7 15 8 28 8 26 21 17 6 18 6 33 3 47 9 12 12 15 36 6 12 9 6 12 15 6 54 14 27 12 9 10 12 6 5 12 21 9 45 12 14 6 17 20 25 9 19 4 21 10 39 4 9 17 5 12 9 9 10 14 10 32 16 12 25 49 11 11 13 21 7 16 42 23 9 7 14 20 12 20 15 26 17 18 31 39 24 17 18 3 23 11 8 4 5 19 17 25 10 16 3 14 11 16 12 3 6 6 7 35 6 11 14 10 21 13 14 17 8 5 17 5 12 3 7 10 6 33 10 3 9 9 3 26 10 8 4 8 14 31 10 18 4 34 21 22 8 18 8 32 16 10 13 9 15 15 12 4 6 13 7 9 23 13 8 35 15 0 20 5 16 7 30 6 24 21 14 5 8 9 34 18 25 23 8 14 9 26 3 21 18 12 10 5 21 25 88 8 3 6 30 15 11 11 2 5 46 12 7 12 10 29 14 9 22 9 18 11 1 18 14 11 15 13 35 14 91 11 19 29 11 69 13 9 7 32 11 34 3 3 16 28 27 45 26 14 68 21 17 45 18 19 8 20 5 10 17 18 6 4 10 16 3 7 10 3 15 49 9 21 12 16 4 3 18 24 6 16 6 19 4 33 15 11 19 24 21 6 9 7 7 16 32 13 14 13 50 6 25 10 12 6 5 25 31 14 22 3 13 13 12 9 15 7 5 18 13 15 9 6 11 12 9 21 6 3 14 6 19 9 14 8 26 4 9 9 39 18 7 8 8 34 19 4 9 13 11 15 24 33 9 14 26 4 66 10 3 31 5 8 13 5 13 23 7 26 10 20 17 9 22 15 71 21 10 8 11 38 13 16 27 37 24 20 13 44 26 9 18 9 8 6 27 14 13 76 15 17 9 14 12 5 9 32 26 18 17 14 12 8 12 13 25 30 4 12 17 4 45 19 47 8 19 7 11 9 13 6 5 45 5 27 7 5 6 16 7 2 8 17 20 19 16 18 4 11 12 17 20 11 15 22 4 18 12 28 11 28 16 7 13 15 27 8 11 40 45 10 8 10 12 69 24 19 9 16 23 8 12 11 41 21 20 14 20 23 23 42 11 10 8 34 12 10 6 21 26 7 8 43 10 8 14 13 19 13 7 6 10 38 5 11 13 10 8 17 8 11 5 29 5 7 6 8 3 6 3 27 27 25 12 0 24 1 21 9 5 28 6 6 37 28 5 54 14 21 8 12 10 26 10 27 12 8 11 5 0 6 10 9 22 5 3 9 18 11 2 10 4 6 22 10 8 29 16 17 7 11 9 3 15 7 14 23 17 27 6 4 2 23 37 5 33 16 24 40 30 6 22 10 2 2 32 12 4 16 9 10 46 8 11 17 9 4 18 8 54 28 36 10 51 6 6 15 15 4 15 12 29 27 10 13 13 3 13 13 13 10 26 4 9 13 6 19 20 16 40 5 14 25 20 49 11 6 13 30 5 38 18 16 11 20 4 42 7 17 18 5 1 5 4 18 17 12 8 5 21 10 10 11 7 25 39 23 15 3 17 12 16 13 19 14 20 4 8 10 18 15 4 9 8 11 15 16 51 23 7 5 13 26 16 2 17 27 14 11 8 9 44 16 22 18 45 7 58 25 16 16 14 10 8 5 12 35 9 22 6 24 13 7 32 13 32 0 10 4 13 11 10 1 19 40 14 18 13 3 5 8 28 6 24 20 40 12 6 28 23 13 11 10 23 18 9 2 15 8 9 10 15 12 17 15 11 14 12 26 16 7 8 20 1 18 39 9 14 9 4 2 19 7 24 11 4 53 6 5 8 39 12 6 12 31 13 6 16 6 40 5 12 11 11 5 4 12 4 17 10 8 1 21 12 11 21 25 22 58 43 9 23 26 7 13 11 6 12 18 22 54 10 8 17 28 29 12 15 3 31 7 13 15 50 10 9 20 8 4 19 3 21 20 3 18 30 26 17 33 31 35 5 24 15 18 3 6 14 11 3 25 8 22 2 30 9 5 12 11 57 24 5 49 13 15 83 28 21 12 8 18 37 14 27 25 8 25 8 12 17 11 13 15 8 2 7 30 9 16 16 15 13 13 53 5 3 17 6 28 10 18 18 14 6 6 7 6 8 34 37 25 20 43 35 13 55 42 17 20 12 33 11 11 34 11 1 39 7 45 2 30 6 13 16 11 20 29 24 28 15 21 8 12 32 5 33 19 33 7 31 10 19 15 15 9 7 20 14 31 32 10 3 9 25 14 13 22 16 5 8 11 33 6 31 13 24 27 0 8 5 2 8 9 23 15 27 11 18 43 33 14 9 16 5 7 19 6 10 4 38 9 19 8 25 8 9 26 30 16 15 24 14 22 34 13 15 11 21 11 2 16 2 9 7 25 6 24 14 4 21 25 55 11 4 16 9 15 4 12 7 9 5 9 17 23 9 17 13 46 2 17 9 6 9 11 16 15 5 14 10 22 25 42 21 4 30 17 21 15 8 33 3 3 26 9 2 8 9 7 11 4 7 33 14 11 8 14 28 29 6 8 5 13 4 16 30 13 22 24 17 5 10 9 34 22 6 23 17 4 8 17 12 20 7 9 6 7 21 10 7 22 15 4 6 10 16 33 26 21 20 14 9 14 11 18 19 16 24 7 49 18 67 13 74 18 8 33 6 11 17 21 15 15 10 25 12 8 17 29 7 18 3 11 14 19 12 40 1 39 9 7 17 6 21 10 52 10 6 12 9 100 13 14 7 6 12 41 15 31 10 5 13 32 9 17 2 11 5 66 20 18 9 4 5 22 27 9 9 12 19 22 15 16 18 11 14 10 9 10 5 19 8 15 14 53 10 42 30 14 9 26 19 7 17 8 15 11 7 13 8 3 18 19 13 39 6 11 0 4 2 13 17 4 17 25 14 15 9 13 11 1 6 18 9 19 38 11 3 10 16 11 10 12 16 12 13 1 23 13 18 11 8 23 3 22 11 6 5 6 8 5 21 27 15 2 13 5 6 39 5 27 10 20 6 2 7 15 12 33 5 7 2 11 14 29 9 27 15 2 5 11 7 16 13 5 21 5 23 33 20 5 39 15 5 5 9 66 5 9 25 68 8 4 24 19 19 16 4 9 7 10 6 22 6 8 25 11 16 22 7 23 7 22 7 18 5 4 4 8 11 12 36 11 19 24 11 25 11 12 27 11 20 10 13 4 9 12 5 2 14 18 40 4 21 8 15 9 14 14 25 6 10 17 16 12 9 25 13 10 13 44 8 8 3 15 9 9 10 17 7 38 9 3 12 22 8 32 24 12 7 11 25 27 6 15 17 10 18 9 20 8 13 6 44 17 7 24 21 9 12 30 11 63 45 13 9 4 19 23 35 4 3 3 12 11 20 13 8 39 39 65 28 16 8 14 22 16 10 12 63 26 4 37 18 6 7 19 15 5 11 13 3 13 27 2 26 4 16 30 8 9 7 10 8 11 18 8 20 33 24 34 4 28 8 30 9 25 14 17 15 11 18 2 6 3
91 4 5 1 6 15 2 4 3 6 8 9 14 6 2 5 8 3 4 3 1 0 6 10 2 9 1 7 6 13 7 10 25 8 11 5 7 1 5 2 2 2 1 2 5 7 2 12 17 7 4 3 5 5 2 1 1 10 15 4 5 4 15 5 17 3 8 3 7 8 2 5 9 0 23 5 7 4 11 9 2 8 5 0 4 2 13 0 6 3 5 5 1 5 7 3 17 19 7 12 4 4 3 12 10 6 11 11 0 0 4 4 28 10 6 2 13 7 8 2 4 2 2 9 11 13 47 12 13 1 5 7 3 0 7 4 6 9 2 2 7 4 9 6 6 1 2 17 0 4 9 5 3 2 16 6 2 1 14 6 1 3 0 7 10 1 32 15 14 13 10 28 5 19 12 7 4 3 13 9 2 0 5 31 9 4 16 5 8 9 2 3 2 16 4 15 2 4 17 2 8 6 6 9 0 11 7 2 10 7 6 3 3 7 2 3 8 4 49 10 4 13 2 10 5 6 3 2 3 4 12 4 11 8 6 3 2 6 10 23 6 7 15 2 8 4 4 11 10 11 7 10 4 0 4 15 19 1 4 4 3 6 13 7 4 18 4 2 1 2 1 2 4 8 3 8 1 3 4 5 7 3 3 6 2 2 2 10 10 4 10 2 0 6 5 9 9 5 9 12 0 6 4 0 8 7 5 6 11 14 4 6 3 16 8 15 19 5 7 3 8 5 2 10 3 4 3 2 10 3 3 7 3 6 4 8 2 10 10 15 9 12 1 7 1 8 8 15 2 3 2 10 14 4 2 7 0 13 8 7 4 7 5 12 4 8 6 28 0 3 5 5 5 16 6 6 5 3 2 4 3 7 4 6 5 13 7 9 8 5 10 34 5 7 3 3 1 1 2 8 4 1 3 1 2 7 29 9 5 1 5 6 10 2 4 11 3 13 10 7 43 13 3 1 2 3 15 8 6 13 9 2 3 14 8 7 3 2 8 6 15 16 3 4 1 10 1 3 3 3 0 5 3 5 0 28 10 10 2 1 2 5 14 5 6 13 2 1 19 2 4 24 8 9 3 7 15 4 1 3 5 1 6 15 11 0 5 4 7 5 4 2 5 1 4 15 6 1 7 9 23 9 22 12 3 9 3 3 10 8 7 10 4 8 12 3 2 4 4 8 8 7 8 15 6 50 14 8 2 6 25 2 0 4 1 9 3 3 4 4 4 7 5 4 3 6 3 4 9 18 2 1 6 11 10 4 6 19 0 2 3 7 5 14 3 1 26 1 1 8 7 16 9 6 1 4 4 5 1 2 6 15 9 8 6 1 6 2 11 6 1 4 10 2 4 6 17 2 20 2 5 2 16 10 15 8 2 5 4 3 4 3 4 6 10 2 12 11 5 14 2 7 15 2 7 2 7 16 11 5 3 3 16 11 5 2 4 7 4 6 7 10 18 8 5 11 3 10 2 5 2 12 2 2 10 10 1 4 10 6 7 4 10 4 6 4 1 5 11 2 5 13 20 37 1 4 7 2 3 5 14 4 10 23 2 9 2 8 6 15 8 2 3 5 5 11 5 1 9 4 1 1 4 9 9 8 5 6 3 2 3 4 19 19 15 17 14 6 7 11 4 4 4 6 4 2 20 2 9 7 7 3 10 9 1 5 16 21 5 3 3 0 4 12 12 8 6 2 4 20 3 10 6 9 2 9 5 2 25 8 4 11 5 1 2 16 5 2 7 7 13 2 4 10 19 8 20 3 3 5 9 2 7 4 8 5 3 1 5 14 7 6 15 6 12 17 2 14 5 6 3 7 4 17 5 3 19 8 5 6 7 5 10 1 11 8 3 4 9 11 10 4 6 3 11 17 8 5 4 11 4 3 5 6 8 20 8 1 15 10 3 7 7 13 4 15 5 8 1 1 3 7 14 14 20 4 13 4 6 10 1 3 8 9 10 6 16 11 11 15 4 5 8 3 2 13 24 5 1 11 6 8 3 1 4 1 10 10 3 7 7 3 8 4 9 3 2 8 3 14 2 5 10 10 26 6 3 9 0 4 0 10 2 22 12 3 2 6 11 6 13 2 10 10 7 9 8 5 14 8 2 7 6 10 6 5 0 3 2 10 7 12 5 3 8 7 4 5 4 0 1 2 14 5 2 8 22 5 3 4 10 5 10 3 2 3 35 3 3 6 5 2 1 7 2 15 8 3 13 2 5 10 12 2 12 0 5 8 9 7 7 4 5 4 6 5 9 2 5 23 7 6 12 11 4 6 5 13 7 2 3 5 3 23 11 13 0 1 0 6 1 7 4 4 10 0 5 5 6 7 5 6 17 5 3 4 5 1 0 1 0 4 11 4 7 8 3 1 0 2 8 12 2 4 4 7 3 15 18 12 7 2 10 14 4 7 1 2 6 11 2 12 11 4 1 6 9 2 2 6 6 4 5 12 4 2 3 4 11 4 7 11 1 5 12 12 13 5 4 5 2 3 0 4 10 13 12 5 13 15 2 21 12 9 9 2 0 7 6 3 11 3 6 13 3 9 8 10 11 4 18 1 4 8 49 10 8 13 6 9 14 4 9 0 20 5 4 15 8 16 0 17 5 8 7 3 4 4 12 4 6 16 11 6 6 3 5 3 3 8 5 6 3 3 0 13 7 11 1 5 3 4 1 1 10 2 5 11 12 5 13 3 7 10 13 0 6 18 4 24 3 5 11 12 1 7 13 10 14 1 7 8 2 4 14 11 11 3 7 6 6 5 0 3 11 2 3 2 5 3 6 9 5 6 3 8 0 4 2 5 8 1 33 4 6 9 5 6 11 5 4 9 11 1 5 1 2 9 9 11 2 2 3 9 10 7 0 9 1 3 3 8 8 5 12 4 0 17 7 4 6 11 4 3 18 3 1 14 10 7 6 2 8 6 2 11 3 11 13 9 6 2 11 2 9 0 10 2 3 7 7 22 5 1 5 14 4 8 2 10 7 1 4 14 8 3 7 13 2 9 10 4 12 14 5 9 6 3 9 2 7 0 2 7 4 4 17 1 6 6 14 6 11 3 13 4 5 2 16 4 5 3 41 4 6 10 8 7 4 3 3 19 1 8 1 6 4 0 6 1 6 1 5 7 1 15 6 7 7 4 9 5 7 8 1 8 20 5 5 12 3 17 3 5 7 6 3 10 2 5 8 3 0 10 12 8 12 8 5 7 4 16 5 20 11 1 2 9 2 13 11 3 7 7 34 1 10 4 1 7 7 9 11 3 4 4 9 12 3 4 4 13 13 1 27 9 4 6 11 15 0 3 8 9 1 5 10 11 1 11 4 7 9 7 1 15 12 2 1 3 4 1 6 8 4 2 7 11 17 0 5 3 2 15 3 2 3 2 0 2 3 23 13 2 3 2 4 4 6 5 3 2 1 14 3 3 6 3 9 3 6 4 3 16 33 4 1 2 7 2 5 10 1 18 6 3 9 2 7 6 10 0 14 14 2 20 6 4 5 1 17 18 6 14 1 5 0 0 8 1 12 2 11 23 14 7 15 2 7 6 1 15 4 4 21 6 5 10 6 13 2 6 4 5 5 11 8 0 6 11 9 1 3 2 3 1 2 7 4 11 6 3 21 10 6 8 5 6 3 8 9 45 9 4 8 3 2 5 11 10 1 3 1 2 4 9 5 2 4 10 13 8 4 4 2 5 38 5 2 5 8 5 6 4 27 10 8 2 6 13 8 10 0 3 3 0 8 10 17 7 2 1 2 8 8 4 8 7 10 1 5 13 9 6 6 1 3 3 5 3 1 11 2 4 1 2 1 4 6 3 12 6 5 7 23 9 15 14 10 18 5 16 17 15 17 4 5 8 3 20 12 14 0 3 2 3 2 4 6 5 5 5 2 1 7 0 3 4 12 7 8 15 5 8 6 5 21 3 0 7 4 13 3 5 5 2 12 0 4 6 5 0 14 5 8 12 8 12 11 0 1 13 8 9 10 8 8 1 6 7 14 6 9 3 2 12 12 8 5 3 12 3 7 9 4 11 5 1 34 8 3 5 13 2 2 1 6 5 2 7 0 16 12 9 15 10 2 1 10 16 2 14 21 3 6 4 6 0 1 5 6 12 3 8 5 8 3 1 4 10 2 26 3 13 8 10 6 8 1 8 26 10 16 3 6 2 10 14 21 7 7 17 7 5 15 9 9 4 6 1 6 7 10 3 9 0 11 6 4 4 3 4 4 7 9 20 8 12 5 4 10 3 1 3 4 11 11 20 8 7 8 5 5 5 10 6 4 9 4 1 6 4 6 3 10 3 8 16 24 1 3 1 5 4 0 1 17 3 19 20 17 2 11 16 7 2 2 4 2 5 8 10 5 4 18 5 8 8 4 5 12 4 1 3 4 3 6 11 7 4 11 4 5 14 17 11 10 2 4 15 3 0 6 4 6 3 0 2 9 17 4 6 13 1 2 10 2 2 1 7 8 18 16 12 4 21 8 5 9 3 4 4 10 2 4 7 10 4 2 7 1
208 7 4 8 1 4 8 2 12 2 6 5 1 24 3 10 12 1 3 2 1 12 3 4 13 4 3 4 7 2 6 7 5 6 10 4 4 9 6 9 2 7 1 9 15 4 4 7 11 2 8 4 4 2 8 3 0 2 6 1 3 3 5 9 9 3 1 4 5 0 3 8 2 6 5 5 4 3 7 7 4 6 22 4 6 4 0 0 0 3 1 7 0 18 6 5 7 4 5 0 4 6 3 1 4 18 3 3 9 12 13 2 0 0 3 3 1 2 3 7 1 4 8 8 11 7 2 7 4 6 0 3 8 9 1 3 5 4 10 1 1 12 3 4 1 3 6 3 4 2 4 1 11 2 13 1 3 6 1 4 5 1 5 10 3 2 3 4 3 5 9 3 4 6 7 7 3 3 13 6 2 1 10 7 2 1 2 7 2 5 4 4 5 23 2 1 2 4 7 8 2 4 2 1 1 1 2 2 4 9 6 5 5 2 1 10 2 6 0 3 2 8 0 1 4 4 2 6 5 13 0 4 1 8 7 1 4 4 2 9 5 6 5 1 13 3 2 3 7 6 4 4 7 2 4 0 9 4 9 3 2 3 3 16 13 3 0 3 5 13 3 1 4 18 4 3 30 5 2 5 2 4 6 3 4 6 8 4 2 1 6 5 6 4 3 0 1 5 2 1 0 7 1 15 1 1 4 2 6 2 9 3 6 16 6 3 4 4 3 1 1 1 3 3 3 2 1 1 10 4 3 3 3 13 0 5 0 3 2 7 2 8 1 2 4 4 1 3 8 2 3 3 4 4 2 2 4 3 5 2 4 2 0 6 22 1 3 2 3 6 1 2 3 8 3 6 3 19 1 2 1 4 4 4 1 2 1 10 4 3 11 9 3 14 12 1 2 2 1 5 3 2 7 4 0 1 0 5 4 0 4 3 3 2 4 8 3 5 8 2 4 4 16 16 2 36 1 3 2 12 2 8 7 2 1 6 1 4 1 2 0 8 3 4 5 5 0 6 4 9 3 4 1 2 2 3 16 2 5 1 6 16 4 1 6 5 4 13 7 5 6 13 2 5 0 4 1 8 12 4 21 6 3 7 3 3 1 2 7 8 4 5 7 0 4 6 8 2 12 0 43 17 5 5 8 3 2 5 1 14 4 5 4 10 2 1 1 2 2 6 10 11 0 23 16 2 12 6 4 2 5 1 1 13 0 11 2 4 5 2 2 12 8 3 8 12 5 1 11 7 6 9 3 10 6 3 0 1 4 2 1 23 2 1 0 6 3 8 7 1 3 1 4 8 3 1 5 1 3 3 3 1 6 2 4 12 4 8 7 4 0 3 4 3 7 3 5 6 3 1 8 1 3 3 13 5 17 3 8 3 0 9 7 4 11 5 15 12 2 2 2 8 3 2 7 12 2 13 1 1 6 9 1 2 4 2 7 4 0 4 1 3 6 32 19 18 2 1 3 1 14 14 4 3 6 1 3 1 3 4 0 1 4 3 8 13 5 4 3 7 5 4 2 21 7 6 1 5 2 1 3 6 7 8 1 4 4 5 4 1 4 3 5 3 0 4 5 1 2 2 3 1 12 2 1 2 3 17 17 0 4 6 13 1 2 3 3 2 0 7 7 8 6 3 5 2 7 1 8 12 13 7 0 6 10 1 5 6 1 24 1 4 3 6 1 3 10 5 3 2 1 1 10 2 4 3 4 4 5 2 1 4 7 1 12 3 7 11 3 5 1 12 8 3 17 1 3 0 6 5 8 1 5 1 0 7 0 2 5 1 1 4 3 4 4 1 2 25 5 1 3 3 14 0 6 14 3 3 6 3 3 6 10 2 0 3 4 2 10 4 9 11 0 3 2 7 4 9 16 7 6 0 6 7 1 5 2 1 1 0 7 6 1 3 5 1 3 9 3 5 2 4 10 3 67 2 11 3 4 5 1 2 2 8 2 7 3 2 3 2 9 8 3 3 0 2 8 0 5 19 9 8 1 4 3 14 13 5 1 2 1 7 4 1 9 2 4 0 1 10 5 2 5 8 21 7 2 0 7 2 4 3 2 9 5 3 2 7 1 7 0 8 8 7 5 2 3 1 9 17 4 2 5 6 1 4 4 6 4 4 4 5 1 7 4 0 6 4 11 5 1 7 5 3 4 5 3 1 6 0 1 6 4 2 1 1 0 1 4 1 1 3 3 4 6 5 1 9 1 2 1 1 0 5 5 3 3 6 7 5 2 3 3 3 3 2 0 7 2 2 0 3 6 11 4 3 5 6 2 2 4 2 6 8 5 5 4 4 0 2 9 14 7 5 6 1 5 0 3 5 4 7 1 13 3 10 7 2 3 4 6 1 3 2 7 5 2 7 4 9 3 0 2 4 3 0 2 8 1 1 3 4 1 4 9 4 16 11 4 3 5 3 4 5 4 4 3 6 2 1 2 5 8 1 6 2 1 1 4 2 6 10 7 2 9 3 3 17 3 3 2 3 3 6 4 1 7 1 22 0 2 5 7 2 5 8 12 4 3 5 5 4 11 4 6 6 2 5 6 0 2 5 14 8 5 2 10 2 2 2 11 3 3 4 3 7 2 6 3 7 1 5 9 8 4 10 12 2 13 8 11 2 0 9 1 1 0 1 2 3 1 2 5 2 3 5 4 0 3 3 5 4 9 5 1 0 4 3 2 1 1 8 6 7 8 6 7 4 5 5 4 2 4 1 7 8 3 5 4 8 39 0 6 8 4 9 1 7 3 4 2 6 2 1 4 13 6 5 3 0 6 7 8 23 4 4 3 1 2 5 5 5 6 2 2 6 9 6 1 4 1 10 0 1 6 3 2 7 2 4 3 1 0 1 1 12 5 2 2 6 14 3 4 2 10 19 3 2 3 3 5 3 7 2 2 5 6 7 14 3 2 1 11 5 3 3 7 16 2 1 1 18 4 6 2 2 1 3 7 3 4 3 13 1 12 4 2 4 3 4 8 11 5 1 3 1 16 3 6 0 2 0 3 8 2 8 8 4 7 2 4 4 5 8 2 4 18 7 1 2 1 4 8 1 6 6 6 5 1 8 13 3 4 5 1 5 3 5 4 3 4 11 9 2 3 7 2 11 6 5 4 0 4 3 3 1 6 3 13 3 3 6 5 12 2 0 4 8 6 5 2 5 2 4 2 3 8 22 2 9 2 4 1 4 10 1 1 12 7 5 2 6 5 1 1 9 2 19 1 8 3 4 9 6 6 7 5 9 13 6 8 2 3 0 6 5 21 4 1 2 8 7 17 6 1 6 1 2 20 5 0 0 2 3 1 6 2 2 16 8 2 2 3 1 3 1 12 7 9 0 1 0 1 5 0 2 2 2 11 1 2 12 2 3 4 1 3 5 1 0 1 1 2 0 3 2 1 2 4 4 8 2 5 2 5 3 8 6 6 9 4 0 1 6 7 19 4 11 4 3 4 7 1 2 7 4 2 7 1 8 3 3 3 8 13 8 1 1 5 4 9 5 2 7 1 0 5 3 7 1 7 7 7 6 3 9 2 3 6 1 9 5 2 1 2 11 4 0 4 3 0 8 2 7 5 5 2 3 5 6 6 1 3 14 1 2 1 2 2 2 5 8 0 6 11 1 4 9 5 3 4 1 5 6 3 4 12 1 2 3 11 1 4 3 6 3 1 6 6 4 3 2 1 2 9 10 2 9 6 1 4 2 3 4 2 13 1 2 1 19 14 5 5 5 6 4 3 9 1 4 2 5 2 1 2 8 1 5 5 0 1 5 5 5 5 2 9 15 3 5 0 0 3 2 8 2 1 9 7 5 4 2 2 4 4 6 4 3 3 5 1 3 15 19 2 10 2 9 3 2 6 2 3 4 3 0 0 3 13 2 3 4 4 4 4 3 7 12 4 5 1 3 2 2 4 3 9 8 10 4 2 5 7 5 5 0 2 11 2 6 4 1 3 2 8 4 4 7 2 14 6 1 3 2 7 2 9 4 7 6 10 7 8 6 2 14 9 2 7 5 5 6 1 1 3 3 8 8 2 6 5 5 7 1 1 8 3 2 8 8 2 4 3 5 12 13 6 8 6 6 0 10 9 6 2 2 6 1 5 2 6 11 10 6 4 5 6 1 8 2 1 4 10 3 1 4 2 6 9 12 5 2 9 12 1 4 4 7 2 4 14 1 8 0 33 1 2 4 3 20 1 2 3 4 20 3 2 2 5 2 2 5 11 2 8 2 2 10 3 9 5 3 2 11 5 20 1 7 0 3 8 1 3 5 4 4 0 0 1 4 3 2 7 11 4 4 13 5 4 2 3 5 8 0 5 3 3 4 6 5 3 5 0 8 3 3 7 6 3 14 1 2 4 8 4 8 4 4 7 6 0 9 1 24 2 7 3 3 6 2 6 2 2 2 5 13 5 8 4 9 0 4 4 7 3 11 4 6 10 0 10 5 2 4 4 5 5 5 5 1 6 6 0 3 5 2 2 2 4 10 9 2 2 0 6 9 5 2 4 3 1

Note: Model prediction is memory-intensive. We thus usually run it in parallel for blocks of grid cells.

Predicting distributions

Bayesian modelling is a stochastic modelling. It accounts thus for the unknowns, caused by for example:

  • Weak predictors

  • Errors in input data

  • Incomplete sample representativity

  • Observations variation

These unknowns result in prediction uncertainty. More precisely it results in predicting a distribution of the likely population count.

As seen in previous section, we have for every grid cell 2000 predictions. Figure 2 shows an example for five grids cells.

prediction_ex <- predictions %>% slice(1:5) %>% 
         pivot_longer(-gridcell_id, names_to = 'iterations', values_to = 'prediction') %>% 
  group_by(gridcell_id) %>% 
  mutate(
    mean_pop = paste0(gridcell_id, ' (mean=', round(mean(prediction)), ')')
  )


ggplotly(ggplot(prediction_ex,   aes(x=prediction, color=mean_pop)) +
  geom_density()+
  theme_minimal()+
  theme(axis.title.y=element_blank(),
        axis.text.y=element_blank(),
        axis.ticks.y=element_blank())+
  labs(x='Population predicted', color='\n\nGridcell id \n(mean population)')) %>%
  layout(margin=list(t=100))

Figure 2: Example of predicted population count for 5 grid cells and their mean population count

A meaningful statistic to retrieve from the distribution is the mean, which can be considered as the most likely value.

Gridded population

The gridded bottom-up population that WorldPop released are based on the mean value per grid cell.

Let’s see how we can create a gridded population from our predictions.

We first compute the mean prediction for every grid cell:

mean_prediction <- tibble( 
  mean_prediction = apply(predictions %>% dplyr::select(-gridcell_id), 1, mean)
  )

We then add the unsettled grid cell by joining with the raster_df, that contains every grid cell.

mean_prediction <- mean_prediction %>% 
  # add gridcell_id
  mutate(gridcell_id = predictions$gridcell_id) %>% 
  # join unsettled grid cell
  right_join(raster_df %>% 
               dplyr::select(gridcell_id)) %>% 
  # sort according to position in the raster
  arrange(gridcell_id)

To retrieve the raster structure we load a reference master - typically the mastergrid.

wd_path <- 'Z:/Projects/WP517763_GRID3/Working/NGA/ed/bottom-up-tutorial/'
mastergrid <- raster(paste0(wd_path, 'mastergrid.tif'))

And assign the predicted population count.

# create raster
mean_r <- mastergrid
# assign mean prediction 
mean_r[] <- mean_prediction$mean_prediction

We can finally plot the raster to see our gridded population:

library(RColorBrewer)
# create color ramp
cuts <- quantile(mean_prediction$mean_prediction,
                 p=seq(from=0, to=1, length.out=7), na.rm=T)
col <-  brewer.pal(n = 7, name = "YlOrRd")

#plot mean prediction
plot(mean_r, col=col)
Gridded mean population prediction for region 8

Figure 3: Gridded mean population prediction for region 8

Figure 3 shows typical population spatial distribution. We see in the top-left corner a cluster of cities, with denser neighborhoods in the centers. Lines of denser settlements are linked with the spatial distribution of population around roads. In white are displayed the unsettled grid cells.

Gridded uncertainty

We can also retrieve from the full posterior distribution, the 95% credible interval (CI) for every prediction and the related relative uncertainty computed as:

\[\begin{equation} uncertainty = \frac{upper\_CI - lower\_CI}{mean\_prediction} \end{equation}\]

The relative uncertainty informs us about areas where the mean prediction is less reliable.

To compute the gridded uncertainty, we follow the same process as for the gridded mean prediction.

# retrieve the 95% credible interval
ci_prediction <- apply(predictions %>% dplyr::select(-gridcell_id),1, quantile, p=c(0.025, 0.975))

ci_prediction <- as_tibble(t(ci_prediction)) %>% 
    # add gridcell_id
  mutate(gridcell_id = predictions$gridcell_id) %>% 
  # join unsettled grid cell
  right_join(raster_df %>% 
               dplyr::select(gridcell_id)) %>% 
  # sort according to position in the raster
  arrange(gridcell_id) %>% 
  mutate(
    mean_prediction = mean_prediction$mean_prediction,
    uncertainty = (`97.5%` - `2.5%`)/ mean_prediction
  )

# create uncertainty raster
uncertainty_r <- mastergrid
uncertainty_r[] <- ci_prediction$uncertainty

#plot uncertainty raster
cuts <- quantile(ci_prediction$uncertainty,
                 p=seq(from=0, to=1, length.out=7), na.rm=T)
col <-  brewer.pal(n = 7, name = "Blues")
plot(uncertainty_r, col=col)

We see that higher uncertainty can be observed on the outskirt of denser settlement. This is often the case due to the rapid expansion of settlement extent that is difficult to capture.

Aggregating prediction

Gridded populations are great to visualise the spatial distribution of population across an area. But often we want to get aggregates, that is a population estimate for a custom area such as a city or the catchment area of a hospital.

Let’s look at an example. We manually drew the extent of one city in our study area.

Disclaimer: the extents are not the official extent

library(sf)
library(tmap)
tmap_options(check.and.fix = TRUE)
city <- st_read(paste0(wd_path, 'study_city.gpkg'), quiet=T)

names(mean_r) <- 'Gridded population'
tmap_mode('view')
tm_shape(city)+
  tm_borders()+
  tm_shape(mean_r)+
  tm_raster()+
  tm_basemap('Esri.WorldGrayCanvas')

Getting the mean population prediction for the city corresponds to computing zonal statistic on the gridded population, that is summing up all the grid cells contained in the city.

mean_city <- extract(mean_r, city, fun=sum, na.rm=TRUE, df=TRUE)
mean_city %>%  
  mutate(features = 'city', .after=ID) %>% 
  rename("Mean population prediction"=Gridded.population) %>% 
  kbl(caption='Mean population prediction for the city computed with the gridded population') %>% kable_minimal(full_width = F)
Table 2: Mean population prediction for the city computed with the gridded population
ID features Mean population prediction
1 city 376610.6

Getting the uncertainty for that estimate is more complicated. Indeed we can’t sum up uncertainty of the grid cell to retrieve the uncertainty of the aggregate because credible intervals are based on quantile computation, and quantiles can’t be summed.

We need thus to reconstruct the full population prediction distribution for the city by aggregating all grid-cells population prediction distribution.

We first convert the city polygon to a raster with same extent as the gridded population. This raster is made of 1s for grid cells inside the city extent and 0s for grid cells outside the city.

city_r <- rasterize(city, mean_r)

We convert the city raster to an array and join to raster_df to get the grid cell id. We filter out the grid cells that belongs to the city and then merge it with the predictions:

city_prediction <- raster_df %>% 
  # select grid cell id from raster df
  dplyr::select(gridcell_id) %>% 
  # add city dummy
  mutate(city = city_r[]) %>% 
  # keep grid cells inside the city
  filter(city==1) %>% 
  # join predictions
  left_join(predictions) %>% 
  # keep predictions
  dplyr::select(starts_with('V'))

city_prediction contains all the grid cells comprised in the city with the corresponding full population prediction distribution.

We first sum the predictions for every grid cells at each iteration.

city_prediction <- as_tibble(apply(city_prediction,2, sum, na.rm=T))

city_prediction becomes an array of size 2000 that contains thus the 2000 population totals for the city.

We can then derive meaningful statistics from the distribution of population total for the city.

city_prediction_stats <- city_prediction %>% 
  summarise( 
    "Mean population" = round(mean(value)),
    "Upper bound" = round(quantile(value, p=0.975)),
    'Lower bound' = round(quantile(value, p=0.025)))

ggplot(city_prediction, aes(x=value))+
  geom_density(size=1, color='orange')+
  theme_minimal()+
    theme(axis.title.y=element_blank(),
        axis.text.y=element_blank(),
        axis.ticks.y=element_blank())+
  labs(y='', x='Predicted population for the city')+
  annotate("segment",x=city_prediction_stats$`Mean population`,
           xend=city_prediction_stats$`Mean population`, y=0, yend=0.000005, size=1)+
  annotate('text',x=city_prediction_stats$`Mean population`+5000, y=0.0000012, hjust = 0, fontface =2,
           label=paste0('Mean prediction: \n', city_prediction_stats$`Mean population`, ' people'))+
  annotate("segment",x=city_prediction_stats$`Upper bound`,
           xend=city_prediction_stats$`Upper bound`, y=0, yend=0.000005)+
  annotate('text',x=city_prediction_stats$`Upper bound`+5000, y=0.000001, hjust = 0,
           label=paste0('97.5% prediction \nbound: \n', city_prediction_stats$`Upper bound`, ' people'))+
    annotate("segment",x=city_prediction_stats$`Lower bound`,
           xend=city_prediction_stats$`Lower bound`, y=0, yend=0.000005)+
    annotate('text',x=city_prediction_stats$`Lower bound`+5000, y=0.000001, hjust = 0,
           label=paste0('2.5% prediction \nbound: \n', city_prediction_stats$`Lower bound`, ' people'))
Full distribution of the predicted population total for the city

Figure 4: Full distribution of the predicted population total for the city

Figure 4 shows that the mean prediction matches the one computed with the gridded population raster. Furthermore we see that our model produces predictions with a very wide credible interval.

Having the full distribution can answer questions such as “what is the probability that there is more than xx people in the area” or “with a 95% certainty, what is the maximum number of people in that area?”

Acknowledgements

This tutorial was written by Edith Darin from WorldPop, University of Southampton and Douglas Leasure from Leverhulme Centre for Demographic Science, University of Oxford, with supervision from Andrew Tatem, WorldPop, University of Southampton. Funding for the work was provided by the United Nations Population Fund (UNFPA).

License

You are free to redistribute this document under the terms of a Creative Commons Attribution-NoDerivatives 4.0 International (CC BY-ND 4.0) license.

References

Leasure, Douglas R, Warren C Jochem, Eric M Weber, Vincent Seaman, and Andrew J Tatem. 2020. “National Population Mapping from Sparse Survey Data: A Hierarchical Bayesian Modeling Framework to Account for Uncertainty.” Proceedings of the National Academy of Sciences.
Oak Ridge National Laboratory. 2018. “LandScan HD: Nigeria.”