Skip to content

Commit

Permalink
Fixed reporting in example to use LaserFrames; Added a 2D example (ju…
Browse files Browse the repository at this point in the history
…st migration component).

Added spatial-from-data example code.
  • Loading branch information
jonathanhhb committed Feb 7, 2025
1 parent 7009c97 commit 763d97b
Showing 1 changed file with 228 additions and 22 deletions.
250 changes: 228 additions & 22 deletions docs/spatialexample.rst
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ Model Class
params (dict): Dictionary containing parameters such as population size,
number of nodes, timesteps, and rates for transmission/migration.
"""
self.components = []
self.params = params
self.nodes = params["nodes"]
self.population = LaserFrame(capacity=params["population_size"], initial_count=params["population_size"])
Expand All @@ -85,11 +86,25 @@ Model Class
np.random.shuffle(node_ids)
self.population.node_id[:params["population_size"]] = node_ids
# Reporting structure
self.results = np.zeros((params["timesteps"], self.nodes, 3)) # S, I, R per node
# Components
self.components = []
# Reporting structure: Use LaserFrame for reporting
self.results = LaserFrame( capacity=self.nodes ) # not timesteps for capacity
for state in ["S", "I", "R"]:
self.results.add_vector_property(state, length=params["timesteps"], dtype=np.int32)
# Record results: reporting could actually be a component if we wanted. Or it can be
# done in a log/report function in the relevant component (e.g., Transmission)
self.results.S[self.current_timestep, :] = np.array([
np.sum(self.population.disease_state[self.population.node_id == i] == 0)
for i in range(self.nodes)
])
self.results.I[self.current_timestep, :] = np.array([
np.sum(self.population.disease_state[self.population.node_id == i] == 1)
for i in range(self.nodes)
])
self.results.R[self.current_timestep, :] = np.array([
np.sum(self.population.disease_state[self.population.node_id == i] == 2)
for i in range(self.nodes)
])
def add_component(self, component):
"""
Expand Down Expand Up @@ -142,8 +157,13 @@ Model Class
"""
plt.figure(figsize=(10, 6))
for i in range(self.nodes):
prevalence = self.results[:, i, 1] / self.results[:, i, :].sum(axis=1)
prevalence = self.results.I[:, i] / (
self.results.S[:, i] +
self.results.I[:, i] +
self.results.R[:, i]
)
plt.plot(prevalence, label=f"Node {i}")
plt.title("Prevalence Across All Nodes")
plt.xlabel("Timesteps")
plt.ylabel("Prevalence of Infected Agents")
Expand Down Expand Up @@ -186,22 +206,6 @@ Migration Component Class
matrix[from_node][to_node] = 0
break_matrix_node(self.migration_matrix, 13, 14)
def get_gravity_migration_matrix(self, nodes):
"""
Generates a gravity-based migration matrix based on population and distances between nodes.
Args:
nodes (int): Number of nodes in the simulation.
Returns:
ndarray: A migration matrix where each row represents probabilities of migration to other nodes.
"""
pops = np.array([np.sum(self.model.population.node_id == i) for i in range(nodes)])
distances = np.ones((nodes, nodes)) - np.eye(nodes)
migration_matrix = gravity(pops, distances, k=1.0, a=0.5, b=0.5, c=2.0)
migration_matrix = migration_matrix / migration_matrix.sum(axis=1, keepdims=True)
return migration_matrix
def get_sequential_migration_matrix(self, nodes):
"""
Creates a migration matrix where agents can only migrate to the next sequential node.
Expand Down Expand Up @@ -368,3 +372,205 @@ Run Everything
model.run()
model.save_results("simulation_results.csv")
model.plot_results()
Discussion
^^^^^^^^^^

This simple spatial example with migration creates a set of nodes with synthetic populations and a linear connection structure. The 0th node is the 'urban' node, with the largest population, and where we seed the infection. The migration matrix just connects nodes to the next node (by index). So we expect to see infection travel sequentially from node to node. We break the connection at node 13 just to show we can.

Now we make things a bit more interesting by taking a similar set of nodes, but creating a migration matrix from the gravtiy function, so we effectively have a 2D network with rates proportional to population sizes. Distances are still very synthetic. We change our migration function as well since we have to be a little smarter when our matrix isn't sparse.

Migration Component (2D)
^^^^^^^^^^^^^^^^^^^^^^^^

.. code-block:: python
:linenos:
class MigrationComponent2D:
"""
Handles migration behavior of agents between nodes in the model.
Attributes:
model (MultiNodeSIRModel): The simulation model instance.
migration_matrix (ndarray): A matrix representing migration probabilities between nodes.
"""
def __init__(self, model):
"""
Initializes the MigrationComponent.
Args:
model (MultiNodeSIRModel): The simulation model instance.
"""
self.model = model
# Set initial migration timers
max_timer = int(1 / model.params["migration_rate"])
model.population.migration_timer[:] = np.random.randint(1, max_timer + 1, size=model.params["population_size"])
self.migration_matrix = self.get_gravity_migration_matrix(model.nodes)
def get_gravity_migration_matrix(self, nodes):
"""
Generates a gravity-based migration matrix based on population and distances between nodes.
Args:
nodes (int): Number of nodes in the simulation.
Returns:
ndarray: A migration matrix where each row represents probabilities of migration to other nodes.
"""
pops = np.array([np.sum(self.model.population.node_id == i) for i in range(nodes)])
distances = np.ones((nodes, nodes)) - np.eye(nodes)
migration_matrix = gravity(pops, distances, k=1.0, a=0.5, b=0.5, c=2.0)
migration_matrix = migration_matrix / migration_matrix.sum(axis=1, keepdims=True)
return migration_matrix
def step(self):
"""
Executes the migration step for the agent-based model.
This function selects a fraction of agents to migrate based on expired migration timers.
It then changes their node_id according to the migration matrix, ensuring that movements
follow the prescribed probability distributions.
Steps:
- Selects a subset of the population for migration.
- Determines the origin nodes of migrating agents.
- Uses a multinomial draw to assign new destinations based on the migration matrix.
- Updates the agents' node assignments accordingly.
Returns:
None
"""
# Decrement migration timers
self.model.population.migration_timer -= 1
# Identify agents ready to migrate
migrating_indices = np.where(self.model.population.migration_timer <= 0)[0]
if migrating_indices.size == 0:
return
np.random.shuffle(migrating_indices)
origins = model.population.node_id[migrating_indices]
origin_counts = np.bincount(origins, minlength=model.params["nodes"])
offset = 0
for origin in range(model.params["nodes"]):
count = origin_counts[origin]
if count == 0:
continue
origin_slice = migrating_indices[offset : offset + count]
offset += count
row = self.migration_matrix[origin]
row_sum = row.sum()
if row_sum <= 0:
continue
fraction_row = row / row_sum
destination_counts = np.random.multinomial(count, fraction_row)
destinations = np.repeat(np.arange(model.params["nodes"]), destination_counts)
model.population.node_id[origin_slice] = destinations[:count]
# Reset migration timers for migrated agents
self.model.population.migration_timer[migrating_indices] = np.random.randint(
1, int(1 / self.model.params["migration_rate"]) + 1, size=migrating_indices.size
)
Discussion
^^^^^^^^^^

This example is more advanced than our first one since it moves from 1 dimension to 2, and is fully connected. We should see infection spread from the large seed node to most or potentially all of the other nodes (depending on transmission rates and migration rates and stochasticity) in a way that is broadly a function of the other nodes' populations. Though since the smaller nodes don't vary massively in population and since the model is stochastic, it will be a general correlation.

Now we make things a bit more interesting by taking a similar set of nodes, but creating a migration matrix from the gravtiy function, so we effectively have a 2D network with rates proportional to population sizes. Distances are still very synthetic. We change our migration function as well since we have to be a little smarter when our matrix isn't sparse.


Simple Spatial SIR Model with Real Data
=======================================

Design
------
Now we switch from synthetic population and spatial data to real population data. In this example, we have a csv
file which starts off like this:

::

region_id,population,centroid_lat,centroid_long,birth_rate
Ryansoro,46482.66796875,-3.707268618580818,29.79879895068512,11.65647
Ndava,72979.296875,-3.391556716979041,29.753430749757815,15.881549
Buyengero,76468.8125,-3.8487418774123014,29.53299692786253,12.503805
Bugarama,44571.8515625,-3.6904789341549504,29.400408879716224,11.025566
Rumonge,300248.03125,-3.9622108122897663,29.45711276535364,19.567726
Burambi,63219.703125,-3.798641437985548,29.452423323952797,9.199019
Kanyosha1,115017.984375,-3.4309688424403473,29.41531324224386,37.951366
Kabezi,71913.8359375,-3.5311012728218527,29.369968675926785,31.831919
Muhuta,88141.7109375,-3.623512958448508,29.415218642943234,21.598902



Code
^^^^

.. code-block:: python
:linenos:
class SpatialSIRModelRealData:
def __init__(self, params, population_data):
"""
Initialize the mode, LASER-style, using the population_data loaded from a csv file (pandas).
Create nodes and a migration_matrix based on populations and locations of each node.
"""
self.params = params
# We scale down population here from literal values but this may not be necessary.
population_data["scaled_population"] = (population_data["population"] / params["scale_factor"]).round().astype(int)
total_population = int(population_data["scaled_population"].sum())
print( f"{total_population=}" )
# Set up the properties as before
self.population = LaserFrame(capacity=total_population, initial_count=total_population)
self.population.add_scalar_property("node_id", dtype=np.int32)
self.population.add_scalar_property("disease_state", dtype=np.int32, default=0)
self.population.add_scalar_property("recovery_timer", dtype=np.int32, default=0)
self.population.add_scalar_property("migration_timer", dtype=np.int32, default=0)
node_pops = population_data["scaled_population"].values
self.params["nodes"] = len(node_pops)
node_ids = np.concatenate([np.full(count, i) for i, count in enumerate(node_pops)])
np.random.shuffle(node_ids)
self.population.node_id[:total_population] = node_ids
# seed in big node
big_node_id = np.argmax( node_pops )
available_population = population_data["scaled_population"][big_node_id]
initial_infected = int(params["initial_infected_fraction"] * available_population)
infection_indices = np.random.choice(np.where(self.population.node_id == big_node_id)[0], initial_infected, replace=False)
self.population.disease_state[infection_indices] = 1
self.population.recovery_timer[infection_indices] = np.random.uniform(params["recovery_time"] - 3, params["recovery_time"] + 3, size=initial_infected).astype(int)
pop_sizes = np.array(node_pops)
latitudes = population_data["centroid_lat"].values
longitudes = population_data["centroid_long"].values
distances = np.zeros((self.params["nodes"], self.params["nodes"]))
# Nested for loop here is optimized for readbility, not performance
for i in range(self.params["nodes"]):
for j in range(self.params["nodes"]):
if i != j:
distances[i, j] = distance(latitudes[i], longitudes[i], latitudes[j], longitudes[j])
# Set up our migration_matrix based on gravity model and input data (pops & distances)
self.distances = distances
self.migration_matrix = gravity(pop_sizes, distances, k=10.0, a=1.0, b=1.0, c=1.0)
self.migration_matrix /= self.migration_matrix.sum(axis=1, keepdims=True) # normalize
Discussion
^^^^^^^^^^

We load the population data from the csv file, round and scale down (optional), and assign node ids. The shuffling is probably optional, but helps avoid biasing. We exploit the distance function from laser_utils (import not in code snippet).

0 comments on commit 763d97b

Please sign in to comment.