From 2f9d181e7b7baab46285881b3c5b67e3d61b83a5 Mon Sep 17 00:00:00 2001 From: Michael Deistler Date: Mon, 25 Nov 2024 09:38:11 +0100 Subject: [PATCH] Add FAQ about rate-based networks (#531) --- docs/faq/question_03.md | 2 +- docs/faq/question_04.md | 52 +++++++++++++++++++++++++++++++++++++++++ 2 files changed, 53 insertions(+), 1 deletion(-) create mode 100644 docs/faq/question_04.md diff --git a/docs/faq/question_03.md b/docs/faq/question_03.md index 41fd3453..28375045 100644 --- a/docs/faq/question_03.md +++ b/docs/faq/question_03.md @@ -6,7 +6,7 @@ - single-compartment (point neuron) Hodgkin-Huxley models - multi-compartment Hodgkin-Huxley models -- rate-based neuron models +- rate-based neuron models (tutorial [here](https://jaxley.readthedocs.io/en/latest/faq/question_04.html)) For all of these models, `Jaxley` is flexible and accurate. For example, it can flexibly [add new channel models](https://jaxleyverse.github.io/jaxley/tutorial/05_channel_and_synapse_models/), use [different kinds of synapses (conductance-based, tanh, ...)](https://github.com/jaxleyverse/jaxley/tree/main/jaxley/synapses), and it can [insert different kinds of channels in different branches](https://jaxleyverse.github.io/jaxley/tutorial/01_morph_neurons/) (or compartments) within single cells. Like `NEURON`, `Jaxley` implements a backward-Euler solver for stable numerical solution of multi-compartment neurons. diff --git a/docs/faq/question_04.md b/docs/faq/question_04.md new file mode 100644 index 00000000..ff2ad827 --- /dev/null +++ b/docs/faq/question_04.md @@ -0,0 +1,52 @@ +# How can I implement rate-based neuron models in Jaxley? + +In this FAQ, we explain how one can implement rate-based neuron models of the form: +$$ +\tau \frac{dV}{dt} = -V + \sum w_{\text{syn}} \phi(V_{\text{pre}}) +$$ +Here, $\phi$ is a nonlinearity such as a `TanH` or a `ReLU`. + +To implement this in `Jaxley`, we first have to set up a network consisting of +point-neurons: +```python +import jaxley as jx + +num_cells = 100 +cell = jx.Cell() # Create a point-neuron. +net = jx.Network([cell for _ in range(num_cells)]) +``` + +Next, we have to equip the neurons with a `Leak` so as to model: +$C \cdot dV/dt = -V$ + +```python +from jaxley.channels import Leak + +net.insert(Leak()) +net.set("Leak_eLeak", 0.0) # Center the dynamics around zero. +net.set("Leak_gLeak", 1.0) # We will deal with the time-constant later. +``` + +Next, we have to connect the cells with `Tanh` synapses: +```python +from jaxley.connect import fully_connect +from jaxley.synapses import TanhRateSynapse + +fully_connect(net.cell("all"), net.cell("all"), TanhRateSynapse()) +``` + +Lastly, what rate-based neuron models call the time constant is called the `capacitance` +in `Jaxley`: +```python +net.set("capacitance", 2.0) # Default is 1.0. +``` + +That's it! As always, you can inspect your network by looking at `net.nodes` and +`net.edges`. + +Equipped with this network, you can check out the +[tutorial on how to simulate network models in Jaxley](https://jaxley.readthedocs.io/en/latest/tutorials/02_small_network.html). +You can also check out the +[API reference on different connect() methods](https://jaxley.readthedocs.io/en/latest/reference/jaxley.connect.html) +(e.g. `sparse_connect()`) or the +[tutorial on customizing synaptic parameters](https://jaxley.readthedocs.io/en/latest/tutorials/09_advanced_indexing.html). \ No newline at end of file