Skip to main content
Advertisement
  • Loading metrics

Flexible neural connectivity under constraints on total connection strength

Abstract

Neural computation is determined by neurons’ dynamics and circuit connectivity. Uncertain and dynamic environments may require neural hardware to adapt to different computational tasks, each requiring different connectivity configurations. At the same time, connectivity is subject to a variety of constraints, placing limits on the possible computations a given neural circuit can perform. Here we examine the hypothesis that the organization of neural circuitry favors computational flexibility: that it makes many computational solutions available, given physiological constraints. From this hypothesis, we develop models of connectivity degree distributions based on constraints on a neuron’s total synaptic weight. To test these models, we examine reconstructions of the mushroom bodies from the first instar larva and adult Drosophila melanogaster. We perform a Bayesian model comparison for two constraint models and a random wiring null model. Overall, we find that flexibility under a homeostatically fixed total synaptic weight describes Kenyon cell connectivity better than other models, suggesting a principle shaping the apparently random structure of Kenyon cell wiring. Furthermore, we find evidence that larval Kenyon cells are more flexible earlier in development, suggesting a mechanism whereby neural circuits begin as flexible systems that develop into specialized computational circuits.

Author summary

High-throughput electron microscopic anatomical experiments have begun to yield detailed maps of neural circuit connectivity. Uncovering the principles that govern these circuit structures is a major challenge for systems neuroscience. Healthy neural circuits must be able to perform computational tasks while satisfying physiological constraints. Those constraints can restrict a neuron’s possible connectivity, and thus potentially restrict its computation. Here we examine simple models of constraints on total synaptic weights, and calculate the number of circuit configurations they allow: a simple measure of their computational flexibility. We propose probabilistic models of connectivity that weight the number of synaptic partners according to computational flexibility under a constraint and test them using recent wiring diagrams from a learning center, the mushroom body, in the fly brain. We compare constraints that fix or bound a neuron’s total connection strength to a simple random wiring null model. Of these models, the fixed total connection strength matched the overall connectivity best in mushroom bodies from both larval and adult flies. We also provide evidence suggesting that neural circuits are more flexible in early stages of development and lose this flexibility as they grow towards specialized function.

Introduction

The connectivity of neural circuits, together with their intrinsic dynamics, determines their computation. A goal of systems neuroscience is to uncover and describe these computational mechanisms in specific circuits. For example, associative memory is a quintessential neural computation [13]. In cerebellar and cerebral cortices, random connectivity may form high-dimensional representations to facilitate associative memory [47]. The synaptic weight distributions of Purkinje cells and cortical pyramidal neurons are consistent with optimal associative memory in simple models [813].

At the same time, neuronal connectivity is constrained by resource limitations and homeostatic requirements. The total strength of synaptic connections between two neurons is limited by the amount of receptor and neurotransmitter available and the size of their synapses [14]. Homeostatic synaptic scaling in pyramidal neurons of mammalian cortex and hippocampus regulates their total excitatory [1518] and inhibitory [1923] synaptic input strengths to regulate activity levels [24].

The neural connectivity necessary for a given computation must exist or develop within these physiological constraints. Furthermore, the computations performed by a circuit may require modification based on exposure to the environment and the needs of the organism. Physiological constraints could conflict with a required circuit configuration and pose a challenge to computational learning. It might thus be advantageous for circuits, under a fixed constraint, to enable a broad array of computations. We refer to connectivity patterns that can enable many computations as “flexible” under a constraint. Here we seek to understand whether this flexibility can predict neural connectivity patterns.

We examine the interaction between constraints and computations in the mushroom body, a cerebellum-like associative memory center in Drosophila melanogaster and other insects [25, 26]. Mushroom bodies are composed largely of Kenyon cells (KCs), which receive input from sensory projection neurons (PNs; Fig 1a). KCs also connect recurrently to each other, receive inhibitory and modulatory inputs, and project to mushroom body output neurons (MBONs). A combinatorial code for odorants in KCs [2729] forms a substrate for associative learning at KC-MBON synapses [25, 30]. In D. melanogaster, neurons from a variety of circuits exhibit homeostatic regulation of connectivity during growth from the first instar larva to the third instar larva. This includes a homeostatic regulation of mechanosensory receptive fields [31], functional motor neuron outputs [32], and nociceptive projections [33]. Changes in inputs to central motor neurons elicit structural modifications to their dendrites that homeostatically maintain input levels [34]. Finally, changes in olfactory PN activity lead to homeostatic compensations in the number and size of their output synapses in the mushroom body [3537]. Connections in the mushroom body also are generally limited to a few synapses per connection [38, 39]. We thus hypothesize that mushroom body connectivity in D. melanogaster might be structured to be computationally flexible under physiological constraints.

thumbnail
Fig 1. Flexible connectivity under constraints.

(a) Mushroom body circuitry (cartoon based on [38]). (b) Synaptic weights occupy a K-dimensional space. K is the number of synaptic partners. The solution spaces for computational tasks are subspaces of the synaptic weight space, with dimension up to K. Constraints also define subspaces with dimension up to K. A tight constraint defines a small subspace with low potential overlap with computational solution spaces. (c) A loose constraint defines a large subspace, with greater potential overlap with computational solution spaces. (d) Cartoon of a postsynaptic resource constraint: a neuron with M = 3 units of postsynaptic weight (e.g., receptors) to distribute amongst two synaptic partners. (e) Cartoon of a presynaptic resource constraint a neuron with M = 4) units of synaptic weight (e.g., vesicles) to distribute amongst two partners. (f) Number of possible connectivity configurations for different values of K and M (given by the binomial coefficient).

https://doi.org/10.1371/journal.pcbi.1008080.g001

Consider the connections from one KC to its K targets. We will describe each projection by one synaptic weight, summed across synapses if the projection is multi-synaptic, so that this KC’s output weight configuration is a point in a K-dimensional synaptic weight space. Regions of this space might correspond to weight configurations that support different learned associations or computations (Fig 1b; “computation spaces”). A physiological or homeostatic constraint, such as those discussed above, also defines a region of allowed connectivity configurations (Fig 1b; “constraint space”). If a constraint is very tight, it might only allow a few configurations, and even disallow computationally useful configurations as in Fig 1b, where the constraint and computation spaces do not overlap. In order for the KC to be both healthy and computationally useful, its connectivity must lie in the intersection of the constraint space and a computation space. This is easier if the constraint space is large—if the KC’s connectivity is flexible under its constraints. (Fig 1c). “Flexibility” thus refers to the number of possible joint configurations of all this KC’s outputs and is a property of the neuron’s full K-dimensional output connectivity rather than of an individual synapse. While we discussed the flexibility of a KC’s output connectivity here, the same idea can be applied to its inputs, or those of MBONs, or of other neurons in other systems.

In this study, we formulate this idea for simple models of two constraints: a (1) bounded or (2) fixed total connection strength. We propose that circuits might face a pressure to be flexible under these constraints. This motivates probabilistic models for the number of synaptic partners to a neuron. We test these models against each other and a simple random wiring null model using recently available electron microscopic wiring diagrams of the mushroom body from larval [38] and adult [39] D. melanogaster. We found that overall, the fixed net weight model provided the best description for neurons’ numbers of synaptic partners in the mushroom bodies. The one exception we saw was in the most mature KCs of the larval mushroom body, which were better described by a binomial random wiring model. This suggests a developmental progression in the pressures shaping KC wiring in the first instar larva of D. melanogaster.

Results

Measuring constraint flexibility

We begin with a simple example where a neuron has M units of synaptic weight, of size ΔJ, available. These could correspond, for example, to individual receptors or vesicles. The neuron can assign these synaptic weight units to its K partners (presynaptic partners for receptors, Fig 1d, or postsynaptic partners for vesicles, Fig 1e). We will also call the number of synaptic partners the degree or connectivity degree.

To measure how flexible the neuron is with M synaptic weight units and K partners, we can count possible connectivity configurations. Since the constraint treats all synaptic partners symmetrically, the number of possible configurations is given by the binomial coefficient “M choose K”. For M = 4 and degree two, there are six possible configurations. With M = 4 and degree three, there are four possible configurations. Thus with the constraint of M = 4, the neuron is more flexible with two connections than three since there are more ways to satisfy the constraint. The neuron’s flexibility under this constraint (of fixed total synaptic resources MΔJ) is a function only of the number of synaptic partners K only. For different numbers of synaptic weight units M, the flexibility exhibits different profiles as a function of the degree K (Fig 1f).

Synaptic weights can be made up of many small units of strength, corresponding to (for example) individual receptors or vesicles. So, we will model individual synaptic weights as continuous variables rather than the discrete description above. Throughout, we will use “synaptic weight” and “connection strength” interchangeably to refer to the total strength of projections from one neuron to another.

Degree distributions from constraints on net synaptic weights

We consider a simple model of synaptic interactions where a neuron has K synaptic partners and the strength of projection i is Ji. The first constraint we consider is an upper bound on the total connection strength: (1)

The bound could be interpreted multiple ways, for example as a presynaptic limit due to the number of vesicles currently available before more are manufactured or a postsynaptic limit due to the amount of dendritic tree available for synaptic inputs. The value of could be set by metabolic or resource constraints. Rather than modeling the biological origin of , we will focus on the structure this constraint imposes in the K-dimensional synaptic weight configuration space.

With K synaptic partners, the constraint (Eq 1) defines a K-dimensional volume. For a neuron with two synaptic partners, this is the portion of the plane bounded by the axes and a line that stretches from to (Fig 2a). For three synaptic partners, the weight configurations live in three-dimensional space and are constrained to lie in the volume under an equilateral triangle (Fig 2b). It is equilateral because its vertices are defined by the configurations where one connection uses the total weight, . In general, for K synaptic partners the synaptic weights live in the volume under a K − 1 dimensional simplex (the geometric generalization of a triangle to higher dimensions). This K-dimensional volume is [40] (2)

thumbnail
Fig 2. Constraints on total synaptic weights.

(a-c) An upper bound on the total synaptic weight. (d-f) A fixed total synaptic weight. (a) For two inputs with a total synaptic weight of at most , the synaptic weights must live in the area under a line segment from to (a regular 1-simplex). (b) For three inputs, the synaptic weights must live in the volume under a regular two-simplex. (c) Volume of the K − 1 simplex as a function of the number of presynaptic partners, K, for different maximal net weights . (d) For two inputs with a total synaptic weight fixed at , the synaptic weight configurations must be on the line segment from to . (e) The solution space for the fixed net weight constraint with three inputs is an equilateral planar triangle (a regular 2-simplex). (f) Surface area of the regular K − 1 simplex as a function of the number of presynaptic partners, K, for different net synaptic weights .

https://doi.org/10.1371/journal.pcbi.1008080.g002

This extends the counting model of Fig 1d–1f to the case of a large pool of synaptic resources (Measuring synaptic weight configuration spaces) and measures the size of the space of allowed circuit configurations under the constraint of Eq 1. The volume under the simplex increases with the max weight , but for it has a maximum at K ≥ 1 and then decays (Fig 2c).

Under our hypothesis of flexible computation under a constraint, there should be a pressure towards circuit structures with a large number of allowed synaptic weight configurations. There may be other competing pressures on the circuit architecture, which prevent it from being optimally flexible (achieving the largest number of possible configurations). To model the pressure towards flexibility, we thus stipulate that the probability of having K synaptic partners given resource limits is proportional to the number of possible configurations, i.e., the volume of the weight space with K partners. For a bounded net synaptic weight: (3) where the subscript V marks the probability distribution as proportional to the simplex’s volume and the normalization constant, ZV, ensures that the probability distribution sums to 1: (4)

This normalization constant can be computed exactly to reveal a zero-truncated Poisson distribution: (5)

Note that this is a distribution for the number of synaptic partners to a neuron (its degree), not for its synaptic weights. The degree distribution is conditioned on the maximum total synaptic weight, as measured by the parameter .

In addition to facing resource constraints, neurons also homeostatically regulate their total input strengths. Motivated by this, the next constraint we consider holds the total synaptic weight fixed at : (6)

This constraint is satisfied on the surfaces of the same simplices discussed above (Fig 2d and 2e). Their surface area is (Measuring synaptic weight configuration spaces): (7)

Like the simplex’s volume, the surface area increases with the total synaptic weight but for it has a maximum at K ≥ 1 (Fig 2f). We will also examine the size of this constraint as a model for degree distributions: (8) where pA denotes the probability proportional to the simplex’s surface area and the normalization constant is (9)

In contrast to the bounded net weight model (Eq 5), we are not aware of an exact solution for ZA. When required, we will either approximate it by truncating at large K or bound it (Model comparison: Fixed net weight model).

Testing degree distribution models

To test these models, Eqs 5 and 8, requires joint measurements of neurons’ total synaptic weight and number of synaptic partners. One type of data with measurements reflecting both of these are dense electron microscopic (EM) reconstructions with synaptic resolution, where (in a large enough tissue sample) all of a neuron’s synaptic partners can be identified and the size or number of synapses provide indirect measurements of the connection strength.

The published EM wiring diagrams of D. melanogaster mushroom bodies measure synaptic strengths by the count of synapses [38, 39]. While we are not aware of joint measurements of synapse counts and physiological connection strength in Kenyon cells, the relationship of anatomical and physiological measures of connection strength has been studied in mammalian pyramidal neurons. There, synapse size and synapse strength are highly correlated [4145]. We thus assumed that the total number of synapses onto a neuron, , is proportional to its total connection strength constraint: (10) where α is the unknown constant relating the synapse count and the net synaptic weight. This assumption introduces α as an additional parameter in our degree distribution models, Eqs 5 and 8, so that each neuron’s degree is conditioned on two things: the unknown parameter α and that neuron’s number of synapses .

In addition to the two constraint-inspired models of Eqs 5 and 8, we also examined a simple random wiring null model where the number of partners follows a zero-truncated binomial distribution: (11)

This binomial wiring model assumes that each of N potential synaptic partners to a Kenyon cell has a fixed probability q of making a connection, and that whether or not different potential partners actually connect is independent. We used anatomical measurements for N (Model comparison: Zero-truncated binomial model) and took q as an unknown parameter for this model. Note that this is a binomial model for connections, in contrast to the binomial model for resource allocation of Fig 1d–1f.

To measure and compare how well these models explain KC connectivity we computed their Bayesian evidence: the likelihood of the data under a model, marginalizing over the unknown parameters (Model comparison). For the fixed net weight, for example, the evidence is (12) where i indexes KCs and p(α) is a prior distribution for α. We have a corresponding integration over α to find the model evidence of the bounded net weight model, and for the binomial wiring model an integration over the unknown connection probability q.

Calculating the model evidence requires choosing a prior distribution for the unknown parameter (p(α) in Eq 12). We will use flat priors, as well as the Poisson Jeffreys prior. (Jeffreys priors maintain uncertainty under different scaling or unit choices.) The normalization constant for the fixed net weight model (ZA, Eq 9) was not analytically tractable, so we computed upper and lower bounds for it. These bounds for ZA then gave us bounds on the fixed net weight model’s evidence (Model comparison: Fixed net weight model).

Larval Kenyon cell outputs

We first examined KCs outputs in the first instar larva, using the complete synaptic wiring diagram of its 223 KCs (110 on the left side of the brain and 113 on the right) from [38]. We excluded projections to the inhibitory APL neuron and the modulatory dopaminergic, and octopaminergic neurons as well as interneurons so the out-degree of each KC measures its number of postsynaptic KCs and MBONs. (There are 48 MBONs, 24 on each side of the brain.) We obtained similar results as reported here, however, when including those other synapses. As in previous studies, we used only reliable multi-synapse connections (but obtained similar results as reported when including single-synapse connections) [38, 39]. Larval KCs can be morphologically classified by their age. The dendrites of mature KCs form claws around PN axons. The 78 young KCs do not have claws, and the 36 single-claw KCs are older than the 109 multi-claw KCs [38]. KCs have a wide range of presynaptic degrees, with very different out-degree distributions for young and clawed KCs (Fig 3a histograms).

thumbnail
Fig 3. Kenyon cell degree distributions in larval D. melanogaster.

(a) Distribution of number of postsynaptic partners for larval KCs. Shaded histograms: empirical distribution. Solid lines: the marginal simplex area distribution at the maximum likelihood value of α, after integrating out the number of synapses against its empirical distribution. Dotted lines: the maximum likelihood binomial distribution. (b) Lower bound for the log evidence ratio (log odds) for the fixed net weight and binomial wiring models. Positive numbers favor the fixed weight model. Evidences computed by a Laplace approximation of the marginalization over the parameters. (c) Lower bound for the log odds for the fixed net weight and bounded net weight models; positive numbers favor the fixed weight model. (d) Log likelihood of the fixed net weight model () as a function of the scaling between synapse counts and net synaptic weights, α. (e-h) Same as (a-d) but for inputs to larval KCs.

https://doi.org/10.1371/journal.pcbi.1008080.g003

As a first test, we examined the maximum likelihood marginal degree distributions. That is, we computed the maximum likelihood value of α to obtain the conditional degree distribution and then marginalized out the number of synapses, , using its observed distribution to compute (Fig 3a, solid lines). For the binomial model we computed the maximum likelihood value of the connection probability q (Fig 3a, dashed lines). For the young and clawed KCs, the fixed net weight model appeared to match the marginal degree distributions better than the binomial wiring model (Fig 3a).

To quantitatively compare how two models explained the data, we computed their log evidence ratio (log odds). The log odds for the fixed net weight model pA versus the bounded net weight model pV are (13) and positive L favors the fixed net weight model, while negative L favors the bounded net weight model. For example, L = 1 means that the data are exp(1) ≈ 2.72 times more likely under the fixed net weight model than the bounded net weight model and L = 10 means the data are exp(10) ≈ 22026.47 times more likely under the fixed net weight model.

We found that the log odds favored the fixed net weight model over the binomial wiring model for young and multi-claw KCs (Fig 3b; log odds at least 48.17 and 201.8 respectively), but not for single- or two-claw KCs (Figure Ab in S1 Figs; log odds at most -0.78 and -0.50 respectively). For KCs with three or more claws, the log odds for the fixed weight model over the binomial wiring model were at least 20.3 (Figure Ab in S1 Figs). The log odds favored the bounded net weight model over the binomial wiring model in the same cases: for young and clawed KCs, except for single- and two-claw KCs (Figure Ac in S1 Figs). The fixed net weight model described KC output degree distributions better than the bounded net weight for all types of KC in the larva (Fig 3c, Figure Aa in S1 Figs). To control for our choice of prior (e.g., p(α) in Eq 12) we also performed a model comparison using the Jeffreys prior for the Poisson distribution (Model comparison) for α and q. Under the Poisson Jeffreys prior, the only result that changed was that the bounded net weight was the best model for the young KC outputs (Figure Ad-f in S1 Figs).

We next asked whether the relationship between anatomical and physiological synaptic weights exhibited a similar developmental trajectory as the model likelihoods, with multi-claw KCs appearing more similar to young KCs. To this end, we examined the likelihood of the data under the fixed net weight model as a function of the scaling parameter α (Fig 3d). The maximum likelihood values of α decreased with KC age, from 0.29 for young to 0.22 for multi-claw and 0.17 for single-claw KCs (Fig 3d). The two-claw KCs exhibited a similar scaling as the single-claw KCs (0.17), while KCs with three or more claws had higher α values (0.24 for three- and four-claw KCs, 0.28 for five- and six-claw KCs). This suggests that the relationship between net synapse counts and regulated net synaptic weights in the larval mushroom body may become weaker during KC maturation.

Larval Kenyon cell inputs

We next examined the inputs to KCs in the larva. Like for the outputs, we examined multi-synapse connections and excluded inputs from the inhibitory APL and modulatory neurons (but obtained similar results when including them). There is a wide distribution of in-degrees for both larval and clawed KCs (Fig 3e, histograms). The maximum likelihood fit of the fixed net weight model appeared a much better fit than the maximum likelihood binomial for young KC inputs (Fig 3e, black solid vs dashed curves). For clawed KCs, it was less immediately clear which maximum likelihood model better explained the in-degree distribution (Fig 3e, blue solid vs dashed curves).

We again examined the log odds for pairs of models. The log odds favored the simplex area model over the binomial model for both young and clawed KCs (Fig 3f, log odds at least 70.46 and 0.97). We next asked whether this depended on the number of claws. The log odds favored the fixed net weight model over the binomial wiring model for multi-claw KCs, but not single-claw KCs (Fig 3f, log odds at least 9.92 for multi-claw KCs; Figure Bb in S1 Figs, log odds at most -1.57 for single-claw KCs). Within multi-claw KCs, the log odds also favored the binomial model over the fixed weight model for two-claw KCs, but not for KCs with at least three claws (Figure Bb in S1 Figs). We found similar results comparing the bounded net weight model with binomial wiring (Figure Bc in S1 Figs). The log odds favored the fixed net weight model over the bounded weight model for all KC types (Fig 3g). We obtained similar results under the Poisson Jeffreys prior for α and q (Figure B in S1 Figs). Since single-claw KCs are more mature than multi-claw KCs [38], these results together suggest that flexibility under a homeostatically fixed net weight governs KC input connectivity early in development, with other factors shaping connectivity after sensory and behavioral experience.

We next asked whether the relationship between anatomical and physiological input synaptic weights exhibited a similar developmental trajectory. We saw that that the maximum likelihood value of α, , decreased with KC age (Fig 3h); maximum likelihood values of α: 0.33, 0.25, and 0.21 for young, multi-claw, and single-claw KCs, respectively). The scaling for single- and two-claw KCs were similar ( of 0.2 for two-claw KCs), while KCs with three or more claws had . Under the simple model of Eq 10, these suggest that the translation of synapse counts into a physiologically regulated net synaptic weight becomes weaker during KC maturation. This could relate to the spatial concentration of synapses in claws of the dendrite.

Larval MBON inputs

The next stage of mushroom body processing after KCs occurs at MBONs (Fig 1a). They exhibit a wide range of in-degrees (Fig 4a) from three presynaptic KCs (for the left MBON-n1) to 105 presynaptic KCs (for the left MBON-m1) [38]. Neither of the max likelihood fixed net weight (Fig 4a, black) and binomial (Fig 4a, blue) models appeared to be as good fits for the MBON in-degree distribution as they were for the KCs (Fig 3a and 3d). The fixed net weight model matched the breadth of the degree distribution, however, while the binomial model did not. We observed similar results for the bounded net weight model. To test which model provided a better explanation of the data overall, not just at a single parameter value, we again computed their log odds (Fig 4b). The log odds favored the fixed net weight model over both the bounded net weight (log odds at least 7.70) and binomial models (log odds at least 249.44). This was despite the fixed and bounded net weight models’ likelihoods being sharper functions of α than the binomial model’s likelihood as a function of q (Fig 4c).

thumbnail
Fig 4. Mushroom body output neuron degree distributions in larval D. melanogaster.

(a) Number of inputs to MBONs. Shaded histograms: empirical distribution. Black curve: the marginal simplex area distribution at the maximum likelihood value of α, after integrating out the number of synapses against its empirical distribution. Blue curve: the maximum likelihood binomial distribution. (b) Lower bound for the log evidence ratio (log odds) for the fixed net weight and binomial wiring models. Positive numbers favor the fixed weight model. Evidences computed by a Laplace approximation of the marginalization over the parameters. (c) Likelihood vs model parameter for the fixed net weight (black) and binomial (blue) models).

https://doi.org/10.1371/journal.pcbi.1008080.g004

In summary, we found that the degree distribution predicted by flexible wiring under a homeostatically fixed total connection strength was the best overall model for KC input and output degrees, and MBON input degrees (Figs 3 and 4). The one exception to this were the single-claw KCs, which were best described by a binomial wiring model (Fig 3b and 3f).

Adult Kenyon cell outputs

To test the generality of these results, we turned to a related circuit: the adult D. melanogaster mushroom body. It contains the same general types of cells as the larva, though in different numbers, with the same broad circuit structure (Fig 1a). We examined a recent connectome of the alpha lobe of the adult mushroom body from Takemura et al. [39]. The alpha lobe is defined by KC axons, so these data do not include the PN inputs which target dendrites. It contains the axons of 132 alpha prime lobe KCs and 949 alpha lobe KCs. Like in the larva, the age of adult KCs can be classified morphologically. KCs of the alpha prime lobe are born before KCs of the alpha lobe. In the alpha lobe, the 78 posterior KCs are born before the 480 surface KCs, which are in turn born before the 259 core KCs [46, 47].

Since the adult data are only for axo-axonal connectivity, we first repeated our previous analysis without KC-MBON connections to examine the axo-axonal KC output connectivity in the larva. We found similar results as for the full connectivity (Figure Cf, g in S1 Figs).

In the adult, Kenyon cells had heterogenous out-degrees, with alpha lobe KCs exhibiting a bimodal distribution (Fig 5a). This bimodality was reflected in the out-degrees of posterior, core and surface KCs, rather than arising from the separate alpha lobe KC types. The fixed net weight models predicted adult KC out-degree distributions better than the binomial wiring model for all KC types (Fig 5b), as did the bounded net weight model (Figure Df, in S1 Figs). The fixed net weight provided a better description for all the out-degree distributions of all types of adult KC than the bounded net weight except when posterior, core, and surface KCs were all considered together (Fig 5c blue). These results were consistent when using the Poisson Jeffreys prior for α and q (Figure Dd-f in S1 Figs). In summary, the degree distributions of KC outputs in the adult alpha lobe are best described by flexibility under a homeostatically fixed net synaptic weight.

thumbnail
Fig 5. Kenyon cell degree distributions in adult D. melanogaster.

(a) Distribution of number of postsynaptic partners for adult KCs. Shaded histograms: empirical distribution. Solid lines: the marginal simplex area distribution at the maximum likelihood value of α, after integrating out the number of synapses against its empirical distribution. Dotted lines: the maximum likelihood binomial distribution. (b) Lower bound for the log evidence ratio (log odds) for the fixed net weight and binomial wiring models. Positive numbers favor the fixed weight model. Evidences computed by a Laplace approximation of the marginalization over the parameters. (c) Lower bound for the log odds for the fixed net weight and bounded net weight models; positive numbers favor the fixed weight model. (d) Log likelihood of the fixed net weight model as a function of the scaling between synapse counts and net synaptic weights. (e-h) Same as (a-d) but for inputs to adult KCs.

https://doi.org/10.1371/journal.pcbi.1008080.g005

The scaling between synapse counts and synaptic weights varied by KC type in the adult (Fig 5d). The maximum likelihood values of α were 0.58, 0.54, 0.7 and 0.67 for alpha prime, posterior, surface and core KCs respectively (in approximate developmental order). These suggest a divide with more mature KCs having more synaptic weight per synapse, on average, than younger KCs. The log likelihood for the alpha prime KCs was also much less sensitive to low values of α than the other KC types (Fig 5d black vs colored curves), suggesting a more heterogenous or flexible relationship between axonal input synapse counts and regulated output synaptic weights similar to for the input synaptic weights (Fig 5d black vs colored curves).

Adult Kenyon cell inputs

As in the larva, adult KCs exhibited a range of in-degrees. KCs in the alpha prime lobe receive fewer axonal inputs than KCs in the alpha lobe (Fig 5e). As before, we computed the maximum likelihood marginal degree distributions, and saw that the binomial model appeared much worse than the fixed net weight model (Fig 3e solid vs dashed lines). This observation was born out by the models’ evidences.

For comparison with the adult data, we again examined KC-KC connectivity in the larva, neglecting the inputs from projection neurons onto KC dendrites, and found similar results as for the full connectivity (Figure Ca-d in S1 Figs).

In the adult, the fixed net model explained the in-degree distribution of every KC type better than the binomial model (Fig 5f; log odds at least 232.57, 551.5, 5.36, 85.82, and 83.02 for alpha prime, all alpha lobe, posterior, core and surface KCs respectively). The fixed net weight model also explained the in-degree distributions better than the bounded weight model (Fig 5f; log odds at least 4.59, 9.0, 0.87, 2.0, and 6.96 for alpha prime, all alpha lobe, posterior, core and surface KCs respectively). The upper bounds for the log odds of the fixed net weight distribution were close to the lower bounds for the adult KCs (Figure Ea, b in S1 Figs). We found similar results using the Poisson Jeffreys prior for the unknown parameters α and q (Figure Ed-f in S1 Figs). Together with the consistent results for adult KC output degrees (Fig 3e–3g), these suggest that flexibility under a fixed net synaptic weight governs KC connectivity in the alpha lobe of the mushroom body.

The scaling between the synapse count and net synaptic weight, α, exhibited similar patterns for adult KC inputs and outputs (Fig 5d vs 5h). The maximum likelihood values for α were 0.68, 0.6, 0.73, and 0.72 for alpha prime, posterior, surface, and core KCs (ordered from approximately oldest to youngest). These suggest that in the alpha lobe, surface and core KCs have more input synaptic weight per synapse on average than posterior KCs. The log odds for the alpha prime KCs was much less sensitive to small values of α than the classes of alpha lobe KCs (Fig 5h, black vs colored curves), suggesting a more heterogenous or flexible relationship between axonal input synapse counts and regulated synaptic weights in alpha prime KCs.

Measuring the cost of changing joint synaptic weight configurations

In measuring the flexibility of connectivity under a constraint, we measured the difference between two synaptic weight configurations by their straight-line (Euclidean) distance: the root sum squared difference in each synaptic weight. This was the origin of in the surface area of the simplex (Eq (7)). It corresponds to the assumption that different connections can potentiate or depress simultaneously: for example, that a vesicle can be taken from one connection, depressing it, and given to another connection to potentiate it (Fig 1e). This implies that the cost of changing one synaptic weight by an amount d is the same as that of changing two weights by each.

Potentiating one connection and depressing another might, however, have separate costs. This can be modeled by choosing a different norm for the space of synaptic weight configurations. For example, a neuron’s connections might potentiate or depress separately so the cost of changing one connection by d is the same as the cost of changing two connections by d/2. In this case distances between configurations are measured by the 1-norm given by the sum of absolute differences. This changes the measure of the surface area of the simplex, replacing by K in Eq 7. This does not change the results of our analysis of KC input connectivity (Figures F, G in S1 Figs). For KC output connectivity, the bounded net weight was a better model than the fixed net weight, under the 1-norm for weight changes, for larval young KCs (Figure H in S1 Figs) and adult surface KCs (Figure I in S1 Figs).

Optimally flexible connectivity

Above, we examined the hypothesis that the distribution of connectivity degrees for Kenyon cells would match the flexibility of those cells under homeostatic or resource constraints on their total synaptic weight. We used a simple measure of flexibility: the size of the allowed synaptic weight configuration space (Fig 1b and 1c). We next considered a related but more restricted hypothesis: that KCs directly maximize their flexibility. For each constraint, we maximized the size of the allowed synaptic weight space to find the optimal degrees.

For the bounded net weight constraint (Eq 1), this consists of maximizing the volume under the simplex (Eq 2) and is equivalent to finding the mode of the zero-truncated Poisson distribution. We found an approximately linear relationship between the optimal degree and the maximum net connection strength: (14)

The derivation of this equation involves the harmonic numbers, which are defined by positive integers, so it applies only for (Optimal degrees: Bounded net weight). Under the fixed net weight constraint, we similarly found an approximately linear relationship between the optimal degree and the net connection strength (Optimal degrees: fixed net weight): (15)

This equation applies only for K ≥ 2, for a similar reason as above (Optimal degrees: fixed net weight). By comparing and , we see that the optimally flexible degrees under the fixed net weight constraint are higher than those for the bounded net weight constraint. Eqs (14) and (15) reveal that to leading order, the model comparisons of Figs 35 encapsulate linear fits of K as a function of while accounting for the variability around that line predicted by each constraint.

In both larval and adult KCs, we observed approximately linear relationships between the total number of synapses and number of partners for each KC type (Fig 6). To quantify this linear relationship, we computed the Pearson correlation between the number of synapses and number of partners for the different KC types. In the larva, we found that less mature KCs better matched this linear relationship (Table 1: young > multi-claw > single-claw). The same was true for the adult KCs (Table 2: alpha prime > core > surface > posterior).

thumbnail
Fig 6. Relation between number of synaptic partners and synapse counts in D. melanogaster Kenyon cells.

(a) Inputs to Kenyon cells (KCs) of the first instar larva. (b) Outputs of the first instar KCs. (c) Inputs to adult KCs in the α lobe. (d) Outputs of adult KCs in the α lobe.

https://doi.org/10.1371/journal.pcbi.1008080.g006

thumbnail
Table 1. Correlation of number of synapses and number partners in larval Kenyon cells.

https://doi.org/10.1371/journal.pcbi.1008080.t001

thumbnail
Table 2. Correlation of number of synapses and number partners in adult alpha lobe Kenyon cells.

https://doi.org/10.1371/journal.pcbi.1008080.t002

The binomial model we examined above does not depend on or model synapse counts. If it were augmented with a wiring process where each connection independently sampled a number of synapses, its total synapse count and number of connections would also be linearly related. Each of these three models are thus consistent with the same qualitative result. The model comparison we performed above tests which best explains the data, accounting for the variability around the mode described by each model (Figs 35).

Finally, we generalized this optimization to allow the constraints on total synaptic weights to explicitly depend on the number of inputs: (16) for the bounded and fixed net weight constraints, respectively. This scaling of the summed synaptic weight corresponds to scaling the individual synaptic weights as Kp−1. If every synaptic weight has an order 1/K strength, the sum of the synaptic weights would be order 1 and p = 0. If every synaptic weight has an order 1 strength, the summed weight is order K and p = 1. If synaptic weights have balanced scaling [48], then the summed weight would have p = 1/2. Under this generalization of our constraint models, our Bayesian model comparisons still apply if we take the total synaptic weight to be proportional to the number of synapses: instead of Eq 10. That corresponds to the requirement that the scaling of synaptic weights with the number of inputs does not arise from scaling the number of synapses, but from other physiological mechanisms. This generalization still led to approximately linear relationships between the optimal degree and the total synaptic weight (Optimal degrees: Bounded net weight and Optimal degrees: fixed net weight): (17)

As before, we see that the optimally flexible degree under the fixed net weight constraint, , is greater than that under the bounded net weight constraint, . In this generalization, we can make a similar assumption as before to relate the net synaptic weight to anatomical measures of connection strength. If we assume that so that the number of synapses absorbs the scaling with Kp, consistent with its origin reflecting the size of a neuron, the same analysis and results of Figs 35 follow. If we instead assumed that , so that the anatomically measured total synaptic weight were , a model comparison that also accounts for the unknown parameter p would be required.

Discussion

We hypothesized that under a particular constraint, the probability of a neuron having K synaptic partners is proportional to the size of the space of allowed circuit configurations with K partners. The general idea of considering the space of allowed configurations can be traced back to Elizabeth Gardner’s pioneering work examining the associative memory capacity of a perceptron for random input patterns [49]. In the limit of infinitely many connections and input patterns, that model yields predictions for the distributions of synaptic weights [811]. Here, in contrast, we examined the hypothesis that the size of the space of allowed configurations—the flexibility of a neuron’s connectivity under constraint—governs the distribution of the number of connections without defining a computational task. This motivated predictions for neural degree distributions, rather than synaptic weight distributions. We examined constraints on the total strength of connections to or from a neuron and found that overall, the degree distribution corresponding to flexible connectivity under a homeostatically fixed total connection strength gave the best explanation for mushroom body connectivity.

Flexible connectivity and circuit development

Computational flexibility should be desirable for an organism’s fitness, allowing the organism to solve problems in a variety of environments. One mechanism of adaptability and flexibility is to build the nervous system out of computationally flexible units that may over time adapt to specific computational roles. Our results are suggestive that this type of strategy may be at play in the development of mushroom body connectivity in the first instar D. melanogaster larva. The log odds for larval young KCs vastly favor the constraints models over the binomial model (Figs 3c and 4c). The odds also approximately decreasingly favor the constraint model with KC maturity (Figs 3 and 4). In the adult, the log odds favored flexible connectivity under constraints on the net synaptic weight over the binomial random wiring model most for the alpha prime KCs, and more for core than surface KCs (Figs 3f and 4f, Figures D, E in S1 Figs). These suggest that flexibility under constraints might also reflect a developmental or experience-dependent progression in the alpha lobe KCs, but it remains a better explanation for their connectivity than binomial wiring even in the more mature KCs of the adult. The less mature KCs in the larva and adult also showed more linear relationships between their number of synapses and number of synaptic partners (Table 1), better matching the prediction of maximizing the space of allowed configurations under a constraint. Together, these results suggest that Kenyon cell connectivity is structured to be flexible early in development, allowing many possible connectivity configurations to support specialization as the organism matures.

Anatomical measures of connection strength

To test the hypothesis that neurons in the mushroom body are subject to a pressure towards flexible connectivity under constraints, we required measurements of the total input or output connection strength of these neurons. For this purpose, we used electron microscopic reconstructions of mushroom body circuitry [38, 39]. These published data contain anatomical measurements of connectivity: the number of synapses between neurons. The general types of constraint we considered (bounded or homeostatically fixed total connection strengths) might not operate directly on synapse counts. To account for this uncertainty, we assumed that synapse counts were proportional to the constrained total connection strength (Eq 10) [38]. Spatially detailed, biophysical neuron models could in principle be used to account for synapse locations and the passive and active membrane conductances transforming anatomical connectivity into physiological connection strengths in specific neurons. In hippocampal pyramidal cells, cerebellar Purkinje cells, and Drosophila visual neurons, dendritic structures can compensate for signal decay systems [5053]. If this is also the case in mushroom body neurons, detailed spatial models to relate anatomical and physiological connection strengths might not provide additional insight. Alternatively, additional information about the processes governing homeostatic synaptic scaling or synaptic resource limits could motivate models of a different functional form than Eq 10 [54, 55].

Physiological constraints on neural circuits

We modeled constraints as requirements on synaptic weight vectors, consistent with point neuron models commonly used in studies of neural computation, rather than specifying the biophysical implementation of these constraints. Minimizing the amount of wire used to connect neural circuits can predict the spatial layout of neural systems (e.g., [5660]) and dendritic arborizations [61, 62]. We examined setting the number of connections separately from the strengths of connections, consistent with the assumption that rewiring neural circuits is more costly than changing the strength of existing connections [63].

Neural activity faces metabolic constraints [64]. In early sensory systems, the combination of metabolic constraints with sensory encoding needs can explain the structure of neural activity [6569]. In our model, both wiring and metabolic costs could be related to setting the parameter . We hope that, analogously to how metabolic costs and encoding performance combine in metabolically efficient coding, the idea of flexibility under constraints might be useful in determining how metabolic and wiring constraints interact with computational tasks to shape neural circuit structures.

Materials and methods

Measuring synaptic weight configuration spaces

First, consider measuring the available configurations of synaptic weights when they can vary continuously. Consider the total synaptic weight, divided into K segments (Fig 7). For one synaptic weight, the measure of the weight configurations is (18)

thumbnail
Fig 7. Synaptic weight configurations.

a) Two example configurations of K = 4 synaptic weights with . b) Two examples of K = 5 synaptic weights with sum bounded by .

https://doi.org/10.1371/journal.pcbi.1008080.g007

For two synaptic weights, the available configurations are measured by (19) and in general, (20) which is the volume under the simplex with vertex length .

Now consider synaptic weights that vary discretely by ΔJ, with . How many ways can we assign M units of synaptic weight amongst K partners? For K = 1, this is M. For K = 2, this is (21) and in general the number of combinations of M units of synaptic weight in K connections is given by the binomial coefficient (22)

For large M, the binomial coefficient is (23)

Now if we measure synaptic weights relative to ΔJ, this replaces with in the simplex’s volume. So for large M, the volume under the simplex approximates the number of allowed configurations.

Similarly, the surface area of the simplex (Eq 7) approximates the number of allowed configurations under the fixed net weight constraint (Eq 6) if we discretize the synaptic weights. It is measured by the K − 1 dimensional Haussdorff measure (hyper-surface area). We can compute the surface area by differentiating the volume (Eq 2) with respect to the inner radius of the simplex (the minimal distance from the origin to its surface) [70]. For the regular simplex with vertices at , that inner radius is . Differentiating the volume with respect to r thus yields the surface area (24)

Note, however, that the inner radius and thus the surface area depends on the norm of the space of synaptic weight configurations (Distances in synaptic configuration space).

Model comparison

Under equal prior likelihoods for two models X and Y, the posterior likelihood ratio between two models, X and Y is (25) where i indexes data points. We consider the Laplace approximations for the posterior odds, obtained by writing pX = exp ln pX and Taylor expanding the log likelihood lnpX in α around its maximum likelihood value, (26)

Truncating at second order then yields a tractable Gaussian integral over the unknown parameter: (27) where the integrals run over the allowed range for α and (28)

Under a flat prior for non-negative α, the marginal likelihood is: (29) where . The simplex volume distribution is a truncated Poisson; we might reasonably use the Jeffreys prior for the Poisson distribution, . In that case, the marginal likelihood is (30) where is the modified Bessel function of the first kind. We will drop the indices on in most of the remaining sections, reintroducing them where necessary.

Model comparison: Bounded net weight model

Under a bounded net weight, the degree distribution is: (31)

The normalization constant Z is (32) so the simplex volume distribution is a zero-truncated Poisson distribution. We will make a Laplace approximation for the simplex volume distribution around , leading to the posterior odds Eq 29 (for a flat prior on non-negative α) or Eq 30 (for the Poisson Jeffreys prior). To calculate the Laplace approximation for the posterior odds we need and σ2. The derivatives of lnpV can be calculated directly (again dropping indices over measurements), (33) So we have (34) and the maximum likelihood solution for α satisfies (35)

Model comparison: Fixed net weight model

Under the fixed net synaptic weight, our model is that the degree distribution is proportional to the surface area of the simplex: (36) where (37)

To calculate and σ2 we need the derivatives of lnpA. (38) where (39) and we use the identity (40)

We next bound . (41)

For K ≥ 1, is bounded above by and below by 1. So, (42)

Inserting these into the critical point equation for provides the bounds: (43)

We will also need the curvature of lnpA w.r.t. α at : (44)

Similarly to the first derivative, (45) where (46)

We use the identity (47)

The curvature of Z is (48)

The final term is bounded above by and below by 1, so (49)

Defining upper and lower bounds for using the upper and lower bounds of the first and second terms in Eq 47 yields: (50)

The upper bound for provides an upper bound for σ2, while neglecting Z provides a lower bound for σ2 (since Z ≥ 1 from Eq 37, so that ln Z ≥ 0): (51)

The posterior odds for the simplex area are: (52) where . We use the upper and lower bounds for to define upper and lower bounds, respectively, for the likelihood’s variance: (53)

We compute numerically by maximizing the likelihood, and compute also numerically, estimating Z by ranging over K = 1 to .

Bounds for the posterior odds of the fixed net weight model

The derivative of the posterior odds under the flat prior, Eq 29, with respect to σ is proportional to (54)

Since α > 0 and σ > 0, the last term is bounded between 0 and 1. The middle term is proportional to the form x exp(−x2/2), which is maximized by at x = 1. Since , the middle term is less than 1 and the derivative of the posterior odds under a flat prior for α, with respect to σ, is non-negative. The upper bound for σ2 thus provides an upper bound on the posterior odds. We see that the posterior likelihood increases from to (reflected in the log posterior odds ratio for the simplex volume vs the simplex area, Figures A, B, D-I in S1 Figs).

The derivative of the posterior odds under the Poisson Jeffreys prior, Eq 30, with respect to σ2, is proportional to (55)

We saw that the posterior odds for the simplex area distribution also increased with σ2 for the Poisson Jeffreys prior (S1 Figs).

Model comparison: Zero-truncated binomial model

The marginal likelihood for the zero-truncated binomial with distribution pB is (56) where pB(Ki|N, q) is given by Eq 11. For connections to larval KCs, we used the total number of traced projection neurons (PNs) and KCs as the binomial parameter N, averaged over the two sides of the brain [38]. For projections from larval KCs, we used the total number of KCs and output neurons, averaged over the two sides, as N. For projection to adult KCs, we used the number of Kenyon cells plus 150 (the estimated number of olfactory PNs) as N [71]. For projections from adult KCs, we used the number of KCs and output neurons labelled in the data as N.

The variance with respect to q is determined as in Eq (28). The derivates of lnpB are, again dropping indices on K, (57)

The maximum likelihood parameter for the zero-truncated binomial, with M samples of K, each with N trials, obeys: (58) and the variance at is (59)

Optimal degrees: Bounded net weight

Now we examine what numbers of synaptic partners maximize the size of the allowed configuration space under the bounded net weight constraint. Here we generalize the constraint to allow the maximum total synaptic weight to explicitly depend on the number of inputs, K: (60) where we will typically take 0 ≤ p ≤ 1. This replaces the maximum weight with in the volume: (61)

The volume is non-decreasing in . We compute its derivative with respect to K by analytically continuing the factorial to real values of K as the Gamma function, yielding (62) where HK is the Kth harmonic number, (63)

We used Euler’s expansion for the harmonic numbers, (64)

At a critical point in K, truncating and higher-order terms, we find (65)

Substituting yields (66)

Alternatively, the critical point can be calculated without first extending K to real numbers by using ratios: (67) which yields (68)

Optimal degrees: Fixed net weight

If then we have the surface area of the K − 1 simplex. We consider a regular simplex (equal side lengths) with vertex length (from the origin to any vertex). Its surface area is (69)

By the same method as above, the derivative with respect to K is (70)

Since HK−1 appears in the derivative, we only consider the derivative at K ≥ 2. At a critical point in K, dividing through by K and truncating and higher-order terms yields (71) and substituting yields (72)

If we do not continue to real K first, we instead have (73) which yields (74)

Distances in synaptic configuration space

Above we assumed that synaptic weight configurations could travel between different points in the synaptic weight space along straight lines, endowing the K-dimensional synaptic weight space with a Euclidean (or 2-) norm. This amounts to assuming that synaptic weights can vary together. This could be interpreted, for example, as allowing a unit of synaptic weight (a receptor, perhaps) to be transferred directly between connections. An alternative is to assume that synaptic weights must move separately, which corresponds endowing the synaptic weight space with the 1-norm. In the above interpretation this would mean separating the removal of a receptor from one synapse from the addition of a receptor to another synapse. This changes the surface area of the simplex, since its inner radius is rather than : (75)

Changing the norm for the synaptic weights leaves the above calculation of the posterior odds for the fixed net weight model mostly unchanged. The factors of in the normalization constant are replaced by K; this removes the square roots in the derivation of the upper bound for the variance with respect to α so that (76)

The optimal number of connections can be calculated in the same manner as previously. The derivative of A1 with respect to K is (to order 1/K): (77)

At a critical point in K, truncating and higher-order terms yields (78)

Supporting information

Acknowledgments

We thank Ramakrishnan Iyer, Casey Schneider-Mizell, and Saskia de Vries for helpful discussions. We wish to thank the Allen Institute founder, Paul G. Allen, for his vision, encouragement and support.

References

  1. 1. Albus JS. A theory of cerebellar function. Mathematical Biosciences. 1971;10(1):25–61.
  2. 2. Marr D. A theory of cerebellar cortex. The Journal of Physiology. 1969;202(2):437–470.
  3. 3. Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences. 1982;79(8):2554–2558.
  4. 4. Barak O, Rigotti M, Fusi S. The sparseness of mixed selectivity neurons controls the generalization-discrimination trade-off. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience. 2013;33(9):3844–3856.
  5. 5. Rigotti M, Barak O, Warden MR, Wang XJ, Daw ND, Miller EK, et al. The importance of mixed selectivity in complex cognitive tasks. Nature. 2013;497(7451):585–590. pmid:23685452
  6. 6. Babadi B, Sompolinsky H. Sparseness and Expansion in Sensory Representations. Neuron. 2014;83(5):1213–1226.
  7. 7. Litwin-Kumar A, Harris KD, Axel R, Sompolinsky H, Abbott LF. Optimal Degrees of Synaptic Connectivity. Neuron. 2017;93(5):1153–1164.e7.
  8. 8. Brunel N, Hakim V, Isope P, Nadal JP, Barbour B. Optimal Information Storage and the Distribution of Synaptic Weights: Perceptron versus Purkinje Cell. Neuron. 2004;43(5):745–757.
  9. 9. Chapeton J, Fares T, LaSota D, Stepanyants A. Efficient associative memory storage in cortical circuits of inhibitory and excitatory neurons. Proceedings of the National Academy of Sciences. 2012;109(51):E3614–E3622.
  10. 10. Chapeton J, Gala R, Stepanyants A. Effects of homeostatic constraints on associative memory storage and synaptic connectivity of cortical circuits. Frontiers in Computational Neuroscience. 2015;9.
  11. 11. Brunel N. Is cortical connectivity optimized for storing information? Nature Neuroscience. 2016;19(5):749–755.
  12. 12. Pereira U, Brunel N. Attractor Dynamics in Networks with Learning Rules Inferred from In Vivo Data. Neuron. 2018;99(1):227–238.e4.
  13. 13. Zhang D, Zhang C, Stepanyants A. Robust Associative Learning Is Sufficient to Explain the Structural and Dynamical Properties of Local Cortical Circuits. Journal of Neuroscience. 2019;39(35):6888–6904.
  14. 14. Kasai H, Matsuzaki M, Noguchi J, Yasumatsu N, Nakahara H. Structure-stability-function relationships of dendritic spines. Trends in Neurosciences. 2003;26(7):360–368.
  15. 15. Turrigiano GG, Leslie KR, Desai NS, Rutherford LC, Nelson SB. Activity-dependent scaling of quantal amplitude in neocortical neurons. Nature. 1998;361:892–896.
  16. 16. Desai NS, Cudmore RH, Nelson SB, Turrigiano GG. Critical periods for experience-dependent synaptic scaling in visual cortex. Nature Neuroscience. 2002;5(8):783–789.
  17. 17. Goel A, Jiang B, Xu LW, Song L, Kirkwood A, Lee HK. Cross-modal regulation of synaptic AMPA receptors in primary sensory cortices by visual experience. Nature Neuroscience. 2006;9(8):1001–1003.
  18. 18. Goold CP, Nicoll RA. Single-Cell Optogenetic Excitation Drives Homeostatic Synaptic Depression. Neuron. 2010;68(3):512–528.
  19. 19. Vale C, Sanes DH. The effect of bilateral deafness on excitatory and inhibitory synaptic strength in the inferior colliculus. European Journal of Neuroscience. 2002;16(12):2394–2404.
  20. 20. Hartman KN, Pal SK, Burrone J, Murthy VN. Activity-dependent regulation of inhibitory synaptic transmission in hippocampal neurons. Nature Neuroscience. 2006;9(5):642–649.
  21. 21. Joseph A, Turrigiano GG. All for One But Not One for All: Excitatory Synaptic Scaling and Intrinsic Excitability Are Coregulated by CaMKIV, Whereas Inhibitory Synaptic Scaling Is Under Independent Control. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience. 2017;37(28):6778–6785.
  22. 22. Maffei A, Nelson SB, Turrigiano GG. Selective reconfiguration of layer 4 visual cortical circuitry by visual deprivation. Nature Neuroscience. 2004;7(12):1353–1359.
  23. 23. Kilman V, Rossum MCWv, Turrigiano GG. Activity Deprivation Reduces Miniature IPSC Amplitude by Decreasing the Number of Postsynaptic GABAA Receptors Clustered at Neocortical Synapses. Journal of Neuroscience. 2002;22(4):1328–1337.
  24. 24. Hengen KB, Lambo ME, Van Hooser SD, Katz DB, Turrigiano GG. Firing rate homeostasis in visual cortex of freely behaving rodents. Neuron. 2013;80(2):335–342.
  25. 25. Heisenberg M. Mushroom body memoir: from maps to models. Nature Reviews Neuroscience. 2003;4(4):266–275.
  26. 26. Owald D, Waddell S. Olfactory learning skews mushroom body output pathways to steer behavioral choice in Drosophila. Current Opinion in Neurobiology. 2015;35:178–184.
  27. 27. Murthy M, Fiete I, Laurent G. Testing odor response stereotypy in the Drosophila mushroom body. Neuron. 2008;59(6):1009–1023.
  28. 28. Caron SJC, Ruta V, Abbott LF, Axel R. Random convergence of olfactory inputs in the Drosophila mushroom body. Nature. 2013;497(7447):113–117.
  29. 29. Stevens CF. What the fly’s nose tells the fly’s brain. Proceedings of the National Academy of Sciences. 2015;112(30):9460–9465.
  30. 30. Srinivasan S, Greenspan RJ, Stevens CF, Grover D. Deep(er) Learning. Journal of Neuroscience. 2018;38(34):7365–7374.
  31. 31. Grueber WB, Jan LY, Jan YN. Tiling of the Drosophila epidermis by multidendritic sensory neurons. Development (Cambridge, England). 2002;129(12):2867–2878.
  32. 32. Keshishian H, Chiba A, Chang TN, Halfon MS, Harkins EW, Jarecki J, et al. Cellular mechanisms governing synaptic development in Drosophila melanogaster. Journal of Neurobiology. 1993;24(6):757–787. pmid:8251016
  33. 33. Gerhard S, Andrade I, Fetter RD, Cardona A, Schneider-Mizell CM. Conserved neural circuit structure across Drosophila larval development revealed by comparative connectomics. eLife. 2017;6:e29089.
  34. 34. Tripodi M, Evers JF, Mauss A, Bate M, Landgraf M. Structural Homeostasis: Compensatory Adjustments of Dendritic Arbor Geometry in Response to Variations of Synaptic Input. PLOS Biology. 2008;6(10):e260.
  35. 35. Kremer MC, Christiansen F, Leiss F, Paehler M, Knapek S, Andlauer TFM, et al. Structural Long-Term Changes at Mushroom Body Input Synapses. Current Biology. 2010;20(21):1938–1944. pmid:20951043
  36. 36. Pech U, Revelo NH, Seitz KJ, Rizzoli SO, Fiala A. Optical dissection of experience-dependent pre- and postsynaptic plasticity in the Drosophila brain. Cell Reports. 2015;10(12):2083–2095.
  37. 37. Doll CA, Vita DJ, Broadie K. Fragile X Mental Retardation Protein Requirements in Activity-Dependent Critical Period Neural Circuit Refinement. Current biology: CB. 2017;27(15):2318–2330.e3.
  38. 38. Eichler K, Li F, Litwin-Kumar A, Park Y, Andrade I, Schneider-Mizell CM, et al. The complete connectome of a learning and memory centre in an insect brain. Nature. 2017;548(7666):175–182. pmid:28796202
  39. 39. Takemura Sy, Aso Y, Hige T, Wong A, Lu Z, Xu CS, et al. A connectome of a learning and memory center in the adult Drosophila brain. eLife. 2017;6:e26975.
  40. 40. Its Lebesgue or K dimensional Hausdorff measure
  41. 41. Harris KM, Stevens JK. Dendritic spines of CA 1 pyramidal cells in the rat hippocampus: serial electron microscopy with reference to their biophysical characteristics. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience. 1989;9(8):2982–2997.
  42. 42. Harris KM, Sultan P. Variation in the number, location and size of synaptic vesicles provides an anatomical basis for the nonuniform probability of release at hippocampal CA1 synapses. Neuropharmacology. 1995;34(11):1387–1395.
  43. 43. Schikorski T, Stevens CF. Quantitative Ultrastructural Analysis of Hippocampal Excitatory Synapses. Journal of Neuroscience. 1997;17(15):5858–5867.
  44. 44. Schikorski T, Stevens CF. Morphological correlates of functionally defined synaptic vesicle populations. Nature Neuroscience. 2001;4(4):391–395.
  45. 45. Bourne JN, Chirillo MA, Harris KM. Presynaptic ultrastructural plasticity along CA3→CA1 axons during long-term potentiation in mature hippocampus. The Journal of Comparative Neurology. 2013;521(17):3898–3912.
  46. 46. Zhu S, Chiang AS, Lee T. Development of the Drosophila mushroom bodies: elaboration, remodeling and spatial organization of dendrites in the calyx. Development (Cambridge, England). 2003;130(12):2603–2610.
  47. 47. Lin HH, Lai JSY, Chin AL, Chen YC, Chiang AS. A Map of Olfactory Representation in the Drosophila Mushroom Body. Cell. 2007;128(6):1205–1217.
  48. 48. van Vreeswijk C, Sompolinsky H. Chaos in neuronal networks with balanced excitatory and inhibitory activity. Science (New York, NY). 1996;274(5293):1724–1726.
  49. 49. Gardner E. The space of interactions in neural network models. Journal of Physics A: Mathematical and General. 1988;21(1):257.
  50. 50. Cuntz H, Borst A, Segev I. Optimization principles of dendritic structure. Theoretical Biology and Medical Modelling. 2007;4(1):21.
  51. 51. Jaffe DB, Carnevale NT. Passive Normalization of Synaptic Integration Influenced by Dendritic Architecture. Journal of Neurophysiology. 1999;82(6):3268–3285.
  52. 52. Chitwood RA, Hubbard A, Jaffe DB. Passive electrotonic properties of rat hippocampal CA3 interneurones. The Journal of Physiology. 1999;515(3):743–756.
  53. 53. Magee JC, Cook EP. Somatic EPSP amplitude is independent of synapse location in hippocampal pyramidal neurons. Nature Neuroscience. 2000;3(9):895–903.
  54. 54. Chowdhury D, Hell JW. Homeostatic synaptic scaling: molecular regulators of synaptic AMPA-type glutamate receptors. F1000Research. 2018;7.
  55. 55. Wen Q, Chklovskii DB. A Cost–Benefit Analysis of Neuronal Morphology. Journal of Neurophysiology. 2008;99(5):2320–2328.
  56. 56. Durbin R, Mitchison G. A dimension reduction framework for understanding cortical maps. Nature. 1990;343(6259):644–647.
  57. 57. Cherniak C. Component placement optimization in the brain. Journal of Neuroscience. 1994;14(4):2418–2427.
  58. 58. Cherniak C, Mokhtarzada Z, Rodriguez-Esteban R, Changizi K. Global optimization of cerebral cortex layout. Proceedings of the National Academy of Sciences. 2004;101(4):1081–1086.
  59. 59. Weigand M, Sartori F, Cuntz H. Universal transition from unstructured to structured neural maps. Proceedings of the National Academy of Sciences. 2017;114(20):E4057–E4064.
  60. 60. Bullmore E, Sporns O. The economy of brain network organization. Nature Reviews Neuroscience. 2012;13(5):336–349.
  61. 61. Cuntz H, Forstner F, Borst A, Häusser M. One Rule to Grow Them All: A General Theory of Neuronal Branching and Its Practical Application. PLOS Computational Biology. 2010;6(8):e1000877.
  62. 62. Cuntz H, Mathy A, Häusser M. A scaling law derived from optimal dendritic wiring. Proceedings of the National Academy of Sciences of the United States of America. 2012;109(27):11014–11018.
  63. 63. Chklovskii DB, Mel BW, Svoboda K. Cortical rewiring and information storage. Nature. 2004;431(7010):782–788.
  64. 64. Harris J, Jolivet R, Attwell D. Synaptic Energy Use and Supply. Neuron. 2012;75(5):762–777.
  65. 65. Laughlin S. A simple coding procedure enhances a neuron’s information capacity. Zeitschrift Fur Naturforschung Section C, Biosciences. 1981;36(9-10):910–912.
  66. 66. Atick JJ, Redlich AN. Towards a Theory of Early Visual Processing. Neural Computation. 1990;2(3):308–320.
  67. 67. Bastos AM, Usrey WM, Adams RA, Mangun GR, Fries P, Friston KJ. Canonical Microcircuits for Predictive Coding. Neuron. 2012;76(4):695–711.
  68. 68. Denève S, Machens CK. Efficient codes and balanced networks. Nature Neuroscience. 2016;19(3):375–382.
  69. 69. Chalk M, Marre O, Tkačik G. Toward a unified theory of efficient, predictive, and sparse coding. Proceedings of the National Academy of Sciences. 2017; p. 201711114.
  70. 70. Emert J, Nelson R. Volume and Surface Area for Polyhedra and Polytopes. Mathematics Magazine. 1997;70(5):365–371.
  71. 71. Jefferis GS, Marin EC, Stocker RF, Luo L. Target neuron prespecification in the olfactory map of Drosophila. Nature. 2001;414(6860):204–208.