VINNIKOV
NETWORK
music
Draw a graph. Hear what emerges.
pluck the strings

Interactive web app that converts combinatorial threshold-linear network dynamics into real-time audio and visualisations. BSc Data Science project.

60fps
Real-time
4
Viz Modes
RK4
Integrator
MIDI
Export
Why this exists

Making the abstract tangible

CTLNs are a mathematical model for understanding how network topology shapes neural dynamics. The theory is rich but abstract — the relationship between a graph's structure and its emergent behaviour is difficult to intuit from equations alone.

4th-order

Runge-Kutta numerical integration solving continuous-time ODEs at 60fps. The graph topology alone determines whether the system oscillates, settles, or descends into chaos.

The Playground

Build a network, simulate dynamics, hear the result

Double-click to add nodes. Shift-drag to connect. Adjust parameters. Press play.

Graph0 nodes · 0 edges

double-click to add · shift-drag to connect · delete to remove

ε
0.25
max 0.332
δ
0.50
θ
1.00

simulate to see results

The Theory

The CTLN equation

dxi/dt rate of change of neuron i's firing rate
=
−xi decay without input, activity fades to zero
+
[ Σ Wij xj + θ ] +
input

weighted sum from all neighbours

threshold

negative values clipped to zero (ReLU)

Parameters & weight matrix

The weight matrix W is built entirely from the directed graph G and two parameters:

Wii = 0

no self-connections

Wij = −1 + ε

if edge j→i exists (weaker inhibition)

Wij = −1 − δ

if no edge j→i (stronger inhibition)

All weights are inhibitory (negative). Edges make inhibition weaker, not excitatory — connected neurons compete less aggressively than unconnected ones.

ε epsilon

Controls edge weight. Must satisfy ε < δ/(δ+1) so non-edges always inhibit more strongly than edges.

δ delta

Controls non-edge penalty. Higher δ means unconnected neurons suppress each other more.

θ theta

External drive / excitability. Scales with network size (θ = N for N neurons). Higher values increase overall activity.

1 The graph is everything

The weight matrix W is fully determined by the directed graph and two parameters: ε for edges (weak inhibition) and δ for non-edges (strong inhibition). Change the graph, change the music.

Morrison & Curto, 2018

2 Cycles create rhythm

An oriented graph with no sinks has no stable fixed points — the network is forced to oscillate. Neurons take turns firing in sequences determined by the graph’s cycle structure.

Morrison et al., 2016

3 Cliques create silence

Bidirectional edges form cliques, and target-free cliques create stable fixed points — the network settles. Avoid them for continuous dynamics, or use them as musical rests.

Curto et al., 2018

4 Sinks kill dynamics

A sink is a node with no outgoing edges. If a graph contains a sink, the network converges to a stable fixed point where only the sink and its neighbours fire. Every node needs at least one outgoing edge for sustained oscillation.

Curto et al., 2018

References

Morrison, K. & Curto, C. (2018). Predicting neural network dynamics via graphical analysis . Algebraic and Combinatorial Computational Biology.

Morrison, K., Degeratu, A., Itskov, V. & Curto, C. (2016). Diversity of emergent dynamics in competitive threshold-linear networks .

Curto, C., Geneson, J. & Morrison, K. (2018). Fixed points of competitive threshold-linear networks .

nebneuron/CTLN-bookchapter — reference implementation.

Built as part of a BSc in Data Science, York St John University.

Tech Stack

React TypeScript Tailwind CSS HTML5 Canvas Web Audio API Vite Vitest MIDI