loopstream

perpetual pass

Pursuing the development of continuous networks.

Experiments

Neural Learning Loop

Live

A continuous feed-forward network that consumes data particles, processes them through hidden layers, and emits transformed generations in an endless learning cycle.

backpropagation gradient flow generation
5 layers / 72 neurons -

Transformer Architecture

Live

Visualizes the attention mechanism in transformer models, showing how tokens attend to each other through multiple heads and flow through the architecture.

attention multi-head embeddings
4 heads / 8 tokens -

Perpetual Descent

Beta

A network designed to never converge - continuous drift, random walks, and momentum reversals keep the weights perpetually exploring the loss landscape.

non-convergent exploration momentum
oscillating weights -

Hopfield Network

Live

Associative memory through energy minimization. Neurons form attractors that recall stored patterns from noisy inputs via recurrent dynamics.

attractor energy hebbian
16 neurons / 3 patterns -

LSTM Cell

Live

Long Short-Term Memory with gated architecture. Forget, input, and output gates control information flow through the cell state memory highway.

gates memory sequences
4 cells / 3 gates each -

State-Space Model

Live

Linear-time sequence modeling through selective state spaces. Input-dependent parameters enable context-aware memory, achieving O(N) complexity vs O(N^2) for attention.

mamba selective linear
6 tokens / 16-dim state -

Graph Neural Network

Live

Message passing on graph-structured data. Nodes aggregate neighbor information through learned transformations, with optional attention weighting (GAT).

message-passing aggregation attention
12 nodes / 3 layers -