Recurrent Neural Networks (RNNs)

This interactive demo shows how Recurrent Neural Networks process sequential data step-by-step. RNNs have a "memory" that allows them to use information from previous timesteps when processing current inputs.

You'll explore how the hidden state evolves over time and see how different weights and activation functions affect the network's behavior on various sequence patterns.
RNN Architecture Components:

Hidden State: ht - Memory that carries information from previous timesteps
Input Weight: Winput - How much the current input affects the hidden state
Hidden Weight: Whidden - How much the previous hidden state influences the current one
Bias: Constant term added to the computation
Activation Function: Non-linear transformation (tanh, ReLU, sigmoid)

RNN Formula:
ht=f(Winputxt+Whiddenht1+bias)

Where f is the activation function, xt is the input at time t, and ht1 is the previous hidden state.
How to Use:

Select a sequence from the dropdown to see different patterns (Fibonacci, linear, etc.)
Adjust weights to see how they affect hidden state evolution
Try different activation functions to understand their impact on processing
Use "Step Forward" to manually process each timestep
Watch the visualization to see how information flows through the network

Sequence Types:
Fibonacci: Each number is sum of previous two
Linear: Simple counting sequence
Alternating: Pattern that switches between values
Exponential: Powers of 2 sequence
Sine Pattern: Smooth oscillating values
Understanding RNN Behavior:

Input Weight (Winput): Higher values make current input more influential
Hidden Weight (Whidden): Higher values give more importance to memory
Activation Functions:
- Tanh: Outputs between -1 and 1, good for centered data
- ReLU: Only positive outputs, can cause exploding gradients
- Sigmoid: Outputs between 0 and 1, can suffer from vanishing gradients

Observing Patterns:
Memory Effect: See how previous inputs influence current outputs
Weight Impact: Notice how different weights change the sequence processing
Activation Choice: Different functions handle the same sequence differently

Input Sequence

RNN Cell

Output Sequence

1.0
0.5
0.0
0
0.000
0.000