Recurrent Neural Networks (RNNs)
This interactive demo shows how Recurrent Neural Networks process sequential data step-by-step. RNNs have a "memory" that allows them to use information from previous timesteps when processing current inputs.
You'll explore how the hidden state evolves over time and see how different weights and activation functions affect the network's behavior on various sequence patterns.
RNN Architecture Components:
• Hidden State: - Memory that carries information from previous timesteps
• Input Weight: - How much the current input affects the hidden state
• Hidden Weight: - How much the previous hidden state influences the current one
• Bias: Constant term added to the computation
• Activation Function: Non-linear transformation (tanh, ReLU, sigmoid)
RNN Formula:
Where is the activation function, is the input at time , and is the previous hidden state.
How to Use:
• Select a sequence from the dropdown to see different patterns (Fibonacci, linear, etc.)
• Adjust weights to see how they affect hidden state evolution
• Try different activation functions to understand their impact on processing
• Use "Step Forward" to manually process each timestep
• Watch the visualization to see how information flows through the network
Sequence Types:
• Fibonacci: Each number is sum of previous two
• Linear: Simple counting sequence
• Alternating: Pattern that switches between values
• Exponential: Powers of 2 sequence
• Sine Pattern: Smooth oscillating values
Understanding RNN Behavior:
• Input Weight (): Higher values make current input more influential
• Hidden Weight (): Higher values give more importance to memory
• Activation Functions:
- Tanh: Outputs between -1 and 1, good for centered data
- ReLU: Only positive outputs, can cause exploding gradients
- Sigmoid: Outputs between 0 and 1, can suffer from vanishing gradients
Observing Patterns:
• Memory Effect: See how previous inputs influence current outputs
• Weight Impact: Notice how different weights change the sequence processing
• Activation Choice: Different functions handle the same sequence differently