See how a single recurrent cell unfolds into a chain of computations over a sequence of characters,
computing actual hidden states with real (random) weights.
Key idea: A recurrent neural network processes sequences by applying the same
cell at every time step: \(h_t = \tanh(W_h \cdot h_{t-1} + W_x \cdot x_t + b)\).
The folded view shows the cell with a self-loop; the unrolled view
expands it across time, revealing how information propagates through the hidden state.