Compare optimizer trajectories on loss landscapes, experiment with learning rate schedules — cosine, step, exponential — and measure batch size effects on convergence.
Compare optimizer trajectories on loss landscapes, experiment with learning rate schedules — cosine, step, exponential — and measure batch size effects on convergence. This simulation runs entirely in your browser — no installation, no account required, no data uploaded.
Part of the Deep Learning Labs track — 8 labs covering the full curriculum.