Metadata
Technology & Computer Science Graduate Analyze Hard-
Subject
Technology & Computer Science
-
Education level
Graduate
-
Cognitive goals
Analyze
-
Difficulty estimate
Hard
-
Tags
optimization, generalization, implicit regularization, overparameterization, stochastic gradient, non-convex
-
Number of questions
5
-
Created on
-
Generation source
-
License
CC0 Public domain
-
Prompt
Assess students' ability to analyze convergence behavior and generalization trade-offs in non‑convex stochastic optimization for overparameterized deep networks, including deriving or critiquing convergence rates of SGD/variants, explaining implicit regularization mechanisms (e.g., noise, early stopping, momentum, geometry), connecting NTK/mean‑field or SDE approximations to generalization phenomena (double descent, sharpness vs flatness), and designing/interpreting experiments or proof sketches that illustrate optimization–generalization trade-offs.
Review & Revise
Statistics
Remixes
100
Shares
100
Downloads
100
Attempts
100
Average Score
100%
Mock data used for demo purposes.