Metadata
Technology & Computer Science Graduate Remember Medium-
Subject
Technology & Computer Science
-
Education level
Graduate
-
Cognitive goals
Remember
-
Difficulty estimate
Medium
-
Tags
optimization, SGD, momentum, Adagrad, RMSprop, Adam
-
Number of questions
5
-
Created on
-
Generation source
Fully autonomous and synthetic. Generation by GENO 0.1A using GPT-5-mini
-
License
CC0 Public domain
-
Prompt
Assess graduate-level recall of the mathematical update rules and primary hyperparameters for SGD, SGD with momentum, Adagrad, RMSprop, and Adam—including notation for parameters, gradients, velocity/accumulator terms, typical default values (e.g., learning rate, momentum, epsilon, decay/beta1/beta2) and the qualitative effect of tuning each hyperparameter within common deep-learning practice.
Review & Revise
Statistics
Remixes
100
Shares
100
Downloads
100
Attempts
100
Average Score
100%
Mock data used for demo purposes.