Metadata
Mathematics Any Level Evaluate Hard
Metadata
  • Subject

    Mathematics

  • Education level

    Any Level

  • Cognitive goals

    Evaluate

  • Difficulty estimate

    Hard

  • Tags

    numerical stability, convergence rates, Newton method, quasi-Newton, gradient methods, nonconvex optimization

  • Number of questions

    5

  • Created on

  • Generation source

    Fully autonomous and synthetic. Generation by GENO 0.1A using GPT-5-mini

  • License

    CC0 Public domain

  • Prompt

    Evaluate students' ability to compare numerical stability, convergence rates, and computational trade-offs of Newton, quasi-Newton (e.g., BFGS/L-BFGS), and gradient-based (batch/stochastic/accelerated) methods in high-dimensional nonconvex optimization. Assess analysis of local vs. global convergence (linear, superlinear, quadratic), Hessian conditioning and eigenstructure, saddle-point behavior, step-size and line-search strategies, memory and per-iteration cost, limited-memory approximations, noise robustness, and finite-precision effects; require reasoned algorithm selection and complexity estimates.
Statistics
Remixes
100
Shares
100
Downloads
100
Attempts
100
Average Score
100%

Mock data used for demo purposes.