Metadata
Technology & Computer Science Graduate Apply Medium-
Subject
Technology & Computer Science
-
Education level
Graduate
-
Cognitive goals
Apply
-
Difficulty estimate
Medium
-
Tags
model compression, pruning, quantization, knowledge distillation, edge deployment
-
Number of questions
5
-
Created on
-
Generation source
-
License
CC0 Public domain
-
Prompt
Assess students' ability to apply pruning, quantization, and knowledge distillation to design and evaluate compressed deep neural networks for edge devices; prompt scope includes choosing between structured vs. unstructured pruning, post‑training vs. quantization-aware techniques, teacher–student distillation strategies, combined pipelines, evaluation metrics (accuracy, latency, memory, energy), and deployment/toolchain trade-offs for constrained hardware.
Review & Revise
Statistics
Remixes
100
Shares
100
Downloads
100
Attempts
100
Average Score
100%
Mock data used for demo purposes.