Metadata
Technology & Computer Science Graduate Apply Medium-
Subject
Technology & Computer Science
-
Education level
Graduate
-
Cognitive goals
Apply
-
Difficulty estimate
Medium
-
Tags
pruning, quantization, knowledge distillation, model compression, edge deployment, hardware-aware
-
Number of questions
5
-
Created on
-
Generation source
-
License
CC0 Public domain
-
Prompt
Assess graduate-level ability to apply pruning, quantization, and knowledge distillation to compress deep neural networks for resource-constrained edge devices. Tasks include selecting and justifying appropriate pruning (structured vs. unstructured), quantization (PTQ vs. QAT, bitwidth selection), and distillation strategies; designing a hardware-aware compression and deployment pipeline; and analyzing trade-offs (accuracy, latency, memory, energy) with concrete evaluation metrics and validation experiments.
Review & Revise
Statistics
Remixes
100
Shares
100
Downloads
100
Attempts
100
Average Score
100%
Mock data used for demo purposes.