Prune Callback
Use the pruner in fastai Callback system
Overview
The PruneCallback integrates structured pruning into the fastai training loop. Unlike sparsification (which zeros weights), pruning physically removes network structures (filters, channels) to reduce model size and computation.
Key Differences from SparsifyCallback: - Removes structures entirely (not just zeros) - Uses torch-pruning library for dependency handling - Supports various pruning criteria and schedules
Parameters:
pruning_ratio: Target ratio of parameters to remove (0-1 or 0-100). Values >1 are treated as percentages.schedule: When to prune (fromfasterai.core.schedule). Controls how pruning progresses over training.context:'local'(per-layer pruning) or'global'(across entire model).criteria: How to select what to prune (fromfasterai.core.criteria).
Usage Example
from fasterai.prune.prune_callback import PruneCallback
from fasterai.core.schedule import agp
from fasterai.core.criteria import large_final
# Prune 30% of parameters using automated gradual pruning schedule
cb = PruneCallback(
pruning_ratio=30, # Remove 30% of parameters
schedule=agp, # Gradual pruning (cubic decay)
context='global', # Prune globally across all layers
criteria=large_final # Keep weights with largest magnitude
)
learn.fit(10, cbs=[cb])See Also
- Pruner - Core structured pruning class used by this callback
- Schedules - Control pruning progression during training
- Criteria - Importance measures for selecting filters to prune
- SparsifyCallback - Unstructured pruning alternative