Task-Guided Quantization Strategies
DOI:
https://doi.org/10.31224/6071Keywords:
task-driven sensing, non-uniform quantization, computational imaging, joint optimization, low-bit-depth sensing, raw image processing, embedded visionAbstract
Conventional image sensors employ uniform quantization, a process agnostic to the downstream task, often discarding task-critical information. We propose a framework for end-to-end learning of a non-uniform quantization strategy, co-designed with a neural network for visual recognition. Our method replaces the standard analog-to-digital converter with a differentiable, learned module that optimizes the allocation of discrete levels for a specific machine vision task. Extensive experiments on ImageNet, CIFAR-100, and SID demonstrate that our approach outperforms uniform quantization and other baselines at ultra-low bit-depths (4-8 bits), achieving superior accuracy and enhanced robustness to noise while significantly reducing data bandwidth.
Downloads
Downloads
Posted
License
Copyright (c) 2025 Igor Szoboszlai

This work is licensed under a Creative Commons Attribution 4.0 International License.