Preprint / Version 1

Task-Guided Quantization Strategies

##article.authors##

  • Igor Szoboszlai Independent Researcher

DOI:

https://doi.org/10.31224/6071

Keywords:

task-driven sensing, non-uniform quantization, computational imaging, joint optimization, low-bit-depth sensing, raw image processing, embedded vision

Abstract

Conventional image sensors employ uniform quantization, a process agnostic to the downstream task, often discarding task-critical information. We propose a framework for end-to-end learning of a non-uniform quantization strategy, co-designed with a neural network for visual recognition. Our method replaces the standard analog-to-digital converter with a differentiable, learned module that optimizes the allocation of discrete levels for a specific machine vision task. Extensive experiments on ImageNet, CIFAR-100, and SID demonstrate that our approach outperforms uniform quantization and other baselines at ultra-low bit-depths (4-8 bits), achieving superior accuracy and enhanced robustness to noise while significantly reducing data bandwidth.

Downloads

Download data is not yet available.

Downloads

Posted

2025-12-23