Issue No. 02 - Apr.-June (2017 vol. 24)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MMUL.2017.40
Ryan Janzen , University of Toronto
Steve Mann , Stanford University
The new concept of coupled dynamic dynamic-range (D<super>2</super>R) compositing operates by assembling sensor information, such as images or audio, from multiple "strong" and "weak" samplings or sensor snapshots, whose sensitivities drift and change over time, as lighting conditions or sound conditions change over time in their amplitude-domain properties. The authors introduce a feedback-control method to automatically adjust multiple exposure settings for compositing to increase the dynamic range of a sensory process such as video capture. The method uses a cost function to express uncertainty in the measurements from each sensor, along with salience detection, which are then fed into a dynamic control system. The system responds in real time to changing ambient conditions and sensor motion, asymptotically tracking the sensor controls to minimize uncertainty to capture an extremely high dynamic range for compositing.
Dynamic range, Cameras, Automatic generation control, Computer graphics, Real-time systems, Sensors, Data analysis, Big data
R. Janzen and S. Mann, "Extreme-Dynamic-Range Sensing: Real-Time Adaptation to Extreme Signals," in IEEE MultiMedia, vol. 24, no. 2, pp. 30-42, 2017.