Closed-loop active object recognition with constrained illumination power

Jacques Noom*, Oleg Soloviev, Carlas Smith, Michel Verhaegen

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

68 Downloads (Pure)

Abstract

Some applications require high level of image-based classification certainty while keeping the total illumination energy as low as possible. Examples are minimally invasive visual inspection in Industry 4.0, and medical imaging systems such as computed tomography, in which the radiation dose should be kept “as low as is reasonably achievable”. We introduce a sequential object recognition scheme aimed at minimizing phototoxicity or bleaching while achieving a predefined level of decision accuracy. The novel online procedure relies on approximate weighted Bhattacharyya coefficients for determination of future inputs. Simulation results on the MNIST handwritten digit database show how the total illumination energy is decreased with respect to a detection scheme using constant illumination.

Original languageEnglish
Title of host publicationProceedings Real-Time Image Processing and Deep Learning 2022
EditorsNasser Kehtarnavaz, Matthias F. Carlsohn
PublisherSPIE
Number of pages6
ISBN (Electronic)9781510650800
DOIs
Publication statusPublished - 2022
EventReal-Time Image Processing and Deep Learning 2022 - Virtual, Online
Duration: 6 Jun 202212 Jun 2022

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume12102
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Conference

ConferenceReal-Time Image Processing and Deep Learning 2022
CityVirtual, Online
Period6/06/2212/06/22

Keywords

  • Active fault diagnosis
  • Auxiliary signal design
  • Computational Tomography
  • Industry 4.0
  • Machine Vision
  • Medical imaging

Fingerprint

Dive into the research topics of 'Closed-loop active object recognition with constrained illumination power'. Together they form a unique fingerprint.

Cite this