Mammadova K., ERDEM M. E., Erdem A.

22nd IEEE Signal Processing and Communications Applications Conference (SIU), Trabzon, Turkey, 23 - 25 April 2014, pp.536-539 identifier identifier

  • Publication Type: Conference Paper / Full Text
  • Volume:
  • Doi Number: 10.1109/siu.2014.6830284
  • City: Trabzon
  • Country: Turkey
  • Page Numbers: pp.536-539
  • Hacettepe University Affiliated: Yes


Exposure Fusion is a popular multi-exposure image fusion method which blends a set of differently exposed low dynamic range images of a scene to obtain another low dynamic range but contrast rich image. This approach carries out the integration process by using three local quality measures, namely contrast, saturation and exposedness. Our aim in this study is to extend the exposure fusion method by incorporating a novel visual saliency based quality measure. This new measure captures the parts of the scene that grabs our attention and gives more prominence to these salient regions, which is otherwise impassible 19, the previous measures in use. Our experiments show that, as compared to the exposure fusion method, our saliency-guided approach gives more vivid results and leads to sharp boundaries in the output images.