3B Sahnelerde Kullanıcı Dikkatine Yönelik Belirginlik Tabanlı Aydınlatma Kontrolü

Görsel dikkat, 3B ortamları nasıl algıladığımızı önemli ölçüde etkiler ve belirginlik bir sahnenin veya objenin görünür özelliklerinden dolayı ne kadar ilgimizi çektiğini ifade eden görsel dikkatin bir parçasıdır. Belirginlik, objelerin şekil, gölgelendirme ve parlaklık gibi görsel özelliklerine dayanır. Bir sahnenin aydınlatmasının bu görsel özellikler üzerinde önemli bir etkisi vardır. Bu çalışma, 3B bir sahnede aydınlatma parametreleri üzerinde değişiklikler yaparak sahnedeki nesnelerin belirginliklerini kontrol etmeyi amaçlamaktadır. Bu sebeple, gerçekçi bir 3B sahne verildiğinde, dikkat çekmesi istenen hedef nesneler için maksimum belirginlik sağlayan ışık parametreleri araştırılmaktadır. Bir başka deyişle, amaca bağlı otomatik aydınlatma kurulumu için bir yöntem öneriyoruz. Bu çalışmada, 3B sahnenin belirli bakış açılarından 2B ekran görüntüleri ele alınmış ve sonuçlar, farklı ışık pozisyonları altındaki belirginlik dağılımına göre incelenmiştir. Ayrıca bu süreçte farklı belirginlik tahmin yöntemleri ve hesaplamalar kullanılmış ve sonuçlar göz takip testleri ile değerlendirilmiştir.

SALIENCY BASED ILLUMINATION CONTROL FOR GUIDING USER ATTENTION IN 3D SCENES

Visual attention has a major impact on how we perceive 3D environments and saliency is a component of visual attention expressing how likely a scene or item is to capture our attention due to its apparent features. Saliency relies upon shape, shading, brightness, and other visual attributes of items. The saliency distribution of a visual field is influenced by the illumination of a scene, which has a significant impact on those visual properties. This work aims to control the saliency by manipulating the illumination parameters in a 3D scene. For this reason, given a sensible 3D scene, the light parameters that provide maximum saliency for the point of interest objects are investigated. In other words, we propose a method for task-aware automatic lighting setup. In this paper, 2D renderings of a 3D scene from various perspectives are considered, and the effects are analyzed in terms of saliency distribution under various lighting conditions. Also, for this process, different saliency estimation methods and calculations are investigated and eye tracker based user experiments are conducted to verify the results.

___

  • [1] Bulbul, A., Capin, T., Gudukbay, U. and Koca, C., “Saliency for animated meshes with material properties”, Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization, pp. 81–88, 2010
  • [2] Itti, L., Koch, C. and Niebur, E., “A model of saliencybased visual attention for rapid scene analysis”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 11, pp. 1254–1259, 1998.
  • [3] Jonker, P. P. and Rudinac, M., “Saliency detection and object localization in indoor environments”, 2010 20th International Conference on Pattern Recognition, 2010, pp. 404–407.
  • [4] Niebur, E. and Ramenahalli, S., “Computing 3d saliency from a 2d image”, 2013 47th Annual Conference on Information Sciences and Systems (CISS). IEEE, pp. 1–5, 2013.
  • [5] Li, J., “Visual attention is beyond one single saliency map”, ArXiv, vol. abs/1811.02650, 2018.
  • [6] Bruce, N. D. B. and Tsotsos, J. K., “Saliency, attention, and visual search: an information theoretic approach”, Journal of vision, vol. 9 3, pp. 5.1–24, 2009.
  • [7] Kruger, A., Tunnermann, J. and Scharlau, I., “Measuring and modeling ¨ salience with the theory of visual attention”, Attention, Perception, & Psychophysics, vol. 79, no. 6, pp. 1593–1614, 2017.
  • [8] Kim, Y. and Varshney, A., “Persuading visual attention through geometry”, IEEE Transactions on Visualization and Computer Graphics, vol. 14, no. 4, pp. 772–782, 2008.
  • [9] Bajic, I. V. and Mateescu, V. A., “Visual attention retargeting”, IEEE MultiMedia, vol. 23, no. 1, pp. 82– 91, 2016.
  • [10] Low, K. and Wong, L., "Saliency retargeting: An approach to enhance image aesthetics", 2011 IEEE Workshop on Applications of Computer Vision (WACV), Kona, HI, USA, pp. 73-80, 2011.
  • [11] Ma, K. and Zhang, Y., “Lighting design for globally illuminated volume rendering”, IEEE Transactions on Visualization and Computer Graphics, vol. 19, no. 12, pp. 2946–2955, 2013.
  • [12] Jin, S. and Lee, S., “Lighting layout optimization for 3d indoor scenes”, Computer Graphics Forum, vol. 38, 2019.
  • [13] Ropinski, T. and Sunden, E., “Efficient volume illumination with ´ multiple light sources through selective light updates”, 2015 IEEE Pacific Visualization Symposium (PacificVis). IEEE, pp. 231– 238, 2015.
  • [14] Li, T.H., Ohtake, Y. and Suzuki, H., “Visualization of user’s attention on objects in 3D environment using only eye tracking glasses”, Journal of Computational Design and Engineering, vol. 7, no. 2, pp. 228–237, 2020.
  • [15] Cottrell, D., “Comparing multiple methods of eye tracking for packaging”, Clemson University All Theses, 2016.
  • [16] Brunye, T. T., Foster, V., Hollander, J. B., Jacob, R. J., Purdy, A., Taylor, H. A. and Wiley, A., “Seeing the city: using eye-tracking ´ technology to explore cognitive responses to the built environment”, Journal of Urbanism: International Research on Placemaking and Urban Sustainability, vol. 12, no. 2, pp. 156–171, 2019.
  • [17] Montabone, S., and Soto, A., “Human detection using a mobile platform and novel features derived from a visual saliency mechanism”, Image and Vision Computing, Vol. 28 Issue 3, pp 391–402. Elsevier, 2010.