Materials are omnipresent. Recognizing materials helps us with inferring their physical and chemical properties, for instance if they are compressible, slippery, sweet and juicy. Yet in literature, much less attention has been paid to material perception than to object perception. This dissertation presents studies on a method to systematically measure human visual perception of opaque materials and test the influence of lighting and shape on material perception. In our studies, we applied multiple psychophysical methods such as matching, discriminating, and perceptual scaling to test the visual perception of materials for human observers. Beyond just matte and glossy material variations that were commonly tested in material perception literature, we included in total four canonical material modes to account for a wide range of materials, namely diffuse, asperity, forward, and mesofacet scattering for "matte", "velvety", "specular", and "glittery" material modes, respectively. For the lightings, we included three canonical lighting modes within a spherical harmonics and perception based framework, namely "ambient", "focus", and "brilliance" lighting. Based on the spherical harmonics analysis of the global lighting environment, we were able to quantify the “diffuseness” and “brilliance” of the light maps by using Xia’s diffuseness metric and a novel brilliance metric we proposed. Combining the four material modes and three lighting modes, we presented a canonical set that in combination with optical mixing supports a painterly approach in which key image features could be varied directly. With this method we were able to test and predict light-material interactions using both photographs of the real objects and computer rendered stimuli. We first introduced a new type of non-spherical appearance probe, implementing the painterly approach. Moreover, we developed an interactive interface that integrated the probe for an asymmetric matching task, where observers adjusted sliders to vary each material mode in the probe. The interface was found to be intuitive for inexperienced users and allowed purely visual quantitative measurements. Performances were generally well above chance and robust across experiments and observers, validating the approach. We further developed the material probe and expanded it to allow optical mixing of canonical lighting modes. In a light matching experiment and a 4-category discrimination experiment we found asymmetric perceptual confounds between judgments of material and lighting. Specifically, observers were found to be less sensitive to light changes than to material changes. Moreover, using this canonical approach, we were able to test and predict light-material interactions in two perceptual scaling experiments. To this aim a novel spherical harmonics based metric was introduced for quantifying the "brilliance". Lastly, we compared results from our probing method and results from other psychophysical experimental methods, namely perceptual scaling and discrimination, in which semantic information (for material attributes) was involved. Robust effects of light, shape, and light map orientation were found, in a material dependent way. To conclude, our research mainly contributed to 1) the development of a novel probing method that mixes image features of the proximal stimulus in a fluent manner instead of varying the distal physical properties of the stimuli, plus a validation that it works and that it allows quantitative measurements of material perception and material-lighting interactions; 2) understanding of visual perception of opaque materials and material-light-interactions in a wide ecological variety; 3) a validated model for predicting the material dependent lighting effects for matte, specular, velvet and glittery materials; and 4) the interpretations of the material perception results in a manner relating to shape and light. Our findings can be further applied to many subjects, such as industrial design, education, e-commerce, computer graphics, and future psychophysical studies.
|Qualification||Doctor of Philosophy|
|Award date||29 Oct 2019|
|Publication status||Published - 2019|
- Visual perception
- Material perception