A preliminary study on visual estimation of taste appreciation

Idil Esen Zulfikar, Hamdi Dibeklioglu, Hazim Kemal Ekenel

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review


From past to present, individuals' appreciation of taste has always been wondered. Moreover, there is an increasing research interest in measuring taste appreciation. Most of the previous work in this area are psychological studies that rely on manual coding of facial actions and/or emotional expressions. Consequently, these studies depend on human observations. We propose a preliminary study for an automatic visual analysis system that estimates taste liking of individuals. Our results show that the proposed system performs with 56.6% accuracy to classify appreciation in terms of liking, neutral, and disliking categories. In order to explore this result in detail, classification of liking level pairs such as disliking-vs-liking, neutral-vs-liking, and neutral-vs-disliking are also evaluated. Our system achieved 72.5% accuracy for distinguishing between dislike and liking, however, classification accuracy for dislike and neutral, and for liking and neutral have been found to be lower. Our results suggest that reliable and fast automatic systems can be developed to estimate taste appreciation, yet classes of liking and neutral state are not easily separable as indicated in previous studies.
Original languageEnglish
Title of host publication2016 IEEE International Conference on Multimedia Expo Workshops (ICMEW)
Place of PublicationPiscataway
Number of pages6
ISBN (Electronic)978-1-5090-1552-8
ISBN (Print)978-1-5090-1551-1
Publication statusPublished - 2016
Event2016 IEEE International Conference on Multimedia and Expo Workshops (ICMEW) - Seatlle, WA, United States
Duration: 11 Jul 201615 Jul 2016


Conference2016 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)
Abbreviated titleICMEW
Country/TerritoryUnited States
CitySeatlle, WA
Internet address


  • appreciation estimation
  • facial response
  • expression
  • peak frame

Cite this