Skip to main content
placeholder image

Perceived similarity and visual descriptions in content-based image retrieval

Conference Paper


Abstract


  • The me of low-level feature descriptors is pervasive in content-based image retrieval tasks and the answer to the question of how well these features describe users ' intention is inconclusive. In this paper we devise experiments to gauge the degree of alignment between the description of target images by humans and that implicitly provided by low-level image feature descriptors. Data was collected on how humans perceive similarity in images. Using images judged by humans to be similar, as ground truth, the performance of some MPEG-7 visual feature descriptors were evaluated. It is found that various descriptors play different roles in different queries and their appropriate combination can improve the performance of retrieval tasks. This forms a basis for the development of adaptive weight assignment to features depending on the query and retrieval task. © 2007 Crown Copyright.

Publication Date


  • 2007

Citation


  • Zhong, Y., Ye, L., Li, W., & Ogunbona, P. (2007). Perceived similarity and visual descriptions in content-based image retrieval. In Proceedings ISM Workshops 2007 9th IEEE International Symposium on Multimedia - Workshops (pp. 173-178). doi:10.1109/ISMW.2007.4475967

Scopus Eid


  • 2-s2.0-49649096451

Web Of Science Accession Number


Start Page


  • 173

End Page


  • 178

Abstract


  • The me of low-level feature descriptors is pervasive in content-based image retrieval tasks and the answer to the question of how well these features describe users ' intention is inconclusive. In this paper we devise experiments to gauge the degree of alignment between the description of target images by humans and that implicitly provided by low-level image feature descriptors. Data was collected on how humans perceive similarity in images. Using images judged by humans to be similar, as ground truth, the performance of some MPEG-7 visual feature descriptors were evaluated. It is found that various descriptors play different roles in different queries and their appropriate combination can improve the performance of retrieval tasks. This forms a basis for the development of adaptive weight assignment to features depending on the query and retrieval task. © 2007 Crown Copyright.

Publication Date


  • 2007

Citation


  • Zhong, Y., Ye, L., Li, W., & Ogunbona, P. (2007). Perceived similarity and visual descriptions in content-based image retrieval. In Proceedings ISM Workshops 2007 9th IEEE International Symposium on Multimedia - Workshops (pp. 173-178). doi:10.1109/ISMW.2007.4475967

Scopus Eid


  • 2-s2.0-49649096451

Web Of Science Accession Number


Start Page


  • 173

End Page


  • 178