Skip to main content
  • Research Article
  • Open access
  • Published:

Robust Object Categorization and Segmentation Motivated by Visual Contexts in the Human Visual System

Abstract

Categorizing visual elements is fundamentally important for autonomous mobile robots to get intelligence such as novel object learning and topological place recognition. The main difficulties of visual categorization are two folds: large internal and external variations caused by surface markings and background clutters, respectively. In this paper, we present a new object categorization method robust to surface markings and background clutters. Biologically motivated codebook selection method alleviates the surface marking problem. Introduction of visual context to the codebook approach can handle the background clutter issue. The visual contexts utilized are part-part context , part-whole context, and object-background context. The additional contribution is the proposition of a statistical optimization method, termed boosted MCMC, to incorporate the visual context in the codebook approach. In this framework, three kinds of contexts are incorporated. The object category label and figure-ground information are estimated to best describe input images. We experimentally validate the effectiveness and feasibility of object categorization in cluttered environments.

Publisher note

To access the full article, please see PDF.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sungho Kim.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Kim, S. Robust Object Categorization and Segmentation Motivated by Visual Contexts in the Human Visual System. EURASIP J. Adv. Signal Process. 2011, 101428 (2011). https://doi.org/10.1155/2011/101428

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1155/2011/101428

Keywords