Human Visual Perception Quantification: Toward Machine Perception

Abstract:

The human visual system processes environmental sensory information to create the perception of actual objects like vehicles, desks, and buildings. This occurrence is referred to as visual perception. The process of making meaning of the visual information is simple. The ultimate goal of recent machine vision research is to mimic human vision and comprehend the complex data processing carried out by the human brain. In this article, we provide a methodology for quantifying human visual perception that is inspired by the Gestalt theory of perception. We want to provide visual perception with a quantitative component so that it may be easily included in activities involving image processing. To do this, we suggest a framework based on the Bayesian learning technique. We provide a brand-new object model that measures Gestalt's proximity law using a marked point process (MPP). Our approach benefits from the stochastic framework's uncertainty and the marked point process' strong mathematical underpinnings. Tests on artificial images demonstrate effective detection of perceptual categories in images with Gaussian noise.