Abstract :
For problems over continuous random variables, MRFs with large cliques pose a challenge in probabilistic inference. Difficulties in performing optimization efficiently have limited the probabilistic models explored in computer vision and other fields. One inference technique that handles large cliques well is Expectation Propagation. EP offers run times independent of clique size, which instead depend only on the rank, or intrinsic dimensionality, of potentials. This property would be highly advantageous in computer vision. Unfortunately, for grid-shaped models common in vision, traditional Gaussian EP requires quadratic space and cubic time in the number of pixels. Here, we propose a variation of EP that exploits regularities in natural scene statistics to achieve run times that are linear in both number of pixels and clique size. We test these methods on shape from shading, and we demonstrate strong performance not only for Lambertian surfaces, but also on arbitrary surface reflectance and lighting arrangements, which requires highly non-Gaussian potentials. Finally, we use large, non-local cliques to exploit cast shadow, which is traditionally ignored in shape from shading.
Keywords :
Gaussian processes; computer vision; image resolution; inference mechanisms; learning (artificial intelligence); arbitrary surface reflectance arrangement; clique size; computer vision; continuous random variables; cubic time; expectation propagation; grid-shaped models; intrinsic dimensionality; lighting arrangement; nonGaussian potentials; nonLambertian shape; probabilistic inference; probabilistic models; quadratic space; shading; shadow; whitened expectation propagation; Approximation methods; Computational modeling; Computer vision; Covariance matrices; Equations; Probabilistic logic; Shape; Approximate inference; Expectation Propagation; Shape from X; Shape from shading; Shape from shadow;