Multi-Stable Perception and Fitting in Computer Vision

Multi-stable Perception
Necker Cube
 
 
Spinning dancer illusion, Nobuyuki Kayahara
 
Feature Matching and Robust Fitting
Computer Vision
James Hays
Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe 2008 AAAI Tutorial
Read Szeliski 7.4.2 and 2.1
Project 2
This section: correspondence and alignment
Correspondence: matching points, patches, edges, or regions
across images
Review: Local Descriptors
Most features can be thought of as templates, histograms
(counts), or combinations
The ideal descriptor should be
Robust and Distinctive
Compact and Efficient
Most available descriptors focus on edge/gradient information
Capture texture information
Color rarely used
K. Grauman, B. Leibe
Can we refine this further?
 
 
 
Fitting: find the parameters of a model that best fit the data
 
 
 
Alignment: find the parameters of the transformation that best
align matched points
 
 
Fitting and Alignment
 
Design challenges
Design a suitable 
goodness of fit
 measure
Similarity should reflect application goals
Encode robustness to outliers and noise
Design an 
optimization
 method
Avoid local optima
Find best parameters quickly
 
Fitting and Alignment: Methods
Global optimization / Search for parameters
Least squares fit
Robust least squares
Other parameter search methods
Hypothesize and test
Generalized Hough transform
RANSAC
Fitting and Alignment: Methods
Global optimization / Search for parameters
Least squares fit
Robust least squares
Other parameter search methods
Hypothesize and test
Generalized Hough transform
RANSAC
Simple example: Fitting a line
 
Least squares line fitting
Data: 
(
x
1
, 
y
1
), …, (
x
n
, 
y
n
)
Line equation: 
y
i
 = m
 
x
i
 + b
Find (
m
, 
b
) to minimize
(
x
i
, 
y
i
)
y=mx+b
Matlab: 
p = A \ y;
Modified from S. Lazebnik
Python: 
p = 
numpy.linalg.lstsq(A, y)
Problem with “vertical” least squares
 
Not rotation-invariant
Fails completely for
vertical lines
Slide from S. Lazebnik
Total least squares
If (
a
2
+b
2
=
1) then 
Distance between point 
(
x
i
, 
y
i
) is
 
|
ax
i
 + by
i
 + c
|
(
x
i
, 
y
i
)
ax+by+c=0
Unit normal:
N=
(
a, b
)
Slide modified from S. Lazebnik
proof:
http://mathworld.wolfram.com/Point-
LineDistance2-Dimensional.html
Total least squares
If (
a
2
+b
2
=
1) then
Distance between point 
(
x
i
, 
y
i
) is
 
|
ax
i
 + by
i
 + c
|
Find 
(
a
, 
b
, c) 
to minimize the sum of
squared perpendicular distances
(
x
i
, 
y
i
)
ax+by+c=0
Unit normal:
N=
(
a, b
)
Slide modified from S. Lazebnik
Total least squares
Find 
(
a
, 
b
, 
c
) 
to minimize the sum of
squared perpendicular distances
(
x
i
, 
y
i
)
ax+by+c=0
Unit normal:
N=
(
a, b
)
 
Solution is eigenvector corresponding to smallest eigenvalue of A
T
A
 
See details on Raleigh Quotient: 
http://en.wikipedia.org/wiki/Rayleigh_quotient
Slide modified from S. Lazebnik
Recap: Two Common Optimization Problems
Problem statement
Solution
Least squares (global) optimization
 
Good
Clearly specified objective
Optimization is easy
 
Bad
May not be what you want to optimize
Sensitive to outliers
Bad matches, extra points
Doesn’t allow you to get multiple good fits
Detecting multiple objects, lines, etc.
Least squares: Robustness to noise
Least squares fit to the red points:
Least squares: Robustness to noise
Least squares fit with an outlier:
 
Problem: squared error heavily penalizes outliers
Fitting and Alignment: Methods
Global optimization / Search for parameters
Least squares fit
Robust least squares
Other parameter search methods
Hypothesize and test
Generalized Hough transform
RANSAC
Robust least squares (to deal with outliers)
General approach:
    minimize
 
 
u
i 
(
x
i
, 
θ
)
 – residual of i
th
 point w.r.t. model parameters 
θ
ρ
 – robust function
 with scale parameter 
σ
The robust function 
ρ
 Favors a configuration
with small residuals
 Constant penalty for large
residuals
Slide from S. Savarese
Choosing the scale: Just right
The effect of the outlier is minimized
The error value is almost the same for every
point and the fit is very poor
Choosing the scale: Too small
Choosing the scale: Too large
Behaves much the same as least squares
Robust estimation: Details
Robust fitting is a nonlinear optimization problem that must be
solved iteratively
Least squares solution can be used for initialization
Scale of robust function should be chosen adaptively based on
median residual
Fitting and Alignment: Methods
Global optimization / Search for parameters
Least squares fit
Robust least squares
Other parameter search methods
Hypothesize and test
Generalized Hough transform
RANSAC
Other ways to search for parameters (for
when no closed form solution exists)
 
Line search
1.
For each parameter, step through values and choose value
that gives best fit
2.
Repeat (1) until no parameter changes
 
Grid search
1.
Propose several sets of parameters, evenly sampled in the
joint set
2.
Choose best (or top few) and sample joint parameters around
the current best; repeat
 
Gradient descent
1.
Provide initial position (e.g., random)
2.
Locally search for better parameters by following gradient
Fitting and Alignment: Methods
Global optimization / Search for parameters
Least squares fit
Robust least squares
Other parameter search methods
Hypothesize and test
Generalized Hough transform
RANSAC
Hypothesize and test
 
1.
Propose parameters
Try all possible
Each point votes for all consistent parameters
Repeatedly sample enough points to solve for parameters
 
2.
Score the given parameters
Number of consistent points, possibly weighted by
distance
 
3.
Choose from among the set of parameters
Global or local maximum of scores
 
4.
Possibly refine parameters using inliers
Fitting and Alignment: Methods
Global optimization / Search for parameters
Least squares fit
Robust least squares
Other parameter search methods
Hypothesize and test
Generalized Hough transform
RANSAC
Hough Transform: Outline
1.
Create a grid of parameter values
2.
Each point votes for a set of parameters, incrementing those
values in grid
3.
Find maximum or local maxima in grid
x
y
b
m
y = m x + b
Hough transform
Given a set of points, find the curve or line that
explains the data points best
P.V.C. Hough, 
Machine Analysis of Bubble Chamber Pictures,
 Proc. Int. Conf.
High Energy Accelerators and Instrumentation, 1959
Hough space
Slide from S. Savarese
x
y
b
m
Hough transform
Slide from S. Savarese
Hough transform
Issue : parameter space [m,b] is unbounded…
P.V.C. Hough, 
Machine Analysis of Bubble Chamber Pictures,
 Proc. Int. Conf.
High Energy Accelerators and Instrumentation, 1959
Slide from S. Savarese
x
y
Hough transform
Issue : parameter space [m,b] is unbounded…
P.V.C. Hough, 
Machine Analysis of Bubble Chamber Pictures,
 Proc. Int. Conf.
High Energy Accelerators and Instrumentation, 1959
Hough space
Use a polar representation for the parameter
space
Slide from S. Savarese
features
votes
Hough transform - experiments
Slide from S. Savarese
features
votes
Need to adjust grid size or smooth
Hough transform - experiments
Noisy data
Slide from S. Savarese
Issue: spurious peaks due to uniform noise
features
votes
Hough transform - experiments
Slide from S. Savarese
1. Image 
 Canny
 
2. Canny 
 Hough votes
 
3. Hough votes 
 Edges
 
Find peaks and post-process
Hough transform example
http://ostatic.com/files/images/ss_hough.jpg
Incorporating image gradients
 
Recall: when we detect an
edge point, we also know its
gradient direction
But this means that the line
is uniquely determined!
Modified Hough transform:
    For each edge point (x,y)
 
θ = gradient orientation at (x,y)
 
ρ
 = x cos θ + y sin θ
 
H(θ, 
ρ
) = H(θ, 
ρ
) + 1
end
Finding lines using Hough transform
 
Using m,b parameterization
Using r, theta parameterization
Using oriented gradients
Practical considerations
Bin size
Smoothing
Finding multiple lines
Finding line segments
Hough Transform
How would we find circles?
Of fixed radius
Of unknown radius
Of unknown radius but with known edge orientation
Hough transform for circles
Conceptually equivalent procedure: for each (x,y,r),
draw the corresponding circle in the image and
compute its “support”
Hough Transform
How would we find circles?
Of fixed radius
Of unknown radius
Of unknown radius but with known edge orientation
Hough transform for circles
x
y
(x,y)
x
y
r
image space
Hough parameter space
Hough transform conclusions
 
Good
Robust to outliers: each point votes separately
Fairly efficient (much faster than trying all sets of parameters)
Provides multiple good fits
 
Bad
Some sensitivity to noise
Bin size trades off between noise tolerance, precision, and
speed/memory
Can be hard to find sweet spot
Not suitable for more than a few parameters
grid size grows exponentially
 
Common applications
Line fitting (also circles, ellipses, etc.)
Object instance recognition (parameters are affine transform)
Object category recognition  (parameters are position/scale)
Slide Note
Embed
Share

Explore the intriguing concepts of multi-stable perception through visual illusions like the Necker Cube and the Spinning Dancer. Delve into the essentials of feature matching, robust fitting, and alignment in computer vision, including methods for refinement and design challenges. Learn about global optimization techniques and parameter search methods for achieving fitting and alignment in image processing tasks.

  • Perception
  • Necker Cube
  • Computer Vision
  • Fitting and Alignment

Uploaded on Sep 18, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Multi-stable Perception Necker Cube

  2. Spinning dancer illusion, Nobuyuki Kayahara

  3. Feature Matching and Robust Fitting Read Szeliski 7.4.2 and 2.1 Computer Vision James Hays Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe 2008 AAAI Tutorial

  4. Project 2

  5. This section: correspondence and alignment Correspondence: matching points, patches, edges, or regions across images

  6. Review: Local Descriptors Most features can be thought of as templates, histograms (counts), or combinations The ideal descriptor should be Robust and Distinctive Compact and Efficient Most available descriptors focus on edge/gradient information Capture texture information Color rarely used K. Grauman, B. Leibe

  7. Can we refine this further?

  8. Fitting: find the parameters of a model that best fit the data Alignment: find the parameters of the transformation that best align matched points

  9. Fitting and Alignment Design challenges Design a suitable goodness of fit measure Similarity should reflect application goals Encode robustness to outliers and noise Design an optimization method Avoid local optima Find best parameters quickly

  10. Fitting and Alignment: Methods Global optimization / Search for parameters Least squares fit Robust least squares Other parameter search methods Hypothesize and test Generalized Hough transform RANSAC

  11. Fitting and Alignment: Methods Global optimization / Search for parameters Least squares fit Robust least squares Other parameter search methods Hypothesize and test Generalized Hough transform RANSAC

  12. Simple example: Fitting a line

  13. Least squares line fitting Data: (x1, y1), , (xn, yn) Line equation: yi = mxi + b Find (m, b) to minimize y=mx+b (xi, yi) = n = 2) ( E y m x b i i 1 i 2 1 x y 2 1 1 m m = = n 2 = = Ap y 1 E x y i i b b 1 i 1 x y n n = + T T T y y Ap y Ap Ap ( 2 ) ( ) ( ) Matlab: p = A \ y; Python: p = numpy.linalg.lstsq(A, y) dE = = T T A Ap A y 2 2 0 dp ( ) 1 = = T T T T A Ap A y p A A A y Modified from S. Lazebnik

  14. Least squares (global) optimization Good Clearly specified objective Optimization is easy Bad May not be what you want to optimize Sensitive to outliers Bad matches, extra points Doesn t allow you to get multiple good fits Detecting multiple objects, lines, etc.

  15. Least squares: Robustness to noise Least squares fit to the red points:

  16. Least squares: Robustness to noise Least squares fit with an outlier: Problem: squared error heavily penalizes outliers

  17. Fitting and Alignment: Methods Global optimization / Search for parameters Least squares fit Robust least squares Other parameter search methods Hypothesize and test Generalized Hough transform RANSAC

  18. Robust least squares (to deal with outliers) General approach: minimize ( ( ) ) i = n , ; u x = 2 2 ( ) u y m x b i i i i 1 i ui (xi, ) residual of ith point w.r.t. model parameters robust function with scale parameter The robust function Favors a configuration with small residuals Constant penalty for large residuals Slide from S. Savarese

  19. Choosing the scale: Just right The effect of the outlier is minimized

  20. Choosing the scale: Too small The error value is almost the same for every point and the fit is very poor

  21. Choosing the scale: Too large Behaves much the same as least squares

  22. Robust estimation: Details Robust fitting is a nonlinear optimization problem that must be solved iteratively Least squares solution can be used for initialization Scale of robust function should be chosen adaptively based on median residual

  23. Fitting and Alignment: Methods Global optimization / Search for parameters Least squares fit Robust least squares Other parameter search methods Hypothesize and test Generalized Hough transform RANSAC

  24. Other ways to search for parameters (for when no closed form solution exists) Line search 1. For each parameter, step through values and choose value that gives best fit Repeat (1) until no parameter changes 2. Grid search 1. Propose several sets of parameters, evenly sampled in the joint set 2. Choose best (or top few) and sample joint parameters around the current best; repeat Gradient descent 1. Provide initial position (e.g., random) 2. Locally search for better parameters by following gradient

  25. Fitting and Alignment: Methods Global optimization / Search for parameters Least squares fit Robust least squares Other parameter search methods Hypothesize and test Generalized Hough transform RANSAC

  26. Fitting and Alignment: Methods Global optimization / Search for parameters Least squares fit Robust least squares Other parameter search methods Hypothesize and test Generalized Hough transform RANSAC

  27. Hough Transform: Outline 1. Create a grid of parameter values 2. Each point votes for a set of parameters, incrementing those values in grid 3. Find maximum or local maxima in grid

  28. Hough transform P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Given a set of points, find the curve or line that explains the data points best y m b x Hough space y = m x + b Slide from S. Savarese

  29. Hough transform y m b x y m 3 5 3 3 2 2 3 7 11 10 4 3 2 2 3 1 1 0 4 1 5 3 2 3 x b Slide from S. Savarese

  30. Hough transform P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Issue : parameter space [m,b] is unbounded Slide from S. Savarese

  31. Hough transform P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Issue : parameter space [m,b] is unbounded Use a polar representation for the parameter space y x Hough space = + x cos y sin Slide from S. Savarese

  32. Hough transform - experiments votes features Slide from S. Savarese

  33. Hough transform - experiments Noisy data features votes Need to adjust grid size or smooth Slide from S. Savarese

  34. Hough transform - experiments features votes Issue: spurious peaks due to uniform noise Slide from S. Savarese

  35. 1. Image Canny

  36. 2. Canny Hough votes

  37. 3. Hough votes Edges Find peaks and post-process

  38. Hough transform example http://ostatic.com/files/images/ss_hough.jpg

  39. Finding lines using Hough transform Using m,b parameterization Using r, theta parameterization Using oriented gradients Practical considerations Bin size Smoothing Finding multiple lines Finding line segments

  40. Hough Transform How would we find circles? Of fixed radius Of unknown radius Of unknown radius but with known edge orientation

  41. Hough transform for circles Conceptually equivalent procedure: for each (x,y,r), draw the corresponding circle in the image and compute its support r x y

  42. Hough transform conclusions Good Robust to outliers: each point votes separately Fairly efficient (much faster than trying all sets of parameters) Provides multiple good fits Bad Some sensitivity to noise Bin size trades off between noise tolerance, precision, and speed/memory Can be hard to find sweet spot Not suitable for more than a few parameters grid size grows exponentially Common applications Line fitting (also circles, ellipses, etc.) Object instance recognition (parameters are affine transform) Object category recognition (parameters are position/scale)

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#