Docstoc

Autonomous Vehicles

Document Sample
Autonomous Vehicles Powered By Docstoc
					José Luis Gordillo
Tecnológico de Monterrey

Development of Autonomous Vehicles at the Tecnológico de Monterrey
SSIR 2009

Specialized Applications

Agenda
Introduction
Preliminaries: Mobile Robots’ Collaboration Features: Navigation of Autonomous Vehicles (AV’s)

Autonomous Vehicles: Design, Construction and Analysis AV Model, Architecture and Method for Transformation State Estimation and Data Fusion AV Trajectory Following with Visual Monitoring Conclusions and considerations

Basic Representation of a Trajectory Following Execution
Y
Landmark Goal

Y

Goal

Landmark

Obstacles Start

X

Planning and Control

Vehicle y v Obstacles Start θ x X φ

Navigation:
Basic task definition
Navigation: Control velocity and steering of the vehicle for a given trajectory
Trajectory planning
Cinematic and dynamic models Model of the environment Heuristic planning methods

Trajectory following
Vehicle automation Dynamic environment Intensive sensor use

Goal Start

Mobile Robots’ Collaboration

H.H. González-Baños, J.L. Gordillo, J.C. Latombe, D. Lin, A. Sarmiento and C. Tomasi: “The Autonomous Observer: A tool for remote experimentation in Robotics”, 1999 SPIE Int. Symp. on Intelligent Systems and Advanced Manufacturing, Boston, U.S.A., Telemanipulator and Telepresence Technologies VI, Matthew Stein (Ed.), SPIE Proc. 3840,November 1999. G. Hermosillo, A. Sarmiento and J.L. Gordillo: “Real-time differential Geometry surface description in range images”, Computación y Sistemas, Vol. 3, No. 3, Mexico, pp. 157-168, 2000.

Navigation with sensory primitives and landmarks

Autonomous Observer
The utilitarian robot perform the task (can collection) The observer monitors the correct execution of the task, for the user

Autonomous Vehicles and Mobile Robots
Autonomous Vehicles (AV) and mobile robots are developed to carry out specific tasks autonomously. Mobile robots are designed from scratch for a specific tasks. An AV is a commercial vehicle with proper sensory and control systems added to be autonomous.

Functional Autonomous Vehicle Model
Autonomous Vehicle

Control System Velocity Control Steering Control State Estimation Feature Extraction Actuatin g

Task

Task Planning

Path Following Obstacle Avoidance Internal Sensing

Perception Sensing

Features of Autonomous Vehicles
Navigation in semi structured and dynamic environment Knowledge about the environment Intensive sensor use Collision detection and avoidance Actuation in hostile work conditions Control though a computer program Autonomy during task execution

Agenda
Introduction
Preliminaries: Mobile Robots’ Collaboration Features: Navigation of Autonomous Vehicles (AV’s)

Autonomous Vehicles: Design, Construction and Analysis AV Model, Architecture and Method for Transformation State Estimation and Data Fusion AV Trajectory Following with Visual Monitoring Conclusions and considerations

Autonomous Vehicles:
Design, Construction and Analysis

J. Gutíerrez, J.L. Gordillo and I. López, “Configuration and construction of an Autonomous Vehicle for a mining task”, Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2004), New Orleans, U.S.A., April 2004, IEEE Press, pp. 2010-2016. J. Gutiérrez, C. Apostolopulos and J.L. Gordillo, “Numerical comparison of steering geometries for robotic vehicles by modeling position error”, Autonomous Robots, Springer, Vol. 23, No. 2, pp. 147-159, August 2007. http://www.springerlink.com/content/2j8050583n272528/

Topographic Reconstruction of a Mine
y

z

x

Preliminaries: Feasibility Analysis
Analysis for “Servicios Industriales Peñoles S.A de C.V.” Implementation of an Autonomous Vehicle for the 3-D topographic reconstruction of a mine Review the impact of the Autonomous Vehicles in mining industry Low cost analysis

Analysis Scope
Control the advance in both senses, but not the direction Control the vehicle velocity 3-D topographic reconstruction of the roof of the mine Communication between the user and the vehicle

Design of a Modular Architecture
User controller (guest): Task definition, planning and monitoring Vehicle controller (host): Autonomous task execution

RF

Design of a Sequential Architecture
(basic)
User Controller

Vehicle Controller

RF
Communication Communication Communication Communication

Advance Advance Control Control

Laser Laser 3-D Sensor 3-D Sensor PTU

Motors

Sonar

Laser Telemeter

Encoder

Guest: User Controller

Host: Vehicle Controller

Device Integration

Sonar RF Modem

PC Pentium

Laser Rangefinder

Pant Tilt Unit Stepper Motors

3-D Data from “Las Golodrinas”
(a mine of Peñoles, in Guanajuato)

Autonomy Analysis:
Steering Schemes
ICC

Articulated
φ
V

R

θ1

l1

Ackermann
ICC ICC

(xr , yr)

θ
l2
φ

Explicit
y

φ

R
x

V

V

R (xr , yr)

θ
L
φ

(xr , yr)

θ
L

1

Autonomy Analysis:
ICC

Positioning
φ
V

ICC
β φ

R

RSLIP

V

(xr , yr)

θ
L

α θ

y

x

No Slip

Slip

& xr = V cos θ & yr = V sin θ

θ& =

V sin φ L cos φ

& xr = V cos(θ + α ) & yr = V sin (θ + α )

θ& =

V sin (β − α + φ ) L cos(φ + β )

Autonomy Analysis:
ICC

Positioning
φ
V

ICC

β V φ

R

RSLIP

θ1
(xr , yr)

l1

θ1 α θ

y

θ
l2

x

No Slip

Slip

& xr = V cos θ & yr = V sin θ

& xr = V cos(θ + α ) & yr = V sin (θ + α )

θ& =

& V sin φ − l1φ l1 + l2 cos φ

θ& =

& V sin (β − α + φ ) + l1φ cos β l1 cos β + l2 cos(φ + β )

1

Autonomy Analysis:
ICC

Positioning
φ
V

ICC

β φ
V

R

RSLIP

α
φ
y

(xr , yr)
x

θ
L

θ

No Slip

Slip

& xr = V cos θ & yr = V sin θ

& xr = V cos(θ + α ) & yr = V sin (θ + α )

θ& =

V sin 2φ L cos φ

θ& =

V sin (β − α + 2φ ) L cos(φ + β )

Autonomy Analysis
PER AS = 1 − PER RS = 1 − PER ES

Positioning
[r cos (k φ ) + cos (φ [1 + k ])]sin φ [r + cos φ ]sin (φ [k − k + 1]) cos (φ [ + k ])sin 2φ 1 = 1− cos φ sin (φ [k − k + 2 ])
β β β α β β α

cos φ sin (φ [k β − kα + 1])

cos (φ [ + k β ])sin φ 1

kβ =

β α l , kα = , and r = 1 : φ φ l2

Case (a) α (b) β (c) α > β (d) α = β (e) α|(b)|

kα (% )
66.19 00.00 66.19 16.62 16.62

k β (% )
00.00 16.62 16.62 16.62 00.00

1

Autonomy Analysis

Positioning

Autonomy Analysis

Positioning

1

Autonomy Analysis

Positioning

Autonomy Analysis

Positioning

1

Agenda
Introduction
Preliminaries: Mobile Robots’ Collaboration Features: Navigation of Autonomous Vehicles (AV’s)

Autonomous Vehicles: Design, Construction and Analysis AV Model, Architecture and Method for Transformation State Estimation and Data Fusion AV Trajectory Following with Visual Monitoring Conclusions and considerations

AV Model, Architecture and Method for Transformation

1

Scale Vehicles: “Paquito”, “Mayito”
Design, construction and experimentation of low cost AV architecture:

Vehicle to Automate:
Super Truck Johnson Industries: “Johnny”

1

Vehicle Model and Version
Akerman mechanic architecture
Steering axis

Simplified bicycle model Non holonomic constraints

Radio Shack Tempest

Maximal speed: 6 m/s

Ackermann mechanical architecture
Bicycle model
yd = y p + L cos θ

L

r v
φ
yd = y + Lsenθ

y
yp

Considerations
Slow velocities Low lateral acceleration No slippage in the wheels Ackermann geometry

R=
φ

L tan φ

θ
xp
xd = x + L cos θ

x

1

Vehicle Model
y 0 : Vehicle angle, relating to the
horizontal plane

L ø

Ø : Wheel angle, relating to the
direction of the vehicle axis

L : Length between front and back 0 R x
wheels

R : Turning radius
R = tan ø L

0

Kinematic Model: Increments Approach
& ⎡ x p ⎤ ⎡ cos θ ⎤ ⎡0⎤ ⎢ y ⎥ ⎢ sin θ ⎥ ⎢ ⎥ & ⎢ p ⎥ = ⎢ ⎛ φ ⎞ ⎥ v1 + ⎢ 0 ⎥ v 2 ⎢ θ& ⎥ ⎢ tan ⎜ ⎟ ⎥ ⎢0⎥ ⎢ & ⎥ ⎢ ⎝ L ⎠⎥ ⎢ ⎥ ⎣φ ⎦ ⎢ 0 ⎥ ⎣1 ⎦ ⎣ ⎦
y

R

φ

φ

V

θ
L

∆ y = R (cos (θ ) − cos (θ + ∆ θ )) ∆θ =

∆ x = R (sin (θ + ∆ θ ) − sin (θ ))

x

∆d ∆d ∆ d tan (φ ) = = L R L tan (φ )

v1 : v2 : ∆d : ∆θ :

Translational velocity Steering velocity Traveled distance Orientation change

1

Architecture requirements
Programmability Design based on distributed and hierarchical architecture (modularity)
Parallel processing Centralized monitor

Application in dynamic environment (flexibility)
Real time response Asynchrony communication

Scalable design and performance

LAAS Architecture

1

User (x1 , y1) (x2 , y2) ... (xn , yn) Trajectory Control

Control Architecture
^ ^ (x, y) (x,y) ^ ^ ^ (x, y,θ ) eθ vd Control Point

y

L

φ

Reactive Module v

φs
ev Velocity Control mv Velocity Plant V es Steering Control ms Steering Plant

φ
Parameters Estimation (x,y,θ ) d

θ
R x

Sonars

φ

Integrator V

[G. Palacios 2000]

φ

Vehicle Kinematics' Model (x,y,θ, φ )

AV Architecture
Cinematic Model Trajectory Planning
Environment Model

Dynamic Trajectory Planning Trajectory Control

Trajectory Planning Trajectory Following

Position Estimation

Sensor Processing Point to Point Control Obstacle Avoidance

Velocity Control

Steering Control

Multiple Sensors

2

f R
Kinematic Model Vehicle's

v

y q x

Trajectory Planning

Environment Model

P={p1,p2,p3,......}={(x 1,y1),(x 2,y2),.....} Trajectory Control State Estimation

AV Architecture

Filter Predicted State

Environment Modeling

Predicted Measurements (x,y,q,f )

Features Extraction

Estimated State

Point to Point Control

pi=(xi ,yi )

Obstacle Avoidance

Sensor Processing

Velocity Controller Digital Controller Digital Potentiometer

Direction Controller Digital Controller

Interface

Interface

Interface Power Board

Interface

Interface

Interface

Interface

Speed Motor Controller

Optic Encoder

Encoder

Absolute Sensors

Security Sensors

Incremental Sensor

Other Sensor

Velocity Plant

Direction Plant

Control Architecture:
Diverse sensors
Internal sensors:
Odometry

External sensors:
GPS

Accelerometer

Electronic compass

(x,y) : Position x,y)
Potentiometer

θ : Orientation
V : Velocity

Gyroscope

2

Control Architecture:
Sensor Integration
Sensor Fusion with Kalman Filter (EKF)
Redundant and competitive sensors Imprecise and noise sensors Kalman Filter to data filtering Statistical normalization for fusion
Orientation Velocity Position

Cynematic model

Cynematic model

Cynematic model

Sensor Fusion
Fusion method properties: Different sources Non-linearity Autocorrelation
P , Cov1 1
Fusion

P5 , Cov5
Fusion

P4 , Cov 4
Fusion

d x = ∫ v x dt d y = ∫ v y dt vx = vy

Pod , θ od

P2 , Cov2
Fusion

P3 , Cov3
Fusion

∫ A dt = ∫ A dt
x y

Odometry Equations Steering Odometer Velocity Odometer

θ = ∫ vθ dt
Gyroscope Compass

PGPS = TGPS DGPS Pcam = Tcam Dcam
GPS External Camera

Accelerometer

2

Modular Architecture
Architecture features: Generic Hierarchic Modular Adaptable Efficient
Terminal boards Devices Master boards

Physical Architecture
GPS

User

Trajectory Control

Steering Control

Compass

Gyro

Velocity Control

Steering Power

Steering Sensor

Electrical Signals Control

2

Utilitarian Real Scale Vehicle

Automation Methodology

Configuration input

Analysis

Characterization

Design

Modifications

Simulation

Tests and Validations

Refinements and Corrections

Implementation

Modifications

Lab Tests and Validations

Refinements and Correction

Field Tests and Validations output Prototype

2

Automation Methodology
Analysis
Characterization Solid modeling and simulation State of the art on systems and processing

Design and instrumentation
Solid Modeling and Simulation

Instrumentation and modifications Test and validation Refinement y correction (cycle)

Vehicle to Automate:
Super Truck Johnson Industries: “Johnny”

2

Analysis: Specifications
• Rear traction motor 5hp • 301 (long) x 134 (width) x 105 (height) cm • Ackermann Bar • Right radium 1.17 + 0.01 m • Left radium 1.76 + 0.01 m • Required force in the steering drive • 10 a 15 Nm

Analysis: Solid Modeling and Simulation
Mechanical and Geometrical Modeling

2

Modified Model

Steering Actuator

GPS, Compass, Gyroscope, etc. Velocity Sensor

Steering Sensor

Steering Controller, Velocity Controller, Path Following, Position Estimation

2

Sludge : Robot

Agenda
Introduction
Preliminaries: Mobile Robots’ Collaboration Features: Navigation of Autonomous Vehicles (AV’s)

Autonomous Vehicles: Design, Construction and Analysis AV Model, Architecture and Method for Transformation State Estimation and Data Fusion AV Trajectory Following with Visual Monitoring Conclusions and considerations

2

State Estimation and Data Fusion

J.M. Mirats-Tur, J.L. Gordillo and C. Albores, “A Close form for expressing the uncertainty in odometry position estimate. Application to an Autonomous Vehicle”, IEEE Transactions on Robotics, Vol. 21, No. 5, pp. 1017-1022, October 2005. J.M. Mirats-Tur, C. Albores, and J.L. Gordillo, “Communication to: A Close form for expressing the uncertainty in odometry position estimate. Application to an Autonomous Vehicle”, Accepted for publication, IEEE Transactions on Robotics, 2007. C. Albores, J.M. Mirats-Tur, and J.L. Gordillo, “Accurate position uncertainty estimation and propagation for Autonomous Vehicles”, IEEE Robotics and Automation Magazine, Vol. 16, No. 2, pp. 82-90, June 2009.

f R
Kinematic Model Vehicle's

v

y q x

Trajectory Planning

Environment Model

P={p1,p2,p3,......}={(x 1,y1),(x 2,y2),.....} Trajectory Control State Estimation

AV Architecture

Filter Predicted State

Environment Modeling

Predicted Measurements (x,y,q,f )

Features Extraction

Estimated State

Point to Point Control

pi=(xi ,yi )

Obstacle Avoidance

Sensor Processing

Velocity Controller Digital Controller Digital Potentiometer

Direction Controller Digital Controller

Interface

Interface

Interface Power Board

Interface

Interface

Interface

Interface

Speed Motor Controller

Optic Encoder

Encoder

Absolute Sensors

Security Sensors

Incremental Sensor

Other Sensor

Velocity Plant

Direction Plant

2

Estimation Principle
Prior estimation Sensed Filtered, new estimation

The State estimation element is a fundamental module of the architecture; a more precise estimation and error model could improve the vehicle’s performance

State Estimation and Data Fusion
Model Estimation Approximation models Uncertainty Propagation: Presented Formulae First Approach Cross Terms Jacobian (EKF) Unscented Transformation (UKF) Data Fusion Proposed scheme Extended Kalman Filter Probabilistic Approach (Pozo Ruz) Covariance intersection Algorithm

3

Model Approximation: Odometry
∆ x = R (sin (θ + ∆ θ )− sin (θ )) ∆ y = R (cos (θ )− cos ( + ∆ θ )) θ ∆d ∆d ∆ d tan (φ ) ∆θ = = = L R L tan (φ )
Y
Y

φ

R yt

∆θ θt-1

yt-1

φ

τ =0

xt-1

xt

X

yt

R

∆θ ∆θ θt-1

τ = ∆θ 2
Y

yt-1

φ

yt

R

∆d : Traveled distance in a sample time ∆θ : Orientation change in a sample time

xt-1 xt

Approximation: ∆ x = ∆ d cos ( + τ ) θ ∆ y = ∆ d sin ( + τ ) θ
X

∆θ

∆θ/2

yt-1

θt-1

τ = ∆θ
xt-1 xt X

Approximation
Whitout angle

∆ x = ∆ d cos (θ ) θ ∆ y = ∆ d sin ( ) ∆ d tan (φ ) ∆θ = L
Approximation Considering ∆θ small

∆ x = ∆ d cos (θ + ∆ θ

Half angle

∆ y = ∆ d sin (θ + ∆ θ 2 (φ ) ∆ ∆ θ = d tan L

2

) )

Full Angle

∆ x = ∆d cos (θ + ∆θ ) ∆ y = ∆d sin (θ + ∆θ ) (φ ) ∆ ∆θ = d tan L

∆θ very small
(∆θ < 60)

∆x = ∆d(cos(θ )cos(∆θ 2 )− sin(θ )sin(∆θ 2 ) ) ∆y = ∆d(sin(θ )cos(∆θ 2 )+ cos(θ )sin(∆θ 2 ) ) ( ) ∆θ = ∆d tan φ L

∆ x = ∆ d (cos (θ ) − ∆ θ sin (θ )) 2 ∆ y = ∆ d (sin (θ ) + ∆ θ cos (θ ) ) 2 cos(∆θ 2 ) ≈ 1 sin(∆θ 2 )≈ ∆θ 2

3

Approximation
Using ∆θ=20º
25 20

15

Half Angle Full Angle No angle

Y

10

Real Approx

5

0 -15 -10 -5 0 5 10 15

-5

X

• Half angle approach and its approximation perform better than full and no angle approaches

Approximation
Y Error R

∆θ

∆θ

φ

⎛ Error % = ⎜ ⎜ ⎝

(x real

2 2 − xcalc . ) + ( y real − y calc . ) ⎞ ⎟ × 100 ⎟ ∆d ⎠

Error percentage
6 5 4

X
0.3

Error percentage

0.25

0.2

.

Error % .

Error %

No angle 3 2 1 0 0 30 60 90 120 Full Angle Half Angle Approx.

No angle Full Angle Half Angle Approx.

0.15

0.1

0.05

0 0 2 4 6 8 10

∆θ = 30

∆θ

3

State Estimation and Data Fusion
Model Estimation
Approximation models

Uncertainty Estimation and Propagation:
Presented Formulae First Approach Cross Terms Jacobian (EKF) Unscented Transformation (UKF)

Data Fusion
Proposed scheme Extended Kalman Filter Probabilistic Approach (Pozo Ruz) Covariance intersection Algorithm

Estimation with uncertainty
⎡1 ⎢0 ∧ P (t ) = ⎢ ⎢0 ⎢ ⎣0 0 1 0 0
∧ ˆ 0 x (t − 1)⎤ ⎡ ∆ x (t ) ⎤ ⎥ ⎢ ∧ ˆ 0 y (t − 1)⎥ ⎢ ∆ y (t ) ⎥ ⎥⎢ ⎥ 1 θˆ (t − 1)⎥ ⎢ ∧ ⎥ ⎥ ⎢ ∆ θ (t )⎥ 0 1 ⎦ ⎣ 1 ⎦

∧ ⎛ ⎞ ∧ ∧ ∆ θ (t ) ⎜ ⎟ sin θˆ (t − 1) ⎟ ∆ x (t ) = ∆ d (t )⎜ cos θˆ (t − 1) − 2 ⎜ ⎟ ⎝ ⎠

(

)

(

)

∧ ⎛ ⎞ ∧ ∧ ∆ θ (t ) ⎜ ⎟ ∆ y (t ) = ∆ d (t )⎜ sin θˆ (t − 1) + cos θˆ (t − 1) ⎟ 2 ⎜ ⎟ ⎝ ⎠ ˆ (t ) tan φ (t ) ˆ ∆d ∆ θˆ (t ) = L

(

)

(

)

∆ d (t ) = ∆ d (t ) + ε d (t )

∧

φ (t ) = φ (t ) + ε φ (t )

∧

3

Previous Works: Error Modeling and Uncertainty Propagation
Odometry errors can be classified as being systematic or nonsystematic. [Borenstein, 1994] a calibration technique called UMBmark test has been developed to calibrate the system. The uncertainty of a model is commonly given by its covariance matrix. [Kleeman, 1997] sums the noise theoretically over the entire path length to produce simple closed form expressions [Najku et. al, 2004] propose a new method which can estimate the covariance matrix from empirical data. It is based on the PCmethod, which is based on the idea that a sensor-based navigation through the Generalized Voronoi Graph (GVG in short) Kelly, A, "Linearized Error Propagation in Odometry", International Journal of Robotics Research, Vol 23, No 2, Feb 2004, pp 179-218 Martinelli, A. and Siegwart, R., “Estimating the Odometry Error of a Mobile Robot during Navigation”. In Proceedings of European Conference on Mobile Robots, 2003. Doh, N. L. and Chung, W.K., “A Systematic Representation Method of the Odometry Uncertainty of Mobile Robots”. Intelligent Automation and Soft Computing, 2005, Vol. 11. No. 10, pp. 1-13.

Covariance Propagation
∧ ∧ ⎛∧ ⎞ ⎛∧ ⎞ ⎛∧ ⎞ ⎛ ∧ ⎞ ⎛ ∧ ⎞ Cov⎜ P ( t ) ⎟ = Cov⎜ P ( t − 1) ⎟ + Cov⎜ P ( t − 1), ∆ P ( t ) ⎟ + Cov⎜ ∆ P (t ), P ( t − 1) ⎟ + Cov⎜ ∆ P ( t ) ⎟ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ ∧ ⎛ ∧ ⎞ ⎛ ∧ ⎞ ⎛ ∧ ⎞ ⎛ ∧ ⎞ Cov⎜ ∆ P (t ) ⎟ = E ⎜ ∆ P (t ) ∆ P (t ) ⎟ − E ⎜ ∆ P (t ) ⎟ E ⎜ ∆ P (t ) ⎟ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ ∧ ⎛ ∧ ⎞ ⎛ ∧ ⎞ ⎛ ∧ ⎞ ⎛ ∧ ⎞ cov ⎜ ∆ x (t ) ⎟ = E ⎜ ∆ x (t ) ∆ x (t ) ⎟ − E ⎜ ∆ x (t ) ⎟ E ⎜ ∆ x (t ) ⎟ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠

Considering independency between variables
∧ ∧ ⎛∧ ⎞ ⎛ ∧ ⎞ Cov⎜ P ( t − 1), ∆ P ( t ) ⎟ = Cov⎜ ∆ P ( t ), P ( t − 1) ⎟ = 0 ⎝ ⎠ ⎝ ⎠ ⎛∧ ⎞ ⎛∧ ⎞ ⎛ ∧ ⎞ Cov⎜ P ( t ) ⎟ = Cov⎜ P ( t − 1) ⎟ + Cov⎜ ∆ P ( t ) ⎟ ⎝ ⎠ ⎝ ⎠ ⎝ ⎠

3

Covariance Propagation
ˆ ˆ⎜ ∆ x = ∆ d ⎛ cos (θ ) − ∆ θ sin (θ )⎞ ⎟ 2 ⎝ ⎠ ˆ ⎛ sin (θ ) + ∆ θˆ cos (θ )⎞ ∆y = ∆d ⎜ ⎟ 2 ⎝ ⎠ ˆ ˆ ∆ d tan φ ˆ= ∆θ L

()

ε ∆d ~ N (0, σ ∆d ε φ ~ N (0, σ φ
t t t

)

t

)

ˆ ˆ ˆ ˆ ˆ E [∆ x ] = E ⎡ ∆ d ⎛ cos (θ ) − ∆ θ sin (θ )⎞ ⎤ = E ∆ d cos (θ ) + E ⎡ − ∆ d ∆ θ sin (θ )⎤ ⎟⎥ ⎢ ⎜ 2 2 ⎢ ⎥ ⎠⎦ ⎣ ⎦ ⎣ ⎝ sin (θ ) 2 E [∆ x ] = ∆ d cos (θ ) − ∆ d 2 + σ d tan (φ + ε φ max ) L k sin (θ ) E [∆ x ] = ∆ d cos (θ ) − 1 tan (φ + ε φ max ) L

[

]

(

)

E ∆ Pt od

[

]

k ⎛ ⎞ φ ⎜ ∆ d t cos( θ t −1 ) − 1 sin( θ t −1 ) tan( φ t + ε max ) ⎟ 2L ⎟ ⎜ k φ = ⎜ ∆ d t sin( θ t −1 ) + 1 cos( θ t −1 ) tan( φ t + ε max ) ⎟ ⎜ ⎟ 2L ⎜ ⎟ ∆d t φ tan( φ t + ε max ) ⎜ ⎟ L ⎝ ⎠

k1 = ∆ d t + σ ∆ d t
2

2

k2 =

3 1 2 ∆ d t + 3∆ d t + σ ∆d t L

(

)

ˆ ˆ E ∆ xt ∆ xtT

[

]

φ = k1 cos 2 (θ t −1 ) − k 2 sin( θ t −1 ) cos( θ t −1 ) tan( φ t + ε max )

Covariance Propagation
General Considerations
A error in theta is considered Cross covariance terms are developed Closed expressions of trigonometric functions of Gaussian errors using Taylor expansion
θˆt −1 = θ t −1 + ε θ
t −1

Particular considerations
Tan(φ ) approximation
tan φ t + tan ε φt tan ε φt sec 2 φt ˆ tan φ t = tan φt + ε φt = = tan φ t + ≈ tan φ t + tan ε φt sec 2 φt 1 − tan φ t tan ε φt 1 − tan φt tan ε φt

(

)

3

Application Example Without Considering Correlation
k ⎛ ⎞ ˆ E [∆ xt ] = ⎜ ∆ d t cos θ t −1 − 1 sin θ t −1 tan φ t ⎟ E cos ε θ t −1 2L ⎝ ⎠ k1 ⎛ ⎞ ˆ cos θ t −1 tan φ t ⎟ E cos ε θ t −1 E [∆ y t ] = ⎜ ∆ d t sin θ t −1 + 2L ⎝ ⎠ ∆ d t tan φ t E ∆ θˆt = L

[

] ]

2 k1 = ∆ d t2 + σ ∆d t

[

[ ]

k2 =

2 ∆ d t3 + 3∆ d tσ ∆d t

L
4 t

2 2 k 3 = ∆ d + 6 ∆ d t2σ ∆d t + 3 σ ∆d t

k ⎛ ⎞ ˆ ˆ E [∆ xt ∆ xt ] = k 5 + ⎜ cos (2θ t −1 )(k1 − k 5 ) − 2 tan φ t sin (2θ t −1 )⎟ E cos 2ε θ t −1 2 ⎝ ⎠ k ⎛ ⎞ ˆ ˆ E [∆ y t ∆ y t ] = k 5 + ⎜ cos (2θ t −1 )(k 5 − k1 ) + 2 tan φ t sin (2θ t −1 )⎟ E cos 2ε θ t −1 2 ⎝ ⎠

[

]

k 4 = tan 2 φt + sec 4 (φt )σ φ2t k5 = k1 k 3 k 4 + 2 8 L2

(

)

2

[

]

k ⎡ ⎤ ˆ ˆ E [∆ xt ∆ y t ] = ⎢ (k1 − k 5 ) sin (2θ t −1 ) + 2 cos (2θ t −1 ) tan φ t ⎥ E cos 2ε θ t −1 2 ⎣ ⎦ k2k4 ⎡ k1 ⎤ ˆ E ∆ xt ∆ θˆt = ⎢ cos θ t −1 tan φ t − sin θ t −1 ⎥ E cos ε θ t −1 2L ⎣L ⎦

[

]

[ [ [

]

[

]

E[sin (ε )] = 0 E cos ε θ t -1

k k ⎡k ⎤ ˆ E ∆ y t ∆ θˆt = ⎢ 1 sin θ t −1 tan φ t + 2 4 cos θ t −1 ⎥ E cos ε θ t −1 2L ⎣L ⎦ ˆ ∆ θˆ = k1 k 4 E ∆θ t t L2

]

[

]

[ ( )] [ (

]

E cos 2ε θ t -1

)]

⎛ − σ θ2t −1 ⎞ ⎜ ⎟ = exp ⎜ 2 ⎟ ⎝ ⎠ = exp − 2σ θ2t −1

(

)

Covariance (y y) Without Considering Correlation
400 0.7 350

Uncertainty (m^2)
0 50 100 150 200

Uncertainty (m^2)

0.6 0.5 0.4 0.3 0.2 0.1 0

300 250 200 150 100 50 0 0 50 100 150 200

Traveled distance (m)

Traveled distance (m)

Formulas previously presented
ˆ Cov Pt
t −1 t

ˆ ˆ ˆ ( ) = Cov (P ) +Cov (∆ P ) + Cov (P

Odometry Montecarlo Simulation
t −1

ˆ , ∆ Pt T

ˆ ˆ ) + Cov (∆ P , P )
t T t −1

3

Crossed Covariance Terms
ˆ Cov Pt ˆ ˆ ˆ ( ) = Cov (P ) +Cov (∆ P ) + Cov (P
t −1 t t −1

ˆ ˆ ˆ− , ∆ Pt T + Cov ∆ Pt , Pt T 1

)

(

)

ε P = [ε x
t −1

] ε ) Cov (ˆ ) = Cov ( P
t −1

ˆ Pt −1 = Pt −1 + ε Pt −1

εy

t −1

εθ

T

t −1

t −1

Pt −1

An hypothesis is made in order to represent this error vector: ε Pt − 1 Three different independent, zero mean, Gaussian and orthogonal errors, ε a , ε b , ε c are considered and multiplied by certain constants equal the error in each of the considered axes, x, y, θ.

Crossed Covariance Terms
ˆ Now, since Cov Pt − 1

( )

is a positive semi definite Hermitian matrix,

it can be decomposed in the form:

ˆ Cov Pt − 1 = Q λ Q T
where

( )

Q

is the eigenvector matrix is a diagonal matrix containing the eigenvalues

λ
⎡A Q = ⎢D ⎢ ⎢G ⎣

σ a,σb, σ c
⎞ ⎟ ⎟ ⎟ ⎟ ⎠

B E H
t −1

C⎤ F⎥ ⎥ I⎥ ⎦

⎛ ⎡ ε a ⎤ ⎡ε a ⎤ T ⎜ λ = cov ⎜ ⎢ε b ⎥ , ⎢ε b ⎥ ⎜⎢ ⎥ ⎢ ⎥ ⎜ ⎢ε c ⎥ ⎢ε c ⎥ ⎝⎣ ⎦ ⎣ ⎦

ε P = Q [ε a
t −1

εb

ε c ]T

ε x = Aε a + Bε b + Cε c

3

Application Example Crossed Covariance Terms
cov ε

(

x t −1

, ∆ x tT

)= )

−E ε

[(

x t −1

)sin (ε
t −1

θ t −1

)] E ⎢ ∆ dˆ
⎢ ⎣

⎡

t

⎛ ∆ θˆt sin ⎜ θ t − 1 + ⎜ 2 ⎝

⎞⎤ ⎟⎥ ⎟ ⎠⎥ ⎦

= − cov

(x

t −1

, θ tT− 1

⎛ σ θ2 exp ⎜ − ⎜ 2 ⎝ = − cov
t −1

Ε[ ε Ε[ ε Ε[ ε Ε[ ε Ε[ ε Ε[ ε Ε[ ε Ε[ ε

x t −1 x t −1 y t −1 y t −1 y t −1

, ∆y

T t T

, ∆θ t , ∆ xt , ∆y , ∆θ , ∆ xt , ∆y , ∆θ

T

T t T t T

θ t −1 θ t −1 θ t −1

T t T t

] = cov (x , θ )E [∆ x ] = − cov (x ˆ ]=0 ] = − cov (y , θ )E [∆ y ] ˆ ] = cov (y , θ )E [∆ x ] ˆ ]=0 ] = − cov (θ , θ )E [∆ y ] ˆ ] = cov (θ , θ )E [∆ xˆ ] ]=0
T t −1 t t −1 T t −1 t t −1 T t −1 t t −1 T t −1 t t −1 T t −1 t

(x

⎞⎛ ⎞ k ⎟ ⎜ ∆ d sin θ + 1 cos θ t − 1 tan φ t ⎟ t t −1 ⎟⎝ 2L ⎠ ⎠
t −1

ˆ , θ tT− 1 E [∆ y t ]

)

t −1

E ˆ , θ t − 1 ) [∆ y t ]
T

Simulations and Experiments
A Monte Carlo simulation for the path the vehicle performs has been made. The chosen path is a closed circle with a total length of 400m. Jacobian estimation (EKF), UT (unscented transformation) estimation (UKF) and the presented formulation are also simulated and compared. The values used in this run are ∆d=0.2m, σ=0.01m, φ=1.125°, σ=3°, L=1.25m. The number of steps was 2,000.

3

Increment Covariance (Jacobian)

cov (∆ x , ∆ x )

cov (∆ x , ∆ y )

cov (∆ y , ∆ y )

cov (∆ x , ∆ θ )

cov (∆ y , ∆ θ )

cov (∆ θ , ∆ θ )

Bold gray line: Montecarlo simulation, Dotted line: presented formulae, Gray line: Jacobian

Increment Covariance (UT)

cov (∆ x , ∆ x )

cov (∆ x , ∆ y )

cov (∆ y , ∆ y )

cov (∆ x , ∆ θ )

Bold gray line: Montecarlo simulation, Dotted line: presented formulae, Gray line: UT

cov (∆ y , ∆ θ )

cov (∆ θ , ∆ θ )

3

Increment Covariance (Jacobian, UT)

cov (∆ x , ∆ x )

cov (∆ x , ∆ y )

cov (∆ y , ∆ y )

cov (∆ x , ∆ x )

cov (∆ x , ∆ y )

cov (∆ y , ∆ y )

Bold gray line: Montecarlo simulation, Dotted line: presented formulae, Gray line: Jacobian(above)/UT(below)

Total Covariance (Jacobian, UT)

cov ( x, x )

cov ( x, y )

cov ( y, y )

cov ( x, x )

cov ( x, y )

cov ( y, y )

Bold gray line: Montecarlo simulation, Dotted line: presented formulae, Gray line: Jacobian(above)/UT(below)

4

State Estimation and Data Fusion
Model Estimation
Approximation models

Uncertainty Propagation:
Presented Formulae First Approach Cross Terms Jacobian (EKF) Unscented Transformation (UKF)

Data Fusion
Proposed scheme Extended Kalman Filter Probabilistic Approach (Pozo Ruz) Covariance intersection Algorithm

Fusion Principle
Sensor 1 Sensor 2 Fusioned, new estimation

To avoid the errors caused by odometry, redundancy sensors are desired, fusing them to improve the estimation and the uncertainty

4

Data Fusion
We are seeking to develop a fusion method able to work with different sources, considering autocorrelation and cross correlation
CIA Fusion

P5 , cov5

CIA Fusion

P4 , cov4

P , cov1 1
Fusion

P2 , cov2
Pozo Ruz

P3 , cov3
Fusion

Pod , θ od

Pod , θ od

Pod , θ od
Odometr y Equation s Steering Velocity Odometer Odometer

θ = ∫ vθ dt

Odometr Odometr PGPS = TGPS DGPS y y Equation Equation s s Steering Velocity Steering Velocity Gyroscope GPS Odometer Odometer Odometer Odometer

Compass

GPS Considerations (Pozo Ruz)
•This algorithm part characterize the errors associated to the measurements provided by the different vehicle’s onboard sensors, as well as from the existing relations of dependency among them. •Optimize the function of joint probability of the errors of odometry and GPS, the best estimation of the position is obtained in every single moment.

4

Covariance Intersection Algorithm
Allows to fuse two measurements with unknown crosscovariances maintaining consistency of the estimates Considering covariance Paa and Pbb with averages a and b It is possible to find a fusion covariance Pcc and average c which are consistent using variable ω where ω ∈ 0 ,1

[ ]

− − − Pcc 1 = ω Paa1 + (1 − ω )Pbb1
− − − Pcc 1c = ω Paa1 a + (1 − ω )Pbb1b

Covariance Intersection Algorithm
covA

covB

Fusion (covA, covB) Using CIA

Fusion (covA, covB) without correlation

4

Experiments
A 60 X 40 m. rectangle trajectory was followed clockwise and counter-clockwise. The trajectories were performed manually. These trajectories were estimated using different sensors. Finally a video of an autonomous execution is shown, with analysis.

Trajectory Comparison

4

Trajectory Comparison

Trajectory Comparison:
Goal achievement
Clockwise Trajectory Odo Odo GPS Odo GPS Compass Odo GPS Compass Gyro x 4.4762 1.0282 0.4489 0.4424 y -0.3171 -0.5679 -0.6463 -0.6509

θ
81.8197 89.0079 91.6764 91.6392

Counter Clockwise Trajectory Odo Odo GPS Odo GPS Compass Odo GPS Compass Gyro

x 2.9906 1.9559 0.7658 0.5294

y -4.9483 1.5659 0.8815 0.8246

θ
-78.0098 -89.4485 -91.3700 -91.4013

4

Video

Total Covariance (Jacobian)

cov ( x, x )

cov ( x, y )

cov ( y, y )

cov ( x , θ )

cov ( y , θ )

cov (θ , θ )

Bold gray line: Montecarlo simulation, Dotted line: presented formulae, Gray line: Jacobian

4

Eigen Vector Errors
12 Jacobian UT Presented formulation 10

8

Error

6

4

2

0

0

200

400

600 Time step

800

1000

1200

Agenda
Introduction
Preliminaries: Mobile Robots’ Collaboration Features: Navigation of Autonomous Vehicles (AV’s)

Autonomous Vehicles: Design, Construction and Analysis AV Model, Architecture and Method for Transformation State Estimation and Data Fusion AV Trajectory Following with Visual Monitoring Conclusions and considerations

4

AV Trajectory Following with Visual Monitoring

Proposed Architecture
z
Path Following

Decisional Level

Executive Level

Path Planning

Environment Model

Obstacle Avoidance

Environment Modeling

x

Position Estimation

Feature Extraction

θ
Global Positioning Position Following Sensor Processing Other Tools

Interface

Actuators

Internal Sensors

External Sensors

Environment

4

Trajectory Planning and Following
Trajectory planning
Non holonomic constraints Continuous curvature Non collision

Following
Hierarchical structure
Sensors-actuators Electronics Control, ...

Internal and external feedback
Robust and deterministic low level Heuristic high level monitoring

General Schema
Mobile Camera
(θ 1 ,θ 2 )

Trajectory

VA

Visual camera area

Off-Board Computer

4

Task Distribution
∆d,φ

P={p1,p2,...pn}
ˆ ˆ ( x , y , θˆ )

Trajectory Planning Visual VA Tracking Visual Pose Estimation Data Fusion

Trajectory Control Pose Estimation Point to Point Control Steer Control Velocity Control

Architecture
Trajectory Planning Vehicle Kinematic model Trajectory Control Environment model Dynamic path planning Trayectory planning Trayectory following Comunication Computer AV Tasks location Computer Position Estimation VA Data Fusion Offline Visual system data processing Camera Calibration AV Computer

Point Control

Obstacle avoidance

Velocity Control

Steering Control

Visual system

5

Control Architecture
Trayectoria planeada P={p1,p2,...pn} Control de trayectoria Control de velocidad Control a punto Estimación de Posición del VA Estimación visual de posición Estimación de Posición en la computadora Fusión de datos Trayectoria estimada

Control de dirección

Vision System Architecture
Camera control Estimate next movement

It

AV tracking

(x,y) img

(a ,b) (x,y) R ( θ 1,θ 2) Pose Data transformation

Camera position
( θ 1,θ 2)

fusion

Camera

New position Actual position

Communication protocol

Camera know parameters

5

Camera model
Pinhole model
m = A (R t)M

It projects a 3D point in the real world to a 2D point in the Image.
y

x Object

z Projection center Focal length Image

Camera parameters
The camera internal parameters using [Zhan 99].

A homography matrix that relates the real world with the image.
Real world

x

Correspondance points H=

h11 h21 h31

h12 h22 h32

h13 h23 h33

Calibration

y
Intrinsic parameters Extrinsic coeficients

fx A= 0 0

0 fy 0

cx cy 1
Image

(k1, k2, p1, p2)

5

Vehicle Visual Tracking
Correlation, based in mean shift vector [Comaniciu 99]. Given an interest patern, it looks in the next image the area with the most similar patern.
Step 1 Select the vehicle area. Step 2 Obtain the vehicle patern.
Step 3 Step 4

It
Start

Step 9 Step 7 y 8

It
No

Step 5

≈
Step 6 Yes

Step 10

Moving camera
M2 M3 M1 Rj R m'2 m'1 Ri m2 m3 m1 m'3

mi = AiRiM mj = Hijmi

mj = AjRjM

Hij = AjRjRi-1Ai-1 = AjRijAi-1

I' I O Optic center

Z

X

Y

5

Pose estimation

.

Image position (x,y) Intrinsic and extrinsic parameters

Corrected image position (x,y)

Homography matrix

Camera position ) (q1 2 ,q Real world position (X,Y) r

x . y

Experiments

5

Dead reckoning

Primer ajuste φ1 Trayectoria Error (m2) Izquierda 2.182 Derecha 1.1823

Primer ajuste φ2 Izquierda 2.5586 Derecha 1.2477

Primer ajuste φ3 Izquierda 1.9173 Derecha 0.9642

With feedback (static camera)
4

0.5 0 -0.5 0 -0.5 -1 -1.5 -2 -2.5 -3 -3.5
0 0.5 1 1.5 2 2.5 3 3.5

3.5

0.5

1

1.5

2

2.5

3

3

2.5

2

1.5

1

0.5

0 -0.5 -0.5

-4
Odometría Cámara Fusión Trayectoria

Odometría

Cámara

Fusión

Trayectoria

Odometry Trajectory Error (m2) Left 3.2091 Right 2.4720 Left 0.975

Camera Right 1.0613 Left 0.8271

Fusion Right 0.952

5

Angle θ
3.5 2.5

1.5

0.5

Cámara Fusión 150 160 170 180 190 200 210 220 230 Odometría Brújula

-0.5

-1.5

-2.5

-3.5

With feedback (PT camera)
0.5 -2.5 -1.5 -0.5 -0.5 -1.5 -2.5 0.5 1.5 2.5

1

0 -0.6 -1 0.4 1.4 2.4 3.4 4.4

-2
-3.5

-3
-4.5 -5.5 -6.5 -7.5 -8.5 Fusion Vision Trayectoria Odometria

-4

-5

-6

-7
Fusión Visión Odometría Trayectoria

Odometría Trayectoria Error (m2) Ocho 12.2884 Rectángulo 5.0451 Ocho 4.4008

Cámara Rectángulo 4.5851 Ocho 3.8876

Fusión Rectángulo 3.9138

5

Videos

Agenda
Introduction
Preliminaries: Mobile Robots’ Collaboration Features: Navigation of Autonomous Vehicles (AV’s)

Autonomous Vehicles: Design, Construction and Analysis AV Model, Architecture and Method for Transformation State Estimation and Data Fusion AV Trajectory Following with Visual Monitoring Conclusions and considerations

5

Conclusions
The design is robust and flexible due to the architecture features An AV generation developed at the Center for IS, of the Tecnológico de Monterrey We produce a very low cost AV (scale) The AV’s will reduce the risk of task execution inside the mines and in harsh works We actually develop a general methodology for design AV’s for mining purposes: sample recollecting, excavation and so on

Next Step
Synthesize AV configuration from task and environment specifications Coherent AV workspace integration (planning, sensing, map building, ...) Collaborative Navigation (Catastrophe relief) Navigation in dynamic environment Marsupial or nursery navigation

5

Participants
Coordinator: Dr. José Luis Gordillo
MC. Isidro López-Ávalos MC. Patricia Mora MC. Aldo Díaz Prado Ing. Adriana Cantú Ing. Manuel Olvera José Luis Bersunza Ing. Alfredo Cruz Ing. Gerardo Palacios Ing. Carlos Albores Sergio Aguirre Hugo Rossano Ing. Luis Fernando Hernández Ing. José Manuel Vazquez Ing. Brian Ponce Ing. Carlos Enrique Aguilar Ing. Elizabeth Guevara Ing. Jorge Humberto Moreno-S Ing. Rodolfo Muñoz Ing. Fernando Von Borstel Ing. F. Gerardo González Ing. Fernando Rivero Ing. Gilberto González

Dr. José Luis Gordillo
Professor Center for Intelligent Computing and Robotics (CCIR) Tecnológico de Monterrey Av. Eugenio Garza Sada 2501 64849 Monterrey, N.L. Mexico Tel: +52 (81) 8328-4423, 8328-4379 Fax: +52 (81) 8328-4189 email: JLGordillo@itesm.mx http://RobVis.mty.itesm.mx/~gordillo

5

Reconocimientos
La página ftp del curso se encuentra en:
http://RobVis.mty.itesm.mx/~gordillo/Cursos/Robotica

Escuela de Verano en Imágenes y Robótica:
http://www-sop.inria.fr/icare/WEB/SSIR04/ http://www.fimee.ugto.mx/ssir05/ http://www.image-and-robotics.org/~ssir2006/ http://www.image-and-robotics.org/~ssir2007/ http://www.image-and-robotics.org/~ssir2008/ http://www.image-and-robotics.org/ssir2009/

Grupo Visión 2003

6

Grupo Visión 2007

6

6


				
DOCUMENT INFO
Shared By:
Tags:
Stats:
views:504
posted:12/17/2009
language:English
pages:62