Lidar_ computer simulation blend in avionics to help helicopter pilots

Document Sample
Lidar_ computer simulation blend in avionics to help helicopter pilots Powered By Docstoc
					Lidar, computer simulation blend in
avionics to help helicopter pilots land
safely in dust
By John Keller ,
Military & Aerospace Electronics

OTTAWA, 30 April 2009. Helicopter avionics specialists at two CanadIain aerospace companies
are blending lidar technology and computer simulation to create a virtual environment for aircraft
avionics to enable pilots to land safely in zero-visibility dust, smoke, and fog — especially at night.

Lidar stands for light direction and ranging, and uses infrared laser beams as primary sensors for
image capture of the surrounding area. This technology also is referred to as ladar, which is short
for laser radar. Neptec's military avionics technology is called OPAL, which is short for obscurant-
penetrating autosynchronous lidar.

Engineers at the Neptec Design Group in Ottawa and CAE in Montreal are using military laser
technology to create imagery of visually obscured areas, which they overlay with terrain database
information designed for military training and simulation to create an enhanced, computer-
generated out-the-window view that is updated in real time to help helicopter pilots land where
dust, smoke, fog, or other obscurants blot out the pilot's view of the outside.

"Landing helicopters in brown-out conditions is the major application for this technology," explains
Neptec President Dr. Iain Christie. "This could help alleviate operational constraints where pilots
can't land in dusty areas at night. Lidar sees as well at night as it does during the day."

CAE and Neptec demonstrated the OPAL sensor integrated into the CAE AVS at the Yuma
Proving Grounds, Ariz. when they penetrated dust clouds generated by a UH-1 test helicopter.
The system helped pilots see through brownout conditions opaque to the human eye to
differentiate between rocks, bushes, sloping terrain, utility poles, ground vehicles, and wires at
distances greater than 200 meters.

"Our job is to have a very efficient sensor system for the CAE AVS with updates to the database
that it uses to create the simulated environment," Neptec's Christie explains. "Pilots need to find
out how different the landing zone is from what the system thinks it is. Combine the sensor and
simulation and you have a very accurate synthetic environment."

The CAE AVS combines OPAL with other sensors such as forward-looking infrared (FLIR).
Sensor information blends with the CAE-developed common database (CDB), originally
developed for U.S. Special Operations Command.

Further development of the AVS technology is part of CAE's Project Falcon research and
development to expand modeling and simulation technologies and develop new applications.
For more information contact Neptec online at, or CAE at

Shared By: