Docstoc

WEATHER DATA VISUALIZATION -

Document Sample
WEATHER DATA VISUALIZATION - Powered By Docstoc
					                       WEATHER DATA VISUALIZATION -
                       DECISION AID TOOLS FOR ARMY C4I

                                  Donald W. Hoock 1 *

                                           and

                                    John C. Giever 2
              1
                U.S. Army Research Laboratory, Information Science and
               Technology Directorate, Battlefield Environment Division
                     White Sands Missile Range, NM 88002-5501
                  (505) 678-5430; (DSN 258-); FAX: (505) 678-3385
                              E-MAIL: dhoock@arl.mil
                             2
                              Physical Science Laboratory,
                            New Mexico State University
                             Las Cruces, NM 88003-0002
                  (505) 678-3280; (DSN 258-); FAX: (505) 678-3385
                           E-MAIL: jgiever@psl.nmsu.edu


                                      ABSTRACT

The Integrated Meteorological System (IMETS) is the Army’s battlefield C4I tactical
weather system. It produces mesoscale forecasts, weather warnings and a complete 3-D
Gridded Meteorological Data Base (GMDB) populated from Air Force MM5, Navy
NOGAPS and Army BFM forecast models. From these data a variety of decision aid
products are generated on the IMETS Weather Effects Workstation (WEW) to inform
commanders of weather impacts on battlefield systems for mission planning and
execution. Current visualization tools provide 2-D weather data and impact overlays
that can be displayed on Army Tactical Command and Control System (ATCCS)
displays using the Joint Mapping Toolkit (JMTK). In addition, the 60+ variables in the
GMDB are viewable on the WEW display using the interactive Vis5D viewer. In this
paper we show various visualization capabilities being explored as potential new
planning decision aids. These include route displays in Vis5D for showing weather data
along a 3-D path, a tactical "weather feature" for the Common Tactical Picture (CTP),
and automating data ingest to the flight weather briefings. In addition, the future use of
common visualization tools across the ATCCS and in particular with the Corps of
Engineers terrain analysis systems, will allow weather to be better integrated into terrain
and mobility decision aids. We also discuss some of the technology improvements that
we expect the IMETS will evolve to as the Army tactical weather systems are made
lighter and more versatile, with tailored weather applications executed or displayed on
fully distributed C4I system displays at different echelons.
                                         OBJECTIVE

        With recent history and projected future requirements in mind, the U. S. Army has
begun to implement a new vision for a “full spectrum” combat force. Focussing on highly
flexible “Brigade and Below” force structures, the new Combat Brigades will be better able to
participate not only in a conventional Major Theater War (MTW), but also in regional Stability
and Security Operations (SASO). Rapid response, reduced logistics, and on-scene planning
with reach back to division and above assets, will optimize the capabilities for successful
employment in Small Scale Contingency Operations (SSCO), particularly in complex and
urban terrain, and for confronting low-end and mid-range threats that may employ both
conventional and asymmetric capabilities. Upcoming changes will impact both materiel
acquisition and Army doctrine, including Brigade Combat Team (BCT) Weather Support
functions.

        The purpose of this paper
is to begin to look at some of the
various weather information and
decision aid products that may be
appropriate as extensions to the
current Integrated Meteorological
System (IMETS). The IMETS is
the Army’s tactical meteorological
forecast    and     weather-effects
decision aid system, currently
fielded into Command and
Control functions at “Division and
Above”      Tactical     Operations
Centers (TOC’s), and into certain
aviation brigades. In the next
section we briefly outline the
current capabilities of the IMETS
system and the planned concepts          Figure 1. Evolution of IMETS and the Weather
for future lighter, smaller IMETS      Effects Workstation (WEW) to smaller processors.
as shown in Fig. 1.            The
succeeding sections then look at various types of data visualization products that can be
implemented in the near term to provide weather planning and ‘nowcast’ support to individual
soldiers, squads, companies, battalions and brigades. These proposed products generally have
no formal written requirements identified as yet, but they do build on many of the new
requirements-driven capabilities that we are implementing this year for weather C4I into the
Army’s First Digitized Division.

       The final sections explore the use of “3-D” or stereo-visualization approaches to pack
more information into “2-D” image products for remote sensing and weather forecast decision
aids. The practical justification for this research is that the display of 2-D image products is
less demanding on the individual soldier in the field, who is now identified as a legitimate end
user of weather information, than are manually-interactive, virtual 3-D visualization displays.
Therefore, for this paper and the symposium presentation we have chosen an “anaglyph” (red-
blue “3-D glasses”) format for these figures, although other stereo projection and observation
formats are possible. This turned out to be not only interesting and rewarding, but also as a
“fun” exercise to liven up a technical presentation on the last afternoon of a long conference. It
does require, however, that the reader of the paper obtain a pair of “red-left, blue-right”
cellophane “glasses” to view a few of the figures. Sources are readily available on the internet.
Attendees of the presentation will be provided inexpensive, cardboard versions to take home.

                             OVERVIEW OF CURRENT IMETS

        The IMETS is developed and fielded through the Army’s PD-IMETS office under the
PM-INTEL FUSION and PEO-C3S. R&D for IMETS’ numerical weather prediction models,
the information databases, and the weather impact Tactical Decision Aids (TDA’s) is provided
by the Army Research Laboratory. New applications are implemented by the Physical Science
Laboratory of New Mexico State University under an ARL contract, and are then integrated
into the Army’s Battlefield Command System (ABCS) by Logicon Advanced Technologies, a
PD-IMETS contractor. The goal of the latest IMETS releases is to provide the many C4I
Battlefield Functional Areas (BFA’s) across all the Army Tactical Command and Control
Systems (ATCCS) with access to IMETS forecast and weather effects decision aids. And,
IMETS workstations provide the Army’s Combat Weather Teams, manned by Air Force
Officers, with decision making tools that include IMETS-generated products and specially
tailored weather information products transmitted via satellite to IMETS in the field from
centralized Air Force Weather hubs.

        Figure 2 shows a schematic of the basic components of the IMETS. The subject of this
talk are products derived from software tools and models on the Weather Effects Workstation




                       Figure 2. Schematic of IMETS work stations
(WEW). These systems are fielded in battlefield TOC’s ranging from Echelons above Corps
(EAC) down to Division level, and to certain Aviation Brigades. The “top-down” flow of
regional forecasts arrives via the commercial Tactical Very Small Aperture Terminal (T-
VSAT). These meteorological data grids originate from Navy Operational Global Atmospheric
Prediction System (NOGAPS) model runs at 1 degree spacing and from Air Force
Meteorological Model 5 (MM5) mesoscale meteorological forecasts at 45 km (and potentially
on 15 and 5 km horizontal grids in future). The T-VSAT also provides current surface
observations and vertical profile observations from around the world. As shown in Fig. 3, these
data can be refined on the WEW using the Army Battlescale Forecast Model (BFM).




                     Figure 3. WEW weather forecast processing steps

        BFM produces a detailed, terrain-coupled forecast at 10 km spacing (potentially down
to 2 km), providing basic meteorological data from very near the ground up to 7 km in a
terrain-conforming grid system. The current version of BFM for the Army’s First Digitized
Division produces a 24-hour forecast. While the BFM can run by itself with just local surface
and vertical profile observations (providing only a local 3 to 12 hour forecast in those cases),
normally it also initializes on the NOGAPS or MM5 forecast grid data. Then, because BFM is
designed to “nudge” the forecast to always reproduce the NOGAPS or MM5 forecast at their
input data grid locations, BFM will normally always do at least as well as those model
forecasts. Numerous studies of the BFM have been made, including results presented at this
conference (Haines, 2000; Henmi, 2000). The MM5 and NOGAPS forecasts are longer range,
extending out to 36 and 96 hours respectively.

      The data from NOGAPS, MM5 and BFM are stored as three-dimensional arrays in a
Gridded Meteorological Database (GMDB). In addition, a number of weather features and
               Figure 4. The IMETS Gridded Meteorological Database (GMDB)
weather hazard parameters are post-processed from the gridded forecasts in the Atmospheric
Sounding Program (ASP) (Passner, 1999). These fill out the GMDB to a set of more than 60
variables as shown in Fig. 4.

        The other products from
the IMETS system are information
and decision aid tools, and limited
remote sensing products that will
be expanded in the future. Some
of these tools are for use by the
Air Force Combat Weather
meteorologists who operate the
IMETS. These include skew T-
log P charts, plots of profiles from
the    GMDB,        thermodynamic
parameters and input observation
profiles (as in Fig. 5).         The
IMETS also provides various
meteorological data editors, etc.

       The weather data are
accessible  by    the   various
command and control functional         Figure 5. Some of the IMETS forecaster tools
area workstations in the TOC.
GMDB meteorological data can be displayed as overlays on common Army maps of the Joint
Mapping Toolkit on these systems. Figure 6 shows an example overlay displaying a 2-D plan
views of meteorological data. These overlays can include line and area contours, wind streams,
etc.

        The Integrated Weather
Effects Decision Aid (IWEDA)
produces red- amber- green
weather impact warnings for
many military systems and
operations. The impact “rules”,
along with the critical value
thresholds that trigger moderate
or severe impact warnings can
be modified for particular
applications (Hoock, Torres and
Sauter, 1998a) using the IMETS
DIRECT weather impact editor.
An example is shown in Fig. 6.

        Currently only simple
met-sat images and loops are
produced on the IMETS. As
shown in Fig. 7, the analyst can
annotate images and post them
to an IMETS web page on the
                                           Figure 6. Weather and IWEDA overlays
TOC network. In future the
metsat data processing and
product     capability will be
enhanced to include a number of
products. These will show current
clouds, precipitation, state of
ground,      temperatures      and
vegetation index. They will also
provide derived quantities to
support forecast and TDA model
inputs, such as surface albedo and
derived visibilities or aerosol
events such as blowing dust.
Examples of some of these
products are shown in Fig. 8. An
Air Force Small Tactical terminal       Figure 7. IMETS Homepage - annotated image
(STT) will feed an IMETS met-sat
workstation (shown in Fig. 2 as a RT-GIS upgrade to current low-res System West WEFAX.)
This will make IMETS environmental processing consistent with the RT-GIS capability
currently used by the Corps of Engineers on the C4I Digital Terrain Support System (DTSS).
                            Figure 8. Future metsat derived products

        Finally, the
manually interactive
Vis5D viewer is used
on the WEW to browse
and display any of the
data in the 3-D Gridded
Meteorological
Database (GMDB).
Outputs, such as those
shown in Fig. 9, can be
viewed on the WEW or
posted to the IMETS
tactical web page by the
IMETS operator.
While it is possible for
external X-windows
compatible work-
stations or emulators to
do a UNIX rlogin to the
IMETS and to execute
Vis5D, in practice this
process is data intensive
and slow. Furthermore,
Vis5D is not an Army
DII/COE graphics
standard, unlike other               Figure 9. Standard Vis5D Viewer on IMETS
IMETS 2-D overlays.
                                RESEARCH ACCOMPLISHED

        The capabilities summarized above currently provide an excellent basis for weather
intelligence at division echelons and above. Until recently, conventional assumptions for
weather C4I have been that: (1) Mission planning will occur mainly at division and above. (2)
Planning will focus on time frames of 24 to 96 hours; and (3) Global forecasts augmented by
available surface observations, and vertical profile observations will be the main input data for
forecasts over otherwise data-denied and data-sparse tactical areas.

        While these assumptions and rationale will probably continue to be valid for Major
Theater Wars, the new Army Vision calls for more mobile Combat Brigades. Organic data
collection and mission planning may demand 20 minute decision cycles and focus on
immediate objectives 3 to 6 hours in the future. Furthermore, the traditional expectation that
the tactical battlefield will be data sparse may be too restrictive. Access to a variety of robotic,
aerial, remote and personnel sensors for direct and inferred met observations is likely to
become feasible soon. Thus, the new focus for weather products may be products developed
for or at the lower echelons, perhaps down to individual soldiers, whose physiological and
environmental state information may itself become available from a web of linked sensors.

        The latest IMETS Operational Requirements Document recognizes an objective goal to
ingest these non-conventional met observations and process them in near real-time. These will
include vehicle and UAV mounted data that are neither conventional surface met observations,
nor vertical profiles; incomplete meteorological data sets that include only some parameters but
not others, and data taken at irregular times. Certainly a variety of remote imaging sensors
from ground to space, active radars and passive radiometers observe environmental data, and
one goal will be to make use of this information. And even anecdotal reports such as “It’s
beginning to rain where we are” should also be considered as valuable, but perishable nowcast
information. All this needs to be merged into a conceptual 4-dimensional database of arrays of
meteorological and weather effects parameters, tailored unit or operation weather forecasts,
weather warnings, performance impacts and inputs needed for CB defense and other models.

    WEATHER REQUIREMENTS OF THE INITIAL / INTERIM COMBAT BRIGADES

        Brigade Combat Teams (BCT) in excess of 3,000 personnel will probably be expected
to conduct sustained, autonomous combat operations for up to 240 hours in an initial Area of
Operations (AO) of 50 x 50 km, expandable up to 100 x 100 km. While it will rely heavily on
“reach-back” to division TOC’s or centralized information hubs, the BCT will have a need for
local decision making and generation of locally-tailored intelligence products by on-site
personnel. Inside the decision making process loop at the BCT-level, combat weather
personnel will recommend alternative ingress, egress, or courses of action to exploit weather
intelligence as a force multiplier.

       The need for quantitative nowcasts that are as consistent with ground truth as possible
(and able to easily adjust to known ground truth) seems to be emerging. But the Army is just
beginning a process to define weather requirements for the Initial (1-2 years), Interim (3-5
years) and Objective (10-15 years) Full Spectrum force.               Some of the minimum
meteorologically-driven information requirements being identified for the various company and
battalion functions within the initial BCT are listed, for example, in Table 1. New
technological capabilities will need to be developed to meet the spatial and time resolutions
associated with this list. New weather products for route planning are particularly needed.

       Table 1. Draft minimum weather requirements considered by TRADOC for BCT
  1. Onsite weather briefing capabilities     13. Cloud cover amount / height
  2. Pre-deployment weather forecasts         14. Density and humidity profiles
  3. NBC effective downwind messages          15. Illumination (Light, NVG)
  4. Resource protection/weather warnings     16. Precipitation, rain/snow/thunderstorms
  5. High-resolution metsat imagery updates   17. Temperature profiles
  6. Weather Products for ABCS BFA’s          18. Visibility and Tgt Acq/Lock-on ranges
  7. Space weather effects on comms           19. Surface winds and upper air profiles
  8. Weather inputs to trafficability         20. Icing and turbulence hazards
  9. Thermal stress indices for personnel     21. Cloud free line-of-sight forecasts
  10. UAV mission following/METwatch for      22. Automated surface observations
  dynamic tasking and alternative             from/for UAV launch site. Take-off and
  ingress/egress                              recovery forecasts
  11. Medevac helicopter weather forecasts    23. UAV “pilot report” weather ingest
  12. Back-brief forecasts for APOE/SPOE and 24. Radar propagation forecasts
  enroute air and sea operations

                  WEATHER OVERLAYS AND GMDB DISTRIBUTION

        Figure 10 shows the current “top down” flow of mesoscale MM5 and NOGAPS data,
Air Force weather information products, and low resolution metsat images from the Air Force
central weather hubs. The IMETS processes these data and makes the weather forecasts and
weather impact data available from the Gridded Met Database (GMDB) on the local Tactical
Operations Center (TOC) network. As part of the current Army Battle Command System
(ABCS 6.x), ARL and PSL
have developed the Weather
Overlay Provider and its
interface to the Joint Mapping
Toolkit (JMTK). This allows
the other BFA’s in the TOC
to display weather contours
and red-amber-green impacts
over their own data on their
common map displays.

       Under agreement with
the      Combat      Terrain
Information System (CTIS),
the IMETS GMDB will share               Figure 10. Weather data flow in the TOC
the terrain database servers
             Figure 11. IMETS Overlays showing a way-point route plan view
provided by the CTIS. Since these databases will be provided across echelons, this is
potentially one way for IMETS weather information and decision aids to be made available to
lower echelons where a full-up IMETS is not deployed. Figure 11 shows examples of these
weather overlays. One simple extension to support route planning would be to plot segments
between way points (shown here in yellow with the current way point segment in red.)

              COMMON TACTICAL PICTURE (CTP) WEATHER FEATURE

        A new “Weather Feature” is being developed by ARL and by PSL for the Common
Tactical Picture (CTP) in ABCS 6.1. The CTP is a specific type of common display for Army
Tactical Command and Control Systems. It accesses the Joint Common Database (JCDB).
The JCDB will replicated across echelons for use by all C4I systems. IMETS has currently
been allocated room in the JCDB for a limited database of 8 basic weather parameters as shown
in Fig. 12. A standard military weather symbol can be moved and clicked on to show text
weather warnings, unit forecasts, and meteorological data at that symbol. For en route weather
planning the symbol could be moved over a route-overlay as part of this application.




            Figure 12. The Weather Feature on the Common Tactical Picture
                           AUTOMATED WEATHER BRIEFINGS

        Probably the best example of automating a route weather briefing would be to address
standard DD Form 175, the “Military Flight Plan” and DD Form 175-1, the “Flight Weather
Briefing”. The DD-175 defines the mission in terms of true airspeed, point of departure,
departure time (Z), altitude and route of flight descriptions. The latter briefing form (part of
which is shown in Fig. 13) is then traditionally filled out manually by a trained forecaster.
Meteorological data for three of the five parts of the form could be automatically pulled from
the IMETS gridded met database (GMDB). Part I, Mission Takeoff Data could be manually
entered or automated to utilize the local met measurements at the air field or to use the forecast
for the estimated time of departure. Part II, En route Data (as in Fig. 13), could be filled with
data from the current IMETS GMDB, although a few of the variables will require development
of additional meteorological analysis models (for example, for “visibility at flight level outside
clouds”). Part III, Terminal Forecasts, could also be pulled from the GMDB, based on the
location of the destination and estimated time of arrival. Note that we do not recommend
eliminating the requirement for face-to-face weather briefing from the trained meteorologist.
Human review and interpretation of the data will probably always be required for Parts IV
Comments/Remarks and Part V, Briefing Record signatures. However, automation of data
look-up and form generation could save the forecaster valuable time for analysis, discussion
and contingency planning.




               Figure 13. Part II of DD Form 175-1 - Flight weather briefing

                       ROUTE PLANNING ADDITIONS INTO VIS5D

        Vis5D is the interactive virtual 3-D data display on the IMETS WEW. To support route
planning, a number of enhancements to Vis5D have been proposed and implemented, although
not all have been accepted yet for field use. First, a menu-driven method to zoom the display
has been added, and a prototype demonstrated that allows the database to be sectioned into
quarters. This allows the user to focus on any specific quadrant in the data set. Second, a route
display has been added that allows any route specified as straight-line paths between user-
defined “way points” to be rendered (Figs. 14 and 15). A “snap” capability for vertical slice
displays in Vis5D has been implemented so that the vertical plane can be automatically placed
tangent to the current way point path segment (Fig. 14). Finally, a “probe” display mode for




     Figure 14. Vertical data plane
    "snapped" to route segment (red
        segment of yellow path)                     Figure 15. Way point routing in Vis5D
route segments has been written. This mode allows the user to move a slider interactively. As
the slider is moved, a point is correspondingly moved along the way point segment and the met
data in the GMDB corresponding to that point are printed on the display in text format.

           PUSHING 3-D INFORMATION INTO 2-D IMAGES AND DISPLAYS

        Information users at the lower echelons may not have the time or training to use manual
virtual data browsers such as Vis5D. Furthermore, bandwidth considerations may dictate that
text messages (perhaps displayed graphically at the end user’s interface), small data objects and
2-D image products will be the smaller bandwidth information formats of choice. With this in
mind, we considered how the very nice capability in Vis5D to rotate the information in three
dimensions might still be captured in 2-D products generated by the Combat weather Team or
higher echelon IMETS.

        There are several technologies, both old and new, that can be used to insert 3-D
information into a flat image. The one chosen for this paper and presentation is the “anaglyph”
or red-blue composite image viewed with colored filters (red for the left eye, blue for the right).
Admittedly this has its drawbacks since the user may be color blind or may not want to obstruct
vision with colored filters. Polarized images, flicker glasses and even stereoscopic pair viewers
are alternatives. What we are only interested in here is whether there is value added. In the
         Figure 16. Probe along the route (3D image, requires red-blue glasses)
following figures we show both Vis5D and meteorological satellite imagery that has been
enhanced by conversion to 3-D anaglyphs. The reader will need to obtain a pair of red-blue
glasses, available at some toy stores, Edmund Scientific, or a number of good on-line stores.
(Search for “anaglyph” and “glasses”). Cheap pairs costing less than 50 cents each will be
handed out at the presentation of this paper.

        Figure 16 shows the route planning
display in “probe” mode. The met data
readout is associated with the white
crosshairs on this route along the coastal
Yugoslav mountains. The common map is
shown draped over the terrain data that was
used in the BFM forecast run. Figure 17 is
similar, but the map has been turned off and
pseudo colors represent the shaded terrain.
In viewing both figures, the reader should
allow the eye-brain to relax and take in the
whole image before focussing on any one
element.      Zooming in the page view
(assuming that a color printer is not
sufficient) will enhance the effects.

                                                  Figure 17. Probe view, Bosnia (3D)
       Figure 18. Extreme view of the route along the Yugoslav mountains (3D)
        Figure 18 shows a rather extreme view of the path and displays iso-surfaces of liquid
water content representing low stratus or fog (blue-white regions). In this case the eye must
strain to try to converge the flight path lines, but the 3-D effect is striking. Figure 19 shows
wind vectors for a model run over the Colorado region. Ft. Collins is near the top, just east of
the front range. Denver is near the center of the image. Not only does one see the turning of
the wind with height better, but also one can better visualize the effects of the mountains using
the anaglyph view. This forecast corresponds to satellite imagery shown later.

        Before proceeding with the other
examples, let us document the simple
process that was used to create these images.
First, one must select image pairs that more
or less represent slightly different viewpoints
corresponding, for example to viewpoints
from the left and right eye. The left and
right images are then joined into one using
the appropriate procedure below.

       For two greyscale (8 bit black and
white) images: Simply copy the left image
into the red channel, and copy the right
image into both the green and blue channels
of a new 24 bit RGB image. This can be
easily done manually using various paint
programs, for example. PaintShop Pro 4.x
was used for the figures in this paper.
                                                   Figure 19. Winds over Colorado (3D)
       For two color images: Convert the images to 24-bit color if not already in this mode.
Separate the red, green and blue component channels of both the left and right color images.
Then form a final new image made up of the red channel from the left image, the green channel
from the right image, and the blue channel from the right image.

        For color graphics images or images with large bright blue or red regions: Color
images of line graphics and data plots are particularly troublesome if the data are plotted as red
lines or blue lines. A similar problem occurs if the image has bright blue or red regions. In
these cases, try the following method to partially cross-mix the image pair colors and reduce
saturation. Convert both the left and right images to 24 bit color if not already in this mode.
Separate the red, green and blue component channels of both the left and right color images.
Next, take a copy of the original left image and convert it directly to an 8-bit greyscale image.
Average this 8-bit greyscale left image with the red channel (also now 8-bit) that was stripped
out of the left color image. Place the resulting averaged image into the red channel of the final
output image. Next, average the red and green channels of the right image; and place the
resulting average 8-bit image into the green channel of the final output image. Finally, average
the red and blue channels of the right image; and place the resulting 8-bit image into the blue
channel of the final output image. The output image will have a shifted hue and will be less
saturated than the original image. However, any red or blue lines in the original images should
now show up in the 3-D composite.




 Figure 20. Cloud cover 3/8, Colorado               Figure 21. Cloud Cover 5/8, Colorado

        Figures 20 and 21 show clouds moving through the Colorado Rockies on this date (30
March 2000). On the 29th a light dusting of snow was deposited and the skies cleared. On the
30th, snow and partial cloud cover persisted throughout much of the area. In these figures, Ft.
Collins is to the right (North) behind the low lying cloud areas. The data plotted are cloud
cover in eighths forecast from the IMETS BFM (3/8ths on the left and 5/8ths on the right) for
approximately noon local time.
         Figure 22 now shows a composite
satellite image in the visible band (courtesy
of the UCAR web page) showing mostly
clear conditions over the Colorado rockies
on 29 March, but with areas of snow on the
mountains easily discerned.        The two
combined images are not simultaneous.
Rather, they are 15 minutes apart, with the
earlier image associated with the right eye
(blue) and the later image in time associated
with the left eye (red). This choice was
made so that the geopolitical (state)
boundary lines could be allowed to line up.
As a result, the surface and clouds appear to
be below the effective level of the map
boundary. However, if one looks long           Figure 22. Snow covered Colorado Mtns
enough, then the thin clouds moving in from
the south east can be seen, and several convective clouds streaming to the northeast off the
mountain ridges can also be picked out.

        (If the resulting anaglyph images
show the red and blue tinted areas as too
weak to produce a good effect, then try
saturating the color of the entire image. This
was done in this case.)

        Figure 23 is from the CSU/CIRA
“Chill project” satellite image web page. It
shows the conditions on 30 March at about 2
hours later than the cloud cover plots of Figs.
20 and 21. Again the image pairs are about
15 minutes apart. In anaglyph format, the
image show the distinction between the open
cloud free areas in the center of the state, and   Figure 23. Slant Path – Colorado, Vis
one can make out somewhat the heights of
the clouds above the surface.

       On 29 March the weather was also active in the south-east. A tornado watch in
soutwestern Alabama was in effect. Figures 24 and 25 (next page) show the region. One can
make out strong convective storms in south Louisiana. Both figures are for the same image
pairs. Like previous figures, the image on the left uses coincident state boundary lines. The
cloud formations appear to be below the map boundaries. In Fig. 25, however, we have shifted
the images by a number of ixels roughly equal to the distances that interesting cloud features
have moved. This co-locates the clouds rather than the boundaries. By associating the image
with the leftmost boundary lines with the right eye (blue) and the image with the rightmost
    Figure 24. Fixed borders, Alabama.                      Figure 25. Shifted borders.
boundary lines with the left eye (red) one can now perceive the “map” to be at ground level,
with the clouds above it. (Note: it takes a while to see this effect at first. Look for the “map” to
appear a to be a flat surface a few “inches” into the figure.)

        Figure 25 is also “greener” than its neighbor. This is intentional. Some red-blue
glasses are darker than others, making images like Fig. 24 appear quite bluish. Figure 25 has
been compensated for that type of glasses by boosting the green component. (There is some
advantage to the darker blue in that there is less “bleeding” of color in the red channel by
changing the hue to emphasize the green more.)

        Finally, Fig. 26 shows a true stereo pair derived from images from the University of
Wisconsin taken from two different satellites at the same time. This is obviously the preferred
situation if possible, since it
provides a truer three
dimensional         perspective.
This is a much more involved
process to co register the
regions and perspectives from
different satellite views. For
that reason, it is less likely
that     Combat        Weather
personnel would have the
time to capture and manually
process image pairs in this
way, except perhaps in a pre-
scripted mode.
                                     Figure 26. True stereo - 2 satellites at the same time
                        CONCLUSIONS AND RECOMMENDATIONS

         In this paper we have described the current capabilities of the IMETS and Weather
Effects Workstation. The “top down” paradigm of providing weather forecasts from central
hubs down to division level may need to be augmented to support the new Army “full
spectrum” brigade structures, where decision making may be pushed to lower echelons. The
shorter decision cycle and nearer-term planning of operations will also tend to change the types
of weather products to be more “now-cast” oriented. Local weather observations from a
variety of sources may become available, including remote sensing, robotic vehicle weather
sensors, UAV imagery, soldier environmental monitors, etc. Therefore, we have begun looking
at the types of new capabilities that will need to be developed to support these forces. It is clear
that forecast models will need a new ability to ingest non-conventional observations. And, it is
clear that new weather products will be needed that are tailored for use all the way down to the
individual soldier level. In this spirit, we have examined here the use of Weather Features on
the ABCS Common Tactical Picture and overlays. We have looked at automating route
planning weather and displaying route weather data in both virtual 3-D data environments
(Vis5D). And we have looked at the potential for using enhanced 2-D images containing 3-D
information.

                                         REFERENCES

Haines, Patrick, 2000; “Development and Validation of Battlescale Forecast Model Convective
       Cloud Prediction Capabilities”, Proceedings of the Battlespace Atmospheric and Cloud
       Impacts on Military Operations Conference 2000 (BACIMO 2000), Cooperative
       Institute for Research in the Atmosphere (CIRA), Colorado State University, Fort
       Collins, CO, April 25-27, 2000.

Henmi, Teizi, 2000; “Evaluation of Operational Mesoscale Models MM5 and BFM Over White
      Sands Missile Range”, Proceedings of the Battlespace Atmospheric and Cloud Impacts
      on Military Operations Conference 2000 (BACIMO 2000), Cooperative Institute for
      Research in the Atmosphere (CIRA), Colorado State University, Fort Collins, CO, April
      25-27, 2000.

Hoock, Donald W., Mario Torres and David Sauter, 1998a; “Adapting a Rule-Based Decision
      Aid as a T&E Range Mission Planning Tool”, Proceedings of the ITEA Workshop:
      Modeling and Simulation, Establishing Seamless, Distributed, and Integrated Solutions
      to Real-World Challenges 7-10 Dec 1998, Las Cruces, NM, pp 166-180.

Hoock, Donald W. and John Giever, 1998b; “Extending VIS5D to View Very High Resolution
      3-D Data”, Proceedings of the ITEA Workshop: Modeling and Simulation, Establishing
      Seamless, Distributed, and Integrated Solutions to Real-World Challenges 7-10 Dec
      1998, Las Cruces, NM, pp 793-803.

Passner, Jeffrey, 1999; “Weather Products from a Mesoscale Model”, Proceedings of the ITEA
       Workshop: Modeling and Simulation Using Modeling and Simulation for Testing – Are
       We ready for the Next Millennium? 6-9 Dec 1999, Las Cruces, NM, to appear.

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:8
posted:10/8/2012
language:Unknown
pages:18