Docstoc

tele immersion

Document Sample
tele immersion Powered By Docstoc
					TECHNICAL SEMINAR TOPIC

TELE-IMMERSION

P.V.RAJA SHEKAR 06S11F0031 MCA IIIyr

2|Pa g e

CONTENTS
1. INTRODUCTION 1.1. Virtual Reality 1.2. The Holographic Environments 2. HISTORY 3. WHAT IS TELE-IMMERSION 3.1. System Description

4. HOW IT WORKS 5. APPLICATIONS 6. ADVANTAGES 7. DISADVANTAGES 8. CONCLUSION 9. FUTURE SCOPE 10. REFERENCES

3|Pa g e

INTRODUCTION

Virtual Reality
Virtual reality (VR) is a technology which allows a user to interact with a computer-simulated environment, be it a real or imagined one. Most current virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special or stereoscopic displays, but some simulations include additional sensory information, such as sound through speakers or headphones. Some advanced, hapticsystems now include tactile information, generally known asforce feedback, in medical and gaming applications. Users can interact with a virtual environment or a virtual artifact (VA) either through the use of standard input devices such as akeyboard and mouse, or through multimodal devices such as a wired glove, the Polhemus boom arm, andomnidirectional treadmill. The simulated environment can be similar to the real world, for example, simulations for pilot or combat training, or it can differ significantly from reality, as in VR games. In practice, it is currently very difficult to create a high-fidelity virtual reality experience, due largely to technical limitations on processing power, image resolution and communication bandwidth. However, those limitations are expected to eventually be overcome as processor, imaging and data communication technologies become more powerful and cost-effective over time.

4|Page

Virtual Reality is often used to describe a wide variety of applications, commonly associated with its immersive, highly visual, 3D environments. The development of CAD software, graphics hardware acceleration, head mounted displays, database gloves and miniaturization have helped popularize the notion. In the book The Metaphysics of Virtual Reality, Michael Heim identifies seven different concepts of Virtual Reality: simulation, interaction, artificiality, immersion, telepresence, full-body immersion, and network communication. The definition still has a certain futuristic romanticism attached. People often identify VR with Head Mounted Displays and Data Suits.

5|Pa g e

The Holographic Environments
If you've ever wa tched "Star Trek," you may remember seeing the crew of the Starship Enterprise live out their fantasies in a room called theholodeck. The holodeck was a giant, holographic projection room that allowed the crew to touch and interact with projections as if they were in a big video game. Scientists today are developing a new communications technology that will allow you and your friends to interact inside a simulated environment even if you are thousands of miles apart. Most of the basic components for this network are already in place to allow the development of tele-immersion. Teleimmersion is the scientific community's answer to the holodeck.

6|Page

Formula 1 driver Lewis Hamilton of Great Britain stands next to a 3D hologram during the Reebok launch of their new Smooth Fit technology.

By combining cameras and Internet telephony, videoconferencing has allowed the real-time exchange of more information than ever without physically bringing each person into one central room. Tele-immersion takes videoconferencing to the next level. It will create a central, simulated environment that will allow everyone to come together in one virtual room, and no one will have to leave their physical location to do so.

7|Pa g e

HISTORY

During the early years of research on Tele-immersion by National Tele-Immersion Initiative or NTII and the University of North Carolina, users were required to wear a head device and special goggles to track the user's eye movements much like a virtual reality headgear.

To recreate the environment and depth, video cameras were used at the other end to track movements and capture light patterns to calculate the distances in the room. The images were then polarized and divided in order to present a different image to each eye at the rate of 3 frames per second. This is somewhat similar to the 3-D glasses we use in the movies.

However, because of the low refresh rate of the frames, the image appeared somewhat jerky. The project however was successful as on May 2000, several researchers located in the University of North Carolina were able to communicate with researchers from the University of Pennsylvania and Advanced Networks and Services using this technology. The two groups were able to communicate inside a room with of lifelike and three dimensional representations of their colleagues more than a hundred miles away.

8|Pa g e

WHAT IS TELE-IMMERSION
Tele-Immersion or the use of holographic environments is the next step to internet video conferencing. This technology aims to produce a computer-generated central environment in which participants from anywhere in the world can interact as if they are in the same room. This technology goes far beyond the current technological combination of telephony and internet by videoconferencing to exchange data. Tele-Immersion will be able to address the line-of-sight limitations of the current video conferencing technology. With the use of holographic environments, users will be able to have an unrestricted view of the other person's environment. Creating a holographic environment will need several equipments. The first will be a computer that will recognize and track movements and presence of objects and people inside a room. This is essential in projecting the same images and movements to the other party using a surface that will be stereoimmersive. The creation of holographic environments may seem a lot like virtual reality, but it's actually very different since virtual reality allows the user to move and manipulate objects in the virtual environment while tele-immersion does not. With the merging of these two technologies, users will be able to manipulate objects in a holographic environment without having to use virtual reality headgears, making for a very realistic computer-generated environment.

9|Page

System Description

The Tele-immersion Project at Penn has built a 60 camera cluster using IEE 1394 cameras. Much of the system was built using off the shelf products. A of the mounting hardware had to be custom made in the workshop. Each camera is a Pointgrey Research's Dragonfly 640x480. The three black and white cameras are responsive in the Infra Red Spectrum (CCD:ICX084AL). They do not have a filter for blocking IR. The color camera however blocks IR as it has a IR block filter. The Dragonfly cameras have the external trigger capability. Each camera is always synchronized to the other cameras on the same bus. However this application needs that all cameras should grab images simultaneously and remain synchronized. This was achieved using external trigger and the Point Grey Research Sync Units. Each camera unit is connected to a Dell Precision 530 Workstation (dual 2.4 Ghz Pentium 4 with an integrated 1394 controlller). The cameras can all be calibrated together. These cameras are used for acquiring the entire room in 3D. It can also be used to capture image data sets to disk. All the machines are connected to a gigabit switch which provides connnection to Abilene. The machines are dual boot with Linux and Windows 2000. The streaming video server runs under Linux. More details on the actual software architecture will be made available soon.

10 | P a g e

The camera units are mounted on a 80/20 Aluminium Frame which is attached to a truss. The lighting for this setup is provided by fluorescent lights from Brightlines. To have more structure in the room we use invisible structured light using five modified Kodak Ektagrpahic projectors which can be mounted on top of the truss.

Figure 1 The four camera unit with the color camera on top and three BW cameras in the bottom row

11 | P a g e

Figure 2 The truss and the 80/20 frame used to hold the cameras

Figure 3 The machines

Hardware and software technologies needed:
- 3D real-time system for acquisition of dynamic, real objects (Upenn) - static scene acquisition (UNC) - rendering and stereo display architecture (UNC) - high precision headtracking system (UNC) - modeling and manipulation with virtual objects (Brown) - multi-person interaction and collaborative architecture (Brown)

12 | P a g e

HOW IT WORKS

Tele-immersion system has 360 degree stereo capturing capability which allows full-body 3D reconstruction of people and objects. The data is captured in real time and projected into a virtual environment as a point cloud which can be combined with virtual objects and scenes. The current apparatus includes 48 cameras arranged in 12 stereo clusters. The images from each cluster are processed by a stereo reconstruction program running in parallel on 12 computers. The acquired data can be sent via Gigabit Internet II connection to another computer to be rendered into a 3-dimensional scene. The data can be displayed using passive stereo projection to increase the perception of depth. We also have hardware capable of capturing and playing sound from 4 microphones and 4 speakers, and 8 infrared lights that project patterns that facilitate depth detection.

13 | P a g e

Applications in high-energy physics, space exploration, environmental hydrology, cosmology, nanotechnology, molecular biology, manufacturing, and chemical engineering are expecting to use technologies such as remote instrumentation control, teleimmersion, real-time client server systems, multimedia, teleteaching, and digital video, as well as distributed computing and high-throughput, high-priority data transfers.

14 | P a g e

ADVANTAGES

15 | P a g e

DISADVANTAGES

16 | P a g e

FUTURE SCOPE
The STAR TAP project assists with high-end collaborations with networking and applications support and provides stable configurations of emerging networking technology. It has enough switching capacity to provide for today's needs and the future. The principal contribution of STAR TAP, now being developed, is the design and enabling of a truly integrated approach to the management, performance measuring, scheduling, and consumption of geographically-distributed network, computing, storage, and display resources, with specific focus on advanced computational science and engineering applications.

17 | P a g e

CONCLUSION

In order to encourage applications, a better level of communication among network engineers, application programmers, and scientists needs to be supported. The nomenclature and styles of networking access and engineering are essentially totally disjoint from the way application programmers and computational scientists write programs and use computers. Security and Acceptable Use Policy (AUP) considerations are typically idiosyncratic at computer sites, yet need to be uniform, or at least interoperable, for networks to support applications. Most important, support for network engineering, applications programming assistance, and Web documentation for USA scientific researchers and their international partners is key to the formation of long-lasting, productive, international research relationships. We encourage NSF CISE's support of applications-driven integration testbeds to stress test the constantly changing network technology.

18 | P a g e

REFERENCES


				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:4815
posted:2/25/2009
language:French
pages:18
Description: this is a document for tele immersion