Dissertation_PK

Document Sample
Dissertation_PK Powered By Docstoc
					                                   Imperial College London

                                  Department of Computing




Virtual Natural Orifice Transluminal Endoscopic Surgery Simulator

                                Instrument Modelling
                                              by

                             Przemyslaw Korzeniowski (PK1309)




Submitted in partial fulfilment of the requirements for the MSc Degree in Computing of Imperial
                                          College London




                                       September 2010
2
ABSTRACT


SUPERVISORS:

Dr Fernando Bello
Dr Vincent Luboz




                    3
ACKNOWLEDGEMENTS




              4
CONTENTS
1.     INTRODUCTION .......................................................................................................................................... 8
     1.1      Project overview ................................................................................................................................. 8
     1.2      Aims of the project ............................................................................................................................. 8
     1.3      Previous research ............................................................................................................................... 8
     1.4      Outline of the thesis ........................................................................................................................... 9
2.     MEDICAL BACKGROUND........................................................................................................................... 10
     2.1      Modern surgical procedures ............................................................................................................ 10
     2.2      Endoscope ........................................................................................................................................ 11
     2.3      NOTES vs. traditional laparoscopy.................................................................................................... 12
     2.4      Limitations of NOTES ........................................................................................................................ 13
3.     TECHNICAL BACKGROUND ....................................................................................................................... 14
     3.1      Virtual surgery .................................................................................................................................. 14
     3.2      Soft body modelling.......................................................................................................................... 14
       Heuristic models ....................................................................................................................................... 15
       Continuum mechanics approach .............................................................................................................. 16
       Hybrid approaches ................................................................................................................................... 16
     3.3      Collision detection ............................................................................................................................ 17
       Bounding volume hierarchy ..................................................................................................................... 17
       Distance fields and ray-traced collision detection ................................................................................... 18
     3.4      Simulation frameworks .................................................................................................................... 18
4.     DESIGN...................................................................................................................................................... 19
     4.1      Tools ................................................................................................................................................. 19
       Visualization Toolkit (VTK) ........................................................................................................................ 19
       OpenGL ..................................................................................................................................................... 19
       GLSL .......................................................................................................................................................... 19
       Fast Light Toolkit (FLTK) ............................................................................................................................ 20
       Microsoft Visual Studio 2008 (MSVS9) ..................................................................................................... 20
       Microsoft DirectX...................................................................................................................................... 20
       Autodesk 3ds max 2009 ........................................................................................................................... 20
       SOFA ......................................................................................................................................................... 20
     4.2      Functional design.............................................................................................................................. 21
                                                                                   5
     4.3        Object oriented design ..................................................................................................................... 23
        Auxiliary classes ........................................................................................................................................ 23
        Instruments .............................................................................................................................................. 24
        Collision detection .................................................................................................................................... 25
        System ...................................................................................................................................................... 26
        GUI and VTK .............................................................................................................................................. 27
        User input/output .................................................................................................................................... 27
        Simulation loop......................................................................................................................................... 28
        Project timetable ...................................................................................................................................... 29
5.      IMPLEMENTATION.................................................................................................................................... 30
     5.1        Instruments model ........................................................................................................................... 30
     5.2        Instruments manipulation ................................................................................................................ 31
     5.3        Shaft translation ............................................................................................................................... 31
     5.4        Shaft rotation ................................................................................................................................... 34
     5.5        Collision detection ............................................................................................................................ 37
     5.6        External force ................................................................................................................................... 38
     5.7        Spring force ...................................................................................................................................... 40
     5.8        2D Bending force .............................................................................................................................. 41
     5.9        3D Bending force .............................................................................................................................. 43
     5.10       Force propagation ............................................................................................................................ 46
     5.11       Force feedback? ............................................................................................................................... 49
     5.12       Actuators .......................................................................................................................................... 51
     5.13       Realistic per-pixel lightning .............................................................................................................. 53
6.      INTEGRATION ........................................................................................................................................... 53
7.      EVALUATION ............................................................................................................................................. 54
8.      CONCLUSION ............................................................................................................................................ 54
     8.1        Achievements ................................................................................................................................... 54
     8.2        Future work ...................................................................................................................................... 54
USER MANUAL .................................................................................................................................................. 55
APPENDIX ......................................................................................................................................................... 57
     Coding practices ........................................................................................................................................... 57
     Abbreviations ............................................................................................................................................... 57
     List of figures ................................................................................................................................................ 58
                                                                                   6
   List of code snippets ..................................................................................................................................... 59
BIBLIOGRAPHY .................................................................................................................................................. 60




                                                                                7
1. INTRODUCTION

This chapter presents the broad overview of the project. It summarizes the aims of the project and the
motivation behind it . The chapter ends with the description of how the thesis is constructed with brief
description of next chapters.



1.1 Project overview

Natural Orifice Transluminal Endoscopic Surgery (NOTES)(1)(2) is an emerging experimental technique in
surgery that eliminates abdominal external incisions. It is considered (3)(4)(5)(6) as a next stage in Minimal
Access Surgery development. The foundations of NOTES were laid by recent improvements in flexible
endoscopy. Because of this technique is at its early stage, very few efficient training programs are available
for clinicians. Computer simulation of a flexible endoscope would contribute to surgical training without
putting patients at risk, popularizing NOTES and keeping practitioners up to date with new methods.



1.2 Aims of the project

The project aim is to create a virtual NOTES simulator. The main focus is on modelling a next generation
flexible endoscope used in NOTES. This instrument is composed of a flexible shaft in which is embedded a
camera, a light source, and two smaller actuators used for grasping and poking the tissues during the
intervention. Modelling this complex endoscope includes a mathematical model, as well as graphical
representation, control and haptic feedback. If time allows, the project will be interfaced with the MSc
project of Mr Danial Sheikh who will explore tissue modelling using the SOFA simulation framework.



1.3 Previous research

There are a number of commercial surgical simulators (7)(8)(9) available for traditional laparoscopy training.
Most of them come bundled with specialized haptic devices, which imitate the behaviour of rigid
endoscopes. To my knowledge there is no commercial or free simulator aimed specifically at NOTES and
flexible endoscopes.

The project will be a modification of a prior MSc project supervised by Dr Fernando Bello - “Virtual
Catheterisation Simulator (VCSim)” by Mr Rafal Blazewski (10). The mass-spring model used by Mr Blazewski
for catheter and guidewire simulation seems a promising entry point for endoscope modelling. Collision
detection algorithm based on Axis Aligned Bounding Box (AABB) trees can also be put into consideration.
Because of the differences between a flexible instrument such as a catheter, and a more rigid instrument
such as the NOTES endoscope, several modifications will need to be implemented. Furthermore, new small
actuators and their interactions with the surrounding tissues and instruments will be added.

                                                      8
1.4 Outline of the thesis

The report consists of eight main chapters followed by user’s manual, appendix, bibliography and list of
figures and code snippets. The following chapters are organized as follows:

This chapter presents the broad overview of the project. It summarizes the aims of the project and the
motivation behind it . The chapter ends with the description of how the thesis is constructed with brief
description of next chapters.

Chapter 2 provides a background information on NOTES as a part of minimal invasive surgery. It compares
NOTES with modern surgical procedures and highlights the endoscopes. It ends with a section devoted to
limitations of NOTES.

Chapter 3 focuses on the technical background of the project including research on virtual surgery, soft-
body simulation and collision detection. The chapter ends with the section describing simulation
frameworks.

Chapter 4 describes the concept design of the simulator, and provides detailed design for the Chosen
technical toolkits are also explained.

Chapter 5 provides the implementation details of the simulator. It starts with the analysis of how the shaft
and the actuators are modelled and how they interact with the organ walls.

Chapter 6

Chapter 7 presents the evaluation of the flexible endoscope simulation as well as the performance of the
simulator. Further, feedback from clinicians is given.

Chapter 8 concludes the whole project, summarising all achievements, discussing its results, and giving
recommendations for future work.

The appendix contains information peripheral to the main body of the report such as the user’s manual,
coding practices used during implementation as well as list of abbreviations, figures and code snippets.




                                                     9
2. MEDICAL BACKGROUND

This chapter provides a background information on NOTES as a part of minimal invasive surgery. It compares
NOTES with modern surgical procedures and highlights the endoscopes. It ends with a section devoted to
limitations of NOTES.

2.1 Modern surgical procedures

Historically, abdominal access required a laparotomy, a procedure involving an incision through the
abdomen to get access to the abdominal cavity (Figure 1). The laparotomy caused incision-related
complications such as pain, infections, hernias or adhesions. Currently, in Minimal Access Surgery (MAS),
operations within abdominal cavities are performed through small portals (laparoscopic surgery), which are
usually 0.5 – 1.5 cm long. Smaller incisions not only decrease hospital time, postoperative pain, morbidity
and mortality, but also improve overall cosmetic result. Over the last twenty years MAS has become
standard for most abdominal surgery. MAS. The foundations of MAS were laid by more than two centuries
of development in the field of endoscopy.

Recently, surgeons experimentally managed to perform abdominal surgeries without any external incisions.
This emerging technique, called Natural Orifice Transluminal Endoscopic Surgery (NOTES)(1)(2), allows
surgeons to gain access to the abdomen through natural orifices such as mouth, anus, vagina and then
through internal incision in stomach, bladder or colon. This so called “scarless” procedure eliminates all
problems related with postoperative wounds. It is considered (3)(4)(5)(6) as a next stage in MAS
development. The first human incisionless operation was carried out in April 2007 at the University Hospital
of Strasbourg by Professor Jacques Marescaux and his team (Operation “Anubis”).




      Figure 1: Cholecystectomy (gallbladder removal) using laparotomy (right) and laparoscopic (left) procedures to access abdomen




                                                                  10
2.2 Endoscope

The key elements of laparoscopic surgeries are endoscopes (laparoscopes in case of abdominal surgeries).
Endoscopes evolved from candle-powered rigid pipes in XIX century via electric light bulbs and fiber wires to
sophisticated devices with high resolution CCD cameras and LEDs. Endoscopes can be divided into two types
– with rigid or flexible tube. Flexible endoscopes (Figure 2), as opposed to rigid ones, can be inserted
through natural orifices and can reach larger parts of the body, often inaccessible by rigid endoscopes.
However, rigid rod-lens endoscopes offer better performance comparing to flexible ones, hence are
preferred by most of the surgeons.

A contemporary endoscope not only gives clear view of the inside of the body using a light source and
camera or lenses. It can be equipped in working channels which enable carrying additional instruments such
as cutters, graspers, staplers, biopsy devices, lasers etc. Because of limited space within the endoscope,
these devices must be thin, long and flexible, which prove to be challenging for the engineers. Figure 2
illustrates how complex the inside of an endoscope can be.




                             Figure 2: Flexible endoscope and cut through the endoscope shaft




                                                           11
2.3 NOTES vs. traditional laparoscopy

During laparoscopy, surgeons create multiple ports in patients’ body using trocars (a sharp ended cylinder).
The required instruments are inserted via these ports. Usually they consist of a rigid endoscope for
visualization and a set of rigid graspers, scissors, staplers etc. for tissue manipulation. In NOTES, surgeons
insert a single endoscope equipped with required instruments via natural orifices and make internal
incisions (viscerotomy site) in a stomach wall, bladder, colon or vagina to pass the instruments to the point
of interest. This is shown in Figure 3. After the operation is finished, instruments are removed and internal
incision must be securely closed (11) in order to avoid a leakage which could lead to haemorrhage or
infection.




     Figure 3: Gallbladder removal through the mouth using NOTES (left) and a gallbladder viewed from the laparoscope's camera (right)




                                                                   12
2.4 Limitations of NOTES

Although NOTES is gaining popularity among surgeons and patients around the globe, it is still an emerging
and experimental technique, which struggles with several limitations. The Natural Orifice Surgery
Consortium for Assessment and Research (NOSCAR) established a list of potential barriers (12) which need
to be surpassed before NOTES can be incorporated in routine practice. Specialists highlight difficulties with
locating optimal viscerotomy sites and its creation and closure techniques. Since natural orifices are not
sterile, methods of bacteria reduction require further research

In addition, surgical instruments in NOTES are inserted through working channels of the endoscopes. As a
result, instrument movement depends on endoscope movement, which significantly constrains articulation
possibilities. Working channels also limit sizes of the instruments. Using one endoscope prevents multi-
planar triangulation with separate optics or other instruments. Figure 2 depicts how complicated
instruments can be and Figure 4 shows different prototype devices.

Furthermore, an important point is the need for better training methods. Currently, a percentage of surgical
training programs is carried out on mannequins, cadavers or animals, which raise ethical issues about
animal rights and is expensive. Moreover, biomechanical properties of mannequins, cadavers or animals
(e.g. swine organs) are not equivalent to those of live human tissue. Fortunately, thanks to technological
progress, some of the surgical procedure training can be simulated in virtual reality (13).




  Figure 4: Different prototypes of NOTES endoscopes. On the left, Cobra device by USGI medical. In the middle, a concept device by Olympus
                Medical Systems. On the right, a robotic system for NOTES proposed by Nanyang Technical University in Singapore.




                                                                    13
3. TECHNICAL BACKGROUND

This chapter focuses on the technical background of the project including research on virtual surgery, soft-
body simulation and collision detection. The chapter ends with the section describing simulation frameworks.

3.1 Virtual surgery

Simulation has been available for training airplane pilots for the four past decades. It is now highly realistic
and pilots can gain their entire abilities without even sitting in a real plane. As a consequence,
improvements in training safety, cost savings, equipment and time are immense. Medical virtual reality (VR)
simulators have been expected since the 1990s (14) to become as important for surgery as flight simulators
are for aviation. The most recent reviews (15) and validation studies (16) show that, although expected
progress in the field has been made and VR simulation is successfully used in many areas of surgery, ranging
from training to planning, navigation and surgical assistance, there is still an enormous potential to be
developed. Minimal access surgery is one of main applications for simulation. Currently, there are several
virtual surgery simulators commercially available on the market. Some of them are entirely software based
allowing highly detailed view of the anatomy (7)superseding textbooks or multimedia presentations. Others
are fully interactive and come with additional haptic devices, ranging from three degrees of freedom arm-
like devices, to instruments perfectly imitating the real medical device, e. g. laparoscopes inserted into the
human body(7)(8)(9). The main field of application of such software-hardware bundles is training and
education. Simulation offers easy implementation of different training scenarios, e.g. unpredictable
complications, which can be replayed countless times. Simulation software can also record performance,
which may be used for assessment and credentialing. Moreover, tutor presence is not necessary, students
can train on their own, whenever they want. Experts can also benefit from using simulators to maintain
their skills or to explore new ways of performing a surgery. Finally, some simulators can read patient specific
data obtained from medical images (e.g. CT or MRI) and help to plan a surgery to avoid potential
complications.

Virtual surgery is a combination of visualization, simulation and haptic force-feedback rendering.
Visualization, thanks to advances in computer graphics, is nowadays no longer an issue. Even low-end,
modern computers can handle real-time display of medical data in high definition or even 3D. Haptic
rendering is also a well known field. Engineers are able to transfer performance of the surgical tools into
numbers which will be input in a simulator to show the interactions with the user. Unfortunately, this
cannot be said about the simulation part. Due to anisotropic and viscoelastic behaviour, tissues are complex
objects to model and simulate. Moreover, for simulation purposes, tissues must interact with various
surgical instruments, which can be rigid or deformable. These interactions include not only deformations
but also incision, tearing, puncturing, suturing etc. Surgical instruments used in NOTES present a wide range
of interactions and are therefore complex to model, especially in real time.



3.2 Soft body modelling


                                                      14
Considerable amount of effort has been put into research on modelling soft body physics. Despite many
deformable models having been proposed over the last two decades, all of them compromise between real-
time interactivity and (bio) mechanical realism. An excellent survey (17) by Meier et al. from 2004 classifies
and compares known solutions for real-time deformable models used in medical simulations. Meier states
that in these models, due to high computational complexity, interactivity is set above realism. Meier divided
deformable models into three groups: (1) based on heuristics, (2) based on simplified continuum mechanics,
and (3) presenting a hybrid of the two prior approaches.

Heuristic models
The most representative model of the heuristic approach is the mass-spring model. It is a widespread
deformable model in the simulation field, not only in medical simulation. It is based on the simplification
that virtual object and its haptic rendering are limited only to the surface of an object. Mass-spring models
use a set of mass points connected by springs, which is “stretched over” the surface of an object. Mass
points and springs follow Newtonian law of motion. To simulate the deformation of the system a set of
differential equations needs to be solved. This is done through a time discretization using usually backward
Euler or Runge-Kutta iterative methods for the approximation of solutions of ordinary differential
equations.

The mass-spring model has many advantages. It is relatively simple to implement and cooperates well with
graphical representation i.e. it is easy to render. It reflects big deformations with satisfying realism and
allows changes in mesh topology, such as incisions which is very important in surgical simulation. The main
drawback of this model is related with its hollow nature and the fact that the displacement of a mass point
is distributed only to neighbouring points within each iteration step. That causes noticeable delays in a
deformation propagation and volume maintaining and consequently in realism. This effect can be minimized
by introducing additional springs, which link arbitrary placed points inside the volume and speed up force
propagation. However, this solution complicates the system of equations and causes problems when
modifying the topology. Another disadvantage of the mass-spring model is its instability. It can be indeed
long and tedious to find the adequate properties needed to tune the springs in order to stabilize their
deformation and relaxation and to avoid unwanted behaviour. Despite these disadvantages, efficient and
scalable applications in tissue simulation were presented (18). A mass-spring model was also successfully
used for guidewire modelling by Luboz et al. (19).

In order to improve the volumetric behaviour of heuristic models, a complete discretization of the volume
has been proposed (20). The linked-volume model uses this approach and can be seen as a volumetric
extension of the spring-mass idea. The mass points are evenly distributed inside the volume in addition to
its surface. Thanks to increased connectivity, which obviously increases the cost of the computational
complexity, the ability of maintaining the volume of this technique is better in respect to the basic mass-
spring model. On the other hand, increased number of mass points results in slower deformation
propagation.

Another interesting approach is linking elements not with springs but with rigid links (20)(21) (such as links
of a chain), which allows some limited movement of a link without influencing its neighbours. The chainlike
structure of the model guarantees approximately constant volume and quick propagation of deformations.
Topological modifications, as in the linked volumes model, can be easily introduced. Unfortunately,

                                                     15
calculation of many contact points between the links requires excessive amount of computation, which
withholds popularization of this model.

Continuum mechanics approach
The second group of deformable models is based on continuum mechanics approach. This approach can be
simplified by assuming linearly elastic material, slow deformations and neglecting small internal forces.
However, the obtained differential equations (second order Navier equation) cannot be solved analytically
and numerical methods are required. The best known is finite element method (FEM).

FEM tries to model mechanical properties (stress-strain relations) of an object. In FEM, an entire volume of
a deformable object is discretized into a finite number of elements, usually tetrahedrons. Thereafter, a
deformation of the entire object is obtained by displacing nodes of every element. FEM offers more realistic
results than heuristic methods do, but is computationally expensive and not suitable for real-time
applications. Although, there are intensive research efforts undertaken to accelerate it, they are currently
not suitable for the real time NOTES simulator aimed in this project.

Hybrid approaches
The deformation techniques described above are not particularly suitable for modelling thin elastic objects
such as wires, sutures, catheters, endoscopes etc. In order to realistically represent properties such as
twisting of thin objects, mass-spring model requires large number of mass-points and springs while FEM
requires very fine meshes. Possible solutions (22)(23)(24) based on Cosserat model were recently
presented. Cosserat rod can model phenomena like bending, torsion, extension and shearing. Its solution
requires solving of ordinary differential equations, hence it is suitable for real-time interactions. Currently, it
is widely used from modelling telephone cables to DNA helixes.

Several other approaches try to take the best of both techniques described in the previous sections to offer
the best balance between real time and accuracy. Because they won’t be approached in this project, they
are not described here. Nevertheless, (17) gives an extensive review of such methods.




                                                        16
3.3 Collision detection

The basic brute force approach to find collisions of an object is to check for collision between every element
of an object against every element of every other object in a scene including itself. This needs to be
repeated every time step. To speed up this process a number of algorithms (25)(26) have been developed.
Collision detection algorithms can be classified in terms of broad-phase and narrow-phase. During the
broad-phase, identification of potentially colliding objects is undertaken. It usually includes spatial
partitioning using binary space partitioning (BSP) or octrees and/or comparing bounding volumes of objects
by exhaustive search or more sophisticated methods like sweep and prune or hierarchical hash tables. The
narrow-phase eliminates unnecessary collision detection checks and determines exact collisions. In this
phase, more detailed information about occurring collision can be compute e. g. penetration depth or force
feedbacks. In comparison to well developed rigid body collision detection, deformable body collision
detection must take under consideration more factors. Check for self collisions must be done, which is
neglected in rigid body algorithms. Pre-processing of spatial data structures is heavily used to accelerate
rigid body algorithms. For deformable objects these structures need to be efficiently updated at every
iteration. Moreover, more precise information such as penetration depth is necessary in order to respond
correctly to the deformations.

Bounding volume hierarchy
Although, significantly different collision detection approaches
for deformable bodies exist, such as feature based (FB),
simplex based (SB), image space based (ISB), depth fields or
bounding volume hierarchies (BVH), the last one is especially
frequently used in dynamic and deformations simulation. On
the contrary to others, BVH (except depth fields) can handle
open objects collisions which very often occur during surgery.

A bounding volume (BV) is a closed geometry that completely
contains a whole object or its parts (Figure 5). A volume is
usually a sphere, a convex hull, an oriented bounding box          Figure 5: An example mesh bounded
(OBB) or axis-aligned bounding box (AABB) which is a special       using axis aligned bounding boxes
case of discrete oriented polytopes (k-DOPs)(27). During pre-
processing, the geometry partition of an object is assigned to hierarchical data structure (usually binary
trees, quad trees or octrees) using various assumptions. Nodes of a tree respond to bounding volumes and
actual geometry is embedded in leafs, nodes which do not have any children. Collisions are detected by
traversing this tree from top to bottom. When leaf-leaf level is reached, the embedded geometry is
processed using standard brute-force approach. Because little surface information is in the leaf, the brute
force algorithm is not slowing down the collision search. For deformable objects the main goal is to create
algorithms which can efficiently refit bounding volumes after the deformations takes place (26).




                                                      17
                                      Figure 6: Different types of bounding volumes




Distance fields and ray-traced collision detection
Other algorithms which can be applied for deformable bodies include distance fields (28). Each point of an
object is tested against the distance field of the other. If the distance between a point and the other object
is smaller than a certain threshold, then collision is detected. Distance fields allow deep intersections, but
building them is computationally intensive. This is the reason why their computation is done during a pre-
processing phase. For deformable bodies, distance fields need to be recalculated at every time step, which
limits real-time applications.

An interesting solution that does not depend on pre-computation has been presented in (29). The
algorithm, called ray-traced collision detection, is based on shooting rays from vertices along opposite
normal vectors. When the ray intersects with the inward surface of another body, collision is detected.
Authors have shown that this novel approach, especially in case of deformable object, is faster and more
robust therefore allowing bigger time steps.



3.4 Simulation frameworks

As stated in previous sections, medical simulation is a complex multi-disciplinary field which requires a strict
integration between visualization, mathematical modelling, collision detection and haptic feedback. In order
to allow researchers to focus on developing new, cutting-edge algorithms and avoid “reinventing the wheel”
the need of suitable tools has appeared.

Over the recent years several open source simulation frameworks have been developed (VRASS, GIPSI,
SPRING). One of the most recent and constantly gaining acclaim is SOFA (http://www.sofa-framework.org)
(30) created by research groups from CIMIT and INRIA. The SOFA architecture relies on multi-model
representations (visualization, deformation, collision) of an object. These representations are mapped
together and organized in a scene-graph. SOFA offers a wide range of built-in algorithms, possibility to add
new ones and combine it all together. Parameters of a simulation can be easily edited and different setups
compared. Moreover, SOFA is designed with parallelization in mind, to make the most of modern multi-core
architectures and GPUs.




                                                          18
4. DESIGN

This chapter describes the concept design of the simulator, and provides detailed design for the

Chosen technical toolkits are also explained.

4.1 Tools

Visualization Toolkit (VTK)
VTK is an open source, freely available software system for 3D computer graphics, image processing, and
visualization. VTK consists of a C++ class library, and several interpreted interface layers including Tcl/Tk,
Java, and Python. Some advantages of VTK is the support to a wide variety of visualization algorithms
including scalar, vector, tensor, texture, and volumetric methods; and advanced modelling techniques such
as implicit modelling, polygon reduction, mesh
smoothing, cutting, contouring, and Delaunay
triangulation. It comes with several reader/importer
and writer/exporter to exchange data with other
applications. Hundreds of data processing filters are
available to operate on these data, ranging from
image convolution to Delaunay triangulation. VTK is
independent from any windowing system and the
graphics model forms an abstract layer above the
OpenGL to ensure cross-platform portability.

In this project VTK is used to read organ data into the pipeline, draw particles of the instruments as 3D
tubes, visualize helper objects such as force vectors, provide camera control and define complex materials
for the meshes.

OpenGL
OpenGL (Open Graphics Library) is a standard specification defining a cross-language, cross-platform API for
writing applications that produce 2D and 3D computer graphics. The interface consists of over 250 different
function calls which can be used to draw complex three-dimensional scenes from simple primitives. OpenGL
was developed by Silicon Graphics Inc. Currently is managed by a non-profit technology consortium, the
Khronos Group. Although OpenGL is used implicitly, as a rendering layer of VTK, a strong understanding of
its mechanisms is crucial to develop robust applications in VTK.

GLSL
GLSL (OpenGL Shading Language) is a high level shading language based on the C programming language. It
was created to give developers more direct control of the graphics pipeline without having to use assembly
language or hardware-specific languages. In this project GLSL is used to implement realistic per-pixel
lightning.




                                                     19
Fast Light Toolkit (FLTK)
FLTK is a cross-platform C++ GUI toolkit which provides modern GUI functionality without the bloat and
supports 3D graphics via OpenGL and its built-in GLUT emulation.

Microsoft Visual Studio 2008 (MSVS9)
Visual Studio was used as an integrated development environment (IDE). It includes a code editor
supporting auto completion and advanced debugger. It was extended by plug-ins for version control and
profiling.

Microsoft DirectX
DirectX is an API for handling tasks related to multimedia, especially graphics, sound and input. In the
project only DirectInput was used to collect input from a game controller. DirectX is supported only on
Microsoft platforms.

Autodesk 3ds max 2009
3ds max is a 3D modelling, animation and rendering package. It was used to create 3D polygonal model of
grasper and poker, as well as test meshes of internal organs.

SOFA
SOFA is an open source framework made primarily for real-time simulation using different solvers (i.e. CG,
Euler, Runge-Kutta) and deformation models (i.e. FEM, Mass-spring, Tensor-mass). It is useful for medical
surgery simulation purposes. The algorithms within the library are general advanced graphics and
visualization techniques. They can be interfaced within a simulator to solve complex physical behaviour and
display interactions between components of the simulation, e.g. medical instrument and the organ to be
treated. SOFA is hierarchical and the XML-graph is easy to use. Creating a root node which contain the
mesh/object and add different topologies and elements to this (i.e. solvers, deformable models, collision
detection models, force fields, mass). The skin is applied on the outer level on top of the mesh.




                                                    20
4.2 Functional design

The main functionality of the simulator is to enable user to manipulate the shaft and actuators (Figure 7) of
the endoscope to navigate inside the organs of the patient. This includes translating the shaft
forward/backward and rotating it at the insertion point along its axis. The tip of the endoscope must be
bendable to match real endoscopes. The three actuators can be protruded from the shaft tip and can be
rotated along two axes. The whole body of the actuator bends. One of the actuators consists of a camera
and a light. The view angle of the camera can be changed (zoom in/out). User should also be able to control
the amount of the light emitted. For the two others actuators, the project assumes that the endoscope can
be equipped with three different types: grasper, cutter, or poker. The first type ends with two jaws. When
the jaws are being opened or closed while intersecting with an object, the object should be grabbed or
released. The second type is similar to the first one, but instead of grasping it cuts through the object. The
poking actuator is equipped with a cone like ending, which allows poking and deforming the object.




                                Figure 7: Virtual NOTES Endoscope - the shaft with actuators




When the simulator starts, default settings should load a predefined organ model to interact with, for
example a surface representing the stomach. The user should be able to load also patient specific data
obtained from medical imaging devices (CT or MRI) to interact with (provided they were accordingly pre-
processed before: segmented and surface generated). If time allows and the integration with SOFA is
successful, user will be able to define a whole complex system of organs, which interact with each others
and with the endoscope enabling poking, grasping and cutting their tissues. Figure 8 depicts the UML use
case diagram.




                                                            21
Figure 8: UML use case diagram (DRAFT




                22
4.3 Object oriented design

The overall object oriented design must take into consideration not only the current functional
requirements, but it must also derive advantages of the VCSim project i.e. reuse, where possible, its
architecture and/or components. Moreover, to allow further integration with SOFA or other simulation
frameworks it requires a clear separation between the simulation layer, graphical representation and user
interface. Additionally, it needs to provide support for getting input data from mouse, keyboard,
joysticks/joypads and in future for haptic devices.

Auxiliary classes
Every 3D application requires a standard set of auxiliary classes which represent points, primitives, models
and transformations.

The CVec3f class defines three floating point variables which represent a point or a vector in Cartesian
space. It contains a number of mathematical functions such addition, subtraction, multiplication,
normalization, calculating length, as well as dot and cross products. Most of them are implemented as
C/C++ operators. Additionally there are functions for an array-style access to the members and equality
check.




                                           Figure 9: CVec3f - UML class diagram




The CTriangle class defines a triangle. It has two arrays storing vertices positions and indices of each vertex
in the model. It also stores a normal vector, which can be updated when needed by a corresponding
method, a plane parameter and a midpoint of a triangle.

The CModel class is responsible for loading and storing data of an organ model, represented as a polygonal
surface made of triangles. It has a vector containing positions of vertices, its normals and, implicitly, indices,
by storing pointers to the triangles, which was modelled by the aggregation on the diagram (Figure 10) .




                                    Figure 10: CTriangle and CModel - UML class diagram




                                                           23
Instruments
Instrument’s shaft and actuators will be modelled as a mass-spring system. The abstract
CMassSpringSystem class stores positions of particles. We assume that, when the system is in a rest state,
the distances between every two neighbouring particles and the radiuses of the particles are constant. To
model this we use just two floating point numbers, springLength and particleRadius. Every mass-spring
system must have its origin. In the case of the shaft, the origin is absolute and corresponds to the insertion
point. For the actuators, the origin is relative to the end of the shaft’s tip. It defines the translation of the
actuators’ first particle from the shaft’s tip. The abstract Update() method will be invoked by a simulation
loop and must be implemented by the deriving classes.

In the shaft model we distinguish particles belonging to the tip or to the body. The tip has a constant
number of particles. It can be bended by the user and its physical properties (spring coefficient, kS, bending
coefficients, kB and kB3D) are different from the properties of the body. The number of particles building
the body depends on the depth of insertion. The Resize() method is responsible for adding and removing
particles to the body. There are five methods for manipulating the shaft. Three are public and two are
private as can be seen on figure 11. The thetaB vector of floats is used for bending behaviour. It stores
desired angles between neighbouring particles of the tip. The composition relationship with specified
cardinality ensures, that the endoscope can have, at most, three actuators.

The actuator is modelled along the same line as a shaft composed only of a tip. It has a constant number of
particles and it bends. Its set of methods is different (fig. 11). The actuator can be bended, protruded from
the tip of the shaft, and rotated around two different axes. Its origin moves and rotates relatively to the
endoscope tip. As stated in the functional requirements, the project assumes four different actuator types
(grasping, cutting, poking, and camera). Deriving classes have a set of specific set of fields and methods. The
Figure 11 shows the UML diagram.




                                     Figure 11: The instruments – UML class diagram




                                                          24
Collision detection
The collision detection model is responsible for returning polygons (in our case triangles) of the model
which collides with the endoscope. The collision detection algorithm uses Bounding Volume Hierarchy (BVH)
more precisely Axis Aligned Bounding Boxes to encapsulate the organ surfaces. The algorithm is inherited
from (10) and its UML class diagram is presented in Figure 12. Most probably, during integration stage,
collision detection will be moved to SOFA side, to make use of recent state of the art solutions.

While the organ model is being loaded, the BVH tree of AABBs is created. The tree consists of nodes which
contain pointers to triangles (leaf nodes) or a pointer to another node (its children), but never to both. It
also contains a pointer to a corresponding bounding box and boolean variables, which help to traverse the
tree. The bounding box class stores the coordinates of the bounds of the box and a pointer to the node
which it belongs to.




                                 Figure 12: The AABB tree - UML class diagram FROM (10)




                                                          25
As the shaft and the actuators are modelled as a set of particles, it is easier to encapsulate each of their
particles inside bounding spheres. When those spheres intersect with the AABB tree enclosing the organs, a
collision is detected. Consequently, most of the collisions will happen between sphere-AABB and sphere-
triangle (a leaf of the AABB tree). There will be also sphere-sphere collisions between actuators as self
collision will be taken into account in the project. The CSphere class contains three overloaded methods to
check if a collision appeared (Figure 13).




                                       Figure 13: CSphere – UML class diagram




System
The CSystem class (Figure 14) stores the pointers to the main subsystems of the application such as GUI,
graphics, organ model, shaft and actuators and, in the future, to SOFA. The pointer to the system will be
stored in every subsystem class to enable two way communication between subsystems.




                                       Figure 14: CSystem – UML class diagram




                                                        26
GUI and VTK
The simulator uses FLTK for the GUI and the Visualization Toolkit (VTK) for the 3D graphics. Both libraries are
briefly described in the Tools section at the beginning of the current chapter. The CVTK class provides two
methods responsible for initialization of a whole pipeline and rendering a single frame. It contains three
renderers. The first renderer draws a scene viewed from the endoscope’s camera, just like a surgeon sees it
during real surgery. The second and third renderers define a scene viewed from freely set helper cameras,
usually from the side and from the top of the scene. Those renderers can be switched off. CVTK also stores a
number of specific VTK algorithms and data readers. The final output of those algorithms and readers are
meshes stored in vtkActor class.

The FLTK class, apart from the functionality responsible for initializing, resizing, minimizing and closing a
window, stores a number of flags which affects the VTK display. It is worth mentioning that those two
classes are in aggregation relationship. This allows simple separation of the 3D graphics engine from the GUI
if needed later on.




                                     Figure 15: CFLTK and CVTK – UML class diagram


User input/output




                                  Figure 16: IODevice and CJoystick – UML class diagram




                                                          27
Simulation loop
The input/output subsystem will make use of a robust Command/Observer design pattern. The VTK
architecture implies inheritance from vtkCommand class for generic function callbacks. Specific
functionalities are implemented in Execute() methods. Deriving classes, CVtkSystemLoppCallback and
CVtkKeyboardCallback, also store a pointer to the system object therefore enabling access from a callback
function to every other subsystems. The keyboard callback is invoked when a key is pressed. The system
callback handles user input/output, simulation and rendering. It is invoked by VTK, on default, as fast as
possible. This behaviour can be changed if the performance of the system is not satisfactory. For example,
to render at constant frame rate by skipping some simulation steps or parallelizing physics and graphics
calculations on multi-core processors. Figure 17 presents the UML diagram of the subsystem.




                                  Figure 17: Callback mechanism – UML class diagram




                                                        28
Project timetable
The project is estimated to take 6 months. The first and last two weeks are allocated for the background
reading and the write up of the final report. One month is allocated for technologies and tools research and
selection of the most appropriate. The particular steps of the implementation process are roughly estimated
from two to three week time period depending on the predicted complexity. Figure 18 depicts the
estimated timetable.




                                        Figure 18: Timetable for the project




                                                        29
5. IMPLEMENTATION

The following chapter describes the implementation details of the simulator. It starts with the analysis of
how the shaft and the actuators are modelled and how they interact with the organ walls.

5.1 Instruments model

The shaft and the actuators are modelled using a mass-spring model. The model is built from a set of
particles (X0..Xn) connected by springs of equal length - λ. The mass is equally distributed along the
instruments i.e. all particles have same mass. The springs are non-bendable and all have equal properties
(length, spring factor, damping factor). Each particle forms the angle Θ with the previous particle.




                                       Figure 19: Mass-spring model of the shaft

In real, flexible endoscopes the tip is rigid and bendable. The angle of its bend is controlled by the surgeon.
Thus, in the shaft model we distinguish between particles forming the tip and particles forming the body of
the shaft. That separation allows us to apply different algorithms and properties to each part of the shaft. In
the actuator model such a distinction is needless and we can think about the actuator only as a tip with
bigger manoeuvrability.




                                                         30
5.2 Instruments manipulation

The precise manipulation of the endoscope, its shaft, camera and two actuators is a key in every medical
simulator. In this project we can control 17 degrees of freedom in total. Due to the lack of access to a
specialized device which could handle such a complex control, a standard gaming pad (“joypad”, Figure 20)
was used to provide interface with the user and input data. The joypad is equipped with two sticks, which
offer precise, analog control in two degrees of freedom each, a directional pad and twelve buttons.

To read the data from the device, Microsoft DirectX library was used. The library was briefly described in the
Tools section (4.1). DirectX offers a plug and play support for almost every gaming device available on the
market, thus any device which has similar set of sticks and buttons can be used to control the endoscope.
Unfortunately, DirectX is supported only on Microsoft systems, which causes the loss of cross-platform
support. On the other hand, the simulator framework was developed in a way which allows quick and easy
extension to support other controllers including haptic devices.




  Figure 20: Logitech Dual Action pad (left) used during the project. Due to use of DirectX a wide range of controllers is supported, for example
                                     Microsoft X-Box 360 controller (middle) or traditional joysticks (right).


5.3 Shaft translation

The translation is controlled by moving the left analog stick forward for pushing or backward for pulling the
shaft. The larger the tilt of the stick the larger translation t is being applied. The crucial issue is how the
translation is propagated along the shaft. Intuitively, for pushing, the forces should be propagated from the
insertion point (last particle of the body, bottom of the shaft) to the end of the tip (first tip particle, top of
the shaft). This approach, as stated in (10), may result in unpredictable behaviour in some peripheral cases
due to mass-spring instabilities. The bottom to top approach can be improved by applying simultaneously a
top to bottom propagation pass. Unfortunately, this solution is computationally intensive and prevents the
real-time simulation. Taking into consideration both realism and efficiency, the top to bottom approach was
used for pushing. There is no instability problem when pulling the instrument and therefore a bottom to top
approach was chosen for pulling.




                                                                       31
                                              Figure 21: Shaft translation

In order to obtain a new position of a particle in a subsequent simulation step while pushing the shaft we
need to find the direction in which the shaft is being translated. The direction vector is computed by
subtracting the position of the last tip particle from the position of the first body particle. After its
normalization it is multiplied by a translation value t to get the new coordinates. We repeat this process for
every particle starting from the first of the tip to the last particle of the body.

  if(t > 0) //PUSHING
  {
        DirectionVector = LastTipParticle - FirstBodyParticle
        Normalize(DirectionVector)
        Shift = DirectionVector * t

             for(int i = 0; i <= NumberOfShaftParticles; i++)
             {
                   NewParticles[i] = CurrentParticles [i] + Shift
             }
  }


Code snippet 1: Shaft translation - pushing

In case of pulling, we use different direction vectors for the body and for the tip. To translate body particles
we use the vector calculated by subtracting the last two particles of the instrument (the two closest to the
insertion point). Then the translation is applied starting from the last particle of the body. When the tip is
reached the second direction vector is calculated in the same way as for pushing. As the t is negative the
multiplication results in a backward shift of the tip.




                                                          32
  if(t < 0) //PULLING
  {
        //BODY
        DirectionVector = SecondLastShaftParticle-LastShaftParticle
        DirectionVector.Normalize
        BodyShift = DirectionVector * t

             for(int i = NumberOfShaftParticles; i > NumberOfTipParticles; i--)
             {
                   NewParticles[i] = CurrentParticles[i] + BodyShift
             }

             //TIP
             DirectionVector = LastTipParticle - FirstBodyParticle
             Normalize(DirectionVector)
             TipShift = DirectionVector * t

             for(int i = NumberOfTipParticles; i > 0; i--)
             {
                   NewParticles[i] = CurrentParticles[i] + TipShift
             }
  }


Code snippet 2: Shaft translation - pulling

TODO: RESIZE




                                              33
5.4 Shaft rotation

While rotating the shaft we can use two approaches. The first one, simpler, assumes ideal torsion control
over the body of the shaft. In other words we do not apply rotation to the body at all, we only rotate the tip.
This is done by obtaining the axis vector again by subtracting the position of the last tip particle from the
position of the first body particle. The rotation is done by calculating a vector from the last tip particle to the
particle which we want rotate and rotating that vector according to the axis. The angle of the rotation
depends on the input from the left/right tilt of the left analog stick. Because the tip is rigid, the rotation is
blocked when collision occurs with the organ’s walls.




                                                  Figure 22: Shaft rotation – ideal torsion control




  OldParticles = CurrentParticles     //SAVE THE PARTICLE POSITIONS
  AxisVector = LastTipParticle – FirstBodyParticle //CALCULATE THE ROTATION AXIS

  for(int i = LastTipParticle; i > 0; i--)
  {
        TempVector = CurrentParticle[i] - CurrentParticle[FirstBodyParticle];
        RotationShift = rotate TempVector around AxisVector by Angle
        NewParticles[i] = FirstBodyParticle + RotationShift

             if(NewParticles[i] collides with the organs)
                   break; //INTERRUPTS THE LOOP
  }

  if (CollisionOccured)
        NewParticles = OldParticles //BRING BACK OLD POSITIONS

Code snippet 3: Shaft rotation – ideal torsion control




                                                                        34
Unfortunately, the above solution is not realistic enough. The alternate algorithm does not assume ideal
torsion control over the body. In principal, it works the same way as the previous one except that the
rotation starts at the bottom of the body. The main difference is when the collision occurs. The rotation is
not being blocked anymore, but instead there are new axes calculated around which we rotate and the
rotations are combined. This can be best demonstrated by the following example from (10):

     “An example of this second approach to rotation is given in Figure 23. First, the Xd particle, which
    does not collide with the vasculature, is rotated along the axis, of a φ angle and arrives in X’d. The
    next rotated point X’c collides with the vasculature during the rotation. According to external forces
    described in the section 5.4, the particle is shifted back inside the vasculature to cancel the collision.
    We store the new position (X’c) of the rotated particle after the collision response as the second
    reference point for the rotation of the next particles of the instrument. At that stage, we compute
    the second axis vector by subtracting the current point from the one before (a2 vector in Figure 23).
    For the next point Xb, we first perform the rotation along the a1 axis and according to the first
    reference point and then we perform the rotation along the a2 axis and according to second
    reference point (X’c). We repeat these steps for the next particles. At every stage of rotation, we
    rotate by the same φ angle. In the case of the instrument tip, we may assume that it is either so stiff
    that we do not rotate it if any of its particles collides with the vasculature or we rotate it as any
    other particle of the body.”




                                    Figure 23: Shaft rotation – non- ideal torsion control




                                                             35
//SAVE THE LAST PARTICLE POSITIONS
OldParticles = CurrentParticles
RotationPoints.Add( LastBodyParticle );
RotationAxes.Add( LastTipParticle - FirstBodyParticle );

for(int i = LastBodyParticle; i >= FirstBodyParticle; i--)
{
      for(int j=0; j<CurrentNumberOfRotationPoints; j++)
      {
            TempVector = OldParticles[i] - RotationPoints[j];
            RotationShift = Rotate TempVec around RotationAxes[j] by Angle
            NewParticles[i] = RotationPoints[j] + RotationShift
      }

     if(NewParticles[i] collides with the organs)
     {
           NewParticles[i] += ExternalShift
           RotationPoints.Add(NewParticles[i])
           RotationAxes.Add(NewParticles[i] - NewParticles[i+1])
     }
}


                       Code snippet 4: Shaft rotation – non-ideal torsion control




                                                  36
5.5 Collision detection

The algorithms used for the collision detection were transferred from (10). That is why they will be
described very briefly. In fact they will be replaced during integration by the state of the art algorithms from
the SOFA framework. The SOFA algorithms are presented in details in(31).

The Axis Aligned Bounding Boxes is one of the most robust Bounding
Volume Hierarchy. Thanks to its simplicity it is efficient to build, traverse
and update. In this case the top down approach was used. It means that
we start with the bounding box which bounds the whole object’s triangles
(the root node) and then we add child nodes to it by dividing the box into
smaller ones. Then the process is recursively repeated for every new node
until final conditions are met. Figure 24 shows a cross section through the
organ walls (red lines) with marked bounding boxes and the colliding             Figure 24: The cross section throguh
                                                                                    the organ with bounding boxes
sphere from the instrument particle.

The division of the box, i.e. the creation of a lower level of the tree, may be based on the space or on the
volume. In the first case we divide the bigger box (parent node) into equal size smaller boxes (child nodes).
The number of boxes created affects the structure of the resulting tree, i.e. its performance in terms of
building, traversing and updating. The broader the tree is the fewer steps we need to traverse it. The most
common solutions are to divide the box into two (binary trees, BSP), four (quad trees) or eight (octtrees)
parts. The disadvantage of the above solution is that we can finish with boxes having just a few or even no
triangles while other can have an excessive number bounded. This can result in sudden performance drop
when checking for collisions in regions with triangles densely distributed.

To tackle this problem and obtain a tree with triangles evenly distributed, the volume instead of space
partitioning was used. In algorithm proposed by(10) the triangles are approximated by theirs midpoints.
This solves the ambiguities when a triangle spans over more than one box. The division of the box is based
on a plane calculated by taking average value of all midpoints in a selected direction (x, y or z). Comparing to
the previous attitude, the building stage takes much more time, but results in a well balanced tree which is
fast to traverse and therefore is more stable in performance.

As the endoscope model is built from particles which may be represented as spheres and the organ is
represented as polygonal surface (a set of triangles) and encapsulated in an AABB tree , we need methods
to check for the intersection between those primitives. The details regarding sphere-box and sphere-
triangle collision are discussed in (10).

5.6 Collision response

The forces which influence instrument particles are crucial part of every mass-spring model. In order to
implement realistic model at every simulation step there are four different types of forces applied to every
particle. The external force keeps the particles inside the organ model. The spring force is responsible for
instrument compressibility i.e. keeping particles at given distance. The 2D bending and 3D bending forces
are responsible for straightening the instrument and bending its tip. The final force applied to the particles
is the sum of above forces. Due to linear nature of the instrument model, the propagation direction and the
                                                      37
coefficient of the forces play important role in a model. The forces and how they are applied are described
in more detail in the following sections.

5.7 External force

The external force is responsible for keeping the endoscope’s particles inside the organ model. It is applied
when a particle collides with the walls of an organ. During a simulation step, the particles and their
encapsulated spheres often collide with more than one triangle. We need to include all normal vectors of
the colliding triangles in the external force response. These vectors are being averaged taking into account
how deeply the particular triangle collides with the sphere. Then, the obtained direction vector, after
normalization, is multiplied by the distance to the most colliding triangle, which is the closest triangle.




                                                    Equation 1




                                                    Equation 2


 - normalized average normal vector of the colliding triangles

   – normalized distance to the ith colliding triangle

      - distance from the closest colliding triangle

  - minimal value by which the node needs to be moved outside the collision
 – number of colliding triangles

Figure 25 presents a sphere, with its centre in point S, colliding with three yellow triangles which normal
vectors are shown by black arrows. The vector n represents the direction of the combined external force
and d2 represents its magnitude.




                                                         38
                                             Figure 25: External forces

External forces can sometimes not be enough to push the sphere out of the collision or they might push the
particle and its sphere into a new collision; for example, in narrow places, with triangles on the opposite
side of an organ. To avoid leaving the particle in a colliding state, externalforces are apllied as long as a
collision is detected or if a limit number of iterations is reached. The Code snippet 5 presents the pseudo
code of the algorithm.

 iterationCounter = 0
 while (Sphere is colliding)
 {
       SumOfDistances = 0;
       MinimumDistance = 100;
       ExternalShift = (0, 0, 0)
       //SUM ALL DISTANCES AND FIND THE SMALLEST ONE
       for(int i = 0; i < NumberOfCollidingTriangle; i++)
       {
             SumOfDistances += Distances[i];
             if(DistancesToCollidingTriangles[i] < MinimumDistance)
                   MinimumDistance = DistancesToCollidingTriangles[i];
       }

         //FIND THE DIRECTION OF THE FORCE
         for(int i = 0; i < NumberOfCollidingTriangle; i++)
         {
               ExternalShift += CollidingTrianglesNormals[i] *
               * DistancesToCollidingTriangles[i] / SumOfDistances);
         }
         //PUSH THE SPHERE OUT OF COLLISION
         Normalize( ExternalShift )
         ExternalShift *= -kE * (SphereRadius - MinimumDistance + 0.2)
         NewSpherePosition = CurrentSpherePosition + ExternalShift

         if(itaration == MAX_ITERATIONS)
               break

         iterationCounter++
 }


                                          Code snippet 5: External forces
                                                        39
5.8 Spring force

The spring force is responsible for keeping the instrument particles at the same distance from each other
and is defined by the following equation:




                                                   Equation 3




  - spring coefficient

 - current distance between particle and its neighbour

 - vector between the particle and its neighbour

 - desired distance between particles

The Code snippet 6 presents the pseudo-code of the algorithm used in two versions. The first one
propagates the force from the end of the tip to the start of the body (top to bottom), the second one works
opposite way (bottom to top).

 //TOP TO BOTTOM PROPAGATION
 for(int i=0; i < NumberOfParticles; i++)
 {
       DirectionToNext = Particles[i+1] - Particles[i];
       DistanceToNext = DirectionToNext.Length
       Normalize(DirectionToNext)

          Particles[i+1] += kS * (DistanceToNext - Lambda) * DirectionToNext
 }

 //BOTTOM TO TOP PROPAGATION
 for(int i = NumberOfParticles; i > 0 ; i--)
 {
       DirectionToNext = Particles[i-1] - Particles[i];
       DistanceToNext = DirectionToNext.Length
       Normalize(DirectionToNext)

          Particles[i-1] += kS * (DistanceToNext - Lambda) * DirectionToNext
 }


                                          Code snippet 6: Spring forces




                                                      40
5.9 2D Bending force

The 2D bending force is responsible for straightening the body of the shaft and keeping the tip bended at
the given angle. In order to achieve this, maintaining the right distance between neighbouring particles set
by the spring forces, the following approach was used:




                                                        Equation 4


   - is the 2D bending force coefficient

  - is the current angle between the particle i and its two neighbours

   - is the bias angle, i.e. the desired angle at this particle

     - is a vector perpendicular to the v1 vector lying on the same surface as the vector v2




                                                Figure 26: 2D Bending force

The shift caused by the         vector is the consequence of a rotation of angle equal to
around the axis passing through particle Xi and being perpendicular to the v1 and v2 vectors. The bias angle
of the body particles is equal to zero as they only need to be straightened. For the tip, the bias angles are
controlled by the user, forcing the tip of the shaft to bend in the direction the user wants to see. The Code
snippet 7 shows the pseudo code implementation of the above algorithm.




                                                           41
  for(int i=0; i < NumberOfParticles; i++)
  {
        Vector1 = Particles[i] - Particles[i-1]
        Vector2 = Particles[i+1] - Particles[i]

            ThetaAngle = BendingAngle( Vector1, Vector2 )

            RotationAxis = Vector1 CrossProduct Vector2
            Normalize(RotationAxis)

            RotationAngle = kB * (ThetaAngle - BiasThetaAngles[i])
            Shift = rotate Vector2 around CrossAxis by RotationAngle

            Particles[i] = Particles[i+1] + Shift
  }




Code snippet 7: 2D Bending force




                                            42
5.10 3D Bending force

The spring and bending forces guarantee that the particles are positioned at the proper distance and angle
from each other. They however do not guarantee that the particles lie on the same plane. This is not an
issue in case of the body but it is crucial in case of the tip since in real life, the endoscope tip is fairly rigid
and tends to stay in the shape set by the clinician. In order to keep the particles in the same plane we
introduce a fourth force which is defined by the following equation:




                                                 Equation 5: 3D Bending

     – is the 3D bending force coefficient

  - is the angle between the base plane and the current plane

     - is the vector which direction is the same as plane π2 normal, but is attached to particle i-1

The 3D bending force is explained in more details in (10) as follow:

    “Figure 27 presents the 3D bending force and the computation of its direction. We obtain the plane
    normals by applying the cross product to vectors u1 and u2 for the tip plane, 1π, and to v1 and v2 for
    the current plane, 2π. The tip plane is effectively determined by the first two particles of the tip
    (Xtlenght and Xtlenght-1) and the first body particle (Xtlenght+1). To obtain the α angle, we find the signed
    angle between the plane normals n1 and n2. Because the spring forces are restraining the length of
    the u vector and 2D bending forces are restraining the angle between v1 and u the w force will rotate
    the i-1 particle in 3D according to Xi and Xi+1. The α angle is a signed angle, thus we may effectively
    denote w as w+ and w- and α as α+ and α- depending on the sign of the angle. To correctly
    approximate the 3D bending force, it is enough to rotate the u vector by the α angle according to the
    axis defined by the vector.”(10)




                                               Figure 27: 3D Bending force



                                                          43
The Code snippet 8 shows the pseudo code of the algorithm:

 //DETERMINNG THE NORMAL VECTOR OF THE BASE PLANE
 VectorAB = Particles[i] - Particles[i+1]
 VectorAC = Particles[i-1] - Particles[i+1]
 BaseNormal = VectorAB CrossProduct VectorAC
 Normalize( BaseNormal )

 for(int i = NumberOfTipParticles - 1; i >= 0; i--) //WE SKIP FIRST TIP PARTICLE
 {
       //DETERMINING CURRENT PLANE NORMAL
       VectorAB = Particles[i] - Particles[i+1];
       VectorAC = Particles[i-1] - Particles[i+1];
       CurrentNormal = VectorAB CrossProduct VectorAC
       CurrentNormal.Normalize();

         //ANGEL IN DEGREES BETWEEN TWO NORMALIZED PLANES NORMALS
         theta = Deg2Rad( ArcCos( DotProduct( CurrentNormal, BaseNormal) ) )

         for (int j=i; j>0; j--)
         {
               RotationAxis = Particles[j] - Particles[j+1]
               RotationAxis.Normalize

                 VectorU = Particles[j-1] - Particles[j+1];

                 //DETERMINING THE SIGN OF THE ROTATION
                 signTheta= Deg2Rad( ArcCos( DotProduct( VectorU, BaseNormal) ) )
                 if(signTheta > 90) sign=-1 else sign=1

                 RotationAngle = kB3D * theta * sign
                 Shift = rotate VectorU around RotationAxis by RotationAngle
                 Particles[j-1] = Particles[j+1] + Shift
         }
 }


                                       Code snippet 8: 3D Bending force




                                                     44
5.11 Force propagation

The force coefficients, the order and the direction in which the forces are being propagated determine how
well the model yields the real instrument. Substantially, the following factors were obtained experimentally.
Moreover, these factors need to be adjusted, if we change the basic parameters of the endoscope model
such as its width, number of tip particles or distance between them.

The larger values of the k-coefficients the faster particles are being positioned at the desired location. This
leads to more rigid instrument. However, making it to rigid may result in unstable behaviour, what shows up
as oscillations or jerky movement. Applying same forces using smaller k-coefficients but more than once,
usually in different propagation direction, can minimise those drawbacks. Obviously, it comes at the cost of
computational complexity and doing that for every particle significantly decreases the performance of
whole simulation. Luckily, it is often enough to do this just for the tip, as usually it collides the most and
designates movement of the whole instrument.

Applying the forces in two iterations in different directions additionally solves one more relevant issue.
Namely, when applying forces just from the top to the bottom the tip behaves like it was pulled by its end.
That not only looks not realistically, but may influence the simulation. That defect disappears after applying
the second iteration to the tip. The Code snippet 9 details the way the algorithm works for the tip
translation:

More details

  //TOP TO BOTTOM
  for(int i=0; i < NumberOfTipParticles; i++)
  {
        //THE PARAMETERS ARE THE FORCE COEFFICIENTS i.e. kS, kB,kE…
        ApplySpringForce(1.0)
        Apply2DBendingForce(1.2)
        TranslateParticle()
        ApplyExternalForce(1.0)
  }

  //BOTTOM TO TOP
  for(int i = NumberOfParticles; i > 0 ; i--)
  {
        ApplySpringForce(0.8)
        Apply2DBendingForce(0.7)
        Apply3DBendingForce(1.0)
  }

  TranslateBody()

Code snippet 9: Force propagation during tip translation

After translating the tip, we translate the body. We use opposite propagation directions depending on the
sign of the translation. That is to say, when the instrument is pulled, the propagation is done from bottom
to top and when it is pushed, it is propagated from top to bottom. We can notice the lack of a 3D bending
force.


                                                           45
  if(dt > 0)
  {
        //PUSHING – TOP TO BOTTOM
        for(int i = NumberOfTipParticles; i < NumberOfParticles; i++)
        {
              //THE PARAMETERS ARE THE FORCE COEFFICIENTS i.e. kS, kB,kE…
              TranslateParticle()
              ApplySpringForce(1.0)
              Apply2DBendingForce(0.4)
              ApplyExternalForce(1.0)
        }
  }
  if(dt < 0)
  {
        //PULLING – BOTTOM TO TOP
        for(int i = NumberOfParticles; i > NumberOfTipParticles ; i--)
        {
              TranslateParticle ()
              ApplySpringForce(1.0)
              ApplyExternalForce(1.0)
        }
  }

Code snippet 10: Force propagation during body translation

During rotation, forces are propagated in a bottom to top direction as explained in section 5.4. After
rotating each particle, we check if it collides. If yes, we block the rotation. Otherwise, we apply the forces as
shown in the Code snippet 11. This is done in “regular” manner i.e. first, we apply spring force, then 2D
bending and, lastly, 3D bending.

  //BOTTOM TO TOP
  RotateBody()//REMOVING THIS WILL RESULT IN PERFECT TORSION CONTROL
  for(int i = NumberOfTipParticles; i > 0 ; i--)
  {
        RotateParticle()
        if(collision occurs)
              Restore previous position and return
        ApplySpringForce(0.9)
        Apply2DBendingForce(0.9)
        Apply3DBendingForce(1.0)
  }

Code snippet 11: Force propagation during tip rotation

If we use the realistic approach, which does not assume perfect torsion control, before rotating the tip we
rotate the body as shown in the first line of Code snippet 11. The body rotation is done in the same fashion
as the tip rotation: following a bottom to top approach. The Code snippet 12 presents how the forces are
applied during the propagation of the body rotation.




                                                             46
  //BOTTOM TO TOP
  for(int i = NumberOfParticles; i > NumberOfTipParticles ; i--)
  {
        RotateParticle()
        ApplySpringForce(1.0)
        Apply2DBendingForce(0.6)
        ApplyExternalForce(1.0)
  }

Code snippet 12: Force propagation during body rotation

It is worth mentioning that, due to the nature of analog control, we very rarely just translate or just rotate
the instrument. Usually it happens in parallel i.e. when we translate there usually is a small amount of
rotation and the other way around. This causes, that most of the time all above code is executed.

5.12 Force feedback

The following mechanisms, due to the lack of access to the specialized haptic device, were described here
for the future reference. We can distinct two kinds of force feedback - translational and rotational. The
total translational friction is the sum of forces described by following equations:




                                                          Equation 6: Damping force

      - is the damping coefficient

   - is the speed of the translation




                                                          Equation 7: Friction force


     - is the friction coefficient

             - is the number of colliding particles




     - is the resistance factor

            - is the number of particles of the instrument




                                                                     47
The rotational friction is the difference δ between the desired rotation β and the actual rotation γ as shown
in Figure 28



                                            Equation 8: Rotational force


    - is the force coefficient

 - is the angle from collision point




                                           Figure 28: Rotational friction




The joypad used for this project does not provide any force feedback mechanisms. However, many gaming
devices can generate simple vibrations, what can give an approximate idea how hard the instrument is
colliding with the organs. That is why the amount of vibration depends only on friction force.




                                                        48
5.13 Actuators




                   Figure 29: Actuators. In this example, the endoscope has a camera and two actuators: a grasper and a cutter.




Each actuator has its origin positioned at the end of the tip (tip first particle, i.e. X0). The z-axis of this origin
is dependent on the base plane of the tip, while the y-axis points forwards (a). An actuator stores a
translation vector, which allows placing it in the desired location on the tip’s cross-section.

The actuator can protrude from the shaft (b) by translating the actuator origin along its y-axis. The actuator
can also be rotated around the y (c) and z axis (e) of the origin. It also can be bended (d) at selected point(s)
to allow manipulation of tissues or objects.

Figure 27 shows the three available actuators in the simulator. They can be used for different tasks:
grasping, cutting, and poking. Figure 28 displays an example of how the grasper and the cutter can open and
close to interact with any object. Figure 29 shows…




Figure 30: Three different actuators can be set at the end of the endoscope: a grasper (left), a cutter (middle) and a poker (right). All are displayed
                                                         as textured meshes with wireframe



                                                                         49
     Figure 31: grasper (far) and the cutter (close), closed and opened.




Figure 32: The triangulated actuators (left) and folded for movement (right)




                                    50
5.14 Realistic per-pixel lightning

The per-pixel lightning is a technique where all the lighting calculations are done at the pixel level. At first, a
standard per-vertex OpenGL graphics pipeline was implemented in GLSL (Gouraud shading). Later, the
pipeline was modified to work at per-pixel level (Phong shading) and finally extended to support normal and
specular mapping.

During lightning, the normal vectors are read from the mesh model for every vertex. In case of per-vertex
lightning, the colour of the vertices is calculated. Then the colours are interpolated on the rest of the pixels.
In case of per-pixel approach, we do not interpolate colours but we interpolate normal vectors and we
calculate the final colour of the pixel using its own normal. At this stage, we can modify the values of those
interpolated vectors by using data read from the normal map. Thus, we obtain realistic rendering without
having high polygons models.

The per-pixel lighting techniques require normal vectors and sometimes other height information declared
at each pixel point. This means that we have one normal vector at each texel (texture element). If these
normal vectors were declared in the world space coordinate system, we would have to rotate them every
time the object is rotated. As lights and camera are defined in world space, we would need thousands or
even millions of object to world matrix transformations. That is why we declare normals in the tangent
space coordinate system. The tangent space (vertex space) is a frame of reference attached to each vertex
in which the position of the vertex is [0, 0, 0] and the coordinates of the normal vector to the vertex are [0,
0, 1]. The other vectors forming this frame are called tangent vector [1, 0, 0] and bi-normal vector [0, 1, 0].
The matrices required for transformation from the world-space to the tangent-space can be pre-calculated
just like normals. Transforming the lights and the camera to the tangent space and doing calculations there
saves orders of magnitude in calculations.

The keystone of normal mapping is introducing small deviations to the normal vector in tangent space. This
can be done procedurally or by using pre-calculated values for example read from normal maps. A normal
map is a texture which texels are not interpreted as a RGB colour but as a XYZ direction of a vector. The
vector represents a normal vector perpendicular to the surface (triangle) at the considered texel in tangent
space. If the vector has [0, 0, 1] direction it does not introduce any deviation. This explains why normal
maps tends to the blue, when opened in an image viewer. If we modify the vector, for example to [0.2, 0.3,
0.93], we modify the normal of this texel. If this is done in a controlled manner, we obtain realistic rendering
without having high polygons models. Normal maps are usually obtained by processing a diffuse map or by
generating them from a high polygon version of a 3D model.

The use of such normal mapping therefore results in a more detailed scene without losing the real time
aspect so important in simulation. Figure… gives an example of the difference between a scene of the
simulator without texture and with texture. In our case, the texture is coming from…




6. INTEGRATION


                                                        51
7. EVALUATION

This chapter presents the evaluation of the flexible endoscope simulation as well as the performance of the
simulator. Further, feedback from clinicians is given.




8. CONCLUSION

This chapter concludes the whole project, summarising all achievements, discussing its results, and giving
recommendations for future work.



8.1 Achievements

8.2 Future work




                                                     52
USER MANUAL




Figure 33: LOGITECH GAMING PAD


A + B = hold button B while manipulating A


SHAFT CONTROLS:
left stick up/down:                    move forward/backward
left stick left/right:                 rotate
right stick up/down:                   bend tip

LEFT ACTUATOR CONTROLS:
buttons 5/7:                           move forward/backward
left stick up/down + 1:                rotate 1
left stick up/down + 2:                rotate 2
left stick up/down + 1 + 2:            bend left
left stick up/down + 5 + 7:            increase/decrease grasper angle

RIGHT ACTUATOR CONTROLS:

                                                   53
buttons 6/8:                  move forward/backward
left stick up/down + 4:       rotate 1
left stick up/down + 3:       rotate 2
left stick up/down + 3 + 4:   bend
left stick up/down + 6 + 8:   increase/decrease grasper angle

CAMERA AND LIGHT CONTROLS:
dpad up/down:                 move forward/backward
dpad left/right:              rotate
dpad up/down + 1:             rotate 1
dpad up/down + 2:             rotate 2
dpad up/down + 1 + 2:         bend

9/10:                         increase/decrease camera view angle
9/10 + 1:                     increase/decrease light cone angle
9/10 + 2:                     increase/decrease light exponent




                                          54
APPENDIX

The appendix contains information peripheral to the main body of the report such as the user’s manual,
coding practices used during implementation as well as list of abbreviations, figures and code snippets.

Coding practices

Hungarian notation:

                          Type                           Indicator            Example
                         Global                             g_             g_uiRefCount
                         Member                             m_             m_uiRefCount
                       Enumeration                           e               eStateOK
                      Signed integer                     i or n               iXPos,
                                                                            nLoopCount
                          Float                             f               fDiameter
                         Double                             d                 dTime
                         Boolean                            b              bNeedToDraw
                  Zero terminated string                    sz               szOutput
                       Signed char                          ch              chKeyInput
                        Unsigned                            u              uchInputVal
                        Parameter                           _               _rbSuccess


Abbreviations

2D      –      Two Dimensional
3D      –      Three Dimensional
AABB    –      Axis Aligned Bounding Box
BVH     –      Bounding Volume Hierarchy
CT      –      Computer Tomography
FEM     –      Finite Element Method
FPS     -      Frames per second
GUI     –      Graphical User Interface
IO      –      Input /Output
LMB     –      Left Mouse Button
MAS     -      Minimal Access Surgery
MMB     –      Middle Mouse Button
MRA     –      Magnetic Resonance Angiography
NOTES   -      Natural Orifice Transluminal Endoscopic Surgery
OO      –      Object Oriented
OOA     –      Object Oriented Analysis
OOD     –      Object Oriented Design
OOP     –      Object Oriented Programming
RMB     –      Right Mouse Button
STL     –      Standard Template Library
VCSIM   -      Virtual Catheterisation Simulator
VE      -      Virtual Environment
                                                   55
VR          -            Virtual Reality


List of figures

Figure 1: Cholecystectomy (gallbladder removal) using laparotomy (right) and laparoscopic (left) procedures
to access abdomen ........................................................................................................................................... 10
Figure 2: Flexible endoscope and cut through the endoscope shaft ............................................................... 11
Figure 3: Gallbladder removal through the mouth using NOTES (left) and a gallbladder viewed from the
laparoscope's camera (right) ............................................................................................................................ 12
Figure 4: Different prototypes of NOTES endoscopes. On the left, Cobra device by USGI medical. In the
middle, a concept device by Olympus Medical Systems. On the right, a robotic system for NOTES proposed
by Nanyang Technical University in Singapore. ............................................................................................... 13
Figure 5: An example mesh bounded using axis aligned bounding boxes ....................................................... 17
Figure 6: Different types of bounding volumes ................................................................................................ 18
Figure 7: Virtual NOTES Endoscope - the shaft with actuators ........................................................................ 21
Figure 8: UML use case diagram (DRAFT ......................................................................................................... 22
Figure 9: CVec3f - UML class diagram .............................................................................................................. 23
Figure 10: CTriangle and CModel - UML class diagram .................................................................................... 23
Figure 11: The instruments – UML class diagram ............................................................................................ 24
Figure 12: The AABB tree - UML class diagram FROM (7) ................................................................................ 25
Figure 13: CSphere – UML class diagram ......................................................................................................... 26
Figure 14: CSystem – UML class diagram ......................................................................................................... 26
Figure 15: CFLTK and CVTK – UML class diagram ............................................................................................. 27
Figure 16: IODevice and CJoystick – UML class diagram .................................................................................. 27
Figure 17: Callback mechanism – UML class diagram ...................................................................................... 28
Figure 18: Timetable for the project ................................................................................................................ 29
Figure 19: Mass-spring model of the shaft....................................................................................................... 30
Figure 20: Logitech Dual Action pad (left) used during the project. Due to use of DirectX a wide range of
controllers is supported, for example Microsoft X-Box 360 controller (middle) or traditional joysticks (right).
.......................................................................................................................................................................... 31
Figure 21: Shaft translation .............................................................................................................................. 32
Figure 22: Shaft rotation – ideal torsion control .............................................................................................. 34
Figure 23: Shaft rotation – non- ideal torsion control...................................................................................... 35
Figure 24: The cross section throguh the organ with bounding boxes ............................................................ 37
Figure 25: External forces ................................................................................................................................. 39
Figure 26: 2D Bending force ............................................................................................................................. 41
Figure 27: 3D Bending force ............................................................................................................................. 43
Figure 28: Actuators. In this example, the endoscope has a camera and two actuators: a grasper and a
cutter. ............................................................................................................................................................... 51
Figure 29: Three different actuators can be set at the end of the endoscope: a grasper (left), a cutter
(middle) and a poker (right). All are displayed as textured meshes with wireframe....................................... 51
Figure 30: grasper (far) and the cutter (close), closed and opened. ................................................................ 52
Figure 31: The triangulated actuators (left) and folded for movement (right) ................................................ 52
                                                                                     56
Figure 32: LOGITECH GAMING PAD .................................................................................................................. 55



List of code snippets

Code snippet 1: Shaft translation - pushing ..................................................................................................... 32
Code snippet 2: Shaft translation - pulling ....................................................................................................... 33
Code snippet 3: Shaft rotation – ideal torsion control ..................................................................................... 34
Code snippet 4: Shaft rotation – non-ideal torsion control ............................................................................. 36
Code snippet 5: External forces ........................................................................................................................ 39
Code snippet 6: Spring forces ........................................................................................................................... 40
Code snippet 7: 2D Bending force .................................................................................................................... 42
Code snippet 8: 3D Bending force .................................................................................................................... 45
Code snippet 9: Force propagation during tip translation ............................................................................... 46
Code snippet 10: Force propagation during body translation ........................................................................ 47
Code snippet 11: Force propagation during tip rotation ................................................................................. 48
Code snippet 12: Force propagation during body rotation.............................................................................. 48




                                                                            57
BIBLIOGRAPHY

1. Rattner, D. Introduction to NOTES White Paper. Surgical Endoscopy. 2006.

2. McGee, A. A Primer on Natural Orifice Transluminal Endoscopic Surgery: Building a New Paradigm .
Surgical Innovation Vol. 13. 2006, pp. 86-93.

3. Spivak, H and Hunter, JG. Endoluminal surgery. Surgical Endoscopy. 1997, pp. 321-325.

4. MacFadyen, BV. Endoluminal surgery 2005;. Surgical Endoscopy. 2005.

5. Ponsky, JL. Endoluminal surgery: past, present and future. Surgical Endoscopy. 2006, pp. 500-2.

6. Richards, WO and Rattner, DW. Endoluminal and transluminal surgery: no longer if, but when. Surgical
Endoscopy. 2005, pp. 19: 461-3.

7. Blazewski, R. Virtual Catheterisation Simulator. 2008.

8. Allard, J, et al. SOFA – an Open Source Framework for Medical Simulation.

9. Sclabas, G, Swain, P and Swanstrom, L. Endoluminal Methods for Gastrotomy Closure in Natural Orifice
TransEnteric Surgery. Surgical Innovation. 2006, pp. 13: 23-30.

10. Rittner, D and Kalloo, A. ASGE/SAGES Working Group on Natural Orifice Translumenal Endoscopic
Surgery. Surgical Endoscopy. 2006, pp. 20: 329–333.

11. Simulation and Virtual Reality in Surgical Education. Real or Unreal? Gorman, PJ, Meier, AH and
Krummel, TM. 1999, Archives of Surgery, pp. 134:1203-1208.

12. Virtual reality surgical simulator. The first steps. Satava, RM. 1993, Surgical Endoscopy, pp. 7:203-205.

13. Simulation in surgery: opportunity or threat? Gallagher, A G and Traynor, O. 2008, Irish Journal of
Medical Sciences, pp. 177:283–287.

14. LapSim Validation Studies. [Online] 2010. [Cited: 22 03                     2010.]    http://www.surgical-
science.com/_files/files/download.cfm?file=LapSimValidation2010.pdf.

15. Fried, NP, et al. The Use of Surgical Simulators to Reduce Errors.

16. Real-time deformable models for surgery simulation: a survey. Meier, U, et al. 2005, pp. 183-197.

17. STRANDS: Interactive Simulation of Thin Solids using Cosserat Models, . Pai, Dinesh K. 2002,
EUROGRAPHICS Volume 21, Number 3.

18. CORDE: Cosserat Rod Elements for the Dynamic Simulation of One-Dimensional Elastic Objects.
Spillmann, J and Teschner, M.

19. Lang, H, Linn, J and Arnold, M. Multibody dynamics simulation of geometrically exact Cosserat rods.

                                                      58
20. Collision detection: A survey; . Kockara, S, et al. 2007. Systems, Man and Cybernetics - IEEE International
Conference.

21. Teschner, M, et al. Collision detection for deformable objects.

22. Hermann, E, Faure, F and Raffin, B. Ray-traced collision detection for deformable bodies.

23. Bartz, D, Klosowski, J T and Staneker, D. k-DOPs as Tighter Bounding Volumes for Better Occlusion
Performance.

24. Adaptively sampled distance fields: A general representation of shape for computer graphics. . Frisken, S
F, et al. 2000. Siggraph, Computer Graphics Proceedings. pp. 249–254.

25. An efficient and scalable deformable model for virtual reality-based medical applications. Choi, Kup-Sze,
Sun, Hanqiu and Heng, Pheng-Ann. 2004, Artificial Intelligence in Medicine 32, pp. 51—69.

26. Real-time guidwire simulation in complex vascular models. . Luboz, V, et al. 2009, The Visual Computer,
Volume 25, Number 9.

27. 3D ChainMail: a fast algorithm for reforming volumetric objects, . Frisken-Gibson, Sarah F. 1997.
SIGGRAPH - Proceedings of the Symposium on Interactive 3D Graphics. pp. 149—154.

28. Park, Jinah, et al. Shape Retaining Chain Linked Model for Real-time Volume Haptic Rendering.

29. Frisken-Gibson, Sarah F. Using Linked Volumes to Model Object Collisions, Deformation, Cutting,
Carving, and Joining. . IEEE transactions on visualization and computer graphics, vol. 5. no. 4,.

30. Avril, Q, Gouranton, V and Arnaldi, B. New trends in collision detection performance, . hal-00412870,
version 1 - 2. 2009.

31. Development of advanced endoscopes for Natural Orifice Transluminal Endoscopic Surgery. Bardaro, S
and Swanstrom, L. 2006, Minimally Invasive Therapy, pp. 378–383.




                                                      59

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:3
posted:11/14/2011
language:English
pages:59