Computational and Numerical Simulations 9781773615851, 9781773613857

567 63 22MB

English Pages [256]

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Computational and Numerical Simulations
 9781773615851, 9781773613857

Table of contents :
Cover
Half Title Page
Title Page
Copyright Page
About the Author
Table of Contents
Preface
Chapter 1 Introduction
1.1 Random Variables
1.2 Stages of The Simulation
1.3. Monte Carlo Simulation Type
1.4 Theory of Tails
1.5 Models of Waiting Lines
Chapter 2 Introduction To System Dynamics Simulations
2.1 Stages In Building Models
2.2 Causal Diagrams
2.3 Levels Diagrams – Flow
2.4 Mathematical Structure
Chapter 3 Implementation of System Dynamics For Urban Planning In A Municipality
3.1 Introduction and Objectives
3.2 Conceptualization And Formalizing Different Submodels
3.3 Construction Submodel
3.4 Submodel Economic Activity
3.5 Analysis Feedback Loops
3.6 Study of Model Behavior Under Different Scenarios
3.7 Conclusions
Chapter 4 Dynamic Integrated Framework For Improving Software Processes
4.1 Simulation Software Development Process
4.2 Justification
4.3 Development of DIFSPI
4.4 Using The Simulation at Different Levels of Maturity
4.5 Conclusions
Chapter 5 Vehicle Aerodynamic Analysis Using CFD Simulation
5.1 Introduction
5.2 Mathematical Modeling
5.3 Approximations Aerodynamic CFD
5.4. Potential Flow Model
5.5. Simulation Details
5.6 Simulation Results
Chapter 6 Parallelization in Hydraulic Simulations
6.1. Introduction
6.2. Parallel Architecture Features
6.3 Computer Concept Parallelism
6.4 Hydraulic Simulation Application
6.5 Conclusions
Chapter 7 Learning and Computational Analysis Technologies
7.1. Introduction
7.2. Problem Statement
7.3. Case Study
7.4. Work Methodology
7.5. Conclusion
Chapter 8 Simulation Processes For Intercommunication Devices In an Intelligent Environment
8.1. Introduction
8.2. Cyclical Instability
8.3. Buzz Box
8.4. Implementation
8.5. Results
8.6. Conclusions
Chapter 9 Numerical Simulations of Systems and Models
9.1 A Brief Introduction to Systems
9.2 Evolution of Mathematical Models
9.3 The Models as Approximations of Reality
9.4 Classification Models Based on Purpose
9.5 Model Building
9.6 Modeling And Simulation
9.7. The Simulation Model
Chapter 10 Transient Simulations In Fluid Mechanics
10.1 Introduction
10.2 Description Of The Problem
10.3 Oscillations Which Are Frictionless
10.4 Oscillations With Friction
10.5 Gravity Flow
10.6 Oscillations of A U-Tube Which Are Frictionless
10.7 Turbulent Friction
Chapter 11 Numerical Analysis of Efforts and Contact Areas in a Scorpio Stryker For Designing Customized to the PTR Phenotype
11.1. Introduction
11.2. Preliminary Analysis Validation
11.3 Preliminary Analysis
11.4 Case Study With Numerical Analysis
11.5. Results
11.6. Discussion
11.7 Conclusions
Chapter 12 Lapbot Positioning In A Three-Dimensional Virtual Environment Using Simulated Interface
12.1 Introduction
12.2. Surgical Simulators
12.3. Surgical Robotics
12.5 Using Positioning Lapbot Novint Falcon Interface
12.4 Haptic Interfaces
12.6 Haptic Libraries
12.7 Positioning An Object Using Novint Falcon
12.8 POSITIONING LapBot
12.9 Collision Detection And Force Feedback In A Virtual Three-
Dimensional Environment
12.10 NxOgre
12.11 Collision Detection And Force Feedback
Chapter 13 Conclusion And Final Considerations
Chapter 14 Conclusion
References
Index

Citation preview

COMPUTATIONAL AND NUMERICAL SIMULATIONS

COMPUTATIONAL AND NUMERICAL SIMULATIONS

Ivan Stanimirović

ARCLER

P

r

e

s

s

www.arclerpress.com

Computational and Numerical Simulations Ivan Stanimirović

Arcler Press 2010 Winston Park Drive, 2nd Floor Oakville, ON L6H 5R7 Canada www.arclerpress.com Tel: 001-289-291-7705         001-905-616-2116 Fax: 001-289-291-7601 Email: [email protected] e-book Edition 2019 ISBN: 978-1-77361-585-1 (e-book) This book contains information obtained from highly regarded resources. Reprinted material sources are indicated and copyright remains with the original owners. Copyright for images and other graphics remains with the original owners as indicated. A Wide variety of references are listed. Reasonable efforts have been made to publish reliable data. Authors or Editors or Publishers are not responsible for the accuracy of the information in the published chapters or consequences of their use. The publisher assumes no responsibility for any damage or grievance to the persons or property arising out of the use of any materials, instructions, methods or thoughts in the book. The authors or editors and the publisher have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission has not been obtained. If any copyright holder has not been acknowledged, please write to us so we may rectify.

Notice: Registered trademark of products or corporate names are used only for explanation and identification without intent of infringement. © 2019 Arcler Press ISBN: 978-1-77361-385-7 (Hardcover) Arcler Press publishes wide variety of books and eBooks. For more information about Arcler Press and its products, visit our website at www.arclerpress.com

ABOUT THE AUTHOR

Ivan Stanimirović gained his PhD from University of Niš, Serbia in 2013. His work spans from multi-objective optimization methods to applications of generalized matrix inverses in areas such as image processing and computer graphics and visualisations. He is currently working as an Assistant professor at Faculty of Sciences and Mathematics at University of Niš on computing generalized matrix inverses and its applications.

TABLE OF CONTENTS

Preface...........................................................................................................xi Chapter 1

Introduction............................................................................................... 1 1.1 Random Variables................................................................................. 6 1.2 Stages of The Simulation....................................................................... 9 1.3. Monte Carlo Simulation Type ............................................................ 14 1.4 Theory of Tails..................................................................................... 16 1.5 Models of Waiting Lines..................................................................... 19

Chapter 2

Introduction To System Dynamics Simulations........................................ 25 2.1 Stages In Building Models................................................................... 26 2.2 Causal Diagrams................................................................................. 27 2.3 Levels Diagrams – Flow...................................................................... 28 2.4 Mathematical Structure....................................................................... 29

Chapter 3

Implementation of System Dynamics For Urban Planning In A Municipality....................................................................... 31 3.1 Introduction and Objectives............................................................... 32 3.2 Conceptualization And Formalizing Different Submodels................... 32 3.3 Construction Submodel...................................................................... 33 3.4 Submodel Economic Activity.............................................................. 34 3.5 Analysis Feedback Loops.................................................................... 35 3.6 Study of Model Behavior Under Different Scenarios........................... 36 3.7 Conclusions........................................................................................ 37

Chapter 4

Dynamic Integrated Framework For Improving Software Processes........ 39 4.1 Simulation Software Development Process......................................... 40 4.2 Justification ........................................................................................ 43 4.3 Development of DIFSPI...................................................................... 44

4.4 Using The Simulation at Different Levels of Maturity........................... 49 4.5 Conclusions........................................................................................ 53 Chapter 5

Vehicle Aerodynamic Analysis Using CFD Simulation............................. 55 5.1 Introduction........................................................................................ 56 5.2 Mathematical Modeling...................................................................... 57 5.3 Approximations Aerodynamic CFD..................................................... 58 5.4. Potential Flow Model......................................................................... 60 5.5. Simulation Details............................................................................. 66 5.6 Simulation Results.............................................................................. 72

Chapter 6

Parallelization in Hydraulic Simulations.................................................. 75 6.1. Introduction....................................................................................... 76 6.2. Parallel Architecture Features ............................................................ 76 6.3 Computer Concept Parallelism............................................................ 78 6.4 Hydraulic Simulation Application....................................................... 79 6.5 Conclusions........................................................................................ 81

Chapter 7

Learning and Computational Analysis Technologies................................. 83 7.1. Introduction....................................................................................... 84 7.2. Problem Statement............................................................................. 84 7.3. Case Study......................................................................................... 85 7.4. Work Methodology............................................................................ 87 7.5. Conclusion........................................................................................ 96

Chapter 8

Simulation Processes For Intercommunication Devices In an Intelligent Environment ...................................................................... 99 8.1. Introduction..................................................................................... 100 8.2. Cyclical Instability .......................................................................... 100 8.3. Buzz Box......................................................................................... 101 8.4. Implementation............................................................................... 102 8.5. Results............................................................................................. 103 8.6. Conclusions..................................................................................... 104

Chapter 9

Numerical Simulations of Systems and Models...................................... 105 9.1 A Brief Introduction to Systems ........................................................ 106

viii

9.2 Evolution of Mathematical Models.................................................... 108 9.3 The Models as Approximations of Reality.......................................... 109 9.4 Classification Models Based on Purpose........................................... 110 9.5 Model Building................................................................................. 114 9.6 Modeling And Simulation................................................................. 117 9.7. The Simulation Model...................................................................... 123 Chapter 10 Transient Simulations In Fluid Mechanics.............................................. 125 10.1 Introduction.................................................................................... 126 10.2 Description Of The Problem........................................................... 126 10.3 Oscillations Which Are Frictionless................................................. 128 10.4 Oscillations With Friction............................................................... 129 10.5 Gravity Flow................................................................................... 129 10.6 Oscillations of A U-Tube Which Are Frictionless............................. 134 10.7 Turbulent Friction............................................................................ 136 Chapter 11 Numerical Analysis of Efforts and Contact Areas in a Scorpio Stryker For Designing Customized to the PTR Phenotype...................... 141 11.1. Introduction................................................................................... 142 11.2. Preliminary Analysis Validation...................................................... 143 11.3 Preliminary Analysis....................................................................... 148 11.4 Case Study With Numerical Analysis ............................................. 151 11.5. Results........................................................................................... 151 11.6. Discussion..................................................................................... 155 11.7 Conclusions.................................................................................... 156 Chapter 12 Lapbot Positioning In A Three-Dimensional Virtual Environment Using Simulated Interface..................................... 159 12.1 Introduction.................................................................................... 160 12.2. Surgical Simulators........................................................................ 161 12.3. Surgical Robotics .......................................................................... 164 12.5 Using Positioning Lapbot Novint Falcon Interface........................... 171 12.4 Haptic Interfaces............................................................................. 168 12.6 Haptic Libraries.............................................................................. 180 12.7 Positioning An Object Using Novint Falcon.................................... 185 12.8 POSITIONING LapBot.................................................................... 188

ix

12.9 Collision Detection And Force Feedback In A Virtual ThreeDimensional Environment............................................................. 199 12.10 NxOgre........................................................................................ 208 12.11 Collision Detection And Force Feedback....................................... 210 Chapter 13 Conclusion And Final Considerations..................................................... 219 Chapter 14 Conclusion ............................................................................................ 225 References.............................................................................................. 229 Index...................................................................................................... 237

x

PREFACE

Simulation is an area of ​​study which is part of Operations Research (IDO), which is used in practically all known areas of study. Simulation allows to study a system without having to perform experiments on the real system. This has many advantages discussed below here. However, this is not the only way to study a system; Another possibility is to construct an analytical model consists of a set of equations (usually differential) representing the system then solve it for different situations or pose an optimization model that aims to provide the best strategy that the system must take to work better according to some measure of performance established in the “objective function” and satisfying the various conditions of the problem, set to “restrictions”. The models are obtained as a set of equations are often called analytic models, that is models of differential equations or optimization. Constructing an analytical model has often serious drawbacks, among which we mention: •

The difficulty of finding the model equations representing the real system and • The difficulty of solving the model. On the other hand, often it requires that individuals who participate in the team must have great training and skill. So these teams are often costly. In contrast, for simulation models, the teams can be formed by people with lower rating, so that coordination of these teams is generally simpler and often cheaper. This is not intended to say that analytical models are useless, because there are certain types of problems, for which method of production of the model and how to build an efficient algorithm to solve it is known. Simulation is a word that is familiar to professionals in all disciplines and even for those who have not studied a career. Thus the meaning of the word simulation is explained almost by itself. Among the meanings we can get from ordinary people to the word “Simulate” are the following: “Imitate reality,” “emulate a system”, “give the appearance or effect of a system or real situation”. There are many proposed definitions of what it means Simulation, here are some definitions here:

“A simulation is an imitation of the operation of a real-world process over certain time” “The behavior of a system over a given time can be studied by means of a simulation model. This model usually takes its shape from a set of assumptions about the operation of the actual system. “ In the first definition, it is implied a system thereof containing a process (possibly formed by threads). Thus it is a system which changes over time. Note that this definition does not indicate whether relations system variables are discrete or continuous, this depends on the model represent the real system. This division does not always exist in reality, we are human beings who we have divided (to facilitate their study) discrete and continuous. This happens with all things of nature; is is unique, however, man has been charged in dividing in physics, biology, mathematics, etc. No matter how you divide this nature probably it will remain unique and indivisible. A model is a representation of an object of interest. Although the object is unique, the number of performances is usually very large, so that the number of models of a real-world system is too. Since a system for real-world there will be many representations as conceptions of reality are taken, the number of models is usually infinite. The fact that they have more of a simulation model for a real system, we should not worry too much, find a simulation model is almost always easy, while finding an analytical model is often an arduous task, regardless that for many problems, an analytical model simply does not exist. Note that in the second incapié definition of a model, suggesting the possibility of different models, which is completely natural, given the multiplicity of models for the same real world object is also made. Note also that an objective of the simulation is proposed: “study real systems through model.” We could add further that the purpose of studying real systems is to understand the interaction of processes involved in, in order to modify them to obtain a benefit. This definition is implied that: A simulation model represents a set of assumptions (or postulates) on the operation of a real system. • The postulates of a simulation model can be expressed as relationships between entities or objects of interest in the system in the form of mathematical expressions, which would lead to an analytical model. Fortunately, you can replace these mathematical expressions and calculating the values ​​of the variables of interest, through probability distribution functions. •

xii

For queuing problems, analytical models are intended to represent the average results of using said probability distribution functions. Markov models also point in that direction. Simulation models of discrete (or type MonteCarlo simulation) events, however, use these distribution functions in order to perform an experiment whose results will lead, after a suitable number of tests which would be obtained in the actual system. These simulation models have the advantage that can be for many types of problems, not only for those waiting line. There are also models of area control theory incorporating probability distribution functions and what is known as stability systems, recently referred to as chaos theory that can also be used for a variety of problems. Stability models used are usually so hard to build and validate. On the other hand there are also models that use optimization and distribution functions allow us to study real-world systems somehow; Examples are models of neural networks and genetic algorithms. Other techniques used are Petri nets and regression models. Recently referred to as chaos theory that can also be used for a variety of problems. Stability models used are usually so hard to build and validate. On the other hand there are also models that use optimization and distribution functions allow us to study realworld systems somehow; Examples are models of neural networks and genetic algorithms. In this course study only queuing models and Markov models of Discrete Event Simulation Monte Carlo type. In what follows we will use the term “simulation” to refer to “Discrete Event Simulation”. In this course we will use the term “simulation” to refer to Discrete Event Simulation MonteCarlo type.

xiii

CHAPTER

1

INTRODUCTION

CONTENTS 1.1 Random Variables................................................................................. 6 1.2 Stages of The Simulation....................................................................... 9 1.3. Monte Carlo Simulation Type ............................................................ 14 1.4 Theory of Tails..................................................................................... 16 1.5 Models of Waiting Lines..................................................................... 19

Computational and Numerical Simulations

2

Simulation enables the study of, and experimentation with, the internal interactions of a real system or between a subsystem with one or more systems where relationships are stochastic in nature. The simulation is useful when: •

Required analyze different changes in the information and its effect. • We want to experiment with different designs or policies. • We want to verify analytical solutions. • An analytical model is impossible or difficult to build. • We want to study a real system and is dangerous or costly to do so in the actual system itself; doing so through an analytical model is impossible or inconvenient. In addition, it may be appropriate to use simulation as a teaching tool to reinforce analytical methodologies and to determine which are the most important variables of a system model, using simulation. In this way you can build a refined model of the real system. This can be useful for building different models to simulation. Computers today play a decisive role in this, because the complexity of the calculations exceeds human computing power. The story marks the decade of the forties during the twentieth century as the birth of modern numerical methods, the conjoined three essential elements: the development of programmable electronic computers, the development of modern mathematical analysis and the availability and need for complex problems in science and technology. Aspects such as fluid mechanics, the study of electromagnetic properties of the materials and analysis of complex mechanical systems relied heavily on these developments. A supercomputer has a high computing power and stability. Used for calculation-intensive tasks such as calculating quantum processes, analysis results of particle physics, weather forecasting, climate change effects, molecular kinetics, aeronautical or automotive simulations and wind processes in fusion reactors and fission. The first were introduced in the sixties, designed by Seymour Cray at Control Data Corporation until Cray founded his own company, Cray Research. Since then the super-computing market has grown exponentially. In the article of von Neumann and Goldstine (1947), a matrix inversion higher order established the possibility of employing computers to perform complex numerical calculations in reasonable times until then were not

Introduction

3

affordable to humans. Various problems that appear together in the physical and technological analysis had the same characteristics: •

The solution of large systems of linear equations representing complex systems • The solution systems of nonlinear equations • Optimization problems of a large number of variables and constraints • The problem of adjustment functions to a set of experimental points • The theory of approximations, by which seeks to approximate how a value of a complex function (such as sin (x), ln (x), ex, etc.) to one that is based on the use of basic operations (addition, subtraction, multiplication and division) • The development of Fourier analysis of complex signals • Integration and automatic derivation of complex functions by computer systems • The solution of ordinary and partial differential equations and integral equations These problems appear jointly to analyze the advent of programmable computers propelled the development of numerical techniques for doing physical phenomena and complex systems and applications. Increasingly complex problems arose with a greater number of variables, which competed quickly with the increasing capacity of existing computing and the need for increasingly accurate results. One example may suffice to explain this. Automotive design usually requires quality standards and strict predictability in moving to the order of millimeters, but the development of the aviation industry made it necessary in their standards a level of micrometric Predictability is established, while nuclear science made these values are reduced by up to six orders of magnitude. Even with the most powerful computers, it became necessary for the complicated calculation processes were slow, so precision and quickly they began to wage a fine battle to properly combine into computers available, without compromising accuracy. The stability analysis of the results and developed numerical methods became increasingly important as the development of numerical methods themselves.

4

Computational and Numerical Simulations

The impact and simulation of the propagation of an earthquake has been a remarkable job for CRAY computers by the UNAM, as the company itself that produces them proudly announces on its website. We can study the history of the statistical numerical methods and find that in all areas described above, about 90% of them have been developed over the last 70 years. While much of the numerical methods studied in this book do not correspond to those developed since 1940, the remaining 10% itself emerged as mathematical curiosities low applicability not available at the time. But they are currently sufficient to resolve most generic simulation problems. Numerical methods developed over the last 70 years is applicable each highly specific problems. The early development of numerical methods can hardly be traced in history, Understanding the use of numerical methods available today requires three skills associated with the aforementioned elements. On one hand the ability to understand how a numerical method represents a mathematical concept also allows alternatively manipulated to known analytical methods have major limitations to do so; the ability and knowledge of implementing these methods by programming a programmable computer; and finally, the ability to understand and visualize a physical or technological problem using these methods through mathematical concepts learned analytical analysis. This book aims to introduce students to these three aspects. This approach and its importance can be full of skepticism. On the one hand who has not used numerical methods to solve a complex problem have so far two paths known to those cost labor resign: first the analytical approach can solve by mathematical analysis, a simple problem without too many complications as is done in most theoretical science courses and introduction to engineering and on the other hand, the path of experimental and technological development, which allows we to design a process or prototype without major elements of previous calculation. We may remain there for comfort and without major surprises, but the way to modern science and technology and regularly raises simple problems can be solved analytically. Nobody builds an airplane without first simulated or a particle accelerator without numerical implementation and construction. Thus, the development of alternative mathematical numerical techniques to analytical, constitutes the body of study of numerical methods and quantitation of efficiency, the numerical analysis. Thus, for each element of mathematical knowledge, alternative techniques have been developed

Introduction

5

which address the problems in a purely numerical perspective, which seek to approximate the level possessing predictive analytical methods. While we could base the study of these techniques in a descriptive approach to them and complementarily in an analytical approach to their behavior, the reality of professional practice also involves learning implementation on more or less developed technology platforms and based on programming. The reason is simple, an increased need for accuracy is typically achieved with a high number of applications of each method, which scales the number of associated operations and complexity of developing by elementary calculation techniques. Thus the use of computational automata becomes necessary and essential. Thus, this work has three orientations. The first is to make an introduction to numerical methods typical and basic problems, especially those associated with algebra and calculus, which are basic elements in many of the analytical applications of science and technology. The second relates to achieve not only convey understanding of numerical methods, but the way these are implemented through programming languages and mastering some elements of this process. Finally, the possibility of using specialized software that simplifies although the degree of mastery of numerical methods still leaves the construction of numerical analysis of a problem. For these reasons, this book evolves from the basic understanding of some simple methods and recursive approach, The issue is primarily oriented to learning programming in languages like C ++ and Python, akin to the built-in Mathematica, but with a brief emphasis on Excel in order that readers initially understand the methodology behind the classical numerical methods. The use of Mathematica tool is dual purpose, the provision of tools for visualization and simulation, which will be useful in integration projects in the book and on the other hand, the solution scenarios methodology Problem-Based Learning , which will be integrated into each chapter from the book. In addition, the use of Mathematica can learn the commands of modern software for simulation and mathematical calculation, which in daily life can synthesize future application of the concepts in the book, but without ignoring the need for programming to display and perform engineering calculations. The field of numerical methods is very rough and seeks to understand procedures alternative to solve simple math problems, but can not be solved analytically. For example, the solution of an equation, the calculation of an integral, solving an optimization problem or the solution of a differential

Computational and Numerical Simulations

6

equation. Moreover, there are a variety of specific numerical methods for each type of problem, which cater not only to their relative efficiency, but the specific type of mathematical elements involved. Yet another aspect is the particular type of problem to be solved, since integrate different numerical methods in a given sequence, required input or generated in a form which restricts specific use. Some simulation applications we can mention are: • • • • • • •

Maintenance Traffic Simulation of a system (Teleprocess, air and land transport, telecommunications, telephony, ...). Changes in system configuration. Economic Simulation. Military strategies. Inventory control. Production lines, etc.

1.1 RANDOM VARIABLES It is called random variable, a variable X can take a set of values {xo, x1, x2, ... xn-1} with probabilities {p0, p1, p2, ... pn-1}. For example, experience coin tossing, the possible outcomes are {heads, tails}, and their probabilities are {1/2, 1/2}. In the experience of throwing dice, the possible outcomes are {1, 2, 3, 4, 5, 6} and their respective probabilities are {1/6, 1/6, 1/6, 1/6, 1/6, 1/6}. Now we make the experience of spinning a roulette wheel and note the number of the sector that matches the arrow. In roulette the left of the figure the possible results are {0, 1, 2, 3, 4, 5, 6, 7}, and the probability of each outcome is 1/8. In roulette the right of the figure the possible results are {0, 1, 2, 3}, and the respective probabilities {1/4, 1/2, 1/8, 1/8}, proportional to the angle of the sector . In the first three examples, the random variable X is said to be uniformly distributed, since all results are equally likely. However, in the last example, the random variable X, is not evenly distributed. The crucial problem of the application of Monte Carlo methods is to find the values of a random variable (discrete or continuous) with a given probability distribution by the function p (x) from the values of a random

Introduction

7

variable uniformly distributed in the [0, 1), provided by the computer or by a program routine incorporated. To simulate a physical process, or find the solution of a mathematical problem is necessary to use large amounts of random numbers. The mechanical method of roulette would be very slow, plus any real physical device generates random variables whose distributions differ at least slightly from the uniform distribution ideal. Also, you can use tables uniformly distributed random numbers, thoroughly tested based on special statistical tests. They are used only when the application corresponding to the Montecarlo method is by hand calculations, which nowadays is unimaginable. In practice it is more convenient to employ so-called pseudo-random numbers, it is numbers obtained from a number called a seed,

1.1.1. Discrete random variable To simulate the event, we proceed as follows: the probability of each outcome, proportional to the angle of each sector are and point in the second column, the total sum must give the unit. In the third column, the cumulative probabilities are written. Result

Probability

Probability accumulated

0

0.25

0.25

1

0.5

0.75

2

0.125

0.875

3

0.125

1

A random number is drawn? uniformly distributed in the interval [0, 1). On the X axis different results we have named x0, x1, x2, x3 are located. On the vertical axis the probability as vertical segments of length equal to the probability pi of each of the results, these segments are placed one after the other over their respective result xi. This results in a step function. When a random variable is drawn?, A horizontal line whose ordinate is plotted? The result is sought whose abscissa is the intersection of the horizontal line and the vertical segment, as indicated with arrows in FIG. If the random number? is between 0.25 and 0.75 called x1 result is obtained.

The table describes the drawing of a discrete variable, being a random variable uniformly distributed in the interval [0,1).

8

Computational and Numerical Simulations Condition 0 400 dpi

Quick Disconnect Handle

2lbs 8.896N

Separate haptics and graphics loops Single point interaction. Must Maintain Stability device Haptics wall (F = kx) With feedback forces 3DOF

Table 12.5: Specifications Inputs and outputs of the system Novint Falcon [2] TICKETS DIGITAL 4: Buttons 1, 2, 3 and 4 DEPARTURES DIGITAL 3: LEDs 1, 2 and 3

SIMILAR 3: Encoder 1, 2 and 3 SIMILAR 3: motors 1, 2 and 3

12.5.4 3D Simulation Engine 3D simulation engine is Ogre3D (Object-Oriented Graphics Rendering Engine). Ogre graphics engine is an open source object-oriented written in C ++. Licensed GNU Lesser General Public License (LGPL) which allows free use with some minor restrictions. It was designed with the aim to facilitate the work of developers who produce applications based 3D graphics acceleration hardware[twenty]. Since 2001 OGRE has grown to become one of the most popular engines graphic rendering open source, and has been used in a large number of projects in such diverse areas as games, simulators, educational software, interactive art, scientific visualization, and others[twenty-one]. It uses OpenGL and Direct3D graphics libraries of Microsoft, allowing code portability to different platforms such as Linux, Windows, Mac OS. Has a highly active community through its forum involved developing plugins that are able to integrate Ogre with lots of graphical tools and

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

177

provides continuous support both started as expert developers. What makes it certainly an attractive option to integrate the simulator. Ogre has a lot of classes and subclasses that allow you to perform your job. To understand broadly as Ogre works is useful UML diagram that incorporates the most important classes [20]. The instance of the class Root, Ie, the “Root” object is the entry point to the system Ogre. This should always be the first object that is created and the last one is destroyed. It is for configuring the system, select (or gives the possibility to select) the system for rendering objects that may be in a scene, ie, selected libraries OpenGL or Direct3D depending on whether they exist and / or user preferences. It also initializes the SceneManager and has a method called “startRendering ()” which is responsible for starting the rendering of objects and keep rendering each frame through a loop until you give the order to finish, this is ending the application . A frame can be considered as a separate image, a succession of frames give the feeling of motion or animation, the fluidity of movement or animation frames depend on that reach to develop in a second. Traditional film generates 24 frames per second to give the impression to the human eye fluid movement. Frames per second to develop a computer application depend on the process and / or calculations have to be performed in each frame (in each iteration in the loop) plus hardware capabilities (processor, video accelerator)[20]. The class RenderSystem It is the one that defines the interfaces between Ogre and underlying 3D API (OpenGL, Direct3D). Once initialized the system for the “Root” object, it selects the 3D API and tells what the API RenderSystem selected for rendering[twenty]. SceneManager It is considered the second most important class after class Root. It is responsible for organizing all class elements in a scene to be rendered. A scene can be defined as the sum of all objects to be displayed on screen. The SceneManager object creates cameras, lights, objects (entities) of a scene and keeps track of all these to access them when required to give the possibility to manipulate you as appropriate. When the time comes SceneManager render a scene sends all objects to be displayed to RenderSystem. There are different types of “SceneManager” some handle better scenes in enclosed spaces, in rooms or hallways games such as Doom first-person type. Others are optimized for outdoor scenes. There may be more than one active “scene manager” at the same time for the same

178

Computational and Numerical Simulations

application. The SceneManagerEnumerator class is the one that is aware of all the “SceneManager” available and active[twenty]. The Mesh class represents a discrete model, a set of autonomous geometry which is generally smaller than the world that integrates. A mesh object is used to represent moving objects within the scene generally are created in 3D modeling tools and then exported to a file using Ogre .mesh to recreate the object and display it on the screen. They can also be created directly in Ogre manually making method calls. Entity objects in the class are instances of a moving object in the scene, they can be for example a person, a dog, among others. An entity can be anything and is based on a set of geometry, ie, in the Mesh objects. An entity may then be a 3D modeling tool vehicle, which is exported to a file .mesh is in turn considered by Ogre as a Mesh object[twenty]. The SceneNode class. All entities and optionally cameras and lights, are linked or attached to a node, how to work with an entity or object scene is through the node to which the object is linked. Transformations such as rotation, translation, orientation, motion, not directly applicable to the entity but which has the node is the node which is rotated or moved to a certain position within the three dimensional space. Nodes have a hierarchical structure that is managed by the SceneManager class, who creates and destroys, where a node can have a parent node (only one) and many child nodes. Initializing the system through Root and create a SceneManager object is automatically created with the latter a node called “root node” which is the main node which are created (branch) the rest of the nodes to which are attached the different objects having a scene. TheFigure 12.6 can clarify this hierarchy [20]. An entity can only be linked to a single node but a node can be linked to it more than one entity. The content of the scene will be rendered ultimately it is the content of the nodes (or SceneNodes) which together make up the structure of the graphic scene or “graphic scene” that is sent to the SceneManager class so that it in turn send it to RenderSystem to be finally displayed on the screen. And an object can be linked to a node can also be separated from him, if so this object will not be rendered[twenty]. The Camera class is that, as the name describes, defines the attributes and properties of the cameras that will be responsible for delivering the point of view from which the rendered scene will be observed. Cameras can be rotated, Blurring directly or through a node that can bind[twenty].

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

179

Material class controls everything related to the appearance of objects, regardless of their form. An object class Material controls how objects in the scene are rendered from the point of view of appearance, specifies the basic properties of the surface of objects such as reflection color, brightness, layers of textures, effects are applied. The materials may be applied programmatically by calling createMaterial method of SceneManager class or can be loaded by the application at runtime through a script containing all necessary information material, this script is a file in intuitive language extension. material. Generally when a 3D object is modeled in a modeling tool also define the materials (color, texture). When exporting the 3D model into a format that can be interpreted Ogre two files one .mesh extension that contains the geometry of the 3D object and another with .Material extension containing the information of the material object and characteristics of how they are created They will be applied. Thus when Ogre an “entity” based on a mesh is created materials, which are read from the file .Material they are automatically loaded along with it[twenty]. Ogre provides an SDK (software development kit) to develop its graphics engine based applications that can be downloaded free from their website [twenty-one]. Ogre uses the axes X and Z in the horizontal plane, and the Y axis on the vertical axis. Now monitor the X axis run on the left side to the right of the monitor, the right side would be the positive direction of X. The Y axis run from bottom to top of the monitor, the top would be the positive direction of the axis Y. Z rove the inside out of the screen, the outside would be the positive direction of Z (Figure 12.7).

Figure 12.6: X, Y and Z.

180

Computational and Numerical Simulations

12.6 HAPTIC LIBRARIES A summary of the different haptic studied libraries for communication between the engine and Novint Falcon 3D simulation is presented.

12.6.1 Novint HDAL The main objective of HDAL is to provide a uniform interface for all supported device types. The application programmer is freed from the responsibility of knowing how each device starts, how your data is recovered and delivered, and by what means strength calculations are initialized. The SDK contains all documentation and software files necessary to develop applications with the haptic device Novint Falcon ™ from abstraction layer HDAL (Haptic Device Abstraction Layer) with the programming language C and C ++, because the last version contains full compatibility with C. for use, it is assumed that Novint Falcon drivers and USB port have already been installed, and have already been included DLLs, Novint * .BIN, HDAL files and application projects. The examples in the SDK have been developed to work on DirectX graphics programming environments and OpenGL. HDAL provides the ability to generate programming interfaces for Novint Falcon, including initialization tasks, reading status (position, speed, buttons, etc.), calculation commands and force application through a callback function executed such a 1 KHz rate. to achieve a high degree of fidelity haptic. Abstraction levels HDAL it are arranged so as to allow proper synchronization between the graphics application and force feedback as shown inFigure 12.8. Communication between the simulation and haptics layer functions HDAL is performed through a callback function invoked from within HDAL to thousand times per second ( “servo-tick”). Within this function the user reads the position of the end effector of the Novint Falcon, calculates the force levels and sends them to be applied to the device. Although for now only works with the Novint FALCON, the company Novint Inc. plans to build other versions, so HDAL and includes managing not only one but a whole family of controllers haptic devices, each of which will its own SDK managed from a control layer HDAL. Physical communication with the haptic (USB for Novint Falcon) device is the responsibility of the SDK itself that has been activated in the previous layer[22].

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

181

Figure 12.6: Novint HDAL SDK.

12.6.2 Libnifalcon Libnifalcon is a software development library for Novint Falcon haptic interface, and is a cross-platform open source alternative to Novint HDAL SDK. This provides communications and kinematic model for the interface controller. Libnifalcon provides basic functionality to connect the haptic interface[2. 3]. The main design goal of this library is to make a driver that is as flexible as the same hardware. The Novint Falcon expandable interface provides features such as load firmware and changing the grip. Libnifalcon was designed with the same idea[2. 3]. We can access Novint Falcon by four behaviors: • • •

Communications - how you can talk to the hardware. Firmware - how the integrated hardware works and communicates. Kinematics - the ability to obtain the position of the end effector and apply forces. • Grasp - access all features of the produced grip. Libnifalcon has a stable set of classes that implements the four behaviors. Each of these behaviors can be changed when necessary through the FalconDevice class, indicating that this library can be used to access the

Computational and Numerical Simulations

182

haptic interface Novint Falcon as well as for research on the new firmware, hardware gripping and kinematic device[2. 3]. Among Libnifalcon some possible applications include: • • •

Education (Dynamic) mechanical engineering. Computer engineering (hardware drivers, firmware / dsp development) • Computer science (hardware drivers, software organization) • Interaction man - machine (touch, gripping development). It is important to note that if you are new to the field of haptics, libnifalcon is simply a driver for a particular haptic hardware. Not a graphics engine for haptic. This means that, although there are examples of application of collision with some simple geometry, it is not really meant for the use of haptic research unless you are looking for very specific to cinematic[2. 3].

12.6.3 H3DAPI H3DAPI is an open source platform for software development using open standards and OpenGL X3D a unified scene graph to handle graphics and touch. Unlike most graphic scenes interfaces, H3DAPI is designed primarily to support a special process of rapid development. By combining X3D, C + + and Python scripting language, H3DAPI offers three modes of application programming they offer the best of both worlds - the speed of execution where performance is critical, and the speed of development where performance it is less critical[24]. H3DAPI is written in C + +, is designed to be extensible, ensuring that developers have the freedom and the means to customize and add any necessary tactile feedback or graphical features in H3DAPI for your applications. It has developed a wide range of applications in areas such as H3DAPI medicine, industry and visualization[24]. H3DAPI is dual license, open source and commercial platform, has a scene graph API available for download [24]. They can be created animated virtual worlds from using applications containing the X3D syntax. They can be created from the world of Python animations and more advanced behaviors. These worlds contain both graphic scenes and touch feedback.

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

183

Through the use of the graphic concept X3D scene, virtual worlds can be easily defined. Due to the concept of scene graph it is always easy to get an overview of how the virtual world is defined. Users with little experience in low-level programming can configure simple scenes that could be used for experimental research[24]. H3DAPI comes with a number of simple examples showing the characteristics of the library. H3DAPI X3D uses the syntax and the concept of nodes to build the virtual world. A node provides a particular feature in the scene. There are nodes for rendering geometry in the virtual world, creating graphics and haptic properties and nodes that can be used for animation[24]. H3DAPI is independent of the haptic device (via HAPI) and supports multiple devices commercially available today including Novint Falcon. Using AnyDevice interface can be used any compatible devices connected[24]. This library contains a couple of different ways to generate forces 3D surface meshes. To generate the strength of a haptic 3D graphics processor it should be chosen. Through X3D you can feel different geometries depending on which node surface is being used and how the node properties are adjusted[24].

12.6.4 OgreHaptics v 2.0 OgreHaptics is a library for software development which is written in C ++ for the integration of Ogre 3D haptic devices available on the market, allowing the user to manipulate virtual 3D environments using force feedback. It is designed to make it easier and more intuitive for developers to produce demos and games that use 3D graphics and haptic devices. The haptic rendering, or rendering forces, can be done in different ways. Rendering a surface in which a force generated by collision between the end effector and a virtual object is displayed to the user, it is one of the most interesting ways of haptic interaction. Another way is the rendering of the force effects of the environment as the force fields or viscosity of the virtual space through which the user moves the end effector. The current version of OgreHaptics only implements the force effects rendering and surface rendering is planned for the next version[25]. This library implements the means to provide insurance for synchronizing data between the client and the haptic device wires. This ensures that the

184

Computational and Numerical Simulations

data used by the device as well as by the customer can be safely used by the user. The Figure 12.8 It presents a class diagram where you can see the most important classes of OgreHaptics library:

Figure 12.8: OgreHaptics class diagram.

Following is a short description of the main objects that allow the realization of the haptic rendering shown: The System object is the main point of entry OgreHaptics system. This object should be the first to be created and the last to be destroyed. It is based on the Ogre :: Root class which is instantiated. Atraves plugins object System can be added other haptic devices for use in applications [25]. The RenderSystem class is an abstract base class, describes a haptic interface API. It is currently implemented for the Phantom haptic interface and can also be used with Novint Falcon. A typical application does not communicate directly with this object unless needed to access more complex methods used by the haptic thread [25]. Via the object device you can control input and output of a haptic device, such as adjustments are made to the haptic space to be mapped to device space, referencing the current state of the device and the event log generated the haptic interface [25]. The ForceEffect object provides the means to create environmental forces such as viscosity, springs and vibration. These forces are environmental because they are not related to any entity or way in the virtual environment.

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

185

A force effect consists of one or more of the subclasses of force algorithms which implement current algorithms for rendering forces [25].

12.7 POSITIONING AN OBJECT USING NOVINT FALCON In this project Microsoft Visual Studio 2005 was used as a platform, programming language Visual C ++. For the virtual three-dimensional environment graphics rendering engine is open source OGRE 3D v1.6.4. Libraries involved after carefully studying them, OgreHaptics v2.0 was chosen because it provides the opportunity to use directly OGRE 3D, the software used to create the graphical interface which is built LapBot. The software development kit OgreHaptics v2.0 provides two demos from which the programmer can start your project. In this case use it was made of demo and demo Weapons Effects. Thanks to the exploration of the application programming interface (API) by the getPosition OgreHaptics () function, which returns the device position in workspace coordinates thereof found. The first test was to move a ball through Ogre 3D haptic interface (see Figure 12.9).

Figure 12.9: First test.

Next, a flowchart is presented that shows how the program was developed to position the sphere through the haptic interface.

12.7.1 Initialization and Calibration Haptic Device Within this file the following function initializes the interface is: MDEVICE = mSystem-> initialise (true);

186

Computational and Numerical Simulations

The following file includes the functions necessary for the calibration of the device: virtual void calibrationStateChanged (const OgreHaptics :: DeviceEvent & EVT)

12.7.2 Creating Graphic Scene and Environment A simple scene consisting of a floor and a sphere was created in Ogre 3D rendering engine. The following lines of code to create the scene shown:

Entity * ent = mSceneMgr-> createEntity ( “Floor”, “FloorPlane”); ent-> setMaterialName ( “OgreHaptics / Examples / FloorPlane”); ent-> setCastShadows (false); SceneNode * node = mSceneMgr-> getRootSceneNode () -> createChildSceneNode ();

gCursor = mSceneMgr-> createEntity ( “Cursor”, “sphere.mesh”); gCursor-> setMaterialName ( “Plain / White”); node-> attachObject (gCursor); node-> setScale (0.5, 0.5, 0.5); First entity (createEntity) containing the image of the body (sphere. mesh) is created, then the material desired for said body (setMaterialName) is assigned, and a node of movement having a parent-child relationship is created with the node of the scene (createChildSceneNode). The entity is linked to the node to move together in the simulation (attachObject). Shadows can be placed the object (setCastShadows) affixing them to the body and can also scale the object (setScale) via node.

12.7.3 Call to FrameListener By the following function is made a call to the EffectsListener class to be updated in each frame. createFrameListener void (void) { EffectsListener * effectsListener = new EffectsListener (mWindow, mCamera, mSystem, mdevice); }

mFrameListener = effectsListener; mFrameListener-> showDebugOverlay (true); mRoot-> addFrameListener (mFrameListener);

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

187

12.7.4 Position Update the Haptic Device Calling the FrameListener accessed the EffectsListener class within which all components of the environment, including the position of the haptic device, in this case the position of the sphere by the following line of code updated: gCursor-> getParentSceneNode () -> setPosition (mDevice-> getWorldPosition ()); In which the position of the haptic interface is acquired and then this position is passed to the sphere. To end the program the ESC key is pressed.

12.7.5 Results Below pictures are shown (Figure 2.10) Taken at different time points that can see the positioned sphere, through Novint Falcon, in different parts of the scene.

Figure 12.10: Positioning of an object.

The Figure 12.11 It shows the movement of the sphere in the Z plane of the haptic interface:

188

Computational and Numerical Simulations

Figure 12.11: Movement in the Z plane of the haptic device.

12.8 POSITIONING LAPBOT After performing the first test to effect communication between the haptic interface and Ogre3D, was included in the room operating room, the couch, the camera, lights, simulated patient’s abdomen and the robot for laparoscopic surgery LapBot.

12.8.1 Including Surgical Scene in the 3D Environment First Ogre 3D is built in the operating room, in which the robot is located, the stretcher and the dummy abdomen, with the following lines of code: For the room: // Room MaterialPtr mpiso = MaterialManager :: getSingleton () create ( “Mpiso” ResourceGroupManager :: DEFAULT_RESOURCE_GROUP_NAME).; TextureUnitState * tpiso = mpiso-> getTechnique (0) -> getpass (0) -> createTextureUnitState ( “wood_15.jpg”); tpiso-> setTextureScale (0.05,1); * Entity floor = mSceneMgr-> createEntity ( “floor”, mSceneMgr-> PT_ CUBE); floor -> setMaterialName ( “Mpiso”); SceneNode * npiso = nodoRoom-> createChildSceneNode (Vector3 (1 -3.35,0)); npiso-> attachObject (floor); npiso-> scale (0.2, 0.0015, 0.2); // Generating the walls MaterialPtr MPARED = MaterialManager :: getSingleton () create (

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

189

“MPARED” ResourceGroupManager :: DEFAULT_RESOURCE_ GROUP_NAME).; TextureUnitState * T wall = mpared-> getTechnique (0) -> getpass (0) -> createTextureUnitState ( “marble_7.jpg”); * Entity wall = mSceneMgr-> createEntity ( “wall” mSceneMgr-> PT_CUBE); wall-> setMaterialName ( “MPARED”); SceneNode * npared = npiso-> createChildSceneNode (Vector3 (50,4000,0)); npared-> attachObject (wall); npared-> roll (Degree (90)); npared-> scale (0.6,1,1); * Entity wall- pared2 => clone ( “pared2”); SceneNode * npared2 = npiso-> createChildSceneNode (Vector3 (-50,4000,0)); npared2-> attachObject (pared2); npared2-> roll (Degree (90)); npared2-> scale (0.6,1,1); * Entity wall- pared3 => clone ( “pared3”); SceneNode * npared3 = npiso-> createChildSceneNode (Vector3 (0.4000, -50)); npared3-> attachObject (pared3); npared3-> pitch (Degree (90)); npared3-> scale (1,1,0.6); * Entity wall- pared4 => clone ( “pared4”); SceneNode * npared4 = npiso-> createChildSceneNode (Vector3 (0,4000,50)); npared4-> attachObject (pared4); npared4-> pitch (Degree (90)); npared4-> scale (1,1,0.6); // Generating the roof * Entity wall- ceiling => clone ( “roof”); SceneNode * ntecho = npiso-> createChildSceneNode (Vector3 (0,8000,0));

190



Computational and Numerical Simulations

ntecho-> attachObject (roof);

As seen performed first floor, then the walls and finally the roof; each entity can be assigned a different material to form the design of room desired, in the case of this project hardwood flooring and other colored marble was selected. Building materials is done with these commands [4]: MaterialPtr MPARED = MaterialManager :: getSingleton (). Create ( “MPARED” ResourceGroupManager :: DEFAULT_RESOURCE_GROUP_NAME); TextureUnitState * T wall = mpared-> getTechnique (0) -> getpass (0) -> createTextureUnitState ( “marble_7.jpg”); three lights were also placed, so that they can observe objects, otherwise everything would be black. The command to create a light is: Light * l1 = mSceneMgr-> createLight ( “light1”); After large stretcher standard dimensions 0.6 m wide, 2 m long, height above the floor of 0.6 meters, and body style box is included to represent the insufflated abdomen of the patient with the respective incisions. C ++ code shown below: //Table Entity * Table = mSceneMgr-> createEntity ( “Mesa”, “mesa. mesh”); Mesa -> setMaterialName ( “MatMesa”); SceneNode * nodoMesa = nodoRoom-> createChildSceneNode (Vector3 (1325, -2.65,0)); nodoMesa-> attachObject (Mesa); nodoMesa-> pitch (Degree (-90)); Entity * Abdomen = mSceneMgr-> createEntity ( “Abdomen”, “Mesh.mesh”); Abdomen -> setMaterialName ( “MatAbdomen”); SceneNode * nodoAbdomen = mSceneMgr-> getRootSceneNode () -> createChildSceneNode (Vector3 (10,475, -2.65,8.8)); nodoAbdomen-> attachObject (Abdomen);

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

191

The robots are added at the end as child nodes of the table and are added the coordinate axes. For the construction of the robot the following code is used, for example for joint r1: Entity * r1 = mSceneMgr-> createEntity ( “r1”, “r1.mesh”); r1 -> setMaterialName ( “MatExterno”); nodor1 = r1s-> createChildSceneNode (Vector3 (0, 0, 0)); nodor1 -> attachObject (r1); For the robot arm: Entity * Arm = mSceneMgr-> createEntity ( “arm”, “Brazo.mesh”); Arm -> setMaterialName ( “MatExterno”); nodoBrazo = nodor1-> createChildSceneNode (Vector3 (0,0,17.12)); nodoBrazo -> attachObject (Arm); Forearm: * Forearm mSceneMgr- Entity => createEntity ( “Forearm”, “Antebrazo.mesh”); Forearm -> setMaterialName ( “MatExterno”); nodoAntebrazo = nodoBrazo-> createChildSceneNode (Vector3 (9.84,0, -0.3)); nodoAntebrazo-> attachObject (Forearm); First entity (createEntity) containing the image of the body (r1.mesh) is created, then the material desired for said image (setMaterialName) is assigned, and a node of movement having a parent-child relationship is created with the node of the previous body (createChildSceneNode). We have the ability to locate the new node at a desired specific position (Vector3) relative to the parent node. Finally the company is linked to the node to move together in the simulation (attachObject). We need to link the bodies of the parent-child way for moving a joint father daughters also move, as in reality, in this way makes the robot is a comprehensive, joint and articulated piece. This procedure is performed with each body to fully arm the robot. In some bodies it is also necessary to rotate a node using the command pitch, yaw, roll, depending on the axis with respect to which the rotation is made[4]. The Figure 2.12 View the inclusion of the surgical scene in the atmosphere of Ogre, you can see the room, the table, abdomen and LapBot robot.

192

Computational and Numerical Simulations

Figure 12.12: fifteen 3D environment.

In the Figure 2.13 It can be seen in various views, surgical scene designed for this application:

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

193

Figure 12.13: Views of the surgical scene.

12.8.2 Device Initialization and Calibration Haptic When running the executable, Ogre window requests calibration of the haptic interface If the device was calibrated correctly a message

12.8.3 Call to FrameListener Ogre can register a class to receive notification before and after a frame is rendered to the screen. This class is known as Ogre FrameListener. FrameListener interface declares three functions that may be used to receive events frame. virtual bool frameStarted (const FrameEvent & evt); virtual bool frameRenderingQueued (const FrameEvent & evt); virtual bool frameEnded (const FrameEvent & evt); In this case frameRenderingQueued use is made for updating the corresponding devices and is also used frameEnded.

194

Computational and Numerical Simulations

12.8.4 Reading Keyboard and Mouse. Generating the Desired Action For reading the keyboard and mouse using the OIS library that allows the use of keyboard and mouse in an application is made. Using this library in the FrameListener to constantly reading these devices is updated is made. Through reading the following commands from the keyboard and the mouse is done: mMouse-> capture (); mKeyboard-> capture (); After for user interaction with these devices the following functions that are inherited from the class ExampleFrameListener Ogre3D are used: The processUnbufferedMouseInput function allows movement of the camera with the mouse. virtual bool processUnbufferedMouseInput (const FrameEvent & EVT) { // Rotation factors, may not be used if the second mouse button is pressed // 2nd mouse button - slide, rotate Otherwise const OIS :: MouseState & ms = mMouse-> getMouseState (); if (ms.buttonDown (OIS :: MB_Right)) { mTranslateVector.x + = ms.X.rel * 0.13; mTranslateVector.y - ms.Y.rel * = 0.13; } else { mRotX = Degree (-ms.X.rel * 0.13); mRotY = Degree (-ms.Y.rel * 0.13); } return true; } The function reads processUnbufferedKeyInput if any key has been pressed by the user and executes a predetermined action, for example if the user presses the escape key function terminates the program. With the W, S, A and D keys can operate the camera at the scene.

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

195

virtual bool processUnbufferedKeyInput (const FrameEvent & EVT) {

if (mKeyboard-> isKeyDown (OIS :: KC_A)) mTranslateVector.x = -mMoveScale; // Move camera left



if (mKeyboard-> isKeyDown (OIS :: KC_D)) mTranslateVector.x = mMoveScale; // Move camera RIGHT

if (mKeyboard-> isKeyDown (OIS :: KC_UP) || mKeyboard-> isKeyDown (OIS :: KC_W) mTranslateVector.z = -mMoveScale; // Move camera forward if (mKeyboard-> isKeyDown (OIS :: KC_DOWN) || mKeyboard-> isKeyDown (OIS :: KC_S)) mTranslateVector.z = mMoveScale; // Move camera backward

if (mKeyboard-> isKeyDown (OIS :: KC_PGUP)) mTranslateVector.y = mMoveScale; // Move camera up



if (mKeyboard-> isKeyDown (OIS :: KC_PGDOWN)) mTranslateVector.y = -mMoveScale; // Move camera down

if (mKeyboard-> isKeyDown (OIS :: KC_RIGHT)) mCamera-> yaw (-mRotScale); if (mKeyboard-> isKeyDown (OIS :: KC_LEFT)) mCamera-> yaw (mRotScale); if (mKeyboard-> isKeyDown (OIS :: KC_ESCAPE) || mKeyboard-> isKeyDown (OIS :: KC_Q)) return false; // Return true to continue rendering return true; }

196

Computational and Numerical Simulations

12.8.5 Reading the Position of the Haptic Interface To read the position of the haptic interface is made using the application programming interface (API) by OgreHaptics and getPosition () function, which returns the device position in workspace coordinates thereof found. The next line of code haptic device position is obtained and stored in a table: Vector3 devicePos = mDevice-> getPosition ();

12.8.6 Implementation of MGI The robotic arm has nine LapBot joints to be moved harmonically according to the movement of the end effector and respecting the incision simulated patient abdomen. This effect is achieved through the implementation of Model Geometrical Inverso (MGI). MGI calculates the values of the variables θj (variable articular rotation [rad]) and rj (variable articular translatory [m]) associated with the joints, depending on the desired orientation and location of the end effector of the robot in space Cartesian (x, y, z). MGI robot was modified because it was done in Matlab and it was necessary to pass a Visual C ++. functions for mathematical calculations with the strengthening and establishment which are different from those presented by Matlab changed. Once the position of the haptic interface read by mDevice-> getPosition () ;, is stored in a vector Vector3 devicePos, then extracted positions x, y, z stored in the vector so that they can be used in calculations MGI as follows: x1 = devicePos.x; y1 = devicePos.y; z1 = -devicePos.z; These values were escalizados because the workspace of the haptic interface is different from that used by the virtual world of Ogre 3D. The escalización shown below: x = x1 + 0.345 / 400; y = 0.1952 + z1 / 357; z = 0.2 + y1 / 450; One can see that it was necessary to change the Y and Z axes to coincide with the virtual world.

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

197

With escalización made calculations are performed for the values of the variables and θj rj attached to the robot joints for generating movement. Finally it was necessary to make changes to the orientation matrix of the robot and the end effector was faceup in the simulation in Ogre3D and need the guidance of the clamp is down.

12.8.7 Generation Movement from MGI The following lines of code the robot motion is generated by the haptic interface from the calculations in the MGI. // to r1 nodor1-> setPosition (0.0, r1 * 40); // to t2 nodoBrazo-> setOrientation (Quaternion (Radian (t2), Vector3 (0,0,1))); // to t3 nodoAntebrazo-> setOrientation (Quaternion (Radian (t3), Vector3 (0,0,1))); // t4 is fixed // to t5 nodot5-> setOrientation (Quaternion (Radian (t5), Vector3 (0,0,1))); // to t6 nodot6-> setOrientation (Quaternion (Radian (t6), Vector3 (0,0,1))); // to t7 nodot7-> setOrientation (Quaternion (Radian (t7), Vector3 (0,0,1))); // to t8 nodot8-> setOrientation (Quaternion (Radian (t8), Vector3 (0,0,1))); // to t9 nodot9-> setOrientation (Quaternion (Radian (t9), Vector3 (0,0,1))); Since r1 is the only translatory joint is used for this function to locate the setPosition in Cartesian coordinates (0,0, z) as one moves around the z axis. For other joints setOrientation function which receives an angle in radians

198

Computational and Numerical Simulations

that is calculated in the MGI for each joint and the position vector (0, 0, 1) for all move about the z axis is used. After making the first test singularities were identified for which the system had no solution. It was necessary to do a study that would solve this problem. new equations which were considered solutions that were not taken into account were implemented. For example, when a square root is calculated there are two possible solutions, one positive and one negative, initially was considered positive solution, then a condition for activating the negative solution for certain cases, as shown below included: if (z> tz) r1 = -sqrt (((pow (R2,2)) * N) / M) - K; else r1 = sqrt (((pow (R2,2)) * N) / M) - K; To see the equations REFER added to Annex C where the file is shown Iteracion.h

12.8.8 Results In the Figure 2.14 LapBot shown along different paths and respecting the passage through the trocar:

Figure 12.14: Passage through the trocar.

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

199

In the Figure 2.15 can be seen as the joints of the robotic arm are moved to bring the end effector to the position indicated by the user via the haptic interface.

Figure 12.15: twenty Movement joints.

12.9 COLLISION DETECTION AND FORCE FEEDBACK IN A VIRTUAL THREEDIMENSIONAL ENVIRONMENT After achieving the robot positioning LapBot laparoscopic surgery in the virtual three-dimensional environment with haptic interface proceeds to perform collision detection between robot end effector (collet) and simulated patient abdomen. Once detected collisions force feedback haptic interface to be made to simulate the constraint constitutes the abdominal wall to the surgeon.

12.9.1 Collision Detection Algorithms The main objective of these algorithms is to calculate the geometric interactions between objects, regardless of the number and complexity that objects can have. They have traditionally required a lot of geometric

200

Computational and Numerical Simulations

intersection test, verifying if all polygons entire surface shape of an object, intercepting the surface of another object, thereby determining if two objects collide. The key to collision detection in real time, is the method to detect more quickly when two objects have no collision[2]. Most researchers in the area proposed algorithms have already been proven effective and reduce the number of calls to verify intersection of two geometric primitives. These techniques pose a type of envelope volumes organized in a hierarchical structure and thus avoiding verification pairs geometric primitives directly[2]. Hierarchical spatial subdivision, subdivision of hierarchical objects and incremental distance calculation: for collision detection in a virtual environment with force feedback, three techniques may be used. The hierarchical spatial subdivision is a recursive partitioning technique that divides the room in segments, with the advantage that if an object changes its position, but it should check the collision in the new space where it has located[2]. Hierarchical structures used in collision detection include trees cones, spheres or cubes, etc., that perform well tests rejection, when the objects are sufficiently separated. These structures in turn can be divided into two groups: the hierarchy of subdivision of space and the hierarchy of the subdivision of the object[2]. Subdividing objects is to enclose the solids volume in a hierarchical environment as spheres or parallelepipeds. Collision occurs when it is detected that the volumes contour intersect each other[2]. Depending on the selected volume techniques can be applied as: Oriented Bounded Box (OBB), BoundingSphere (BS) or Axis Aligned Bounded Box (AABB), as shown in Figure 12.16. Some techniques are explained below:

Figure 12.16: Classes envelopes Volumes [2].

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

201

12.9.2 Bounding Sphere Technique Through the technique of BS or Bounding Sphere, the objects are wrapped in colisionables spheres whose radius is determined by the maximum coordinate of the solid object. Collision detection spheres is easier to determine, since it only suffices to compare the distance between the centers of spheres, with the sum of the radii of each area, if this distance is less than the result of the sum, then it is detected collision (seeFigure 12.17). Another advantage of this structure is that the only processing to be performed on the envelope sphere are the translations of the object, without requiring apply rotations making it faster than AABB or OBB techniques. However, you can generate erroneous reports of collisions if not completely locked objects bordering their fields[2].

Figure 12.17: Technical Bounding Sphere [2].

12.9.3 Technique Axis Aligned Bounding Box AABB AABB English acronym Axis Aligned Bounding Box boxes or enclosures aligned with the shafts, is a technique for creating a box aligned with the coordinate system of axes x, y and z around each colisionable object. The dimensions of said box-dependent maximum and minimum values of the object’s edges[2]. While there may be, AABB technique reduces the occurrence of false collisions reports regarding the SB technique. The algorithm is easy to implement, if two objects A and B contained in two boxes are supposed aligned and the figure 12.18, Then the collision detection is done by comparing the positions of the coordinate axes with respect to the center of each object, so in one dimension such [2]:

Computational and Numerical Simulations

202

• • • •

If the X-peak position of A is less than the X-minimum B position then there is no collision. If the X-peak position of B is less than the X-minimum A position then there is no collision. the process is repeated for each axis. If neither condition is true, then will the collision between the two objects.

Figure 12.18: Technical Axis Aligned Bounding Box Oriented AABB [2].

A disadvantage of the AABB technique is that it can create false reports if locked objects are thin and are tilted, rotated or have some deformation (see figure 12.19), For a regular updating of the information boxes envelopes required.

Figure 12.19: Technical Report false Axis Aligned Bounding Box Oriented AABB [2].

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

203

12.9.4 Technical OBB: Oriented Bounding Box Through the technical OBB (Oriented Bounding Box acronym or oriented bounding box) a geometric model to collide, within a box which is aligned with the maximum dimensions of the object is enclosed (figure 12.20), Thus it is estimated the existence of collisions with the box, instead of performing the processing at each point of the object. Such volume is indicated especially for objects that fit rectangular shapes, and although there is error in figures geometry curves, BB is the most efficient method. For detection safe collision, the transformations of Figure should be continuously applied to the bounding box[2].

Figure 12.20: OBB Oriented BoundingBox [2]

12.9.5 Libraries for Collision Detection We can be defined as a physical engine computer program that simulates Newtonian physical models, using variables such as mass, velocity, friction, weight and strength. This can simulate and predict effects under different conditions that approximate what happens in reality. The main task of a physical engine is performing collision detection, collision resolution and other restrictions, transform the world provide updated information for all objects. will summarize some libraries studied for performing collision detection in this section.

12.9.5.1 V- Collide V-Collide is a library collision detection developed by the (Geometric Algorithms for Modeling, Motion, and Animation) GAMMA group at the University of North Carolina. Is written in C ++ and was designed for use in environments containing large numbers of geometric objects formed by

Computational and Numerical Simulations

204

triangle meshes, performing collision detection between pairs of objects through two stages[2]: •

Construction of OBB’s (Hierarachical Oriented Bounded Box) for each object in order to find pairs of triangles that intersect the OBB’s possibly (this is done with RAPID). • Check that the pairs of triangles detected in Stage 1 actually intersect. These stages are in a V-Collide component called RAPID, which is also a library independent detection collisions. The differences between the two libraries are as follows[2]: •

V-Collide retains information about where are the objects in the environment, so that if they do not move their locations need not be recomputed with RAPID. • V-Collide allows verification of many objects simultaneously and only allows verification RAPID two. • RAPID reports that pairs of triangles exactly collided while V-Collide only reports collision between objects. some tests with V-Collide in OpenGL, for which it was necessary to create an .OBJ file with the shape of the abdomen, with satisfactory results, the problem arose at the time to include in Ogre 3D .obj files were performed. Therefore, this library was not selected because incompatibilities with Ogre 3D graphics engine format files, because this library works with OBJ files and Ogre .mesh files.

12.9.6 Bullet Bullet Physics is an open source library for collision detection and dynamic between rigid and soft bodies code. The library is free for commercial use under the Zlib license and can be used on all platforms including Playstation3, Xbox 360, Wii, PC, Linux, Mac OSX and iPhone[2. 3]. Performs simulation of rigid and soft bodies with continuous and discrete detection of collisions. Among the forms of concave and convex collision meshes, triangle meshes and all the basic primitives such as spheres, cones, cylinders and boxes they include envelopes. Features support for soft bodies in clothing, rope and deformable objects. It also has an extensive set of restrictions for soft and rigid bodies with limits and restrictions engines[26]. Bullet has been designed to be customized and modular (see Figure 12.21). The developer can use only such component or collision detection can

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

205

use the component rigid body dynamics without using the soft component bodies.

Figure 12.21: Bullet Physics [26].

Bullet for use with Ogre 3D Btogre studied [27] which is a software library that wraps the most important functions and classes Bullet and works as an intermediary between the two engines. Due to the lack of documentation, developer support and problems encountered in developing an application that would detect collisions between objects using this library was discarded.

12.9.7 NVIDIA PhysX It is given the name PhysX to a chip and an SDK (software development kit) developed by AGEIA [28]to perform complex physics calculations. PhysX is a commercial but free use in non-commercial projects product is written in C ++ initially supported on Windows but has recently expanded its portability to Linux. It is capable of simulating standard rigid bodies, fluid dynamics, clothing, soft bodies, volumes, among others. PhysX software has been adopted for the creation of more than 150 games and employ more than 10,000 developers worldwide. PhysX is commercially used by Sony in its PlayStation 3 game console. To give some clarity on the elements (classes) most significant engine operation and a class diagram is performed (see Figure 12.22) And very simplified summary, the most significant classes retrieving, based on experience acquired which is not intended as a formalization of a class diagram for PhysX.

206

Computational and Numerical Simulations

Figure 12.22: Diagram main classes of PhysX [twenty].

The “Nx” prefix used in all class names is because the SDK was previously known under the name “Novodex”, where he inherited the prefix. Like Ogre, PhysX has an entry point to initialize your engine, this entry point is the NxPhysxSDK class. The first time, before making a simulation is to create an instance of the NxPhysxSDK class to initialize the system through this object can be set parameters, such as giving the option of displaying the axes of an actor (XYZ) to know how it is oriented or where it is applied a force. Once the system is initialized and adjusted as appropriate to the simulation object (instance) NxPhysxSDK is responsible for creating an instance of the class NxScene[twenty]. The NxScene class similarly to Ogre defines a scene in which objects are in this case Actors in a three-dimensional environment which allows to observe the behavior of these objects according to the physical conditions that are submitted. When creating a scene by NxPhysxSDK, you must specify the severity that affect the objects in the scene, this is done through a scene descriptor is an instance of the class NxSceneDesc. The descriptors are widely used by PhysX and are structures that contain all information about the characteristics you want to give objects when creating them[twenty]. The NxActor class includes the basic elements of a scene, that is, instances of the class NxActor, or rather the actors, are the objects that interact with each other within the scene. Actors within a scene are created by the same, that is, the instance of the class that defines NxScene the scene. PhysX uses three types of actors: the static, dynamic and kinematic. A static actor as its name implies will never move, can be used to represent soil,

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

207

mountains, an intersection etc; it is considered that these actors have an infinite mass and are not affected by forces. A dynamic player will move and act under the influence of different situations to which it is subjected, forces and speeds, have a finite mass. Kinematic actors are movable static objects, They will not be moved by the application of a force or impact but can be positioned within the scene as appropriate. Dynamic objects can be converted into kinematic and vice versa, not static which can not be modified type. Like a scene when creating an actor must specify all the features you want to have this actor (including its type, body shape, density) this is done through a descriptor actor, instance of the NxActorDesc class[twenty]. An actor is comprised of a rigid body, PhysX called “Body”, a form of collision, “Shape”, which as its name suggests is the way to take an actor and behave accordingly. That is, if an actor is created sphere shaped and the actor will behave as it does a sphere, so if another actor is created cuboid (box) and this will behave as such. As previously it mentioned all this information is contained within the descriptor actor, if an actor is not assigned a rigid body (body) is considered static type. The way of assigning a rigid body an actor is through creating a descriptor body, that is, creating an instance of the class NxBodyDesc[twenty]. The NxShape class defines the ways you work with PhysX, they can be: flat, boxes, spheres, capsules, wheels, mesh triangle. To create any of the above forms should specify features such as dimensions and materials among others, this is done, just as in the previous cases, through descriptors and descriptors in this case such that are instances of the class NxShapeDesc. An actor can have multiple associated forms such as a vehicle that itself corresponds to an actor but with associated to represent the chassis and four other ways to represent each a wheel shape. The form “mesh triangle” gives the possibility to create more complex shapes, especially in shapewear tools from which subsequently can be exported to a format that can be read by PhysX[twenty]. The NxMaterial class. Unlike Ogre where the materials are more related to the appearance, PhysX are considered from the physical point of view. The materials are therefore the physical substance of which objects are made (actors or more directly forms) define internal properties and restitution properties of the object surface as friction. The materials are created by the scene object and its properties you need to be adjusted giving values its return, static and dynamic friction friction. Give a high value of refund, for example the maximum is 1, it means that impact the object will

208

Computational and Numerical Simulations

lose little energy what will this have a broad rebound in the opposite case if you are assigned a coefficient of restitution low, for example 0.15 the object will lose a lot of energy on impact and bounce back slightly. If a material is adjusted with a high coefficient of static friction will cause the subject has difficulty in moving when exiting the rest, this difficulty will decrease if the coefficient of friction decreases. A small dynamic friction coefficient allow a moving object slide smoothly while one high tend to stop faster or more abrupt way. One way you can use a material created by the scene and can also stop using it and replace it at any time[twenty]. The Figure 12.23. It shows the working environment used by PhysX in the tutorials that are available for learning motor. In the picture you can see a scene where an actor falls sphere shaped by the action of gravity, hitting a pile of actors shaped boxes.

Figure 12.23: PhysX working environment.

Of those studied physics engines that integrate applications developed with Ogre noted for its speed and robustness, according to user feedback gathered themselves Ogre forums [29], PhysX. The fact selected PhysX as a physical engine for the development of the simulator was based primarily on compatibility with Ogre 3D and the ability to work with NxOgre, as with this first approach to physics simulation was achieved on the recommendation of expert users Ogre and availability of support in learning through tutorials and an active and cooperative forum.

12.10 NXOGRE To use the PhysX physics engine was used with Ogre 3D physical “wrapper” called NxOgre, it is a software library that wraps all the functions and classes PhysX and acts as an intermediary. It is an open source library on the LGPL license. There are currently two versions of NxOgre: BloodyMess and Bleeding, the first being the latest version of this library. Although this

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

209

library has Ogre on your behalf you can work with any 3D or 2D engine rendering. NxOgre is a wrapper created by Robin Southern Ogre integrates with PhysX. Thus when working with NxOgre you can be accessed through it, to both engines, graphic and physical, giving the possibility to develop applications with all the potential that everyone possesses. As with Ogre and PhysX, NxOgre is written in the programming language C ++ is objectoriented and is licensed open source but is limited by the use of PhysX which as mentioned above is paid but free use in non-commercial projects[twenty]. NxOgre has many classes that involve classes defined for Ogre and PhysX. In order to give a clearer picture of how it works NxOgre a class diagram was developed (see Figure 12.24) With the most relevant and significant features that allow understanding the wrapper.

Figure 12.24: NxOgre class diagram [twenty].

As mentioned previously and PhysX need Ogre from initializing systems, This is achieved by Root and NxPhysxSDK class respectively. NxOgre performs this work through NxOgre_World class, NxOgre_ World object takes as a parameter the Root object Ogre initializing and is responsible for initializing the SDK PhysX by creating a NxPhysxSDK object. Through the NxOgre_World object you can access all available methods for objects of classes inicializadoras both engines, plus the NxOgre_World object is responsible for creating the scene NxOgre[twenty]. Creating an instance of the class NxOgre_Scene implies the creation of a scene both Ogre by an object of class SceneManager, as PhysX, creating an object of class NxScene. As is the case with NxOgre_World object scene NxOgre provides access to all available methods for instances

210

Computational and Numerical Simulations

of SceneManager and NxScene, you can benefit from all the functionality of the scenes of Ogre and PhysX. As shown here the motor operation is performed in parallel, being NxOgre the link between the two. Ogre define objects within your scene as “Entities” PhysX by his side calls them “actors.” NxOgre through the NxOgre_Body class integrates the two concepts of objects within a scene, both of Ogre like PhysX, and gives the name of “Bodies” or “Body”. When NxOgre creates a Body by and within your scene is creating at the same time an actor for PhysX and an entity for Ogre, thus manages to perform physical simulations with the actor and represent this graphically through the entity created to Ogre, the which can also be called “collision model” and “Model Chart” respectively. Wherein the collision model corresponds to the form (shape) on which the physical computations are performed and the graphical model corresponding to the entity or mesh on which the graphical representation of the object is based[twenty]. NxOgre_Body by an object can have access to all methods of the Entity class and all methods of the class NxActor. Also by creating an object NxOgre_Body the entity created to Ogre to a node SceneNode class that allows control over the entity is bound. NxOgre used, through the use of your scene (class NxOgre_Scene), the scene of the SceneManager class to work with the entities and the scene NxScene class to work with actors. Every move made by an actor (collision model) is replicated by the entity (mesh) in this way you can be seen graphically by 3D Ogre behavior of objects within a physical simulation performed by PhysX. Finally, if a “Body” is created in the shape of sphere (shape) and a graphic appearance (entity or mesh) of a giraffe, This Body will not behave as giraffe but as a sphere and most likely giraffe roll before walking. The development of the simulator is made on the basis of defined classes NxOgre, so it manages to integrate the project Ogre3d graphics engine and physics engine PhysX. Any functionality not covered by NxOgre has the facility to be saved directly interacting with either engine, it is an advantage that has NxOgre[twenty].

12.11 COLLISION DETECTION AND FORCE FEEDBACK Here it will be described how the collision detection between the end effector of the robot and the simulated patient with software tools mentioned above the abdomen was performed.

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

211

PhysX SDK version used for the development of the project is 2.8.3, the latest version released for the SDK corresponds to the 2.8.4. version 2.8.3 that when training with this tool corresponded to the latest version released by Ageia version 1.5.5 BloddyMess NxOgre is started is used.

12.11.1 Creation of the world in NxOgre To work with NxOgre as mentioned earlier the first thing to be created is the world after the scene with his description, some physical values required for simulation, rendering the system and the timekeeper. // Create the world mWorld = NxOgre :: World :: createWorld (); // Create scene description NxOgre :: SceneDescription sceneDesc; sceneDesc.mGravity = NxOgre :: vec3 (0, 0, 0); sceneDesc.mName = “sceners”; // Create scene MSCENE = mWorld-> createScene (sceneDesc); // set some physical scene values mScene-> getMaterial (0) -> setStaticFriction (0.5); mScene-> getMaterial (0) -> setDynamicFriction (0.5); mScene-> getMaterial (0) -> setRestitution (0.1); // Create render system mRenderSystem = new OGRE3DRenderSystem (MSCENE); // Create time controller mTimeController = NxOgre TimeController :: :: getSingleton ();

12.11.2 Creation of Housing for the Abdomen He initially worked in the collision between the simulated patient’s abdomen and any object. For this tutorials studied BloddyMess[30] for creating complex physical forms collision detection applications. Due to the shape of the abdomen was necessary to build a body based on triangles that would detect collision at all points of the abdomen.

212

Computational and Numerical Simulations

The abdomen used in this application was built in Solid Edge and exported as a .stl file, then imported to Blender [31]for export as a file .mesh this because Ogre 3D works with these files. A mesh is essentially a complex 3D polygon, represented by a set of vertices and a body triangulizado allows perfectly represent the physical form of a mesh. Triangulizado to build the body was necessary to use the tool Flour [32]Which allows you to convert files to .nxs .mesh files with which triangulizados bodies are represented. Flour generated body is totally equal in shape and size to .mesh containing the file, but you need to work with NxOgre and PhysX. Once created the body with Flour, must be loaded in Ogre to work with him. The following lines of code allow you to load the body: ResourceSystem NxOgre :: :: getSingleton () -> openArchive ( “media”, “file: C: / OgreSDK / media”); NxOgre :: Mesh * triangleMesh = NxOgre MeshManager :: :: getSingleton () -> load ( “media: Mesh.nxs”); Now you must create a “SceneGeometry” that generates the triangulizado body in the scene and you can display an entity is created and assigned the same position in the scene: NxOgre :: TriangleGeometry * triangleGeometry = new NxOgre :: TriangleGeometry (triangleMesh); Entity * Abdomen = mSceneMgr-> createEntity ( “Abdomen”, “Mesh. mesh”); Abdomen -> setMaterialName ( “MatAbdomen”); SceneNode * nodoAbdomen = mSceneMgr-> getRootSceneNode () -> createChildSceneNode (Vector3 (10.475, -2.65, 8.8)); nodoAbdomen-> attachObject (Abdomen); mScene-> createSceneGeometry (triangleGeometry, NxOgre :: Matrix44 (NxOgre :: vec3 (10.475, -2.65, 8.8)));

12.11.3 Creating a Volume To Collision Detection NxOgre studying tutorials, the author proposes a method for detecting collisions is to create a volume that has the same shape of the abdomen, which would detect the collision anywhere in the abdomen. A volume is an invisible physical object of an arbitrary shape, it does not move and can not be seen. This volume is used here as a trigger, as soon as an object enters it, is completely in or out of it, a special event is triggered and can be used to

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

213

perform a certain action within the program, in this case detection collision. It is necessary to make the main program class inherits from NxOgre :: Callback to be able to handle the events that are generated when an object comes into contact with the volume created. Where the surgical scene is created, now then it creates a volume that has the following characteristics: • • • •

The shape is the same simulated triangleGeometry abdomen. The position is the same as has the abdomen. The Callback class and inheritance which indicated. Behavior, which in this case is the method called OnVolumeEvent (), which will react when a body enters or leaves the volume created. The following lines of code allow you to view volume creation: mVolume = mScene-> createVolume (triangleGeometry, NxOgre :: Matrix44 (NxOgre :: vec3 (10.475, -2.65, 8.8)), this, NxOgre :: Enums :: VolumeCollisionType_All); The last thing to do to detect the collision is create the function that will be executed when an object comes into contact with the volume. void onVolumeEvent (NxOgre :: Volume * Volume, NxOgre :: Shape * volumeShape, NxOgre :: RigidBody * rigidBody, NxOgre :: Shape * rigidBodyShape, unsigned int collisionEvent) { if (collisionEvent == NxOgre :: Enums :: VolumeCollisionType_OnEnter) { // In this condition the actions to take when a collision is detected is programmed. } } an object barrel shaped created with an enclosure to test the collision between two objects in the virtual world and the results shown were obtained Figure 12.25:

214

Computational and Numerical Simulations

Figure 12.25: eleven Collision between two objects.

12.11.4 Creation of housing for the Pinza After creating the volume for collision detection, a housing created in sphere shape for recubriera to thus be able to clamp and collision detection with the simulated patient abdomen. With the following lines of code to create the sphere shown: Body = mRenderSystem-> createBody (new NxOgre :: Sphere (0.18), NxOgre :: vec3 (10.475, 2, 8.8), “pinza.mesh”); It must be assigned to this body haptic cursor position to follow the movements of the clamp.

12.11.5 Collision Detection To perform collision detection, as mentioned above, use of a volume and OnVolumenEvent () method is, in this condition if (collisionEvent NxOgre == :: Enums :: VolumeCollisionType_OnEnter) is used which allows whether some body came into contact with the volume, in this case indicates whether the robot end effector came into contact with the simulated abdomen.

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

215

If this condition is met is assigned to a real flag to know that a collision is detected, this in order to perform as a next step the force feedback at the end effector. Once the collision between the end effector and abdomen is detected, the user is shown on-screen message “COLLISION DETECTED --- ---”, Which indicates touching the walls of the abdomen.

12.11.5.1 Results for Collision Detection Then the Figure 12.26 the results obtained in the detection of collisions between the end effector and the simulated patient’s abdomen, the red oval in the figure highlights the message indicating the detection of the collision is:

216

Computational and Numerical Simulations

Figure 12.26: Collision detection.

In the Figure 12.27 collision detection viewed from inside the patient’s abdomen is shown:

Lapbot Positioning in a Three-Dimensional Virtual Environment .....

217

Figure 12.27: Collision detection within the abdomen.

12.11.6 Force Feedback When a collision is detected is sent to the haptic device a force that tells the user who is playing one of the walls of the abdomen. force algorithm called Spring or Spring in OgreHaptics was used in this case [33] corresponding to the equation F = -KX, Hooke’s law3Where X corresponds to the difference between the current position and the previous position of the haptic device, the force F to be generated by the device and the force constant K. In the following lines of code you can be seen the algorithm and the functions used to carry out the feedback force to the haptic device. The first thing you should do is get the difference between the position before and after the collision and this value is stored in a vector and then multiply it by a constant force.

Vector3 X = pactual-pante; Vector3 F = -100 * X;

force [0] = Fx; force [1] = Fy; force [2] = Fz; To apply forces to the haptic device two functions of two different libraries recommended by experts Ogre studied. The first corresponds to a function call OgreHaptics _applyForces () that applies to a force haptic 3 When a spring is deformed exerts an opposing force to deformation it is proportional to the amount of deformation of the spring.

218

Computational and Numerical Simulations

interface device coordinates. The second corresponds to a function library of haptic HDAL called hdlSetToolForce () which adjusts the force to be generated by the haptic device in Newtons. mdevice -> _ applyForces (F, Vector3 :: ZERO); hdlSetToolForce (force); Of the two functions mentioned that yielded better results the last, since greater force feedback is obtained by touching the walls of the simulated patient’s abdomen.

CHAPTER

13

CONCLUSION AND FINAL CONSIDERATIONS

In this final chapter some recommendations that will serve as support to continue with the proposed future work and to enable students to minimize working time and research. The proposed recommendations are made on selected software tools for the development of this project. • Connecting other haptic device: If you want to work with the Phantom haptic device should uncomment (delete #) on line HapticsPlugin.cfg file that allows working with OpenHaptics for handling Phantom: Plugin = HapticsRenderSystem_OpenHaptics If you want to add the Phantom haptic interface in addition to Novint Falcon, for example for positioning the second arm in the application, create a new instance of the class device that allows the system to recognize the new device and so you can work with this interface. Then few lines of code that can be tested in the application are presented. Do not forget to activate the plugins for each interface. DeviceInitInfo initInfo;

220

Computational and Numerical Simulations

initInfo.api = “Falcon HapticsRenderSystem”; initInfo.initName = “DEFAULT”; Device * falconDevice = System :: getSingleton () createDevice ( “Falcon”, initInfo).; initInfo.api = “OpenHaptics HapticsRenderSystem”; initInfo.initName = “DefaultPHANToM”; Device * phantomDevice = System :: getSingleton () createDevice ( “Phantom” initInfo).; System :: getSingleton () startSchedulers ().; • Creating housings compounds for detection with organs: If you want to create a housing for other joints it is recommended LapBot directly use PhysX, since the wrapper NxOgre only involves the major classes of physical engine and not all of them. in which it is necessary to learn how to handle this tool documentation: When installing the Physx physics engine a ‘\ Program Files \ NVIDIA Corporation \ NVIDIA PhysX SDK \ v2.8.3 C “directory is created. To create a housing to detect the collision with the clamp and joints of the robot falling within the abdomen, you should study the training program called Lesson103, which comes with the PhysX physics engine, where the programmer that way explained you can create composite shapes housings, which would be very useful for the case LapBot arm. Then you probably want to include organs that go inside the abdomen, for which we recommend to study training programs from Lesson1101 to Lesson1106 that create soft bodies and methods offered by the PhysX physics engine collision detection using deformable objects . In particular it is advisable to study a very interesting training program, Lesson115 that creates a scene where you can generate collisions between soft bodies as rigid bodies and bodies like the robot’s body is inside the abdomen. The implementation of these tutorials in the current application should be made in the Escena.h file for creating the housings and if the tutorial requires updating a position or a change in the housing, this should be done in the Iteracion.h file is the file being run each frame. • Disconnecting the Novint Falcon haptic interface: The haptic device can not disconnect from the application as this would cease to function as first running is Ogrehaptics is the library that handles the interface functions. To make the program work without a haptic interface is necessary to create a new class that inherits the methods of this library.

Conclusion and Final Considerations

221

What to do in case of not having the Falcon device is to connect the Phantom haptic interface 291 and uncomment the line where you send the order to generate force with applyForces function. • Force feedback: To improve the force feedback that is sent to the user is encouraged to create an algorithm which recognizes the shaft in which a collision is detected and send the same axis force. It is also advised to study plugins OgreHaptics to apply force effects which can be found in the manual that is installed with the library. • Including dynamic models of the robot in the application For inclusion of dynamic models LapBot is necessary to create a program that performs the same stuff in Matlab Simulink but in C ++, ie: • •

Read the joint variables generated by the MGI Implement the program ‘trocar’ in C ++ to calculate the values of passive variables, their velocities and accelerations. • With these data, the PD control generates a correction signal that should go into the MDI, as shown in the Simulink subsystem is implemented. • Implement dynamic models in C ++, • And ensure that the feedback function in C ++. For this work you must have the following matlab files that are attached to the application note: • Lapbot2DinCart_mgi_simple.mdl is simulink file. • trocar_opt.m for calculating variables. • MDD.my MDI.m dynamic models of the robot. • Converting Matlab files to C ++ Matlab Compiler and MATLAB C ++ Math Library latter was discarded because in this library can not be implemented Simulink functions: in this case two tools were studied. Matlab Compiler allows a huge variety of file conversions with 3 main objectives: •

Integrating Matlab instructions in other languages like C or Basic, thereby facilitating enormously complex mathematical coding, graphics operations, etc.

Computational and Numerical Simulations

222



Matlab create libraries of functions that can be called from any application. • Wean the compiled code Matlab environment. That is, without the Matlab application when executing code, creating standalone applications (stand alone applications). In order to convert these files must have a C ++ compiler (like Visual Studio) and Matlab installed. It should be checked if Matlab Compiler is installed, in this case tests were made with Compiler version: 4.11. The first step is to open matlab go to the command window of MATLAB and verify that recognize the C ++ compiler that is installed on your computer. We should write the mbuild -setup command with which you can know that the computer recognizes compilers. Once you have identified should go to Start-> Matlab> Matlab Compiler-> Deployment Tool (Figure 13.1).

Figure 13.1: Matlab Compiler

After opening this tool a window where you can choose the type of application you want to create, in Figure 13.2 you can see the different types of applications that the tool offers to convert observed:

Figure 13.2: Conversion Tool.

For this test was selected C ++ Shared Library as was needed to convert the code in Matlab to c ++, once the tool is selected, a window Matlab

Conclusion and Final Considerations

223

Compiler for conversion opens. In the Build tab Exported Functions must be added m-files you wish to convert to C ++ and once these files are loaded you click on the Build icon. This tool will generate different files that can be used in c ++ compiler without matlab. The files generated are: • readme.txt • Untitled.dll • Untitled.h • Untitled.lib Unfortunately Matlab Compiler does not support file conversion to Simulink code in C ++ or any other programming language or application type. To accomplish this task, it is advisable to study the tool called Real Time Workshop of Matlab, this can be found by opening the file Simulink, once opened on the menu, the Tools tab is opened and Clik is done in Real Time WorkShop, you should review the help of the tool to find the configuration that must be done before compiling any code.

CHAPTER

14

CONCLUSION

The goal of computational simulation is to solve the theoretical models in its full complexity, using the numerical solution of the equations involved, making intensive (and extensive) use of computers. Currently, through good computational model can not only be played laboratory experiments, but also, thanks to that can freely vary the parameters used, allows testing (or falsify) existing in ranges of parameters that are impossible to achieve experimentally theoretical models also you also get a graphic image of the process in question. This project has a lot to offer our country, we are proposing the creation of a laboratory to study the behavior of materials in fire. Having the necessary equipment you can interact directly with the area of construction and industry that produces materials for the building of this country. Can improve both simultaneously. In addition, the market, both economic study is so broad that it could work in any area of industry, as in paints, paper, coatings, adhesives, interior installations of public order such as hospitals, buildings government secretaries, and many other areas where more can open a new door to the future, looking towards progress and sustainable development. Although the example given vitiated by certain simplifications, the fact is that describes the most important aspects, and relations between the main members of the urban system variables in a municipality any. Because of this, it would yield an accurate estimate of the evolution of the state variables of the system to different hypotheses and scenarios alternative forecast. Thus, allow it responsible for urban planning of a municipality

226

Computational and Numerical Simulations

make more consistent decisions with less risk and with sufficient advance on the response of the municipality to future needs that could cause growth of the population of the same, for example, in housing, water supply, sanitation, infrastructure, etc. We have discussed the main features of a dynamic integrated framework for improving software processes. The combination of traditional or conventional software engineering with modeling techniques and dynamic software simulation process techniques, pursues the fundamental objective of providing a framework from which possible, among other things, assess the consequences of different actions process improvement, conduct testing of different scenarios and promote training and gaining experience in the field of project management. The main advantage of the proposed framework, it is important to note that the process of creating simulation models itself is used as the main driver of another process-oriented definition of a collection program metrics that allows the instantiation of historical databases within organizations. The analysis of real and simulated data will offer two advantages: the ability to validate simulation models, which will increase the accuracy of quantitative reusltados; and determining the actual effects of improving processes in the organization. The feasibility of implementing optimization algorithms for the treatment of cyclical instabilities within an intelligent environment showed, using a high-tech tool and innovative which proved to be versatile and suitable for experimentation. The results guarantee control system instability effectively. It is working to improve the strategy, particularly minimizing the amount of blocked agents. Today, the analysis of MEF in the study of prostheses are commonly used, as well as being very reliable according to findings reported in several sections mentioned in this research, with the particularity that must be operated by experienced personnel in contact problems because any error could cause erroneous results. Moreover these are supported with experimental analysis. Performance curves determined on the results of this study provide important to evaluate the performance of the prosthesis in the complete cycle of motion information, which can be used to modify the sagittal and coronal geometries of femoral and tibial insert to propose amendments compliance and evaluate the results.

Conclusion

227

Regarding the work reported by Fregly et al (2011) mentioned in the introduction of this section, which describes about 19 investigations made for at least the past decade based on numerical models aimed at determining the total load on the joint and distribution in the lateral and medial areas only, depending on the contribution of each of the muscles involved in the gait cycle, it verified that should be made more comprehensive studies to determine patterns of stress and strain at some specific design of PTR. As mentioned in the introduction, the conditions of position and load have not been established in other jobs to match the gait cycle, but with some other study referred to in the same work is not focused directly to meet the kinematic conditions and dynamics that occur when the implant is placed in the patient. According patterns found stress and strain, it is possible to determine the modified geometry PTR for customizing many commercial models in order to improve performance and wear resistance, in this specific case applied to the phenotype. From our perspective forward a medium-term period would be possible to design and manufacture customized to the patient prosthesis. In this book, the position of an assistant robot for laparoscopic surgeries called LapBot was performed by the haptic interface Novint Falcon, making a previous study of the haptic libraries that allow communication between the haptic device and the rendering engine Graphic Ogre 3D. This study concluded that OgreHaptics is one of the best options for working with Ogre 3D for its compatibility with the graphics engine and Novint Falcon, allowing the user to manipulate virtual 3D environments using force feedback. It is designed to make it easier and more intuitive for developers to produce demos and games that use 3D graphics and haptic devices. Thanks to exploring the library OgreHaptics a program where the positioning of an object obtained in Ogre 3D virtual environment using the haptic device was developed. Subsequently, a study of physics engines available to simulate collision detection between two virtual objects was made. This study was obtained that the best option for this project; It is the NVIDIA PhysX physics engine. The fact selected PhysX as a physical engine for the development of the project was based primarily on compatibility with Ogre 3D and the possibility of working with the wrapper NxOgre, as with this first approach

228

Computational and Numerical Simulations

to physics simulation was achieved on the recommendation of Ogre expert users and availability of support in learning through tutorials and an active and cooperative forum. NxOgre constitutes a great alternative for collision detection, it gives the developer the ability to work with envelopes based on triangles to represent complex figures volumes, thus allowing the detection of collision at all points of FIG. Furthermore, to detect collision when an object enters, leaves and / or within the volume. Finally two methods of OgreHaptics and HDAL libraries were studied for force feedback to the haptic interface, achieving better results with the function of HDAL, since greater force feedback is obtained in the device. As future work the positioning of the second robot arm with a new haptic device, including collision detection and force feedback arises. Besides the construction of virtual bodies that allow for greater realism to the simulator. It also seeks to implement the deformation of objects initially to the abdomen and then to all bodies included in the simulation.

REFERENCES

1. 2.

3.

4. 5.

6.

7. 8.

Abdel-Hamid, T., Madnick, S., 1991. Software project dynamics: An Integrated Approach. Prentice-Hall, Englewood Cliffs, NJ. Christie, AM., 1999. Simulation in support of CMM-based process improvement. The Journal of Systems and Software, 46, (1999), 107112. Paulk, M., et al., 1993. Key practices of the Capability Maturity Model, Version 1.1. Technical Report CMU / SEI-93-TR-25. Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA. Putnam, LH, 1992. Measures for Excellence. Reliable software, on time, Within budget. Prentice-Hall, New York, NY. Rodrigues, A. Bowers, B., 1996. System dynamics in project management: a comparative analysis With traditional methods. System Dynamics Review. Vol. 12, 2. Ruiz, M. Ramos. I., Toro, M., 2002. A simplified model of software project dynamics. The Journal of Systems and Software, 59/3, (2002), 299-309. Abbas, A. 2004. Grid Computing: A Practical Guide to Technology and Applications. Andersen, D., M. Neilsen, G. Singh, P. Kalita, Domain-specific Metaware for Hydrologic Applications in the International Journal of Parallel and Distributed Systems and Networks, January, 2003 (also in

230

9. 10. 11. 12.

13. 14. 15. 16.

17. 18. 19.

20. 21. 22. 23.

Computational and Numerical Simulations

the Proceedings of the International Conference IASTED on Parallel and Distributed Computing and Systems (PDCS 2002), pp. 416-421, Cambridge, MA, November 4-6, 2002). COGrid: The ColoradoGrid Computing Initiative. 2003. http://cogrid.colostate.edu/downloads/COGridDescription.pdf. Accessed4/15/2003. Flynn, M. 1966. Very High Speed ​​Computing Systems, “Proceedings of IEEE, vol. 54, pp. 1901 - 1909 Foster, C. I and Kesselman. 1998a. Computational Grids in: The Grid: Blueprint for a Future Computing Infrastructure,I.Foster and C. Kesselman (Eds), Morgan Kaufmann Publishers. Pp: 1-29. Foster,. IC Kesselman. 1998b.The Globus Project: A Status Report Proc. IPPS / SPDP ‘98 Heterogeneous Computing Workshop, pp. 4-18,. Foster, I.2003. The Grid: Computing without boundaries. Scientific American. April Issue. Grid.Org. 2004. http://www.grid.org/about/gc/. Accessed 4/15/2004 Gross, LJ and DL DeAngelis. 2001. Multimodeling: New approaches for linking ecological models. In Predicting Species Occurrences: Issues of Scale and Accuracy, (Scott, JM, PJ Heglund, M. Morrison, M. Raphael, J. Haufler, B. Wall, editors). Island Press,Covello, AC. Mutka, M. and M. Livny. 1991. The available capacity of a privately owned workstation environment. Performance Evaluation 12 (4) 269284. Santo Orcero, David. 2002.Parallel simulation and optimization of silicon clusters. Disertation written to Obtain the degree of Ms. on Computer Science at Málaga (spain). Sepúlveda, ES., FS Zazueta, EA Holzapfel, and RA Bucklin. 2004. Equations Microirrigation complex network for grid computing using hydraulics. Section ASAE Meeting Presentation. Section Number FL04-1008. 12 p. Borko, H. (2004). Teacher Professional Development and Learning: Mapping the Terrain. Educational Researcher, 33 (8), 3-15. Buckingham, D. (2008) Beyond technology. Buenos Aires: Manantial. Burbules, N. and Callister, T. (2001) Risks and promises of new information technologies. Buenos Aires: Granica. Cochran-Smith, M. (1981). American Educational Research

References

24.

25. 26. 27.

28.

29.

30.

31. 32.

33.

34.

35.

231

Association: Annual Meeting 1982 Call for Proposals New York City, March 19 23. Educational Researcher, 10 (5): 1-24. Coll, C. (2011). Learning and Teaching with ICT: expectations, reality and potential. In Carneiro, R; Toscano, JC; Diaz, T. (Coords.) (2011) The challenges of ICT for educational change. EDUCATIONAL GOALS Collection 2021. OEI and Santillana Foundation. Collins, A. and Halverson, R. (2009) Rethinking Education in the Age of Technology. New York: Teachers College Press. Federal Council of Culture and Education. (CFCE) (2004). Priority learning cores. Ministry of Education, Science and Technology. Dede, C. Richards, J. (2012) Digital Teaching Platforms: Customizing Classroom Learning for Each Student. New York: Teachers College Press. Donnelly, D .; McGarr, O, & O’Reilly, J. (2011) A framework for integration of ICT into teachers’ Their classroom practice. Computers & Education, 54, 1469-1483. Duarte, J., Bos, MS, & Moreno, M. (2009). Inequity in school learning in Latin America. IDB Technical Note No. 4, Inter-American Development Bank, Washington, DC. Fenstermacher, G. (1989) “Three aspects of the philosophy of research on teaching” In Wittrock, MC, & American Educational Research Association. (1989). Research Teaching Approaches, theories and methods. Barcelona: Paidós. Furman, M., & Podesta, ME (2009). The adventure of teaching natural sciences. Buenos Aires: Aique. Furman, M., & Podesta, ME (2013). Good Practices in Science Teacher Education for Schools in Disadvantaged Areas: Value of Inquiry-Based Science in the Classroom Lesson Modeling. The International Journal of Science, Mathematics and Technology Learning, 19 (2), 1-13. Furman, M., & Podesta, ME (2014). Evaluating the Impact of a School Improvement Program in Science Student Learning: The Case of “Bicentennial Schools.” Buenos Aires. Gerard, LF; Spitulnik, M. & Linn, MC (2009) Teacher Use of Evidence to Customize Inquiry Science Instruction. Journal of Research in Science Teaching 47 (9), 1037-1063. Light, D. Pierson, E. (2012) Highlighting changes in the classroom of a successful one-to-one program in rural Argentina: Case Studies of

232

36. 37.

38.

39.

40.

41.

42.

43.

44.

45.

Computational and Numerical Simulations

All Kids on the Net in San Luis. Center for Children and Technology, Education Development Center Inc. Linn, M. & Slotta, J (2009). WISE science: web-based inquiry in the classroom. New York: Teachers College Press, Columbia University. Ministry of Education (2007) Notebooks classroom: Natural Sciences 6. Buenos Aires: Ministry of Education, Science and Technology of the Nation. Ministry of Education (s / f). National Evaluation 2010 CENSUS OF COMPLETION OF SECONDARY EDUCATION. Results report. Available inhttp://diniece.me.gov.ar/images/stories/diniece/ evaluacion_educativa/nacionales/resultados/Resultados%20 Censo%20ONE%202010.pdf OECD (2012) PISA in Focus. What kind of careers aspiring boys and girls? Available inhttp://www.oecd.org/pisa/pisaproducts/pisainfocus/ PISA%20in%20Focus-n%C2%B014%20ESP.pdf Organization of American States (OEI) (1999). Budapest Declaration. Declaration on Science and the use of scientific knowledge. World Conference on Science for the Twenty-First Century: A New Commitment. Budapest (Hungary) 26 June- 1 July 1999. Organization of American States (OEI). (2014). Perspectives on education in Latin America. Advances in Educational Goals 2021. Availablehttp://oei.es/xxivcie/Miradas2014Web.pdf Organization for Economic Co-operation and Development (OECD) (2013). PISA 2012 Results: What Students Know and Can Do: Student Performance in Reading, Mathematics and Science [PISA 2012 Results: What to know and what students can do: student performance in reading, mathematics and science]. Available in http://www.oecd. org/pisa/pisaproducts/48852548.pdf Osborne, J .; Simon, S. & Collins, S. (2003) Attitudes towards science: a review of the literature and Its Implications. International Journal of Science Education 25 (9), 1049-1079. Polino, C. (2012). Science in the classroom and interest in scientific and technological careers: An analysis of the expectations of students at the secondary level in Latin America. Ibero-American Journal of Education, 58, 167-191. Porlán, R. (1998). Past, present and future of science education. Science education journal, 16 (1), 175-185.

References

233

46. Well, J and Gomez, M (2001). Learn and teach science. Ed. Morata, Madrid. Third edition. 47. Raes, A., Schellens, T. & De Wever, B. (2013) Web-based Collaborative Inquiry to Bridge Gaps in Secondary Science Education. Journal of the Learning Sciences. DOI: 10.1080 / 10508406.2013.836656 48. Rizzi, C. (2012). modeling school activities with ICT. In Perez, P .; Libedinsky, M .; Garzon, M .; Telleria, X. & López, N. (2012) school activities with ICT. Buenos Aires: New Educational. 49. Torres, M. (2010). The traditional teaching of science versus new technologies. Educare Electronic Journal, vol. XIV, no. 1 January-June 2010, 131-142 Universidad Nacional Costa Rica. 50. UNESCO (2013a) UNESCO Science for Peace and Sustainable Development. Available inhttp://unesdoc.unesco.org/ images/0021/002197/219756e.pdf 51. UNESCO (2013b) Strategic Approaches to ICT in education in Latin America and the Caribbean. Available inhttp://www.unesco.org/new/ fileadmin/MULTIMEDIA/FIELD/Santiago/images/ticsesp.pdf 52. UNESCO. (2009). Contributions to the teaching of natural sciences: Second Regional Comparative and Explanatory Study (SERCE). Santiago de Chile: Regional Office of Education of UNESCO for Latin America and the Caribbean. 53. Valverde, G., & Näslund-Hadley, E. (2010). The condition of education in mathematics and science in Latin America and the Caribbean. InterAmerican Development Bank. Education Division. 54. Wilensky, U. and Reisman, K. (2006) “Thinking Like a Wolf, a Sheep, or Firefly: Learning through Biology Computational Constructing and Testing Theories - An Embodied Modeling Approach”. Cognition and Instruction, 24 (2), 171-209. 55. Bartel, DL, Burstein, AH, Edwards, DL 1985. The effect of conformity and plastic thickness on contact stress in metal-backed plastic implants. Journal of Biomechanical Engineering. 107: 193-199 56. Bartel, DL, Bicknell, VL, Wright, TM 1986. The effect of conformity, thickness, and materials on stresses in ultra-high molecular weight Total joint replacement components for. Journal of Bone and Joint Surgery American. 68-A (7): 1041-1051 57. Bartel, DL, Rawlinson, JJ, Burstein, AH, Ranawat, CS, Flynn, FW 1995. stresses in polyethylene components of the total knee replacements

234

58.

59.

60.

61.

62. 63.

64.

65. 66.

67. 68.

69.

70.

Computational and Numerical Simulations

contemporary. Clinical Orthopedic Releated Research, 317: 76-82 Bei, Y., Fregly, BJ, Sawyer, WG, Banks, SA, Kim, 2004. The relationship Between NH contact pressure, insert thickness, and mild wear in the total knee replacements. Computer Modeling in Engineering and Sciences, 6 (2): 145-152 Chillag, KJ, Barth, E. 1991. An analysis of the total polyethylene thickness in modular knee components. Clinical Orthopedic Releated Research, 273: 261-263 Deen, EM, Garcia, FM, Jin, ZM 2006. Effect of ultra-high molecular weight polyethylene in thickness on contact mechanics Total knee replacement. Journal of Engineering in Medicine. 220 (H): 733-742 Fregly, BJ, Bessier, TF, Lloyd, DG, Delp, SL, Banks, SA, Pandy, MG, D’Lima, DD, 2011. Grand Challenge competition to predict in vivo knee loads, Journal of Orthopedic Research Mont. 2011: 1-11 Hills, DA, Nowell, D., Sackfield, A. 1993. Mechanics of Elastic Contacts. Great Britain. Butterworth-Heinemann Ltd., pp. 45-71 Hills, DA, Urriolagoitia, SG 1999. Origins of partial slip in freeting-a review of Known and potential solutions. The Journal of Strain Analysis for Engineering Design. 34 (3): 175-181 Kurtz, SM, Jewett, CW, Bergström, JS, Foulds, JR, Edidin, AA 2002. Miniature test specimen for shear punch UHMWPE used in the total joint replacements. Biomaterials. 23: 1907-1919 V. Zamudio, “Understanding and Preventing Behavior in Ambient Intelligence Periodic”, Ph. D. thesis, University of Essex. 2009 R. J. Kennedy and Eberhart, “Particle swarm optimization”, Proceedings of IEEE International Conference on Neural Networks, Perth, Australia, 1995, pg. 1942-1948. GoldbergGenetic Algorithms in Search, Optimization, and Machine Learning, 1989: Addison-Wesley A. Sosa, “Stabilization strategies in dynamic environments using artificial intelligence intelligent techniques”. 2013, MSc. thesis, Instituto Tecnologico de Leon, Leon, gto, 2013 V. Callaghan “Putting the Buzz Back into Computer Science Education” Workshop Proceedings of the 9th International Conference on Intelligent Environments, doi: 10.3233 / 978-1-61499-286-8-454 Callaghan V. and V. Zamudio, “faciliting the ambient intelligence vision: A theorem, representation and solution for instability in

References

71.

72.

73.

74. 75. 76. 77. 78. 79.

235

rule-based multi-agent systems. 2008, Special Section on Agent Based System Challenges for Ubiquitous and Pervasive Computing. International Transactions on System Science and Applications, May 2008, vol. 2. 4 pp 108-121 L. Romero, V. Zamudio, M. Sotelo, E. M. Baltazar and Mezura, “A comparation Between mataheuristics as strategies for Minimizing cyclic instability in ambient intelligence”, 2012, sensor, vol. 12 A. Sosa, V. Zamudio, R. Baltazar, C. Lin, M. Casillas and M. Sotelo, “PSO and algorithms applied to the problem WEIGHT of dynamic instability in multiagent systems Nomad”, 2012, International ROPEC G. Harik, F. Lobo and D. Goldberg, “The Compact Genetic Algorithm”, IEEE Transactions on Evolutionary Computation, vol. 3, 4, pp 287297. Nov. 1999 Heino Engel. Structures systems. First Edition third edition. 2003. Edit. Gustavo Hill. Shahin Vassigh. Interactive Structures (CD). University atBuffalo. USES. 2005. Branko Kolarevic. Architecture in the Digital Age: Design and Manufacturing. Editorial Taylor & Francis; English language. 2003. Farshid Moussavi. The Function of Form. Actar andHarvard Graduate Schoolof Design. English language. Martin Bechtold. Innovative Surface Structures: Technologies and Applications. Publisher Taylor & Francis. ; English language. 2008. Fuller Moore. Understanding Structures in Architecture. McGrawHill. Mexico. 2000.

INDEX

A Accurate simulation engineering 50 Achieving higher accuracy 160 Acquire skills 86 Actual behavior 155 Additional transport equation 60 Aerodynamic analysis 56 Analytical model 2 Application fields 86 Application of dynamic simulation 49 Applications incorporating software 160 Applied linear operator 61 Architectural design 84, 86, 87, 96, 97 Architectural spaces from sustainability 85 Artificial satellite 106 Asymptotic evolution 109 Automatic derivation 3

Automotive design 3 Automotive simulations 2 B Blood sugar after injection 112 Bullet Physics 204, 205 C Chimneys pressure compensation 131 Circumstances convection dominates 65 Commercial program 143, 145 Communication networks 15 Complex architectural 84, 85 Complex dynamic system 40 Compressive stresses normal 142 Compressive stresses occur 142 Computational automata 5 Computed aided manufacturing 84 Computer graphics simulation 96 Computer program 85

238

Computational and Numerical Simulations

Console incorporates stereoscopic glasses 167 Context of systems theory 106 Context of teaching architecture 85 Control and operational management 41 Conventional software engineering 226 Cyclical instabilities 104 Cyclic instability 101 D Data generator mechanism 111 Decision-making process 41, 51 Decomposition previously commented 49 Deduces specific information 121 Develop management models 50 Development of alternative mathematical numerical techniques 4 Digital system laparoscopic 166 Digital technologies 88 Dimensional object 94 Dissemination of culture 86 Dissimilar materials 145 Dynamic behavior 40, 45 Dynamic environment 101 Dynamic Integrated Framework for Software Process Improvement (DIFSPI) 43 Dynamic viscosity 60 E Economic activity 32, 34 Elastic modulus 142, 148 Electrical circuit 106

Excellent incremental framework 44 Excessive deformation 69 F Facilitate decision-making 42 Facilitate operational decision making 41 Feasibility of implementing optimization algorithms 104 Finite element formulation 64 Finite element method 56, 58, 64 Flexible electrical network 109 Fluctuations speed 62 Fluid Mechanics 126 G Glucose tolerance 112 Goal of computational simulation 97, 225 Good computational model 97, 225 Graphic and physical 209 Graphic comparison 124 Great alternative 228 Grid computing 76 H Heat exchanger sodium-sodium 113 Hybrid methods 60 I Implementing optimization algorithms 226 Inherent approximation 110 Initial model information 48 Insert Femoral (IF) 144

Index

Interesting training program 221 Internal functional relationship 106 Internal interaction 2 K Kinematic viscosity 60 L Laparoscopic instruments 167 Laparoscopic surgery 161, 162, 163, 166, 167, 171 Laparoscopy 162, 163, 164, 165, 166 Larger deformation 145 Learning implementation 5 Learn project management 42 M Manner of arrangement 106 Mathematical assumption 79 Mathematical model 57, 64, 108, 109, 110 Mathematical problem 7 Mathematical regularity 71 Mathematical relationship 109 Mathematical structure 26, 27, 29 Maximum compressive stresses 149 Measurable attribute 106 Measure functional 71 Medical interventions 160 Metaheuristic optimization algorithms 101 Metal component 142 Metric system component 45 Metric tensor eigenvalues 70 Micrometric Predictability 3 Minimally invasive surgery (MIS) 168

239

Mini surgery Invasive (MIS) 162 Modeled system 36 Model etymological 108 Modeling synovial fluids 155 Modern mathematical analysis 2 Moment resulting system 73 Monte Carlo method 14 MonteCarlo simulation system 16 N Nonlinear partial differential 57 Novint Falcon haptic device 174 Numerical analysis 141, 143, 144 Numerical computation intensive 78 Numerical data 36 Numerical implementation 4 Numerical methods available 4 Numerical simulation 126 Numerical simulations predictions 126 O Observation concerns 111 Optimization approach 101 Orientation period 9 P Parallel computing 77, 78 Parallel software 60 Parameter calculation produces optimal parameter 123 Particular organization 42 Perform complex numerical calculation 2 Performs a parametric analysis 129 Phantom haptic device 220 Phantom haptic interface 220, 221

240

Computational and Numerical Simulations

Physical phenomena 3 Physical substance 207 Polyethylene 142, 143, 148, 149, 150, 151, 155, 156 Population grows parallel 36 Possessing predictive analytical methods 5 Potential mathematical model 56 Practical cases oscillation 136 Pressure coefficient calculation 73 Probability distribution 6, 12 Problem analytically determine 79 Problem including friction 140 Processes of structural analysis 84 Process modeling project management 49 Process of architectural professional work 84 Project management process 46 Q Qualitative structure 26, 27 Quantitative yield 52 R Reference volume 71 Replication structure 49 Respective explanatory 73 S Satisfy scientific curiosity 111 Scientific instruments 76 Seymour Cray at Control Data Corporation 2 Share computing resource 76 Simulated patient’s abdomen 161, 211, 215, 218 Simulate the structural reality 85 Simulating standard rigid bodies

205 Simulation-based training 42, 51 Simulation data 124 Simulation method 12 Simulation model 40, 41, 42, 49 Software simulation process 41 Spectral analysis 124 Stability requirements 51 Static processes traditional 51 Stochastic simulations 51 Strategic management 41 Strict predictability 3 Structural analysis 84, 89, 93, 94 Substantive function 86 Subtraction causing problem 95 Supportive environment 44 Surface modeling 88, 90, 91, 92 Surgical Simulators 160 Surgical techniques 161, 162 System approach nonlinear 71 System Dynamics 26 System information 114 T Tactile feedback 160, 166 Teaching didactics-Learning 96 Technological analysis 3 Thermal conditions and neutron flux found 113 Training scenarios training 166 Transformation 70 Transitional until obtaining 127 V Verified maximum pressure 155 Virtual architectural models 96 Virtual laparoscopic simulato 166 Virtual reality system 167