Deformable haptic simulation demo: 6-DoF haptic rendering of distributed contact between deformable and/or rigid objects with complex geometry Version 1.6 August 2007 Introduction ============ This is a haptics demo that demonstrates 6-DoF (forces and torques) haptic rendering of distributed contact between deformable and/or rigid objects with complex geometry. Distributed contact refers to contact where there can be several simultaneous contact sites between the two objects, each distributed over a certain non-zero surface area (as opposed to the objects just touching at a single point, etc.). In other words, this demo supports arbitrary contact configurations between the two 3D objects. The deformation model is reduced nonlinear FEM [Barbic and James, SIGGRAPH 2005]. This model can handle large deformations and supports deformable dynamics. It is very fast; however, the deformations are limited to a certain low-dimensional basis of motion (we designed algorithms to identify quality low-dimensional bases automatically, based on object's mesh and material parameters). Supported haptic devices ======================== The demo is designed for Sensable's 6-DoF Premium Phantom 1.5 force-feedback device. We were also able to run it on: Phantom Premium 3-DoF 1.5, Phantom Omni (these last two devices don't support torques; but torques are computed and displayed on the screen). Other haptic devices are currently unsupported/untested. Computer requirements ===================== You need a dual-core PC with an Intel Pentium processor to run the demo. We use some SSE2 machine code instructions in critical parts of the code, and we did not test our demo on AMD processors (however, it would likely work just fine). The demo was designed for the Windows 2000/XP operating system. We have been able to run the demo on an older Pentium 4 (manufactured in 2003) dual-core 2.0 Ghz computer with 2 Gb of memory. Usually, we run our demo on a Pentium 4 dual-core 3.0 Ghz computer (manufactured in 2004), or on a new Intel Core Duo 2.66 Ghz computer (manufactured in 2007). 2 Gb of memory is sufficient. We recommend using as fast computer as available, as faster computers will give you a better force/torque signal, provided you increase the "maximum number of processed points" setting (see the Graceful degradation section). Our demo uses GPU shaders for graphical rendering (not for haptics). The shaders are only supported on Nvidia's graphics cards GeForce 6000 or higher. Even if your graphics card does not support our shaders, you can still run the demo: the demo will detect whether you card supports our shaders, and if not, it will fallback to standard rendering. Even with standard rendering, the performance was still very good; and in either case, this only affects graphics, not the haptic signal. Warning ======= If you set some of the user interface settings incorrectly, you can potentially damage the device. For example, setting high stiffness values from the UI will result in device instabilities. You are running the demo at your own risk. We never encountered any particular problems (after several months of testing the demo), and our settings worked fine for our purposes. In addition to us, about 20 visitors to our lab have already tried the demo and there were never any problems. However, the demo is research code and we cannot guarantee problem-free operation under all possible circumstances. Our demo does continuously monitor device's motor temperatures, and it will shut forces/torques off if temperatures reach dangerous values. Required: Phantom device drivers v4.2 or higher =============================================== This demo needs Phantom device drivers version 4.2 or higher. It will not run with earlier device drivers (reason: it uses the OpenHaptics API, which only works with device drivers 4.2 or higher). You can download the drivers (for free) at: http://www.sensable.com Required: OpenHaptics DLL ========================= The demos use OpenHaptics, a haptic API from Sensable Technologies. OpenHaptics is not a free API, neither for the developers nor for the end users, even if they had purchased a device from Sensable Technologies. For this reason, we cannot distribute OpenHaptics DLLs with our demos. If you are a qualified academic institution, you can obtain an academic copy (free of charge (2007), courtesy of Sensable Technologies) of the DLLs on the Sensable's website. To do so, you need to login into the Sensable Developer Support Center (you can find it by following the link for OpenHaptics support; first, you need to make an account). You only need one file: hd.dll. Place that file into the main demo folder or somewhere on the system path. Installing the demo =================== There is no specific install. Just unzip the files into a folder and you are ready to go. You need to unzip both the demo core and the individual demo data files from the same folder; appropriate subdirectories are stored in the zip archives and will be created during unzip (demo data files for each demo will be placed into separate subfolders). When unzipping, you need to enable the option to generate appropriate subfolders. Under Unix/Linux, this happens by default. In Winzip, check the "Use folder names" option when unzipping. Setting the haptics device name =============================== By default, the demo uses the haptic device named "Default PHANToM" (without the quotes), which is the default name given to Phantom force-feedback devices. If your device is named differently (under Windows, you can find the name of your device in Control Panel/Phantom), you need to alter the "configFiles/phantomName.txt" text file accordingly. The demo won't run unless the device name is set correctly. Running the demo ================ Double click the appropriate .bat file (in the main demo folder). Note: You can also run the demo from a Windows Command Prompt. If the demo for some reason doesn't start, this will allow you to see any error messages. The wireframe overlay ===================== Two meshes are displayed for the haptically manipulated object. The wireframe mesh gives the current position of the haptic manipulandum. The solid triangle mesh gives the current position of the virtual object, which will follow the solid object in free space (unless max velocities are exceeded). If the user moves the haptic manipulandum (wireframe object) into contact, the solid object will maintain surface contact, and the user will feel forces/torques. This is a form of a virtual proxy (God object), as standard in haptics. You can disable displaying the wireframe mesh on the UI (as the extra mesh can sometimes be visually distracting). Mouse ===== Middle + drag: zoom in/out (must press the middle button or wheel down, not scroll the wheel) Right + drag: move camera (note: the mouse scroll wheel is assigned no function) You can zoom/move the camera while the simulation is running, for example, to explore the scene from a different angle. If you change the camera view, the motion of the haptic manipulandum will always be interpreted according to the current camera view. For example, moving the manipulandum left will mean left with respect to the current camera view, which is different than left under another camera view, etc. Keyboard ======== Note: The main simulation window *must have focus* for keys to work. Initially, this window does NOT have focus (the 'Controls' UI window does). (I am unable to change this behavior in GLUI; maybe it's an inherent limitation of Windows; if somebody knows a workaround in GLUI, I'd be happy to hear about it) ESC ... exit program \ ... reset camera view to default c ... calibrate Phantom (this also happens automatically at the start of the simulation, unless disabled in the configuration file) i ... print out camera position information a ... on/off: render coordinate system axes k ... on/off: display traversed points (blue) and points in contact (red) L ... (capital L) lock scene and enter "camera mode": Haptics stops (object position freezes) and you can freely reposition the camera. This is good for having a look at the contact site from another angle. Exit this mode by pressing 'L' or 'l'. Haptic simulation then resumes from the original camera view. p ... on/off: display pointshell (will slow down simulation) n ... on/off: when displaying pointshell, this toggles displaying the normals; normals determine the direction of the penalty force; note that normals dynamically change with deformation q ... on/off: display manipulated object e ... on/off: display the other object (non-manipulated) w ... on/off: display wireframe mesh for the non-manipulated object (note: this mesh plays no significant role in the simulation) Note for 'p' key: the pointshell is only displayed up to the level specified by the "Max rendered node level" edit box on the UI. By default, this is the deepest level, so you can see the entire pointshell. The button on the haptic handle =============================== You can use the button to control workspace indexing: When the button is pressed down, you can freely reposition the haptic handle (or change handle orientation), without this having any effect on the position/orientation of the virtual handle (i.e. wireframe overlay mesh) in the simulation. This is useful if you reach the limit of the Phantom workspace. Press and hold the button, and move the physical haptic handle to the origin (also you can un-rotate the handle to neutral rotation). Then, let go off the button - and you can explore the parts of the scene that would otherwise not be reachable due to workspace limitations. Note: you will typically press the button AND then hold it while repositioning the device. Just pressing it and immediately releasing it is not very useful. The Controls window =================== A note: ------- If you change a value on the UI (for example, using the keyboard, by editing a text box), you need to click on an empty space in the UI window to complete the input. Otherwise, the particular control is still accepting input, and won't send the new value to the application until it loses focus (I am unable to change this behavior in GLUI; if somebody knows a workaround, please let me know). Another note: ------------- Some edit boxes are too short to display the entire numerical value, and I am unable to make them longer. For example, a GD threshold of 2000 is displayed as "200", because the last 0 is hidden; it appears if you click on the edit box and scroll left/right with cursor keys. Virtual coupling stiffness kVC ------------------------------ This controls the stiffness of the virtual coupling spring between the physical position of the device and the virtual position of the simulation object. The value is to be interpreted in actual physical workspace device units, in N/mm. This value typically needs to be less than 0.6 N/mm (hardware stiffness limitation of the device). Environment stiffness kENV -------------------------- This is just like virtual coupling stiffness, except it controls the stiffness of each individual penalty spring applied to points in contact. This should be typically set to a higher value than kVC, e.g. to a multiple of 3 or 4 of kVC. For example, a common setting would be kVC = 0.4 N/mm, kENV = 1.2 N/mm. Note: it is possible to increase the stiffness of our demos by increasing kVC and kENV. but care must be taken when approaching hardware stiffness limits of the device; see below. Note: the actual stiffness felt by the haptic device is: k = 1 / (1 / kVC + 1 / ( min(n,10) * kEnv ) ) <= 1 / (1 / kVC + 1 / ( 10 * kEnv ) ), where n is the current number of points in contact. This stiffness k is also always displayed at the bottom of the UI (as "effective stiffness"). Note: in some demos from Sensable, the maximum stiffness is set to 0.3 N/mm for a 6-DOF Phantom, and to 0.6 N/mm for a 3-DOF Phantom device. Exceeding these values (for k) will result in device instabilities. Also, instabilities will appear if the ratio kEnv / kVC is set too high, e.g. much higher than 3 or 4. To some extent you can reach higher stiffnesses by increasing the servo loop rate to 2000 Hz. Torque-force balance -------------------- Torque-force balance controls how strong the simulation torques are compared to simulation forces. Rendered torque stiffness ------------------------- This controls the ratio between simulation torques and the torques that are haptically rendered to the device. For example, you can decrease/increase this parameter to make the torques weaker/stronger (it will make the torque motors heat up more slowly/faster). Static damping -------------- Static damping (0 <= alpha < 1) adds extra dissipation to the system, which can improve stability. Large values of alpha (close to 1) lead to a surface stickiness effect. We set alpha=0.5 as the default. When static damping is disabled (alpha = 0), the virtual object always snaps to the first-order linearized rest configuration under the current virtual coupling/contact forces and torques. For a positive value of alpha, the object only travels (1-alpha) the distance to the rest configuration in every haptic cycle. Virtual coupling saturation --------------------------- If enabled, the virtual coupling spring will only be linear up to a certain threshold. This prevents pop-through with thin geometry. Virtual coupling saturation serves as a virtual proxy: even if force rendering is turned off and the user is free to position the manipulandum in any way they want, the manipulated object cannot penetrate deep into the other object. Beyond the linear region, the spring force will exponentially saturate to a constant value equaling twice the linear threshold. The threshold is given in terms of how deep into the voxel layer the object can penetrate under the maximum value of the virtual coupling force. The force rendered to the haptic device is linear (does not saturate) regardless of whether the saturation is enabled or not. Deformable object ----------------- It is possible to control the "softness" of the deformable object (compliance). This effectively linearly scales the Young's modulus for the entire deformable object. Also, you can control the lowest natural vibration frequency of the object (all the other frequencies are rescaled so that the ratio of any frequency to the lowest frequency stays the same). You can also disable the deformations and feel the rigid object (the simulation will be significantly stiffer). Max velocity/max angular velocity --------------------------------- These control the maximum velocities that are allowed for the virtual object in the simulation. If you move the haptic manipulandum faster than the limit, the object will lag behind and you will feel a braking force (also a braking torque when exceeding rotation limits, if using a 6-DoF device). This prevents the user from propelling the object extremely fast into the other object, which would lead to haptic instabilities. We set the limits to a reasonably high value so that they don't impede normal demo operation. These settings also affect the temporal coherence module, which internally uses the max velocity settings to estimate the time before a certain point can enter contact. Temporal coherence will be able to be more aggressive for lower velocity limits. Max rendered node level ----------------------- The pointshell will only be haptically rendered up to and including the level specified. By default, this is set to the finest pointshell level (i.e. entire pointshell). This setting also affects the maximum level used by graceful degradation, if graceful degradation is enabled. Graceful degradation -------------------- This enables the simulation to maintain a fixed update rate, even if contact configuration becomes so involved that not all pointshell points can be traversed within one haptic cycle. If enabled (it is enabled by default), the simulation will only haptically render the pointshell up to a level where a specified maximum number of processed tree points (i.e. BD-Tree nodes) has been reached. You can control this maximum number of points on the UI. Since the time it takes to a complete a haptic cycle is about proportional to the number of tree points traversed, this allows you to control the maximum amount of computation time for each haptic cycle. The deepest level rendered is continuously displayed in the main application window (LOD: n). The algorithm internally ensures that LOD switches are not too common (by using two activation thresholds, cold and warm - they are also displayed in the main application window). If you have a fast computer, you can set the max number of processed points to a higher setting, e.g. 1200 or higher. You can use the microseconds counter in the main application window as a guide on what number of points your computer can handle. Higher thresholds will give you more points in contact and an improved force/torque signal. Also note that for a fixed max number of traversed points, the contact forces and torques computed will be identical on any computer. IMPORTANT: When trying the demo for the first time on a new computer, you will most likely need to set the fixed max number of traversed points appropriately. Select the largest value that still maintains sufficient haptic performance (for example, around 800 microsecs per frame). Scheduler rate -------------- The default rate is 1000 Hz, but you can also change it to the "alternative rate" at any time during the simulation. The alternative rate is usually 2000 Hz. Some devices don't support 2000 Hz - in this case, you can modify the *alternativeSchedulerRate in the config file to another value. Also, we have seen computer/haptic device combinations where the alternative scheduler rate didn't work (OpenHaptics fails to change the servo rate), no matter the alternative rate. Use voxmap instead of distance field ------------------------------------ This makes the simulation use a quantized (8-bit) voxmap representation for the rigid object, as opposed to continuous (trilinearly interpolated) distance field. It allows you to compare the force/torque signals quality under the two different representations. Enable BD-Tree -------------- Disabling this results in a flat tree-less point-shell traversal, which will be very slow. The entire pointshell is always rendered, regardless of the "Max rendered node level" setting. Half-voxel force ---------------- This is the force that the user would feel if a single point penetrated half a voxel (of the distance field) into contact. Higher distance field resolutions will lower this value. Performance indicators (main application window) ================================================ Displayed are: - the number of microseconds it took to compute the current haptic cycle (averaged over the last 100 cycles), - the current haptic and graphic frame rate, - the current LOD rendered (see graceful degradation section), - the current number of points in contact (not averaged, sampled at the graphics rate), - the current number of tree nodes traversed (not averaged, sampled at the graphics rate), - the cold and warm graceful degradation thresholds (warm threshold matches the max visited tree nodes for graceful degradation), - the total number of nodes in the BD-Tree. Motor temperatures ================== These are continuously displayed in the title bar, and also in the top-right corner of the screen. Temperatures are normalized to a scale from 0 to 99, where 0=room temperature, and 99=overheating. If a temperature exceeds 85, the demo will exponentially reduce stiffness until all temperatures are below 65. Troubleshooting graphics ======================== In order to use GPU-accelerated graphical rendering, your graphics card and driver must support the vertex texture fetch functionality, pbuffer, and render-to-texture extensions. These extensions are used in a GP-GPU-like fashion for the purposes of graphical rendering, such as computing the model vertex displacements at graphical rates. They are not used for haptics, which runs entirely on the CPU. At startup, the demo will check if your graphics card supports the features necessary for GPU-accelerated graphical rendering; if not, it will fallback to CPU graphical rendering. In most cases, this should still give you fast haptic performance, with perhaps a lower graphical frame rate (especially on older computers). We used Nvidia GeForce 6800 graphics cards or newer. The demo has not been tested with ATI cards. GPU-accelerated rendering uses vertex texture fetches, and will as such only be available on Nvidia's GeForce 6000 series or newer. Vertex texture fetches are currently (2007) not supported on Nvidia's Quadro FX cards. In these cases, the application will fallback to CPU rendering. If the application fails to start due to some graphic card related problem (this never happened on our test computers), you can try updating your graphics card driver. If that fails to resolve the problem, you can try disabling GPU rendering by setting the *renderOnGPU setting on 0 in the appropriate config file (default is 1, i.e. enabled). This will disable GPU-accelerated rendering and the computer won't even attempt to detect it. Cool things to try in the demo ============================== Insert dinosaur's tail into a hole on the top shelf of the bridge. Drive the bridge by hooking the dinosaur's tail against the small feature on the back of the windpump. Ride the bridge and watch the LOD indicator (graceful degradation). Ride the bridge, then feel the torques as you try to rotate the dinosaur along the vertical axis. Turn deformations off and feel the rigid structure. Enable the display of the BD-Tree. See how the spheres deform with geometry, and how they get subdivided into smaller spheres at the contact sites. Disable the tree and see how much it slows down (it will get REALLY slow). Turn on/off the manipulandum display (shown as wireframe mesh). Press 'p' and see the pointshell. Deform the object and see the pointshell follow. Then, decrease the level of detail rendered and you will see the pointshell at coarser levels. In addition, press 'n' to see the normals. Deform the object and see how the normals deform too. Get into an interesting contact configuration, then press 'L'. Simulation will freeze. Now you can change the camera view. Show/hide the two objects with 'q' and 'e'. Press 'l' to resume the simulation. Camera view will reset back to the view when 'L' was pressed. Really cool: Press 'k' to enter (or leave) the contact display mode. Now, all traversed tree nodes (except those in contact) are colored in blue, and all nodes in contact are red. Go into contact, and hide the haptic object ('q'). You will see the contact sites in red and you will also see the blue points concentrated close to the contact sites. Also note how the blue points are flickering; this is due to using temporal coherence. Many points far away from contact sites are put to sleep by temporal coherence. Turn temporal coherence off and the nodes will stop flickering (they will become steady). Note that our L1 pointshells (i.e., coarsest level) have many points (e.g. 1000); those are always traversed when temporal coherence is off. Try making a large contact area and watch the performance indicator and the LOD level rendered. Making a large contact area is easier if you disable virtual coupling saturation as you can then penetrate deeper. Enable/disable virtual coupling saturation and see how saturation prevents pop-through. With saturation on, disable forces and torques, then reposition the haptic handle very far into contact. See how the force and torque grows to huge numbers (not rendered to the device of course; but you can still watch them in the top-right part of the screen), however, the two objects don't inter-penetrate more than a shallow distance. Switch the manipulated object's representation to the VPS voxmap and compare the forces and torques to those under the distance field. Change the deformable object's compliance and base frequency. Use workspace indexing! Without it, you are limited in what you can explore. Note ==== The alpha path planning puzzle would benefit from torque saturation (not implemented in our demo). Credits ======= Programming and this readme.txt: Jernej Barbic, jernej.barbic@gmail.com Carnegie Mellon University Research: Jernej Barbic, jernej.barbic@gmail.com Carnegie Mellon University Doug James, djames.cs@gmail.com Cornell University Citation: Jernej Barbic and Doug L. James: Six-DoF Haptic Rendering of Contact between Geometrically Complex Reduced Deformable Models, IEEE Trans. on Haptics, 1(1), 2008, p. 39-52 @article{ Barbic:2008:SHR, author = "Jernej Barbi\v{c} and Doug L. James", journal = "IEEE Transactions on Haptics", title = "Six-DoF Haptic Rendering of Contact between Geometrically Complex Reduced Deformable Models", year = "2008", volume = "1", number = "1", pages = "39--52", } Jernej Barbic, Doug L. James: Time-critical distributed contact for 6-DoF haptic rendering of adaptively sampled reduced deformable models, Symposium on Computer Animation (SCA) 2007, San Diego, CA, August 2007, @inproceedings{Barbic:2007:TDC, author = "Jernej Barbi\v{c} and Doug L. James", title = "Time-critical distributed contact for 6-DoF haptic rendering of adaptively sampled reduced deformable models", year = "2007", month = aug, booktitle = "2007 ACM SIGGRAPH / Eurographics Symposium on Computer Animation", pages = "171--180", } This demo uses OpenHaptics, a commercial haptic API from Sensable Technologies. It also uses GLUI, a LGPL-licensed GLUT-based UI library: http://glui.sourceforge.net/ http://www.cs.unc.edu/~rademach/glui/