Abstract: A guide system for use by a user who performs an operation in a defined three-dimensional region is disclosed, the system including a data processing apparatus for generating images of the subject of the operation in co-registration with the subject, a display for displaying the images to the user, a probe having a position which is visible to the user, and a tracking unit for tracking the location of the probe by the system and transmitting that location to the data processing apparatus, the data processing apparatus being arranged, upon the user moving the probe to a selection region outside and surrounding the defined region, to generate one or more virtual buttons, each of the buttons being associated with a corresponding instruction to the system, the data processing apparatus being arranged to register a selection by the user of any of the virtual buttons, the selection including positioning of the probe in relation to the apparent position of that virtual button, and to modify the computer-generated im
Abstract: A computer system for permitting user interaction with a three-dimensional computer model defines an initial correspondence between the computer model and a real world workspace. An editing volume of the workspace is also defined, and a stereoscopic image of the section of the computer model within the editing volume is displayed. Using a first input device a user can translate and/or rotate the model, and rotate the editing volume, so as to bring different portions of the model into the editing volume, and thus into the user's view. The user operates a second input device to indicate changes to be made to the model. The first and second input devices can be operated with the user's respective hands. Since only the portion of the model within the editing volume need be displayed, the processing and display requirements are reduced, in comparison to displaying the entire model.
Type:
Grant
Filed:
August 28, 2001
Date of Patent:
January 13, 2009
Assignee:
Volume Interactions Pte., Ltd.
Inventors:
Luis Serra, Chee Keong Eugene Lee, Hern Ng
Abstract: A system and method for displaying 3D data are presented. The method involves subdividing a 3D display region into two or more display subregions, and assigning a set of display rules to each display subregion. Visible portions of a 3D data set in each display subregion are displayed according to the rules assigned to that display subregion. In an exemplary embodiment of the present invention the boundaries of the display regions, the display rules for each display subregion, and the 3D data sets assigned to be displayed in each display subregion can be set by a user, and are interactively modifiable by a user during the display. In exemplary embodiments of the invention the same 3D data can be displayed in each display subregion, albeit using different display rules. Alternatively, in other exemplary embodiments of the present invention, a different 3D data set can be displayed in each display subregion.
Abstract: A method and system for the dynamic display of three dimensional ultrasound images is presented. In exemplary embodiments according to the present invention, the method includes acquisition of a plurality of ultrasound images with a probe whose position is tracked. Using the positional information of the probe, the plurality of images are volumetrically blended using a pre-determined time dependent dissolving process. In exemplary embodiments according to the present invention a color look up table can be used to filter each image prior to its display resulting in real-time segmentation of greyscale values and the three-dimensional visualization of the three-dimensional shape of structures of interest.