USER INTERFACE FOR ULTRASOUND SYSTEM

- General Electric

A user interface for an ultrasound system is provided. The ultrasound system includes the user interface having at least one user control member and a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display. A function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to and the benefit of the filing date of U.S. Provisional Application No. 60/914,893, filed Apr. 30, 2007 for “PORTABLE 3D/4D ULTRASOUND,” which is hereby incorporated by reference in its entirety.

BACKGROUND OF INVENTION

This invention relates generally to ultrasound systems and, more particularly, to a user interface for controlling ultrasound imaging systems, especially portable ultrasound medical imaging systems.

Ultrasound systems typically include ultrasound scanning devices, such as ultrasound probes having transducers that allow for performing various ultrasound scans (e.g., imaging a volume or body). The ultrasound probes are typically connected to an ultrasound system for controlling the operation of the probes. The ultrasound system usually includes a control portion (e.g., a control console or portable unit) that provides interfaces for interacting with a user, such as receiving user inputs. For example, different buttons, knobs, etc. can be provided to allow a user to select different options and control the scanning of an object using the connected ultrasound probe.

When using volume probes, for example three-dimensional (3D) or four-dimensional (4D) probes, certain procedures may require multiple steps and adjustments that can be controlled by different controllers, for example, using several rotatable control members (commonly referred to as rotaries) to adjust different settings. As a result, numerous control members of each of several different types can be included as part of the control portion. The control members are often mode dependent such that each of the control members control a different function or allow adjusting a different setting based on the mode of operation, for example, a visualization or rendering mode of operation.

As the size of ultrasound systems continue to decrease, the available space of the various controllers on the control portion is limited. Moreover, as processing power continues to increase, portable ultrasound systems, which have increasingly smaller footprints, often include an entire ultrasound system (e.g., processing components, etc.) embodied within a housing having the dimensions of a typical laptop computer or smaller. Thus, the same functionality is often now available in portable systems as in larger systems. However, with the reduced space available in a compact unit, the reduced number of available control members can make it difficult or complex to control certain procedures or adjust different parameters. In some instances, these portable ultrasound systems may not have enough controls to allow a user to control all of the operations that would otherwise be available on a larger system, but that are still desirable in portable systems.

BRIEF DESCRIPTION OF INVENTION

In accordance with one embodiment, an ultrasound system is provided that includes a user interface having at least one user control member and a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display. A function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.

In accordance with another embodiment, an ultrasound system is provided that includes an ultrasound volume probe for acquiring one of three-dimensional (3D) ultrasound data and four-dimensional (4D) ultrasound data and a portable control unit having a user interface and a display. The ultrasound volume probe is connected to the portable control unit and wherein manipulation of one of the 3D ultrasound data and 4D ultrasound data is provided by a single user control member of the user interface.

In accordance with yet another embodiment, a method for controlling an ultrasound probe using a portable ultrasound system is provided. The method includes receiving a user input selecting one of a plurality of virtual display elements on a display on the portable ultrasound system. The method further includes configuring a user control member of the portable ultrasound system based on the received user input to control an operation of the portable ultrasound system.

BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS

FIG. 1 is a block diagram of an ultrasound system formed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 2 is a block diagram of the ultrasound processor module of FIG. 1 formed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 3 is a top perspective view of a portable ultrasound imaging system formed in accordance with an exemplary embodiment of the inventive arrangements having at least one reconfigurable user input member.

FIG. 4 is a top plan view of a user interface of the portable ultrasound imaging system of FIG. 3.

FIG. 5 is an elevation view of a backend of the portable ultrasound imaging system of FIG. 3.

FIG. 6 is a side elevation view of the portable ultrasound imaging system of FIG. 3.

FIG. 7 is a perspective view of a case for the portable ultrasound imaging system of FIG. 3.

FIG. 8 is a perspective view of a movable cart that is capable of supporting the portable ultrasound imaging system of FIG. 3.

FIG. 9 is a top view of a hand carried or pocket-sized ultrasound imaging system formed in accordance with an exemplary embodiment of the inventive arrangements having at least one reconfigurable user input member.

FIG. 10 is a screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 11 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance an exemplary embodiment of the inventive arrangements.

FIG. 12 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 13 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 14 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 15 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 16 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 17 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 18 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 19 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

FIG. 20 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.

DETAILED DESCRIPTION OF VARIOUS PREFERRED EMBODIMENTS

The foregoing summary, as well as the following detailed description of certain embodiments of the inventive arrangements, will be better understood when read in conjunction with the appended drawings. To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. Thus, for example, one or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or random access memory, hard disk, or the like). Similarly, the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.

As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” of the inventive arrangements are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising” or “having” an element or a plurality of elements having a particular property may include additional such elements not having that property.

It should be noted that although the various embodiments may be described in connection with an ultrasound system, the methods and systems described herein are not limited to ultrasound imaging. In particular, the various embodiments may be implemented in connection with different types of medical imaging, including, for example, magnetic resonance imaging (MRI) and computed-tomography (CT) imaging. Further, the various embodiments may be implemented in other non-medical imaging systems, for example, non-destructive testing systems.

Exemplary embodiments of ultrasound systems provide a user interface for an ultrasound system. A plurality of virtual display elements (e.g., display icons) are selectable by a user to change the function controlled by a particular user control member. The selection of the virtual display elements reconfigures one or more of the user control members for controlling certain parameters, settings, etc. based on the selected virtual display element.

FIG. 1 illustrates a block diagram of an ultrasound system 20 formed in accordance with various embodiments of the inventive arrangements. The ultrasound system 20 includes a transmitter 22 that drives an array of elements 24 (e.g., piezoelectric crystals) within a transducer 26 to emit pulsed ultrasonic signals into a body or volume. A variety of geometries may be used, and the transducer 26 may be provided as part of, for example, different types of ultrasound probes. For example, the ultrasound probe may be a volume probe such as a three-dimensional (3D) probe or a four-dimensional (4D) probe wherein the array of elements 24 can be mechanically moved. The array of elements 24 may be swept or swung about an axis powered by a motor 25. In these embodiments, movement of the array of elements 24 is controlled by a motor controller 27 and motor driver 29. However, it should be noted that the ultrasound system 20 may have connected thereto an ultrasound probe that is not capable of mechanical movement of the array of elements 24. In such embodiments, the motor controller 27 and motor driver 29 may or may not be provided and/or may be deactivated. Accordingly, the motor controller 27 and motor driver 29 are optionally provided.

The emitted pulsed ultrasonic signals are back-scattered from structures in a body, for example, blood cells or muscular tissue, to produce echoes that return to any of the elements 24. The echoes are received by a receiver 28. The received echoes are provided to a beamformer 30 that performs beamforming and outputs an RF signal. The RF signal is then provided to an RF processor 32 that processes the RF signal. Alternatively, the RF processor 32 may include a complex demodulator (not shown) that demodulates the RF signal to form IQ data pairs representative of the echo signals. The RF or IQ signal data may then be provided directly to a memory 34 for storage (e.g., temporary storage).

The ultrasound system 20 also includes a processor module 36 to process the acquired ultrasound information (e.g., RF signal data or IQ data pairs) and prepare frames of ultrasound information for display on a display 38. The processor module 36 is adapted to perform one or more processing operations according to a plurality of selectable ultrasound modalities on the acquired ultrasound information. Acquired ultrasound information may be processed in real-time during a scanning session as the echo signals are received. Additionally or alternatively, the ultrasound information may be stored temporarily in the memory 34 during a scanning session and processed in less than real-time in a live or off-line operation. An image memory 40 is included for storing processed frames of acquired ultrasound information that are not scheduled to be displayed immediately. The image memory 40 may comprise any known data storage medium, for example, a permanent storage medium, removable storage medium, etc.

The processor module 36 is connected to a user interface 42 that controls operation of the processor module 36 as explained below in more detail and is configured to receive inputs from an operator. The display 38 includes one or more monitors that present patient information, including diagnostic ultrasound images to the user for review, diagnosis, and/or analysis. The display 38 may automatically display, for example, one or more planes from a 3D ultrasound data set stored in the memory 34 or 40. One or both of the memories 34, 40 may store 3D data sets of the ultrasound data, where such 3D data sets are accessed to present 2D and 3D images. For example, a 3D ultrasound data set may be mapped into the corresponding memory 34 or 40, as well as one or more reference planes. The processing of the data, including the data sets, is based, at least in part, on user inputs, for example, user selections received at the user interface 42.

The display 38 also may display one or more virtual display elements 50 that are selectable by a user and as described in more detail below. Based on the selection of a virtual display element 49, one or more corresponding controls of the user interface 42, for example, the operations controlled by a trackball and/or the like (not shown) may be reconfigured.

In operation, the ultrasound system 20 acquires data, for example, volumetric data sets by various techniques (e.g., 3D scanning, real-time 3D imaging, volume scanning, 2D scanning with transducers having positioning sensors, freehand scanning using a voxel correlation technique, scanning using 2D or matrix array transducers, etc.). The data may be acquired by mechanically moving the array of elements 24 of the transducer 26, for example, by performing a sweeping type of scan. The transducer 26 also may be moved manually, such as along a linear or arcuate path, while scanning a region of interest (ROI). At each linear or arcuate position, the transducer 26 obtains scan planes that are stored in the memory 34.

FIG. 2 illustrates an exemplary block diagram of the processor module 36 of FIG. 1. The processor module 36 is illustrated conceptually as a collection of sub-modules, but it may also be implemented utilizing any combination of dedicated hardware boards, digital signal processors (DSPs), processors, etc. Alternatively, the sub-modules of FIG. 2 may be implemented utilizing an off-the-shelf PC with a single processor or multiple processors, with functional operations distributed between the processors. As a further option, the sub-modules of FIG. 2 may also be implemented utilizing a hybrid configuration, in which certain modular functions are performed utilizing dedicated hardware, while the remaining modular functions are performed utilizing an off-the-shelf PC and/or the like. The sub-modules also may be implemented as software modules within a processing unit.

The operations of the sub-modules illustrated in FIG. 2 may be controlled by a local ultrasound controller 50 or by the processor module 36. The sub-modules 52-68 perform mid-processor operations. The ultrasound processor module 36 may receive ultrasound data 70 in one of several forms. In the embodiment of FIG. 2, for example, the received ultrasound data 70 constitutes IQ data pairs representing the real and imaginary components associated with each data sample. The IQ data pairs are provided, for example, to one or more of a color-flow sub-module 52, a power Doppler sub-module 54, a B-mode sub-module 56, a spectral Doppler sub-module 58, and an M-mode sub-module 60. Other sub-modules may also be included, such as an Acoustic Radiation Force Impulse (ARFI) sub-module 62, a strain sub-module 64, a strain rate sub-module 66, a Tissue Doppler (TDE) sub-module 68, among others.

Each of sub-modules 52-68 are configured to process the IQ data pairs in a corresponding manner to generate color-flow data 72, power Doppler data 74, B-mode data 76, spectral Doppler data 78, M-mode data 80, ARFI data 82, echocardiographic strain data 84, echocardiographic strain rate data 86, and tissue Doppler data 88, all of which may be stored in a memory 90 (or memory 34 or image memory 40 shown in FIG. 1) temporarily before subsequent processing. The data 72-88 may be stored, for example, as sets of vector data values, where each set defines an individual ultrasound image frame. The vector data values are generally organized based on the polar coordinate system.

A scan converter sub-module 92 accesses and obtains from the memory 90 the vector data values associated with an image frame and converts the set of vector data values to Cartesian coordinates to generate an ultrasound image frame 93 formatted for display. The ultrasound image frames 93 generated by the scan converter sub-module 92 may be provided back to the memory 90 for subsequent processing or may be provided to the memory 34 or image memory 40.

Once the scan converter sub-module 92 generates the ultrasound image frames 93 associated with the data, the image frames may be restored in the memory 90 or communicated over a bus 96 to a database (not shown), the memory 34, the image memory 40, and/or to other processors (not shown).

A 2D video processor sub-module 94 may be used to combine one or more of the frames generated from the different types of ultrasound information. For example, the 2D video processor sub-module 94 may combine different image frames by mapping one type of data to a gray map and mapping the other type of data to a color map for video display. In the final displayed image, the color pixel data is superimposed on the gray scale pixel data to form a single multi-mode image frame 98 that is again re-stored in the memory 90 or communicated over the bus 96. Successive frames of images may be stored as a cine loop in the memory 90 or memory 40 (shown in FIG. 1). The cine loop represents a first in, first out circular image buffer to capture image data that is displayed in real-time to the user, such as one or more heart cycles. The user may freeze the cine loop by entering a freeze command at the user interface 42. The user interface 42 may include, for example, a keyboard, mouse, trackball, and/or all other input controls associated with inputting information into the ultrasound system 20 (shown in FIG. 1), which input controls may be reconfigured automatically based on selection of a virtual display element 49 (shown in FIG. 1) by the user.

A 3D processor sub-module 100 is also controlled by the user interface 42 and accesses the memory 90 to obtain spatially consecutive groups of ultrasound image frames (that may be acquired, for example, by a sweeping ultrasound scan) and to generate three dimensional image representations thereof, such as through volume rendering or surface rendering algorithms, as are known. The three-dimensional images may be generated utilizing various imaging techniques, such as ray-casting, maximum intensity pixel projection, and/or the like. Additionally, the three-dimensional images may be displayed over time, thereby providing four-dimensional operation, as is known.

Various embodiments of the inventive arrangements can also be implemented in a miniaturized ultrasound imaging system, for example, a portable ultrasound imaging system 110 as shown in FIG. 3. The portable ultrasound imaging system 110 may be, for example, a Voluson i compact 4D ultrasound system available from G.E. Healthcare in Waukesha, Wis. The portable ultrasound imaging system 110 controls a probe (not shown) connected to the portable ultrasound imaging system 110 via a probe connector 112 that may be locked to the portable ultrasound imaging system 110 using a probe locking handle 114. The user interface 42 includes a plurality of user inputs and/or controls, which may be of different types, and are configured to receive commands from a user or operator. For example, the user interface 42 may include a plurality of “soft” buttons 116, for example, toggle buttons and a keyboard 118, for example, an alphanumeric keyboard. Additionally, a functional keyboard portion 120 may be provided that includes other user selectable buttons and controls. Other user controls also may be provided, such as a trackball 122 having a trackball ring 124 and a plurality of associated buttons 126, which may be activated by the fingers of a user when operating the trackball 126. A plurality of sliding control members 128 (e.g., time control gain potentiometers) may also be provided, for example, adjacent the keyboard 118.

The portable ultrasound imaging system 110 also includes a display 130, for example, an integrated LCD display with a display latch 132 provided to lock the display 130 to the user interface 42. A power button 134 is provided to power on and off the portable ultrasound imaging system 110. The portable ultrasound imaging system 110 with the user interface 42 and the display defines a portable control unit.

It should be noted that as used herein, “miniaturized” generally means that the ultrasound system 110 is a handheld or hand-carried device and/or is configured to be carried in a person's hand, pocket, briefcase-sized case, backpack, and/or the like. For example, the ultrasound system 110 may be a hand-carried device having a size of a typical laptop computer, for instance, having dimensions of approximately 2.5 inches in depth, approximately 14 inches in width, and approximately 12 inches in height. The ultrasound system 110 may weigh about ten pounds or less, and is thus easily portable by the operator. The display 130 is configured to display, for example, a medical image and virtual display elements, as described below.

It further should be noted that ultrasonic data from the portable ultrasound imaging system 110 may be sent to an external device (not shown), such as a printer or display, via a wired or wireless network (or direct connection, for example, via a serial or parallel cable or USB port). In some embodiments, the external device may be a computer or a workstation having a display. Alternatively, the external device may be a separate external display or a printer capable of receiving image data from the portable ultrasound imaging system 110 and of displaying or printing images that may have greater resolution than the display 130.

With particular reference to the user interface 42, and as shown in more detail in FIG. 4, a plurality of user controls may be provided as part of the user interface 42. For example, “soft” buttons 116 may include a first menu button 140, a second menu button 142, a third menu button 144, and a fourth menu button 146, each capable of movement in four directions. A plurality of imaging buttons 148 may also be provided to select different imaging functions or operations. A plurality of mode selection buttons 136 also may be provided to select different scanning modes, for example, 2D, 4D, pulsed wave doppler (PW), color flow mode (CFM), etc. The functional keyboard portion 120 also includes other user selectable buttons and controls, such as buttons that allow for obtaining saved information, storing information, manipulating information or displayed images, calculating measurements relating to displayed images, changing a display format, etc.

The portable ultrasound imaging system 110 also includes internal and external connections on a back end 160 as shown in FIG. 5 and on a side portion 170 as shown in FIG. 6. For example, the back end 160 may include a VGA connector 162 (for connection, for example, to an external monitor), an RGB connector 164 (for connection, for example, to a printer) and a power supply input 166. A network connector 168, for example, an Ethernet LAN input/output also may be provided and one or more USB connectors 169 may be provided. On the side portion 170, and for example, a probe connection 172 for connection to a probe, may be provided, and the probe locking handle 114 is provided. It should be noted that different or additional connectors may be provided as desired or known, for example, based on the scanning applications for the portable ultrasound imaging system 110.

The portable ultrasound imaging system 110 also may be transported, stored, or operated in a case 180, as shown in FIG. 7. The case 180 may be, for example, a padded case to protect the portable ultrasound imaging system 110.

The portable ultrasound imaging system 110 also may be configured for mounting to or to be supported by a moveable base 190, for example, a movable cart as shown in FIG. 8. The moveable base 190 includes a support portion 192 for receiving and supporting the portable ultrasound imaging system 110 and a tray portion 194 that may be used, for example, to store peripherals. The movable base 190 also may include one or more probe holders 196 for supporting and holding therein one or more ultrasound probes, for example, one probe connected to the portable ultrasound imaging system 110 and other probes configured to be connected to the portable ultrasound imaging system 110. A foot rest 198 also may be provided. Accordingly, the portable ultrasound imaging system 110 may be configured to appear like a console-based type ultrasound imaging system.

However, it should be noted that the various embodiments may be implemented in connection with ultrasound systems having different sizes and shapes. For example, a hand carried or pocket-sized ultrasound imaging system 200 may be provided as shown in FIG. 9. In such a system 200, the display 130 and user interface 42 can form a single unit. By way of example, the pocket-sized ultrasound imaging system 200 may be a pocket-sized or hand-sized ultrasound system approximately 2 inches wide, approximately 4 inches in length, and approximately ½ inches in depth and/or weigh less than 3 ounces. The display 130 may be, for example, a 320×320 pixel color LCD display (on which a medical image 210 can be displayed). A typewriter-like keyboard 202 of buttons 203 may optionally be included in the user interface 42. It should be noted that the various embodiments may be implemented in connection with a pocket-sized ultrasound system 200 having different dimensions, weights, and/pr power consumptions.

Multi-function controls 204 may each be assigned functions in accordance with the mode of system operation. Therefore, each of the multi-function controls 204 may be configured to provide a plurality of different actions. Label display areas 206 associated with the multi-function controls 204 may be included as necessary on the display 130. The system 200 may also have additional keys and/or controls 208 for special purpose functions, which may include, but are not limited to “freeze,” “depth control,” “gain control,” “color-mode,” “print,” and “store.”

Various embodiments of the inventive arrangements provide virtual display elements (e.g., display icons) that are selectable by a user to change the function controlled by a particular user control. The selection of the virtual display elements reconfigures one or more of the user controls for controlling certain parameters, settings, etc. based on the selected virtual display element. In general, and as shown in FIG. 10, a user is presented with a plurality of virtual display elements 220a-220e that may be displayed, for example, on a screen 222, such as the display 38 (shown in FIG. 1). The virtual display elements 220a-220e are displayed on the screen 222 adjacent (e.g., surrounding) or proximate an image that is selected by a user. For example, when a user places a virtual pointer 224 (e.g., virtual cross-hairs) over a particular image 225, the virtual display elements 220a-220e are displayed on the screen 222. It should be noted that the virtual display elements 220a-220e disappear once the image 225 is no longer selected or the virtual pointer 224 is moved away from the image 225 and another image 226 is selected or the virtual pointer 224 is moved over that image or another user control member is activated (e.g., depressed). However, the virtual display elements 220a-220e may continue to be displayed in connection with the image 225 for a predetermined period of time (e.g., 2 seconds) even after the image 225 is no longer selected.

With the virtual display elements 220a-220e displayed on the screen 222, a user may select one of the virtual display elements 220a-220e. Upon selecting one of the virtual display elements 220a-220e, the corresponding function represented by that virtual display element 220a-220e is now adjusted or controlled by one of the controls of the user interface 42 (shown in FIG. 4), for example, the trackball 122. Accordingly, when a virtual display element 220a-220e is selected, the operation of the trackball 122 is reconfigured and the control thereof remapped, for example, as shown in Table 1 below.

TABLE 1 Screen Ctrl Icon Meaning Rot X Rotate around X-axis when in ref image A (respective axis in B, C, 3D). Rot Y Rotate around Y-axis when in ref image A (respective axis in B, C, 3D). Rot Z Rotate around Z-axis when in ref image A (respective axis in B, C, 3D). Parallel Shift Shift in Z-direction. Curved Render Start Move curved render start Move Move the data around. Borders of Can be selected to resize the Renderbox Renderbox Home3D Switch 3D rendered image back to initial 3D position. Home3Dflat Toggle between 3D view and flat 3D view of the rendered image.

Accordingly, and for example, if a particular virtual display element 220a is selected, which may be selected by using the trackball 122 to move the virtual pointer 224 to the image 225 and pressing one of the buttons 126 (shown in FIG. 4), then the trackball 122 is reconfigured to control or adjust the parameter, function, etc. corresponding to that virtual display element 220a, which, in the embodiment shown in Table 1, is to control rotation around the X-axis of the image 225. Thus, the operation of the trackball 122 is reconfigured to control or adjust the X-axis rotation. A user may then click one of the buttons 126 to return the trackball 122 to controlling movement of the virtual pointer 224 and allowing selection of one of the other virtual display elements 220b-220e. Alternatively, another one of the buttons of the user interface 42 may deselect the operation corresponding to a virtual display element 220a and allow the selection of one of the other virtual display elements 220b-220e.

It should be noted that the virtual display elements 220a-220e may be configured as different icons and correspond to different function or operations than those illustrated in Table 1. It also should be noted that the selection of one of the virtual display elements 220a-220e may, instead of reconfiguring the trackball 122, reconfigure another user control or the user interface 42 or an external user control (e.g., a connected mouse).

Moreover, other information or selectable elements may be displayed on the screen 222. For example, a plurality of selectable elements 230 may be provided to allow for the selection of a particular visualization mode.

Referring now to FIGS. 11-20 illustrating exemplary screenshots 232 including the virtual display elements 220a-220e, a render visualization mode (which may be selected using the selectable elements 230) for a 4D realtime acquisition is shown. Specifically, as shown in FIG. 11, the virtual pointer 224, illustrated as a mouse pointer, is moved, for example, using the trackball 122 (shown in FIG. 4), over a side 240 of a render box 242 (identifying the region of the image 244 to be rendered). The side 240 may be highlighted (e.g., highlighted by a color) when the virtual pointer 224 is placed over the side 240. When the side 240 is selected, the virtual display elements 220a-220d and 220f (e.g., icons) are displayed. It should be noted that the virtual display element 220e is not displayed in this screenshot, but it may be displayed in some embodiments.

As shown in FIG. 11, the virtual pointer 224 has now been placed over virtual display element 220f (the icon shaped as a dot) that corresponds to a curved render start function. When the virtual display element 220f is selected, or when the virtual pointer 224 is moved over the virtual display element 220f, the virtual display element 220f may be highlighted (e.g., highlighted or shadowed in yellow). Once the virtual display element 220f is selected, the trackball 122 is reconfigured to adjust the curved render start function as shown in FIG. 13. Once the virtual display element 220f is selected, the virtual display element 220f may be highlighted differently (e.g., highlighted in a different color, such as red) and a curved render start portion 244 of the render box 240 is displayed. It should be noted that once the virtual display element 220f is selected, the other virtual display elements 220a-220d disappear, and when the trackball 122 is moved, the curved render start portion 244 is changed, for example, curved as adjusted by the trackball 122 instead of straight as shown in FIG. 12. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4) and the other virtual display elements 220a-220d appear again.

It also should be noted that a virtual representation 246 of the trackball 122 may be displayed on the display 130 and indicate the functions corresponding to the trackball 122 and the buttons 126 in the current active display mode.

As shown in FIG. 14, another side 248 (or border) of the render box 240 may be selected and which reconfigures the functionality of the trackball 122 to allow adjustment of the size of the render box 240. The side 248 may be highlighted (e.g., highlighted in red) and all of virtual display elements 220a-220d and 220f disappear. When the trackball 122 is now moved, the size of the render box 240 is changed. For example, the render box 240 is now smaller in FIG. 14 than in FIG. 11-13. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4) and the virtual display elements 220a-220d and 220f appear again.

In the screenshot 232 of FIG. 15, the virtual display element 220a has been selected and the other virtual display elements 220b-220d and 220f have disappeared. The selected virtual display element 220a corresponds to a rotate around the y-axis, which now may be adjusted by the trackball 122 that is reconfigured to control this operation. The selected virtual display element 220a may be highlighted (e.g., highlighted in red) and when the trackball 122 is moved, the volume data displayed is rotated around or about the y-axis, with the content of the displayed images 244 and 245 changed accordingly. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4) and the virtual display elements 220b-220d and 220f appear again.

In FIG. 16, the virtual pointer 224 has been moved over the image 245. When the virtual pointer 224 is moved over the image 245, a different set of virtual display elements 220a-220d and now 220g appear. In particular, the virtual display element 220g now appears and is configured as a “house” icon. It should be noted that a render box 241 may now appear on the image 245. The virtual display element 220g when selected changes the 3D display. In particular, as shown in FIG. 17, the rendered image, specifically, the image 245 is now displayed at an angle, the render box 241 is displayed as a three-dimensional box and the virtual display element 220g changes shape, for example, the “house” icon is rotated. If the virtual display element 220g is again selected, the image 245 will again appear as shown in FIG. 16 and the shape of the virtual display element 220g will return to the “house” icon as shown in FIG. 16.

FIG. 18 illustrates a quad-view mode and the visualization mode now shows sectional planes. The virtual pointer 224 is now shown as moved over a center dot 252 in the image 250 and the dot is marked, for example, with a marker 254, such as cross-hairs that may be highlighted, for example, highlighted in yellow. The user may then select the marker 254, which may, for example, change color to red and a move center dot function is now assigned to the trackball 122. All of the other virtual display elements 220a-220d also disappear. When the trackball 122 is moved, the center dot 252 is moved and the content of the image 250, as well as the images 260 and 262 changed accordingly. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4) and the virtual display elements 220a-220d appear again.

As shown in FIG. 20, the virtual pointer 224 has been moved over the image 260 and not over any of the virtual display elements 220a-220d. The virtual pointer 224 now has a different shape, for example, a hand instead of an arrow or pointer. In this mode, if one of the buttons 126 is selected (e.g., pressed by a user), a move image functionality is selected and assigned to the trackball 122. When the trackball 122 is moved, the image 260 is moved on the display.

Thus, a single user control member can be used to manipulate, for example, 3D or 4D ultrasound data. For example, by assigning different operations to the single user control member based on selecting from a plurality of virtual display elements, the single user control member is reconfigured to control different operations or adjust different settings, parameters, etc.

It should be noted that the some (or all) of virtual display elements 220a-220g may be displayed in different imaging modes, for example, a tomographic ultrasound image (TUI) mode or a SonoVCAD mode. However, different virtual display elements corresponding to different operations or functions may be displayed in addition to or instead of some or all of the virtual display elements 220a-220g. Also, it should be noted that in some modes, only a specific image or images can be adjusted and accordingly, the virtual display elements only appear when the virtual pointer 224 is moved over those images. It also should be noted that only a single image may be displayed instead of the multiple images as illustrated.

Accordingly, the various embodiments automatically reconfigure the operation of a user control member (e.g., a trackball) based on a selected virtual display element such that the control operations performed by the user control member are remapped. The user control member is thereby used to adjust or control different functions based on the virtual display element selected. In one embodiment, based on the selected virtual display element corresponding to a particular function, setting, parameter, etc., the movement of the user control member is remapped to, for example, allow the particular, setting, parameter, etc. to be adjusted or changed based on the movement of the user control member that has been remapped. For example, a table or database is accessed and the corresponding motion of the user control member is mapped for the particular function, setting, parameter, etc. Thereafter, the relative movement of the user control member adjusts the particular function, setting, parameter, etc. corresponding to the selected virtual display element.

At least one technical effect of the various embodiments of the inventive arrangements is automatically changing the control function or operation of a user control member based on the selection of a virtual display element. The user control member is reconfigured or reassigned to control or adjust a different operation or function based on the selected virtual display element.

Some embodiments of the inventive arrangements provide a machine-readable medium or media having instructions recorded thereon for a processor or computer to operate an imaging apparatus to perform one or more embodiments of the methods described herein. The medium or media may be any type of CD-ROM, DVD, floppy disk, hard disk, optical disk, flash RAM drive, and/or other type of computer-readable medium, and/or a combination thereof.

The various embodiments and/or components, for example, the processors, or components and controllers therein, may also be implemented as part of one or more computers or processors. Such a computer or processor may include a computing device, an input device, a display unit, and/or an interface, for example, for accessing the Internet. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus. The computer or processor may also include a memory. The memory may include Random Access Memory (RAM) and/or Read Only Memory (ROM). The computer or processor may further include a storage device, which may be a hard disk drive or a removable storage drive, such as a floppy disk drive, optical disk drive, and/or the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.

As used herein, the term “computer” may include any processor-based or microprocessor-based system, including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only and thus not intended to limit in any way the definition and/or meaning of the term “computer.”

The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired and/or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.

The set of instructions may include various commands that instruct the computer or processor as a processing machine to perform specific operations, such as the methods and processes of the various embodiments of the inventive arrangements. The set of instructions may be in the form of a software program. The software may be in various forms, such as system software or application software. In addition, the software may be in the form of a collection of separate programs, a program module within a larger program, or a portion of a program module. The software may also include modular programming in the form of object-oriented programming. The processing of input data by the processing machine may be in response to user commands, or in response to results of previous processing, or in response to a request made by another processing machine.

As used herein, the terms “software” and “firmware” are interchangeable and include any computer program stored in memory for execution by a computer, including RAM memory, ROM memory, EPROM memory, EEPROM memory, and/or non-volatile RAM (NVRAM) memory. The above memory types are exemplary only and are thus not limiting as to the types of memory usable for storage of a computer program.

It is to be understood that the above description is intended to be illustrative and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the inventive arrangements without departing from their scope. For example, the ordering of steps recited in a method need not be performed in a particular order unless explicitly stated or implicitly required (e.g., one step requires the results or a product of a previous step to be available). While some of the dimensions and types of materials described herein are intended to define the parameters of the inventive arrangements, they are by not limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reviewing and understanding the above description. The scope of the inventive arrangements should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and they are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. § 112, sixth paragraph, unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.

This written description uses examples to disclose the inventive arrangements, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices and/or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.

Claims

1. An ultrasound system, comprising:

a user interface having at least one user control member; and
a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display and wherein a function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.

2. The ultrasound system of claim 1, wherein the at least one user control member is configured to be operated to select one of the plurality of virtual display elements using the virtual pointer.

3. The ultrasound system of claim 1, wherein the at least one user control member comprises a trackball.

4. The ultrasound system of claim 1, wherein the display is configured to display only the selected one of the virtual display elements.

5. The ultrasound system of claim 4, wherein the display is configured to display the other display elements when one of (i) an adjustment using the at least one user control member is completed and (ii) a button corresponding to the user control member is activated.

6. The ultrasound system of claim 1, wherein the plurality of virtual display elements comprise icons representative of the corresponding controlled function.

7. The ultrasound system of claim 1, wherein the plurality of virtual display elements are displayed only when the virtual pointer is positioned over an image that can be changed.

8. The ultrasound system of claim 1, wherein the plurality of virtual display elements are changed based on one of a mode of operation and a mode of visualization.

9. The ultrasound system of claim 1, wherein the display automatically displays the plurality of virtual display elements when the virtual pointer is positioned over the image.

10. The ultrasound system of claim 1, wherein the virtual display elements are displayed adjacent the image.

11. The ultrasound system of claim 1, wherein the selected one of the plurality of virtual display elements is highlighted.

12. The ultrasound system of claim 1, wherein the function controlled by the at least one user control member comprises an adjustment.

13. The ultrasound system of claim 1, wherein an icon representing the selected one of the plurality of virtual display elements changes based on an input from the at least one user control member.

14. The ultrasound system of claim 1, wherein the display is configured to display a virtual representation of the at least one user control member along with at least one indicated function corresponding to at least one user control member and one or more buttons associated with the at least one user control member.

15. The ultrasound system of claim 1, further comprising:

a portable ultrasound unit including the user interface and the display.

16. An ultrasound system, comprising:

an ultrasound volume probe for acquiring one of three-dimensional (3D) ultrasound data and four-dimensional (4D) ultrasound data; and
a portable control unit having a user interface and a display, the ultrasound volume probe connected to the portable control unit, and wherein manipulation of one of the 3D ultrasound data and 4D ultrasound data is provided by a single user control member of the user interface.

17. The ultrasound system of claim 16, wherein the single user control member comprises a trackball.

18. The ultrasound system of claim 16, wherein the display is configured to display a plurality of selectable virtual display elements and a type of manipulation provided by the single user control member is determined based on a selected one of the plurality of selectable virtual display elements.

19. The ultrasound system of claim 16, wherein the user interface does not include rotary controls and the manipulation is performed without the use of the rotary controls.

20. A method for controlling an ultrasound probe using a portable ultrasound system, comprising:

receiving a user input selecting one of a plurality of virtual display elements on a display on the portable ultrasound system; and
configuring a user control member of the portable ultrasound system based on the received user input to control an operation of the portable ultrasound system.
Patent History
Publication number: 20090012394
Type: Application
Filed: Apr 30, 2008
Publication Date: Jan 8, 2009
Applicant: GENERAL ELECTRIC COMPANY (Schenectady, NY)
Inventors: Petra Hobelsberger (Vocklabruck), Walter Duda (Obendorf)
Application Number: 12/112,946
Classifications
Current U.S. Class: Ultrasonic (600/437)
International Classification: A61B 8/00 (20060101);