Patents by Inventor Mark Morita

Mark Morita has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20080166070
    Abstract: Embodiments of the presently described technology provide a method for adapting a hanging protocol based on an efficiency of use. The method includes monitoring usage information for a hanging protocol, determining a productivity factor based on an efficiency of a first user during a reading of an imaging study, and recommending at least one of a hanging protocol selection and a hanging protocol change to a second user based on the productivity factor. The usage information includes at least one of a selection of a hanging protocol and a change to the hanging protocol by a first user during the reading of the imaging study.
    Type: Application
    Filed: January 4, 2007
    Publication date: July 10, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Murali Kumaran Kariathungal, Mark Morita, Prakash Mahesh
  • Publication number: 20080163070
    Abstract: Certain embodiments relate to providing automated, custom displays of images and tools for manipulating the images. Certain embodiments include a method for providing an automated user interface for healthcare workers. Certain embodiments of the method employ a network to execute the steps of retrieving an image from an image archive, processing the image to generate image-specific information, identifying data related to the image, mining the data related to the image, and displaying the image along with a custom selection of user interface tools.
    Type: Application
    Filed: January 3, 2007
    Publication date: July 3, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Prakash Mahesh, Murali Kariathungal, Mark Morita
  • Publication number: 20080120576
    Abstract: Certain embodiments of the present invention provide methods and systems for hanging protocol generation using gesture recognition. Certain embodiments provide a method for creating a hanging protocol based on gesture input in a clinical environment. The method includes specifying a hanging protocol specification using gesture-based input. The method also includes translating the gesture-based input into a hanging protocol. The method further includes facilitating display of clinical information based on the hanging protocol. Certain embodiments provide a gesture detection system. The system includes a sensor surface configured to detect gesture-based input made on the sensor surface. The gesture-based input specifies a hanging protocol layout. The system also includes a processor configured to identify the gesture-based input and translate the gesture to a corresponding hanging protocol definition for display of image and clinical data.
    Type: Application
    Filed: November 22, 2006
    Publication date: May 22, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Murali Kumanran Kariathungal, Prakash Mahesh, Mark Morita, Stephen P. Roehm
  • Publication number: 20080120141
    Abstract: Certain embodiments of the present invention provide methods and systems for hanging protocol generation using eye tracking and/or voice command and control. Certain embodiments provide a method for creating a hanging protocol based on at least one of eye tracking and voice command and control input in a clinical environment. The method includes specifying a hanging protocol specification using input including at least one of eye tracking and voice command and control. The method also includes translating the input into a hanging protocol. The method further includes facilitating display of clinical information based on the hanging protocol.
    Type: Application
    Filed: May 16, 2007
    Publication date: May 22, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Murali Kumaran Kariathungal, Prakash Mahesh, Mark Morita, Stephen P. Roehm
  • Publication number: 20080120548
    Abstract: A system and method for processing data on user interactions with a workstation. The system comprises an information system that includes a data storage device. An audio microphone capable of capturing workstation user voice data is linked to the information system. An eye-tracking device capable of capturing workstation user eye-movement data is linked to the information system. A display screen capture routine capable of capturing video display data from a workstation display is linked to the information system. A user input capture routine capable of capturing input data entered into the workstation by the workstation user is linked to the information system. The voice data, eye-movement data, video display data and input data for the workstation user are captured simultaneously and the data are recorded on the data storage device with time information that allows synchronization of the data.
    Type: Application
    Filed: November 22, 2006
    Publication date: May 22, 2008
    Inventors: Mark Morita, Prakash Mahesh, Murali Kumaran Kariathungal, Jeffrey James Whipple, Denny Wingchung Lau, Eliot Lawrence Siegel, Khan Mohammad Siddiqui
  • Publication number: 20080120138
    Abstract: Certain embodiments of the present invention provide a method for automatic prioritizing and ranking of patients in a medical center. The method includes acquiring medical information associated with a patient, prioritizing and ranking the patient based on the acquired medical information, and routing the patient's medical information to an appropriate practitioner based on the priority and rank of the patient.
    Type: Application
    Filed: November 22, 2006
    Publication date: May 22, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Mark Morita, Prakash Mahesh, Murali Kumaran Kariathungal
  • Publication number: 20080118119
    Abstract: Certain embodiments of the present invention provide a method for automatic prioritization and routing of exams for patients in a medical center based on image classification. The method includes capturing an image and digitizing it, automatically classifying the image based on its content, prioritizing the image based on its classification, and routing the image based on its prioritization and classification to the appropriate medical practitioner.
    Type: Application
    Filed: November 22, 2006
    Publication date: May 22, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Prakash Mahesh, Mark Morita, Murali Kumaran Kariathungal
  • Publication number: 20080114808
    Abstract: Various embodiments of the presently described invention provide a method for visually representing associations among data and/or events presented on one or more timelines. A user is provided with the ability to select a filter that can be used to determine a plurality of data/events that are associated or related to one another according to the rule(s) defined by the filter. Once the associated data/events are determined, the association among the data/events is graphically presented to the user.
    Type: Application
    Filed: November 10, 2006
    Publication date: May 15, 2008
    Inventors: Mark Morita, Prakash Mahesh, Murali Kumaran Kariathungal
  • Publication number: 20080114614
    Abstract: Certain embodiments of the present invention provide methods and systems for clinical workflow using gesture recognition. Certain embodiments provide a method for gesture-based interaction in a clinical environment. The method includes detecting a gesture made on a sensor surface. The method also includes determining a pressure applied to make the gesture. The method further includes mapping the gesture and the pressure to a healthcare application function. The pressure modifies the healthcare application function corresponding to the gesture. Certain embodiments provide a gesture detection system including a sensor surface configured to detect a gesture made. The system further includes a pressure sensor configured to detect a pressure applied when making the gesture on the sensor surface. The system also includes a processor configured to identify the gesture and translate the gesture to a healthcare application function.
    Type: Application
    Filed: November 15, 2006
    Publication date: May 15, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Prakash Mahesh, Mark Morita, Stephen P. Roehm, Murali Kumaran Kariathungal
  • Publication number: 20080114615
    Abstract: Certain embodiments of the present invention provide methods and systems for improved clinical workflow using gesture recognition. Certain embodiments provide a method for gesture-based interaction in a clinical environment. The method includes detecting a gesture in a thin-air display space. The method also includes identifying the detected gesture. The method further includes translating the identified gesture to a corresponding healthcare application function. Certain embodiments provide a gesture detection system. The system includes a thin-air display space defined by at least one sensor configured to detect a gesture performed in the thin-air display space. The system also includes a processor configured to identify the gesture and translate the gesture to a corresponding healthcare application function.
    Type: Application
    Filed: November 15, 2006
    Publication date: May 15, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Prakash Mahesh, Mark Morita, Stephen P. Roehm, Murali Kumaran Kariathungal
  • Publication number: 20080104547
    Abstract: Application workflows can be improved using gesture recognition. Interpreting non-functional attributes of gestures, such as relative sizes and/or positions and/or locations, can indicate relative degrees of functionality of the gesture. Thus, gesture inputs trigger proportionate functionality at an application, whereby the gesture input can include a gesture component and at least one of a size component and/or a position component modifying the gesture component.
    Type: Application
    Filed: October 25, 2006
    Publication date: May 1, 2008
    Applicant: GENERAL ELECTRIC COMPANY
    Inventors: Mark Morita, Murali Kumaran Kariathungal, Steven Phillip Roehm, Prakash Mahesh
  • Publication number: 20070185730
    Abstract: Certain embodiments of the present invention provide a system for exam prioritization including a priority indicator and a database. The priority indicator is assigned a priority level selected from at least three available priority levels. The at least three available priority levels represent categories of patient acuity. The database is adapted to store an association of the priority indicator and a medical exam. In certain embodiments, the priority indicator is adapted to by dynamically adjusted.
    Type: Application
    Filed: February 6, 2006
    Publication date: August 9, 2007
    Inventors: Prakash Mahesh, Mark Morita
  • Publication number: 20070166672
    Abstract: Certain embodiments of the present invention provide a method for just-in-time training in software applications including tracking usage of an application by a user, determining a task the user is attempting to perform based at least in part on the tracked usage, and offering training content to the user based at least in part on the task. The training content is offered substantially when the user is attempting to perform the task.
    Type: Application
    Filed: January 3, 2006
    Publication date: July 19, 2007
    Inventors: Mark Morita, Prakash Mahesh
  • Publication number: 20070129626
    Abstract: Certain embodiments of the present invention provide a method for facilitating surgery including: tracking a position of at least a portion of a surgical implement in a volume of interest; recognizing a surgical plan corresponding to at least a portion of the volume of interest; and providing feedback based on a correspondence between the position of the at least a portion of the surgical implement and the surgical plan. In an embodiment, the feedback is provided in real-time. In an embodiment, the feedback includes at least one of: haptic feedback, thermal feedback, optical feedback, and auditory feedback. In an embodiment, the surgical plan includes a previously generated radiological image. In an embodiment, the surgical plan includes at least one trajectory for the at least a portion of the surgical implement.
    Type: Application
    Filed: November 23, 2005
    Publication date: June 7, 2007
    Inventors: Prakash Mahesh, Mark Morita
  • Publication number: 20070116336
    Abstract: Certain embodiments of the present invention provide a system for determining patient acuity including an analysis component, an acuity database, and a processing component. The analysis component is capable of generating analysis data based at least in part on a medical image. The acuity database is capable of associating the analysis data with an acuity value. The processing component is capable of generating an acuity level based at least in part on the acuity value.
    Type: Application
    Filed: November 22, 2005
    Publication date: May 24, 2007
    Inventors: Prakash Mahesh, Murali Kariathungal, Mark Morita
  • Publication number: 20070118101
    Abstract: Certain embodiments of the present invention provide a method for performing surgical planning including: providing for display a data set including a representation of a volume of interest of a patient; providing an interactive tool for use in conjunction with the data set; allowing an interaction with the interactive tool and a portion of the data set; and forming a prediction for an effect on a portion of the data set based at least in part on the interaction. In an embodiment, the method further includes selecting the interactive tool from a plurality of tool types. In an embodiment, the variety of tool types includes a plurality of tool tips. In an embodiment, the providing an interactive tool for use in conjunction with the data set is performable automatically. In an embodiment, allowing an interaction with the interactive tool and a portion of the data set is performable automatically. In an embodiment, the prediction is storable for later retrieval.
    Type: Application
    Filed: November 23, 2005
    Publication date: May 24, 2007
    Inventors: Prakash Mahesh, Mark Morita
  • Publication number: 20070118401
    Abstract: Certain embodiments of the present invention provide a real-time healthcare business decision support system including a plurality of information sources, a processing component, and a user interface component. Each information source includes resource information for a resource in a healthcare environment. The healthcare environment includes a plurality of resources. The processing component aggregates resource information from the plurality of information sources. The processing component is capable of generating performance information based at least in part on the aggregated resource information in substantially real-time. The performance information corresponds at least in part to the performance of at least one of the plurality of resources. The user interface component is capable of displaying the performance information.
    Type: Application
    Filed: December 7, 2005
    Publication date: May 24, 2007
    Inventors: Prakash Mahesh, Mark Morita
  • Publication number: 20070118100
    Abstract: Certain embodiments of the present invention provide methods and systems for improved tumor ablation. Certain embodiments include determining a distance between a tumor and at least one of a plurality of landmarks bounding the tumor in at least one acquired image of an area of interest for a patient, obtaining positional data for an ablation instrument, and displaying a position of the ablation instrument with respect to the tumor. Additionally, a position of the ablation instrument may be dynamically displayed on the fluoroscopic image and/or the at least one acquired image during tumor ablation. Furthermore, a location cursor on the fluoroscopic image may be linked with a location cursor on the one or more acquired images. In certain embodiments, a location of a tip of the ablation instrument may be dynamically displayed with respect to a starting location and an ending location of the tumor.
    Type: Application
    Filed: November 22, 2005
    Publication date: May 24, 2007
    Inventors: Prakash Mahesh, Mark Morita
  • Publication number: 20070118400
    Abstract: Certain embodiments of the present invention provide methods and systems for improved clinical workflow using gesture recognition. Certain embodiments include establishing a communication link between an interface and a remote system, and utilizing gesture input to transmit data to, retrieve data from, and/or trigger functionality at the remote system via the communication link. Additionally, the method may include using the gesture input to perform data acquisition, data retrieval, order entry, dictation, data analysis, image review, and/or image annotation, for example. In certain embodiments, a response from the remote system is displayed. In certain embodiments, the gesture input corresponds to a sequence of healthcare application commands for execution at the remote system. In certain embodiments, the interface includes a default translation between gestures and functionality.
    Type: Application
    Filed: November 22, 2005
    Publication date: May 24, 2007
    Inventors: Mark Morita, Steven Roehm
  • Publication number: 20070106501
    Abstract: Certain embodiments of the present invention provide a medical workflow system including a subvocal input device, an impulse processing component, and an information management system. The subvocal input device is capable of sensing nerve impulses in a user. The impulse processing component is in communication with the subvocal input device. The impulse processing component is capable of interpreting nerve impulses as dictation data and/or a command. The information management system is in communication with the impulse processing component. The information management system is capable of processing dictation data and/or a command from the impulse processing component.
    Type: Application
    Filed: November 7, 2005
    Publication date: May 10, 2007
    Inventors: Mark Morita, Prakash Mahesh, Thomas Gentles