Patents by Inventor Sundar Murugappan
Sundar Murugappan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230093342Abstract: A facilitation system for facilitating remote presentation of a physical world includes a first object and an operating environment of the first object. The facilitation system includes a processing system configured to obtain an image frame depicting the physical world, identify a depiction of the first object in the image frame, and obtain a first spatial registration registering an object model with the first object in the physical world. The object model is of the first object. The processing system is further configured to obtain an updated object model corresponding to the object model updated with a current state of the first object, and generate a hybrid frame using the image frame, the first spatial registration, and the updated object model. The hybrid frame includes the image frame with the depiction of the first object replaced by a depiction of the updated object model.Type: ApplicationFiled: March 30, 2021Publication date: March 23, 2023Applicant: Intuitive Surgical Operations, Inc.Inventors: Sundar Murugappan, Danilo Gasques Rodrigues, Govinda Payyavula, Simon DiMaio
-
Patent number: 10540898Abstract: A dynamic human machine interface system includes a mission commander application (MCA) unit including a control processor, the MCA active on one vehicle of a plurality of mission member vehicles, the MCA unit in communication with a data store, the control processor accessing executable instructions that cause the control processor to direct operations of components of the MCA unit, an alternate scenario evaluation unit accessing at least one of mission parameter records and flight member data records in the data store to recalculate mission parameters, a dynamic video interface unit to render the recalculated mission parameters on a mission control dashboard (MCD), the MCD presented to the mission commander on a display unit of the one vehicle, the MCD including a plurality of display pane areas selectable by a user interaction with an interactive interface, and each display area configurable by the user interaction to change content of the display pane.Type: GrantFiled: July 21, 2017Date of Patent: January 21, 2020Assignee: GENERAL ELECTRIC COMPANYInventors: So Young Kim, Aaron Williams, Mark Snyder, Christopher Scott Sensing, Samuel Levulis, Jeffrey Robert Winters, Sundar Murugappan, Courtney Albers, Jennifer Ruth Cooper, Michael Eric Figard, Alexander Kaber Carroll
-
Patent number: 10252815Abstract: A method of monitoring a cockpit of an aircraft includes receiving, by one or more controllers, an image depicting an operator manipulated input device located within the cockpit. The method can include determining, by the one or more controllers, an observed state of the operator manipulated input device. In particular, the observed state can be based on the image. The method can include determining, by the one or more controllers, a sensed state of the operator manipulated input device. In particular, the sensed state can be based on data from a sensor. The method can include determining, by the one or more controllers, a mismatch between the observed and sensed states of the operator manipulated input device.Type: GrantFiled: April 28, 2017Date of Patent: April 9, 2019Assignee: General Electric CompanyInventors: Sundar Murugappan, Alexander Kaber Carroll, Norman Leonard Ovens, Sharon Ann Green, Jennifer Ruth Cooper, So Young Kim, Michael Eric Figard, Masaki Merritt Akiyama, Bernardo Pires
-
Publication number: 20190057548Abstract: The example embodiments are directed to a system for self-learning augmented reality for use with industrial operations (e.g., manufacturing, assembly, repair, cleaning, inspection, etc.) performed by a user. For example, the method may include receiving data captured of the industrial operation being performed, identifying a current state of the manual industrial operation based on the received data, determining a future state of the manual industrial operation that will be performed by the user based on the current state, and generating one or more augmented reality (AR) display components based on the future state of the manual industrial operation, and outputting the one or more AR display components to an AR device of the user for display based on a scene of the manual industrial operation. The augmented reality display components can identify a future path of the manual industrial operation for the user.Type: ApplicationFiled: August 16, 2017Publication date: February 21, 2019Inventors: Baljit SINGH, Zhiguang WANG, Jianbo YANG, Sundar MURUGAPPAN, Jason NICHOLS
-
Publication number: 20190027047Abstract: A dynamic human machine interface system includes a mission commander application (MCA) unit including a control processor, the MCA active on one vehicle of a plurality of mission member vehicles, the MCA unit in communication with a data store, the control processor accessing executable instructions that cause the control processor to direct operations of components of the MCA unit, an alternate scenario evaluation unit accessing at least one of mission parameter records and flight member data records in the data store to recalculate mission parameters, a dynamic video interface unit to render the recalculated mission parameters on a mission control dashboard (MCD), the MCD presented to the mission commander on a display unit of the one vehicle, the MCD including a plurality of display pane areas selectable by a user interaction with an interactive interface, and each display area configurable by the user interaction to change content of the display pane.Type: ApplicationFiled: July 21, 2017Publication date: January 24, 2019Inventors: So Young KIM, Aaron WILLIAMS, Mark SNYDER, Christopher Scott SENSING, Samuel LEVULIS, Jeffrey Robert WINTERS, Sundar MURUGAPPAN, Courtney ALBERS, Jennifer Ruth COOPER, Michael Eric FIGARD, Alexander Kaber CARROLL
-
Publication number: 20180312272Abstract: A method of monitoring a cockpit of an aircraft includes receiving, by one or more controllers, an image depicting an operator manipulated input device located within the cockpit. The method can include determining, by the one or more controllers, an observed state of the operator manipulated input device. In particular, the observed state can be based on the image. The method can include determining, by the one or more controllers, a sensed state of the operator manipulated input device. In particular, the sensed state can be based on data from a sensor. The method can include determining, by the one or more controllers, a mismatch between the observed and sensed states of the operator manipulated input device.Type: ApplicationFiled: April 28, 2017Publication date: November 1, 2018Inventors: Sundar Murugappan, Alexander Kaber Carroll, Norman Leonard Ovens, Sharon Ann Green, Jennifer Ruth Cooper, So Young Kim, Michael Eric Figard, Masaki Merritt Akiyama, Bernardo Pires
-
Publication number: 20180108178Abstract: A method for inspecting a component includes generating measurement data of the component, using a measurement device coupled to an optical marker device. The method further includes generating co-ordinate data of the measurement device, using the optical marker device and at least one camera. The method includes generating synchronized measurement data based on the measurement data and the co-ordinate data. The method further includes retrieving pre-stored data corresponding to the synchronized measurement data, from a database. The method also includes generating feedback data based on the pre-stored data and the synchronized measurement data, using an augmented reality technique. The method includes operating the measurement device based on the feedback data to perform one or more measurements to be acquired from the component.Type: ApplicationFiled: October 13, 2016Publication date: April 19, 2018Inventors: Sundar Murugappan, Arvind Rangarajan, Joseph William Bolinger, Francesco Balsamo, Lorenzo Bianchi
-
Publication number: 20170345318Abstract: A system is provided that includes a controller including one or more processors disposed onboard an aircraft. The controller is configured to be operably connected to multiple subsystems on the aircraft. The controller receives operating parameters from one or more of the subsystems during a flight of the aircraft. The controller is configured to analyze the operating parameters to determine an abnormal operating condition of the aircraft. The controller is further configured to transmit a display message to a display device onboard the aircraft. The display message provides multiple responsive actions to the abnormal operating condition. The responsive actions are prioritized on the display device to indicate to the flight crew that one or more of the responsive actions are recommended over one or more other responsive actions in the display message.Type: ApplicationFiled: May 25, 2016Publication date: November 30, 2017Inventors: So Young Kim, Alexander Kaber Carroll, Norman Leonard Ovens, Sharon Ann Green, Jennifer Ruth Cooper, Michael Eric Figard, Sundar Murugappan, Boris Soliz, Masaki Merritt Akiyama
-
Patent number: 9753546Abstract: A system and method for selective gesture interaction using spatial volumes is disclosed. The method includes processing data frames that each includes one or more body point locations of a collaborating user that is interfacing with an application at each time intervals, defining a spatial volume for each collaborating user based on the processed data frames, detecting a gesture performed by a first collaborating user based on the processed data frames, determining the gesture to be an input gesture performed by the first collaborating user in a first spatial volume, interpreting the input gesture based on a context of the first spatial volume that includes a role of the first collaborating user, a phase of the application, and an intersection volume between the first spatial volume and a second spatial volume for a second collaborating user, and providing an input command to the application based on the interpreted input gesture.Type: GrantFiled: August 29, 2014Date of Patent: September 5, 2017Assignee: General Electric CompanyInventors: Habib Abi-Rached, Jeng-Weei Lin, Sundar Murugappan, Arnold Lund, Veeraraghavan Ramaswamy
-
Publication number: 20170210484Abstract: Systems and methods for controlling an operations specified by an operations checklist of an aircraft are provided. A method can include providing a user interface for display on a display screen. The user interface can include a checklist having one or more checklist items and can present an interactive virtual element in conjunction with each checklist item. Each checklist item and virtual element can be associated with a task to be performed for operation of an aircraft. The virtual elements can be visual representations of physical control interfaces associated with the task. The method can include receiving data indicative of a user interaction with at least one virtual element. In response, the method can further include sending one or more command signals to one or more aircraft control devices to perform at least a portion of the task associated with the at least one virtual element.Type: ApplicationFiled: January 25, 2016Publication date: July 27, 2017Inventors: Michael Eric Figard, Alexander K. Carroll, Norman Leonard Ovens, Sharon Ann Green, Jennifer Cooper, So Young Kim, Sundar Murugappan, Boris A. Soliz, Kristian Thibault
-
Publication number: 20160147433Abstract: A method and system for improving user interface efficiency through muscle memory and a radial menu are disclosed. A computer device stores a list of reference commands. The computer device receives a first input component from a user. The computer device then determines whether the first input component matches a first component of at least one reference command in the list of reference commands. In accordance with a determination that the first input component matches the first component of the at least one reference command in the list of reference commands, the computer device continues to monitor user input without displaying a radial menu. In accordance with a determination that the first input component does not match the first component of the at least one reference command in the list of reference commands, the computer device displays the radial menu to the user.Type: ApplicationFiled: November 26, 2014Publication date: May 26, 2016Inventors: Jeng-Weei Lin, Sundar Murugappan, Jeong Eon Kim, Arnold Lund, Veeraraghavan Ramaswamy, Chih-Sung Wu
-
Publication number: 20160113592Abstract: A system and method for acquisition setup and anatomy landmarking for MRI systems are described. A gesture sensing input device generates user motion data. A gesture application identifies a gesture based on the user motion data. A display device displays patient setup information in response to the gesture. An acquisition application generates a setup for an MRI system for a patient based on the gesture and the user motion data.Type: ApplicationFiled: October 28, 2014Publication date: April 28, 2016Inventors: Sundar Murugappan, Adrian Jeremy Knowles, Alexander Kaber Carroll
-
Publication number: 20160062469Abstract: A system and method for selective gesture interaction using spatial volumes is disclosed. The method includes processing data frames that each includes one or more body point locations of a collaborating user that is interfacing with an application at each time intervals, defining a spatial volume for each collaborating user based on the processed data frames, detecting a gesture performed by a first collaborating user based on the processed data frames, determining the gesture to be an input gesture performed by the first collaborating user in a first spatial volume, interpreting the input gesture based on a context of the first spatial volume that includes a role of the first collaborating user, a phase of the application, and an intersection volume between the first spatial volume and a second spatial volume for a second collaborating user, and providing an input command to the application based on the interpreted input gesture.Type: ApplicationFiled: August 29, 2014Publication date: March 3, 2016Inventors: Habib Abi-Rached, Jeng-Weei Lin, Sundar Murugappan, Arnold Lund, Veeraraghavan Ramaswamy