Patents by Inventor Shi-Ping Hsu

Shi-Ping Hsu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10296168
    Abstract: A method for a multi-step selection interface is provided including receiving a multistep selection indication, causing, using a processing circuitry, a first selection menu extending in a first direction to be rendered on a display, receiving a first selection indication based on the first selection menu, in response to receiving the first selection indication; causing a second selection menu to be rendered on the display, extending in the first direction in substantially the same position of the display as the first selection menu, and causing a rendering of at least a portion of the first menu to be displaced in a direction substantially perpendicular to the first direction.
    Type: Grant
    Filed: June 25, 2015
    Date of Patent: May 21, 2019
    Assignee: NORTHROP GRUMMAN SYSTEMS CORPORATION
    Inventors: Adrian Kaehler, Shi-Ping Hsu, Sam Leventer, Fred Zyda
  • Patent number: 9946584
    Abstract: Systems and methods are provided for extracting application relevant data from messages. In one embodiment, a system can comprise a message parser that parses messages and builds a message tree having one or more objects, one or more data type templates that define a given data type based on one or more data elements and a comparison engine that matches data elements in the one or more objects with data elements in the one or more data type templates. The comparison engine groups data elements in the one or more objects that matches data elements in the one or more data templates as a specific data type corresponding to the associated data type template that is matched.
    Type: Grant
    Filed: September 30, 2008
    Date of Patent: April 17, 2018
    Assignee: Northrop Grumman Systems Corporation
    Inventors: Adrian Kaehler, Shi-Ping Hsu
  • Patent number: 9696808
    Abstract: One embodiment of the invention includes a method of providing device inputs. The method includes illuminating hand gestures performed via a bare hand of a user in a foreground of a background surface with at least one infrared (IR) light source. The method also includes generating a first plurality of silhouette images associated with the bare hand based on an IR light contrast between the bare hand and the background surface and generating a second plurality of silhouette images associated with the bare hand based on an IR light contrast between the bare hand and the background surface. The method also includes determining a plurality of three-dimensional features of the bare hand relative to the background surface based on a parallax separation of the bare hand in the first plurality of silhouette images relative to the second plurality of silhouette images.
    Type: Grant
    Filed: December 17, 2008
    Date of Patent: July 4, 2017
    Assignee: Northrop Grumman Systems Corporation
    Inventors: H. Keith Nishihara, Shi-Ping Hsu, Adrian Kaehler, Lars Jangaard
  • Patent number: 9658765
    Abstract: One embodiment of the invention includes a computer interface system. The system comprises a user interface screen configured to display visual content and an input system configured to detect a presence of an input object within a threshold distance along a normal axis of the user interface screen. The system further comprises a graphical controller configured to magnify a portion of the visual content that is located at an approximate location of a base of the normal axis on the user interface screen.
    Type: Grant
    Filed: July 31, 2008
    Date of Patent: May 23, 2017
    Assignee: Northrop Grumman Systems Corporation
    Inventors: H. Keith Nishihara, Shi-Ping Hsu, Adrian Kaehler, Eric Gradman, Kjerstin Williams
  • Publication number: 20170094227
    Abstract: A three-dimensional spatial-awareness vision system includes video sensor system(s) mounted to a monitoring platform and having a field of view to monitor a scene of interest and provide real-time video data corresponding to real-time video images. A memory stores model data associated with a rendered three-dimensional virtual representation of the monitoring platform. An image processor combines the real-time video data and the model data to generate image data comprising the rendered three-dimensional virtual representation of the monitoring platform and the real-time video images of the scene of interest superimposed at a field of view relative to the rendered three-dimensional virtual representation of the monitoring platform.
    Type: Application
    Filed: September 25, 2015
    Publication date: March 30, 2017
    Applicant: NORTHROP GRUMMAN SYSTEMS CORPORATION
    Inventors: KJERSTIN IRJA WILLIAMS, Brandon M. Booth, Christopher M. Cianci, Aaron J. Denney, Shi-Ping Hsu, Adrian Kaehler, Jeffrey Steven Kranski, Jeremy David Schwartz
  • Publication number: 20160378273
    Abstract: A method for a multi-step selection interface is provided including receiving a multistep selection indication, causing, using a processing circuitry, a first selection menu extending in a first direction to be rendered on a display, receiving a first selection indication based on the first selection menu, in response to receiving the first selection indication; causing a second selection menu to be rendered on the display, extending in the first direction in substantially the same position of the display as the first selection menu, and causing a rendering of at least a portion of the first menu to be displaced in a direction substantially perpendicular to the first direction.
    Type: Application
    Filed: June 25, 2015
    Publication date: December 29, 2016
    Inventors: Adrian Kaehler, Shi-Ping Hsu, Sam Leventer, Fred Zyda
  • Patent number: 9377874
    Abstract: A system and method is provided for a gesture recognition interface system. The system comprises a projector configured to project colorless light and visible images onto a background surface. The projection of the colorless light can be interleaved with the projection of the visible images. The system also comprises at least one camera configured to receive a plurality of images based on a reflected light contrast difference between the background surface and a sensorless input object during projection of the colorless light. The system further comprises a controller configured to determine a given input gesture based on changes in relative locations of the sensorless input object in the plurality of images, and being further configured to initiate a device input associated with the given input gesture.
    Type: Grant
    Filed: November 2, 2007
    Date of Patent: June 28, 2016
    Assignee: Northrop Grumman Systems Corporation
    Inventors: Kenneth W. Aull, H Keith Nishihara, Shi-Ping Hsu
  • Patent number: 9019239
    Abstract: Creative design systems and methods are disclosed. In one embodiment, a creative design system is provided. The creative design system comprises a high resolution display, an interactive stylus that includes a transmitter at a tip of the interactive stylus that transmits an encoded signal associated with the interactive stylus and a plurality of sensors that track movement of the interactive stylus over the high resolution display by capturing and decoding the transmitted encoded signal as the stylus moves over the high resolution display. A creative design controller is configured to display sketches of context in response to the tracking of the movement of the interactive stylus over the high resolution display.
    Type: Grant
    Filed: November 29, 2010
    Date of Patent: April 28, 2015
    Assignee: Northrop Grumman Systems Corporation
    Inventors: H. Keith Nishihara, Shi-Ping Hsu, Donald G. Lariviere
  • Patent number: 8972902
    Abstract: One embodiment of the invention includes a method for executing and interpreting gesture inputs in a gesture recognition interface system. The method includes detecting and translating a first sub-gesture into a first device input that defines a given reference associated with a portion of displayed visual content. The method also includes detecting and translating a second sub-gesture into a second device input that defines an execution command for the portion of the displayed visual content to which the given reference refers.
    Type: Grant
    Filed: August 22, 2008
    Date of Patent: March 3, 2015
    Assignee: Northrop Grumman Systems Corporation
    Inventors: H. Keith Nishihara, Shi-Ping Hsu, Adrian Kaehler, Bran Ferren, Lars Jangaard
  • Patent number: 8958916
    Abstract: A robotic arm module includes a chassis having at least one arm pod. At least one arm connected to the chassis is movable between a stowed position within the at least one arm pod and a deployed position extending from the at least one arm pod. Each arm has a gripping mechanism for gripping articles of work. An attachment structure is configured to allow a host robot to grip and manipulate the robotic arm module. An electrical interface is configured to receive electronic signals in response to a user moving remote manipulators. The electronic signals cause the at least one arm to mimic the movement of the user moving the remote manipulators.
    Type: Grant
    Filed: May 31, 2013
    Date of Patent: February 17, 2015
    Assignee: Northrop Grumman Systems Corporation
    Inventors: Mark Setrakian, Peter Abrahamson, Randall Adam Yates, Shi-Ping Hsu
  • Publication number: 20130325182
    Abstract: A robotic arm module includes a chassis having at least one arm pod. At least one arm connected to the chassis is movable between a stowed position within the at least one arm pod and a deployed position extending from the at least one arm pod. Each arm has a gripping mechanism for gripping articles of work. An attachment structure is configured to allow a host robot to grip and manipulate the robotic arm module. An electrical interface is configured to receive electronic signals in response to a user moving remote manipulators. The electronic signals cause the at least one arm to mimic the movement of the user moving the remote manipulators.
    Type: Application
    Filed: May 31, 2013
    Publication date: December 5, 2013
    Applicant: NORTHROP GRUMMAN SYSTEMS CORPORATION
    Inventors: MARK SETRAKIAN, Peter Abrahamson, Randall Adam Yates, Shi-ping Hsu
  • Patent number: 8589824
    Abstract: A system and method is provided for a gesture recognition interface system. The interface system may comprise a first and second light source positioned to illuminate a background surface. The interface system may also comprise at least one camera operative to receive a first plurality of images based on a first reflected light contrast difference between the background surface and a sensorless input object caused by the first light source and a second plurality of images based on a second reflected light contrast difference between the background surface and the sensorless input object caused by the second light source. The interface system may further comprise a controller operative to determine a given input gesture based on changes in relative locations of the sensorless input object in the first plurality of images and the second plurality of images. The controller may further be operative to initiate a device input associated with the given input gesture.
    Type: Grant
    Filed: July 13, 2006
    Date of Patent: November 19, 2013
    Assignee: Northrop Grumman Systems Corporation
    Inventors: William Daniel Hillis, H Keith Nishihara, Shi-Ping Hsu
  • Patent number: 8281324
    Abstract: A system is provided for linking software applications. The system comprises a message backplane configured to linked software applications by sharing messages associated with events occurring in a respective linked software application with one or more other linked software applications and a controller configured to instruct the message backplane to link software applications.
    Type: Grant
    Filed: September 30, 2008
    Date of Patent: October 2, 2012
    Assignee: Northrop Grumman Systems Corporation
    Inventors: Adrian Kaehler, Shi-Ping Hsu
  • Publication number: 20120154511
    Abstract: Systems and methods for providing geographically distributed creative design are disclosed. In one embodiment, a system is provided for providing directional audio in a video teleconference meeting. The system comprises a high resolution display, an interactive stylus and a plurality of sensors that track movement of the interactive stylus over the high resolution display. A creative design controller is configured to display detailed text and figures and display sketches of context in response to movement of the interactive stylus over the high resolution display.
    Type: Application
    Filed: December 20, 2010
    Publication date: June 21, 2012
    Inventors: Shi-Ping Hsu, Neil G. Siegel, Bran Ferren, W. Daniel Hillis
  • Publication number: 20120133616
    Abstract: Creative design systems and methods are disclosed. In one embodiment, a creative design system is provided. The creative design system comprises a high resolution display, an interactive stylus that includes a transmitter at a tip of the interactive stylus that transmits an encoded signal associated with the interactive stylus and a plurality of sensors that track movement of the interactive stylus over the high resolution display by capturing and decoding the transmitted encoded signal as the stylus moves over the high resolution display. A creative design controller is configured to display sketches of context in response to the tracking of the movement of the interactive stylus over the high resolution display.
    Type: Application
    Filed: November 29, 2010
    Publication date: May 31, 2012
    Inventors: H. Keith Nishihara, Shi-Ping Hsu, Donald G. Lariviere
  • Patent number: 8180114
    Abstract: One embodiment of the invention includes a gesture recognition interface system. The system may comprise a substantially vertical surface configured to define a gesture recognition environment based on physical space in a foreground of the substantially vertical surface. The system may also comprise at least one light source positioned to provide illumination of the gesture recognition environment. The system also comprises at least two cameras configured to generate a plurality of image sets based on the illumination being reflected from an input object in the gesture recognition environment. The system further comprises a controller configured to determine a given input gesture based on changes in relative locations of the input object in each of the plurality of image sets. The controller may further be configured to initiate a device input associated with the given input gesture.
    Type: Grant
    Filed: June 5, 2008
    Date of Patent: May 15, 2012
    Assignee: Northrop Grumman Systems Corporation
    Inventors: H. Keith Nishihara, Shi-Ping Hsu
  • Patent number: 8000482
    Abstract: Apparatus and a corresponding method for processing speech signals in a noisy reverberant environment, such as an automobile. An array of microphones (10) receives speech signals from a relatively fixed source (12) and noise signals from multiple sources (32) reverberated over multiple paths. One of the microphones is designated a reference microphone and the processing system includes adaptive frequency impulse response (FIR) filters (24) enabled by speech detection circuitry (21) and coupled to the other microphones to align their output signals with the reference microphone output signal. The filtered signals are then combined in a summation circuit (18). Signal components derived from the speech signal combine coherently in the summation circuit, while noise signal components combine incoherently, resulting in composite output signal with an improved signal-to-noise ratio. The composite output signal is further processed in a speech conditioning circuit (20) to reduce the effects of reverberation.
    Type: Grant
    Filed: August 5, 2005
    Date of Patent: August 16, 2011
    Assignee: Northrop Grumman Systems Corporation
    Inventors: Russell H. Lambert, Shi-Ping Hsu, Karina L. Edmonds
  • Patent number: 7763969
    Abstract: An embedded semiconductor chip structure and a method for fabricating the same are proposed. The structure comprises: a carrier board, therewith a plurality of through openings formed in the carrier board, and through trenches surrounding the through openings in the same; a plurality of semiconductor chips received in the through openings of the carrier board. Subsequently, cutting is processed via the through trenches. Thus, the space usage of the circuit board and the layout design are more efficient. Moreover, shaping time is also shortened.
    Type: Grant
    Filed: October 27, 2006
    Date of Patent: July 27, 2010
    Assignee: Phoenix Precision Technology Corporation
    Inventors: Zhao-Chong Zeng, Shi-Ping Hsu
  • Patent number: 7701439
    Abstract: A gesture recognition simulation system and method is provided. In one embodiment, a gesture recognition simulation system includes a three-dimensional display system that displays a three-dimensional image of at least one simulated object having at least one functional component. A gesture recognition interface system is configured to receive an input gesture associated with a sensorless input object from a user. The gesture recognition simulation system further comprises a simulation application controller configured to match a given input gesture with a predefined action associated with the at least one functional component. The simulation application controller could invoke the three dimensional display system to display a simulated action on at least a portion of the at least one simulated object associated an input gesture and a predefined action match.
    Type: Grant
    Filed: July 13, 2006
    Date of Patent: April 20, 2010
    Assignee: Northrop Grumman Corporation
    Inventors: William Daniel Hillis, H Keith Nishihara, Shi-Ping Hsu, Neil Siegel
  • Publication number: 20100050133
    Abstract: One embodiment of the invention includes a method for executing and interpreting gesture inputs in a gesture recognition interface system. The method includes detecting and translating a first sub-gesture into a first device input that defines a given reference associated with a portion of displayed visual content. The method also includes detecting and translating a second sub-gesture into a second device input that defines an execution command for the portion of the displayed visual content to which the given reference refers.
    Type: Application
    Filed: August 22, 2008
    Publication date: February 25, 2010
    Inventors: H. Keith Nishihara, Shi-Ping Hsu, Adrian Kaehler, Bran Ferren, Lars Jangaard