Patents by Inventor Héctor H. González-Baños

Héctor H. González-Baños has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230349693
    Abstract: A system and method for generating input data from a pose estimate for a pose (position and orientation) of a manipulated object operated in a three-dimensional environment that offers optical features. The manipulated object has an on-board photodetector for providing light data and an on-board auxiliary motion detection component for providing relative motion data indicative of a change in an orientation, a change in position or both (relative change in pose). A processor in communication with the on-board photodetector and auxiliary motion detection component uses light data to determine an absolute pose estimate at times ti and relative motion data to determine a relative pose change. The processor deploys a technique that combines the absolute pose estimate from the light data and the relative pose change from the relative motion data to provide the pose estimate at an application request time tr.
    Type: Application
    Filed: May 16, 2023
    Publication date: November 2, 2023
    Applicant: Electronic Scripting Products, Inc.
    Inventors: Michael J. Mandella, Hector H. Gonzalez-Banos, Marek Alboszta
  • Patent number: 11609134
    Abstract: A system for monitoring injuries comprising a plurality of wearable user input devices and a wireless transceiver. Each of the plurality of wearable user input devices may be configured to detect motion patterns of a user. Each of the plurality of wearable user input devices may be configured as performance equipment. The wireless transceiver may be configured to communicate the motion patterns to a user device. The user device may be configured to (i) develop and store reference patterns related to impacts, (ii) compare the detected motion patterns with the reference patterns, (iii) estimate a location and direction of an impact based on the comparison, (iv) accumulate data from the estimated impact with previously suffered impact data, (v) aggregate results based on the accumulated impact data and context information and (vi) generate feedback for the user based on the aggregated results.
    Type: Grant
    Filed: December 1, 2021
    Date of Patent: March 21, 2023
    Assignee: Invent.ly, LLC
    Inventors: Stephen J. Brown, Hector H. Gonzalez-Banos
  • Patent number: 11577159
    Abstract: The present invention discloses systems and methods for both viewing and interacting with a virtual reality (VR), an augmented reality (AR) or a mixed reality (MR). More specifically, the systems and methods allow the user to interact with aspects of such realities including virtual items presented in such realities or within such environments by manipulating a control device that has an inside-out camera mounted on-board. The apparatus or system uses two distinct representations including a reduced representation in determining the pose of the control device and uses these representations to compute an interactive pose portion of the control device to be used for interacting with the virtual item. The reduced representation is consonant with a constrained motion of the control device.
    Type: Grant
    Filed: December 11, 2020
    Date of Patent: February 14, 2023
    Assignee: ELECTRONIC SCRIPTING PRODUCTS INC.
    Inventors: Michael J. Mandella, Hector H. Gonzalez-Banos, Marek Alboszta
  • Patent number: 11442521
    Abstract: A system comprising a plurality of self-powered devices and at least one remote device. The plurality of self-powered devices may be configured to perform instructions. The plurality of self-powered devices may be configured to select one of a plurality of modes of operation. The remote device may be configured to store scheduling data. The remote device may be configured to communicate with the self-powered devices. The self-powered devices may select one of the plurality of modes of operation based on the scheduling data.
    Type: Grant
    Filed: October 23, 2019
    Date of Patent: September 13, 2022
    Assignee: Invent.ly, LLC
    Inventors: Stephen J. Brown, Daylyn M. Meade, Timothy P. Flood, Clive A. Hallatt, Holden D. Jessup, Hector H. Gonzalez-Banos
  • Patent number: 11398253
    Abstract: The disclosure includes a system and method for decomposing a video to salient fragments and synthesizing a video composition based on the salient fragments. A computer-implemented method receives a first set of salient fragments and a first set of clusters extracted from a video, where each cluster includes related salient fragments connected by a connectivity graph. The method determines a weight associated with each of the salient fragments and each of the clusters based on an activity level associated with the respective salient fragment or cluster and determine a permissible zone of activity. The method determines a spatial-temporal distortion to be applied to each salient fragment and cluster and synthesizes a video composition based on the first set of salient fragments, the first set of clusters and non-salient portions of the video using weighted editing.
    Type: Grant
    Filed: June 10, 2020
    Date of Patent: July 26, 2022
    Assignee: Ricoh Company, Ltd.
    Inventors: Hector H. Gonzalez-Banos, Ramya Narasimha
  • Patent number: 11263472
    Abstract: The disclosure includes a system and method for providing visual analysis focalized on a salient event. A video processing application receives a data stream from a capture device, determines an area of interest over an imaging area of the capture device, detects a salient event from the data stream, determines whether a location of the detected salient event is within the area of interest, and in response to the location of the salient event being within the area of interest, identifies a portion of the data stream, based on the salient event, on which to perform an action.
    Type: Grant
    Filed: February 7, 2020
    Date of Patent: March 1, 2022
    Assignee: Ricoh Co., Ltd.
    Inventors: Manuel Martinello, Hector H. Gonzalez-Banos
  • Patent number: 11193840
    Abstract: A system for monitoring injuries comprising a plurality of wearable user input devices and a wireless transceiver. Each of the plurality of wearable user input devices may be configured to detect motion patterns of a user. Each of the plurality of wearable user input devices may be configured as performance equipment. The wireless transceiver may be configured to communicate the motion patterns to a user device. The user device may be configured to (i) develop and store reference patterns related to impacts, (ii) compare the detected motion patterns with the reference patterns, (iii) estimate a location and direction of an impact based on the comparison, (iv) accumulate data from the estimated impact with previously suffered impact data, (v) aggregate results based on the accumulated impact data and context information and (vi) generate feedback for the user based on the aggregated results.
    Type: Grant
    Filed: November 17, 2020
    Date of Patent: December 7, 2021
    Assignee: Invent.ly, LLC
    Inventors: Stephen J. Brown, Hector H. Gonzalez-Banos
  • Publication number: 20210283496
    Abstract: The present invention discloses systems and methods for both viewing and interacting with a virtual reality (VR), an augmented reality (AR) or a mixed reality (MR). More specifically, the systems and methods allow the user to interact with aspects of such realities including virtual items presented in such realities or within such environments by manipulating a control device that has an inside-out camera mounted on-board. The apparatus or system uses two distinct representations including a reduced representation in determining the pose of the control device and uses these representations to compute an interactive pose portion of the control device to be used for interacting with the virtual item. The reduced representation is consonant with a constrained motion of the control device.
    Type: Application
    Filed: December 11, 2020
    Publication date: September 16, 2021
    Applicant: Electronic Scripting Products, Inc.
    Inventors: Michael J. Mandella, Hector H. Gonzalez-Banos, Marek Alboszta
  • Patent number: 10956494
    Abstract: A system and method for analyzing behavior in a video is described. The method includes extracting a plurality of salient fragments of a video; building a database of the plurality of salient fragments; receiving a keyword; identifying a time anchor when the keyword appears in an audio track associated with the video; retrieving one or more salient fragments of the video from the database of the plurality of salient fragments based on the time anchor; generating a focalized visualization based on the one or more salient fragments of the video; tagging a human subject in the focalized visualization with a unique identifier; analyzing the focalized visualization based on the time anchor and the unique identifier to generate a behavior score; and providing the behavior score via the user device.
    Type: Grant
    Filed: April 16, 2019
    Date of Patent: March 23, 2021
    Assignee: Ricoh Company, Ltd.
    Inventors: Ramya Narasimha, Hector H. Gonzalez-Banos
  • Patent number: 10956773
    Abstract: A system and method for analyzing behavior in a video is described. The method includes extracting a plurality of salient fragments of a video; building a database of the plurality of salient fragments; associating a time anchor with a media event; retrieving one or more salient fragments of the video from the database of the plurality of salient fragments based on the time anchor; generating a focalized visualization based on the one or more salient fragments of the video; tagging a human subject in the focalized visualization with a unique identifier; analyzing the focalized visualization based on the time anchor and the unique identifier to generate a behavior score; and providing the behavior score via the user device.
    Type: Grant
    Filed: April 16, 2019
    Date of Patent: March 23, 2021
    Assignee: Ricoh Company, Ltd.
    Inventors: Ramya Narasimha, Hector H. Gonzalez-Banos
  • Patent number: 10956495
    Abstract: A system and method for analyzing behavior in a video is described. The method includes extracting a plurality of salient fragments of a video; building a database of the plurality of salient fragments; associating a time anchor with a machine event; retrieving one or more salient fragments of the video from the database of the plurality of salient fragments based on the time anchor; generating a focalized visualization based on the one or more salient fragments of the video; tagging a human subject in the focalized visualization with a unique identifier; analyzing the focalized visualization based on the time anchor and the unique identifier to generate a behavior score; and providing the behavior score via the user device.
    Type: Grant
    Filed: April 16, 2019
    Date of Patent: March 23, 2021
    Assignee: Ricoh Company, Ltd.
    Inventors: Ramya Narasimha, Hector H. Gonzalez-Banos
  • Patent number: 10949463
    Abstract: A system and method for analyzing behavior in a video is described. The method includes extracting a plurality of salient fragments of a video; associating a time anchor with an utterance of a first keyword in an audio track associated with the video; generating a focalized visualization, based on the time anchor, from one or more of the plurality of salient fragments of the video; tagging a human subject in the focalized visualization with a unique identifier; and analyzing behavior of the human subject, using the focalized visualization, to generate a behavior score associated with the unique identifier and the first keyword.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: March 16, 2021
    Assignee: Ricoh Company, Ltd.
    Inventors: Ramya Narasimha, Hector H. Gonzalez-Banos
  • Patent number: 10949705
    Abstract: A system and method for analyzing behavior in a video is described. The method includes extracting a plurality of salient fragments of a video; building a database of the plurality of salient fragments; generating a focalized visualization, based on a time anchor, from the one or more salient fragments of the video; tagging a human subject in the focalized visualization with a unique identifier; analyzing the focalized visualization, based on the unique identifier, to generate a behavior score; and providing the behavior score via the user device.
    Type: Grant
    Filed: April 16, 2019
    Date of Patent: March 16, 2021
    Assignee: Ricoh Company, Ltd.
    Inventors: Ramya Narasimha, Hector H. Gonzalez-Banos
  • Patent number: 10943122
    Abstract: A system and method for analyzing behavior in a video is described. The method includes extracting a plurality of salient fragments of a video; generating a focalized visualization, based on a time anchor, from one or more of the plurality of salient fragments of the video; tagging a human subject in the focalized visualization with a unique identifier; and analyzing behavior of the human subject, using the focalized visualization, to generate a behavior score associated with the unique identifier and the time anchor.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: March 9, 2021
    Assignee: Ricoh Company, Ltd.
    Inventors: Ramya Narasimha, Hector H. Gonzalez-Banos
  • Patent number: 10929707
    Abstract: A system and method for analyzing behavior in a video is described. The method includes extracting a plurality of salient fragments of a video; associating a time anchor with a presentation of a first media content to a human subject; generating a focalized visualization, based on the time anchor, from one or more of the plurality of salient fragments of the video; tagging the human subject in the focalized visualization with a unique identifier; and analyzing behavior of the human subject, using the focalized visualization, to generate a behavior score associated with the unique identifier and the first media content.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: February 23, 2021
    Assignee: Ricoh Company, Ltd.
    Inventors: Ramya Narasimha, Hector H. Gonzalez-Banos
  • Patent number: 10929685
    Abstract: A system and method for analyzing behavior in a video is described. The method includes extracting a plurality of salient fragments of a video; associating a time anchor with an occurrence of a first machine event of a machine operated by a human subject; generating a focalized visualization, based on the time anchor, from one or more of the plurality of salient fragments of the video; tagging the human subject in the focalized visualization with a unique identifier; and analyzing behavior of the human subject, using the focalized visualization, to generate a behavior score associated with the unique identifier and the first machine event.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: February 23, 2021
    Assignee: Ricoh Company, Ltd.
    Inventors: Ramya Narasimha, Hector H. Gonzalez-Banos
  • Patent number: 10876912
    Abstract: A system for monitoring injuries comprising a plurality of wearable user input devices and a wireless transceiver. Each of the plurality of wearable user input devices may be configured to detect motion patterns of a user. Each of the plurality of wearable user input devices may be configured as performance equipment. The wireless transceiver may be configured to communicate the motion patterns to a user device. The user device may be configured to (i) develop and store reference patterns related to impacts, (ii) compare the detected motion patterns with the reference patterns, (iii) estimate a location and direction of an impact based on the comparison, (iv) accumulate data from the estimated impact with previously suffered impact data, (v) aggregate results based on the accumulated impact data and context information and (vi) generate feedback for the user based on the aggregated results.
    Type: Grant
    Filed: December 11, 2019
    Date of Patent: December 29, 2020
    Assignee: Invent.ly, LLC
    Inventors: Stephen J. Brown, Hector H. Gonzalez-Banos
  • Publication number: 20200335134
    Abstract: The disclosure includes a system and method for decomposing a video to salient fragments and synthesizing a video composition based on the salient fragments. A computer-implemented method receives a first set of salient fragments and a first set of clusters extracted from a video, where each cluster includes related salient fragments connected by a connectivity graph. The method determines a weight associated with each of the salient fragments and each of the clusters based on an activity level associated with the respective salient fragment or cluster and determine a permissible zone of activity. The method determines a spatial-temporal distortion to be applied to each salient fragment and cluster and synthesizes a video composition based on the first set of salient fragments, the first set of clusters and non-salient portions of the video using weighted editing.
    Type: Application
    Filed: June 10, 2020
    Publication date: October 22, 2020
    Applicant: Ricoh Company, Ltd.
    Inventors: Hector H. Gonzalez-Banos, Ramya Narasimha
  • Patent number: 10719552
    Abstract: The disclosure includes a system and method for creating, storing, and retrieving a focalized visualization related to a location, an event or a subject of interest. A visualization server receives a query for creating a visualization from a client device, identifies and retrieves one or more segments of a video stream satisfying the query, and generates the visualization based on the one or more segments of the video stream.
    Type: Grant
    Filed: March 9, 2018
    Date of Patent: July 21, 2020
    Assignee: Ricoh Co., Ltd.
    Inventors: Jorge Moraleda, Hector H. Gonzalez-Banos, Marco Antonio Covarrubias
  • Patent number: 10720182
    Abstract: The disclosure includes a system and method for decomposing a video to salient fragments and synthesizing a video composition based on the salient fragments. A video decomposition application extracts non-salient portions of a video, extracts a plurality of salient fragments of the video, builds a database of the plurality of salient fragments, receives a query, retrieves, from the database of the plurality of salient fragments, a set of salient fragments based on the query, and synthesizes a video composition based on the set of salient fragments and the non-salient portions of the video.
    Type: Grant
    Filed: March 2, 2017
    Date of Patent: July 21, 2020
    Assignee: Ricoh Company, Ltd.
    Inventors: Hector H. Gonzalez-Banos, Ramya Narasimha