Patents by Inventor Guanhang Wu

Guanhang Wu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11080562
    Abstract: A method includes obtaining training samples that include images that depict objects and annotations of annotated key point locations for the objects. The method also includes training a machine learning model to determine estimated key point locations for the objects and key point uncertainty values for the estimated key point locations by minimizing a loss function that is based in part on a key point localization loss value that represents a difference between the annotated key point locations and the estimated key point locations values and is weighted by the key point uncertainty values.
    Type: Grant
    Filed: June 14, 2019
    Date of Patent: August 3, 2021
    Assignee: Apple Inc.
    Inventors: Shreyas Saxena, Wenda Wang, Guanhang Wu, Nitish Srivastava, Dimitrios Kottas, Cuneyt Oncel Tuzel, Luciano Spinello, Ricardo da Silveira Cabral
  • Publication number: 20200257957
    Abstract: A set of sensor information conveyed by sensor output signals may be accessed. The sensor output signals may be generated by a set of sensors. The set of sensor information may characterize an event monitored by the set of sensors. A multi-feature convolutional neural network may be trained using a branch-loss function. The branch-loss function may include individual loss functions for individual sensor information and one or more combined loss functions for combined sensor information. The set of sensor information may be processed through the multi-feature convolutional neural network. A classification of the event may be obtained from the multi-feature convolutional network based on the set of sensor information.
    Type: Application
    Filed: February 16, 2017
    Publication date: August 13, 2020
    Inventors: Daniel Tse, Guanhang Wu, Desmond Chik
  • Patent number: 10534966
    Abstract: Systems and method of identifying activities and/or events represented in a video are presented herein. An activity and/or event may be represented in a video by virtue of one or both of an entity moving with a capture device during capture of the video preforming the activity and/or event, or the video portraying one or more entities performing the activity and/or event. Activity types may be characterized by one or more of common movements, equipment, spatial context, and/or other features. Events may be characterized by one or both of individual movements and/or sets of movements that may routinely occur during performance of an activity. The identification of activities and/or events represented in a video may be based on one or more spectrogram representations of sensor output signals of one or more sensors coupled to a capture device.
    Type: Grant
    Filed: February 2, 2017
    Date of Patent: January 14, 2020
    Assignee: GoPro, Inc.
    Inventors: Daniel Tse, Desmond Chik, Guanhang Wu
  • Patent number: 10185895
    Abstract: An image including a visual capture of a scene may be accessed. The image may be processed through a convolutional neural network. The convolutional neural network may generate a set of two-dimensional feature maps based on the image. The set of two-dimensional feature maps may be processed through a contextual long short-term memory unit. The contextual long short-term memory unit may generate a set of two-dimensional outputs based on the set of two-dimensional feature maps. A set of attention-masks for the image may be generated based on the set of two-dimensional outputs and the set of two-dimensional feature maps. The set of attention-masks may define dimensional portions of the image. The scene may be classified based on the two-dimensional outputs.
    Type: Grant
    Filed: March 23, 2017
    Date of Patent: January 22, 2019
    Assignee: GoPro, Inc.
    Inventors: Daniel Tse, Desmond Chik, Guanhang Wu