Patents by Inventor Kenrick Cheng-kuo Kin

Kenrick Cheng-kuo Kin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11409364
    Abstract: Disclosed herein are related to a system and a method for controlling a virtual reality based on a physical object. In one aspect, a shape of a hand of a user corresponding to a surface or a structure of a physical object is detected. In one aspect, according to the detected shape of the hand, an interactive feature for the surface or the structure of the physical object is generated in a virtual reality or augmented reality application. In one aspect, a user interaction with the interactive feature is detected. In one aspect, an action of the virtual reality or augmented reality application is initiated, in response to detecting the user interaction with the interactive feature.
    Type: Grant
    Filed: June 4, 2020
    Date of Patent: August 9, 2022
    Assignee: FACEBOOK TECHNOLOGIES, LLC
    Inventors: Qian Zhou, Kenrick Cheng-Kuo Kin
  • Patent number: 11256341
    Abstract: The disclosed computer-implemented method may include tracking (1) a position of a primary real-world object within a real-world environment via a primary tracking method, and (2) a position of a secondary real-world object within the real-world environment via a secondary tracking method. The method may further include presenting (1) a primary virtual object at a position within an artificial environment corresponding to the tracked position of the primary real-world object, and (2) a secondary virtual object at a position within the artificial environment corresponding to the tracked position of the secondary real-world object. The method may further include (1) detecting an interaction of the primary real-world object with the secondary real-world object, and (2) transitioning to tracking the position of the primary real-world object via the secondary tracking method. Various other methods, systems, and computer-readable media are also disclosed.
    Type: Grant
    Filed: October 6, 2020
    Date of Patent: February 22, 2022
    Assignee: Facebook Technologies, LLC
    Inventors: Kenrick Cheng-Kuo Kin, Maxime Ouellet
  • Publication number: 20210406529
    Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.
    Type: Application
    Filed: September 13, 2021
    Publication date: December 30, 2021
    Inventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
  • Patent number: 11157725
    Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.
    Type: Grant
    Filed: March 16, 2020
    Date of Patent: October 26, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
  • Publication number: 20210081050
    Abstract: Disclosed herein are related to a system and a method for controlling a virtual reality based on a physical object. In one aspect, a shape of a hand of a user corresponding to a surface or a structure of a physical object is detected. In one aspect, according to the detected shape of the hand, an interactive feature for the surface or the structure of the physical object is generated in a virtual reality or augmented reality application. In one aspect, a user interaction with the interactive feature is detected. In one aspect, an action of the virtual reality or augmented reality application is initiated, in response to detecting the user interaction with the interactive feature.
    Type: Application
    Filed: June 4, 2020
    Publication date: March 18, 2021
    Inventors: Qian Zhou, Kenrick Cheng-Kuo Kin
  • Patent number: 10896545
    Abstract: A system includes a near eye display (NED) comprising a substantially transparent electronic display that is configured to display images in accordance with display instructions, and an imaging device configured to capture one or more images of a portions of a local area surrounding the NED. The system further includes a controller configured to determine a position of an object within the local area using the captured one or more images and location information associated with the object. The controller accesses supplemental information regarding the object, and updates the display instructions to cause the substantially transparent electronic display to display at least a portion of the supplemental information about the object. The display of the at least a portion of the supplemental information is positioned within a threshold distance of the determined position of the object in an augmented reality environment as presented via the substantially transparent electronic display.
    Type: Grant
    Filed: November 29, 2017
    Date of Patent: January 19, 2021
    Assignee: Facebook Technologies, LLC
    Inventors: Kenrick Cheng-kuo Kin, Albert Peter Hwang
  • Patent number: 10824244
    Abstract: The disclosed computer-implemented method may include tracking (1) a position of a primary real-world object within a real-world environment via a primary tracking method, and (2) a position of a secondary real-world object within the real-world environment via a secondary tracking method. The method may further include presenting (1) a primary virtual object at a position within an artificial environment corresponding to the tracked position of the primary real-world object, and (2) a secondary virtual object at a position within the artificial environment corresponding to the tracked position of the secondary real-world object. The method may further include (1) detecting an interaction of the primary real-world object with the secondary real-world object, and (2) transitioning to tracking the position of the primary real-world object via the secondary tracking method. Various other methods, systems, and computer-readable media are also disclosed.
    Type: Grant
    Filed: February 15, 2019
    Date of Patent: November 3, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Kenrick Cheng-Kuo Kin, Maxime Ouellet
  • Patent number: 10783712
    Abstract: A near eye display (NED) system is configured to present an augmented reality environment to the user. In addition, the NED system uses an imaging device to detect individuals within a local area, and identify positions of hands of the individuals from which the NED system may be able to detect one or more performed gestures. In response to detecting a predetermined gesture, the NED system displays a user of the NED one or more virtual objects corresponding to the identified first gesture at a location associated with a predetermined portion of the individual's body. The displayed virtual objects may be referred to as “visual flair,” and are used to accentuate, stylize, or provide emphasis to gestures performed by individuals within the local area.
    Type: Grant
    Filed: June 27, 2018
    Date of Patent: September 22, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Albert Peter Hwang, Kenrick Cheng-Kuo Kin
  • Publication number: 20200279104
    Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.
    Type: Application
    Filed: March 16, 2020
    Publication date: September 3, 2020
    Inventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
  • Patent number: 10739861
    Abstract: A system includes a near eye display (NED) that comprises an optical assembly with an electronic display, an imaging sensor configured to capture images of a user's hands, and an eye imaging sensor configured to capture images of an eye of the user. The system also includes a controller configured to determine eye tracking information using the captured images of the eye, the eye tracking information indicating a gaze orientation, wherein the gaze orientation terminates at first location. The controller determines that a pose of the user's hand indicates a pinch gesture based on the captured images of the user's hands. The controller also identifies an object in the local area that is at the first location, and updates the display instructions to cause the electronic display to display an indication of a selection of the object in an artificial reality environment.
    Type: Grant
    Filed: January 10, 2018
    Date of Patent: August 11, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Kenrick Cheng-kuo Kin, Albert Peter Hwang
  • Patent number: 10739869
    Abstract: An artificial-reality apparatus may include a wearable band dimensioned to be worn around a portion of a user's hand. The artificial-reality apparatus may also include a primary tactile-input location at an outer surface of the wearable band to facilitate inspecting an artificial-reality element when another hand of the user activates the primary tactile-input location. Additionally, the artificial-reality apparatus may include a secondary tactile-input location at the outer surface of the wearable band to facilitate manipulating the artificial-reality element when the user's other hand simultaneously activates the primary tactile-input location and the secondary tactile-input location. Furthermore, the artificial-reality apparatus may include a computing subsystem contained by the wearable band that communicatively couples the primary tactile-input location and the secondary tactile-input location to an artificial-reality system. Various other apparatuses, systems, and methods are also disclosed.
    Type: Grant
    Filed: May 1, 2018
    Date of Patent: August 11, 2020
    Assignee: Facebook Technologies, LLC
    Inventor: Kenrick Cheng-Kuo Kin
  • Patent number: 10712901
    Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user. In some embodiments, multiple users may be in a local area, each using a different NED. A first user of a first NED may view virtual content using a first NED. The first NED may comprise an imaging device capable of capturing images of the local area, allowing the first NED to identify gestures performed by the first user and/or by other users in the local area. In some embodiments, the first NED may, in response to detecting one or more predetermined gestures performed by the first user, share virtual content displayed to the first user with a second user using a second NED, allowing the second user to view the virtual content through the second NED.
    Type: Grant
    Filed: June 27, 2018
    Date of Patent: July 14, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Albert Peter Hwang, Daniel Andersen, Kenrick Cheng-Kuo Kin
  • Publication number: 20200159337
    Abstract: The disclosed computer-implemented method may include tracking (1) a position of a primary real-world object within a real-world environment via a primary tracking method, and (2) a position of a secondary real-world object within the real-world environment via a secondary tracking method. The method may further include presenting (1) a primary virtual object at a position within an artificial environment corresponding to the tracked position of the primary real-world object, and (2) a secondary virtual object at a position within the artificial environment corresponding to the tracked position of the secondary real-world object. The method may further include (1) detecting an interaction of the primary real-world object with the secondary real-world object, and (2) transitioning to tracking the position of the primary real-world object via the secondary tracking method. Various other methods, systems, and computer-readable media are also disclosed.
    Type: Application
    Filed: February 15, 2019
    Publication date: May 21, 2020
    Inventors: Kenrick Cheng-Kuo Kin, Maxime Ouellet
  • Patent number: 10635895
    Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.
    Type: Grant
    Filed: June 27, 2018
    Date of Patent: April 28, 2020
    Assignee: Facebook Technologies, LLC
    Inventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
  • Publication number: 20200005539
    Abstract: A near eye display (NED) system is configured to present an augmented reality environment to the user. In addition, the NED system uses an imaging device to detect individuals within a local area, and identify positions of hands of the individuals from which the NED system may be able to detect one or more performed gestures. In response to detecting a predetermined gesture, the NED system displays a user of the NED one or more virtual objects corresponding to the identified first gesture at a location associated with a predetermined portion of the individual's body. The displayed virtual objects may be referred to as “visual flair,” and are used to accentuate, stylize, or provide emphasis to gestures performed by individuals within the local area.
    Type: Application
    Filed: June 27, 2018
    Publication date: January 2, 2020
    Inventors: Albert Peter Hwang, Kenrick Cheng-Kuo Kin
  • Publication number: 20200005026
    Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.
    Type: Application
    Filed: June 27, 2018
    Publication date: January 2, 2020
    Inventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
  • Publication number: 20200004401
    Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user. In some embodiments, multiple users may be in a local area, each using a different NED. A first user of a first NED may view virtual content using a first NED. The first NED may comprise an imaging device capable of capturing images of the local area, allowing the first NED to identify gestures performed by the first user and/or by other users in the local area. In some embodiments, the first NED may, in response to detecting one or more predetermined gestures performed by the first user, share virtual content displayed to the first user with a second user using a second NED, allowing the second user to view the virtual content through the second NED.
    Type: Application
    Filed: June 27, 2018
    Publication date: January 2, 2020
    Inventors: Albert Peter Hwang, Daniel Andersen, Kenrick Cheng-Kuo Kin
  • Publication number: 20190212828
    Abstract: A system includes a near eye display (NED) that is configured to display images in accordance with display instructions. The system also includes an imaging sensor configured to capture images. The system further includes a controller configured to identify the object in the captured images using one or more recognition patterns, and to determine a pose of the user's hand based on the captured images, with the determined pose indicating a touch gesture with the identified object. The controller also updates the display instructions to cause the electronic display to display a virtual menu in an artificial reality environment, with the virtual menu within a threshold distance of the position of the object in the artificial reality environment.
    Type: Application
    Filed: January 10, 2018
    Publication date: July 11, 2019
    Inventors: Kenrick Cheng-kuo Kin, Albert Peter Hwang
  • Publication number: 20190212827
    Abstract: A system includes a near eye display (NED) that comprises an optical assembly with an electronic display, an imaging sensor configured to capture images of a user's hands, and an eye imaging sensor configured to capture images of an eye of the user. The system also includes a controller configured to determine eye tracking information using the captured images of the eye, the eye tracking information indicating a gaze orientation, wherein the gaze orientation terminates at first location. The controller determines that a pose of the user's hand indicates a pinch gesture based on the captured images of the user's hands. The controller also identifies an object in the local area that is at the first location, and updates the display instructions to cause the electronic display to display an indication of a selection of the object in an artificial reality environment.
    Type: Application
    Filed: January 10, 2018
    Publication date: July 11, 2019
    Inventors: Kenrick Cheng-kuo Kin, Albert Peter Hwang
  • Patent number: 10261595
    Abstract: A system includes an electronic display configured to display one or more simulated objects in accordance with display instructions, an imaging sensor configured to capture images of a user's hands, and a console. The console is configured to receive the captured images from the imaging sensor, extract joint information of the user's hands from the captured images, and determine one or more poses based on the extracted joint information. In response to the determined poses indicating the user's index finger positioned orthogonally to the user's thumb, and the thumb within a minimum distance to the index finger, the console detects a directional pad display gesture, and update the display instructions to cause the electronic display to generate a simulated directional pad adjacent to the user's thumb in a simulated environment that is presented to the user via the electronic display.
    Type: Grant
    Filed: May 19, 2017
    Date of Patent: April 16, 2019
    Assignee: Facebook Technologies, LLC
    Inventor: Kenrick Cheng-kuo Kin