Patents by Inventor Kenrick Cheng-kuo Kin
Kenrick Cheng-kuo Kin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11409364Abstract: Disclosed herein are related to a system and a method for controlling a virtual reality based on a physical object. In one aspect, a shape of a hand of a user corresponding to a surface or a structure of a physical object is detected. In one aspect, according to the detected shape of the hand, an interactive feature for the surface or the structure of the physical object is generated in a virtual reality or augmented reality application. In one aspect, a user interaction with the interactive feature is detected. In one aspect, an action of the virtual reality or augmented reality application is initiated, in response to detecting the user interaction with the interactive feature.Type: GrantFiled: June 4, 2020Date of Patent: August 9, 2022Assignee: FACEBOOK TECHNOLOGIES, LLCInventors: Qian Zhou, Kenrick Cheng-Kuo Kin
-
Patent number: 11256341Abstract: The disclosed computer-implemented method may include tracking (1) a position of a primary real-world object within a real-world environment via a primary tracking method, and (2) a position of a secondary real-world object within the real-world environment via a secondary tracking method. The method may further include presenting (1) a primary virtual object at a position within an artificial environment corresponding to the tracked position of the primary real-world object, and (2) a secondary virtual object at a position within the artificial environment corresponding to the tracked position of the secondary real-world object. The method may further include (1) detecting an interaction of the primary real-world object with the secondary real-world object, and (2) transitioning to tracking the position of the primary real-world object via the secondary tracking method. Various other methods, systems, and computer-readable media are also disclosed.Type: GrantFiled: October 6, 2020Date of Patent: February 22, 2022Assignee: Facebook Technologies, LLCInventors: Kenrick Cheng-Kuo Kin, Maxime Ouellet
-
Publication number: 20210406529Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.Type: ApplicationFiled: September 13, 2021Publication date: December 30, 2021Inventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
-
Patent number: 11157725Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.Type: GrantFiled: March 16, 2020Date of Patent: October 26, 2021Assignee: Facebook Technologies, LLCInventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
-
Publication number: 20210081050Abstract: Disclosed herein are related to a system and a method for controlling a virtual reality based on a physical object. In one aspect, a shape of a hand of a user corresponding to a surface or a structure of a physical object is detected. In one aspect, according to the detected shape of the hand, an interactive feature for the surface or the structure of the physical object is generated in a virtual reality or augmented reality application. In one aspect, a user interaction with the interactive feature is detected. In one aspect, an action of the virtual reality or augmented reality application is initiated, in response to detecting the user interaction with the interactive feature.Type: ApplicationFiled: June 4, 2020Publication date: March 18, 2021Inventors: Qian Zhou, Kenrick Cheng-Kuo Kin
-
Patent number: 10896545Abstract: A system includes a near eye display (NED) comprising a substantially transparent electronic display that is configured to display images in accordance with display instructions, and an imaging device configured to capture one or more images of a portions of a local area surrounding the NED. The system further includes a controller configured to determine a position of an object within the local area using the captured one or more images and location information associated with the object. The controller accesses supplemental information regarding the object, and updates the display instructions to cause the substantially transparent electronic display to display at least a portion of the supplemental information about the object. The display of the at least a portion of the supplemental information is positioned within a threshold distance of the determined position of the object in an augmented reality environment as presented via the substantially transparent electronic display.Type: GrantFiled: November 29, 2017Date of Patent: January 19, 2021Assignee: Facebook Technologies, LLCInventors: Kenrick Cheng-kuo Kin, Albert Peter Hwang
-
Patent number: 10824244Abstract: The disclosed computer-implemented method may include tracking (1) a position of a primary real-world object within a real-world environment via a primary tracking method, and (2) a position of a secondary real-world object within the real-world environment via a secondary tracking method. The method may further include presenting (1) a primary virtual object at a position within an artificial environment corresponding to the tracked position of the primary real-world object, and (2) a secondary virtual object at a position within the artificial environment corresponding to the tracked position of the secondary real-world object. The method may further include (1) detecting an interaction of the primary real-world object with the secondary real-world object, and (2) transitioning to tracking the position of the primary real-world object via the secondary tracking method. Various other methods, systems, and computer-readable media are also disclosed.Type: GrantFiled: February 15, 2019Date of Patent: November 3, 2020Assignee: Facebook Technologies, LLCInventors: Kenrick Cheng-Kuo Kin, Maxime Ouellet
-
Patent number: 10783712Abstract: A near eye display (NED) system is configured to present an augmented reality environment to the user. In addition, the NED system uses an imaging device to detect individuals within a local area, and identify positions of hands of the individuals from which the NED system may be able to detect one or more performed gestures. In response to detecting a predetermined gesture, the NED system displays a user of the NED one or more virtual objects corresponding to the identified first gesture at a location associated with a predetermined portion of the individual's body. The displayed virtual objects may be referred to as “visual flair,” and are used to accentuate, stylize, or provide emphasis to gestures performed by individuals within the local area.Type: GrantFiled: June 27, 2018Date of Patent: September 22, 2020Assignee: Facebook Technologies, LLCInventors: Albert Peter Hwang, Kenrick Cheng-Kuo Kin
-
Publication number: 20200279104Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.Type: ApplicationFiled: March 16, 2020Publication date: September 3, 2020Inventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
-
Patent number: 10739861Abstract: A system includes a near eye display (NED) that comprises an optical assembly with an electronic display, an imaging sensor configured to capture images of a user's hands, and an eye imaging sensor configured to capture images of an eye of the user. The system also includes a controller configured to determine eye tracking information using the captured images of the eye, the eye tracking information indicating a gaze orientation, wherein the gaze orientation terminates at first location. The controller determines that a pose of the user's hand indicates a pinch gesture based on the captured images of the user's hands. The controller also identifies an object in the local area that is at the first location, and updates the display instructions to cause the electronic display to display an indication of a selection of the object in an artificial reality environment.Type: GrantFiled: January 10, 2018Date of Patent: August 11, 2020Assignee: Facebook Technologies, LLCInventors: Kenrick Cheng-kuo Kin, Albert Peter Hwang
-
Patent number: 10739869Abstract: An artificial-reality apparatus may include a wearable band dimensioned to be worn around a portion of a user's hand. The artificial-reality apparatus may also include a primary tactile-input location at an outer surface of the wearable band to facilitate inspecting an artificial-reality element when another hand of the user activates the primary tactile-input location. Additionally, the artificial-reality apparatus may include a secondary tactile-input location at the outer surface of the wearable band to facilitate manipulating the artificial-reality element when the user's other hand simultaneously activates the primary tactile-input location and the secondary tactile-input location. Furthermore, the artificial-reality apparatus may include a computing subsystem contained by the wearable band that communicatively couples the primary tactile-input location and the secondary tactile-input location to an artificial-reality system. Various other apparatuses, systems, and methods are also disclosed.Type: GrantFiled: May 1, 2018Date of Patent: August 11, 2020Assignee: Facebook Technologies, LLCInventor: Kenrick Cheng-Kuo Kin
-
Patent number: 10712901Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user. In some embodiments, multiple users may be in a local area, each using a different NED. A first user of a first NED may view virtual content using a first NED. The first NED may comprise an imaging device capable of capturing images of the local area, allowing the first NED to identify gestures performed by the first user and/or by other users in the local area. In some embodiments, the first NED may, in response to detecting one or more predetermined gestures performed by the first user, share virtual content displayed to the first user with a second user using a second NED, allowing the second user to view the virtual content through the second NED.Type: GrantFiled: June 27, 2018Date of Patent: July 14, 2020Assignee: Facebook Technologies, LLCInventors: Albert Peter Hwang, Daniel Andersen, Kenrick Cheng-Kuo Kin
-
Publication number: 20200159337Abstract: The disclosed computer-implemented method may include tracking (1) a position of a primary real-world object within a real-world environment via a primary tracking method, and (2) a position of a secondary real-world object within the real-world environment via a secondary tracking method. The method may further include presenting (1) a primary virtual object at a position within an artificial environment corresponding to the tracked position of the primary real-world object, and (2) a secondary virtual object at a position within the artificial environment corresponding to the tracked position of the secondary real-world object. The method may further include (1) detecting an interaction of the primary real-world object with the secondary real-world object, and (2) transitioning to tracking the position of the primary real-world object via the secondary tracking method. Various other methods, systems, and computer-readable media are also disclosed.Type: ApplicationFiled: February 15, 2019Publication date: May 21, 2020Inventors: Kenrick Cheng-Kuo Kin, Maxime Ouellet
-
Patent number: 10635895Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.Type: GrantFiled: June 27, 2018Date of Patent: April 28, 2020Assignee: Facebook Technologies, LLCInventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
-
Publication number: 20200005539Abstract: A near eye display (NED) system is configured to present an augmented reality environment to the user. In addition, the NED system uses an imaging device to detect individuals within a local area, and identify positions of hands of the individuals from which the NED system may be able to detect one or more performed gestures. In response to detecting a predetermined gesture, the NED system displays a user of the NED one or more virtual objects corresponding to the identified first gesture at a location associated with a predetermined portion of the individual's body. The displayed virtual objects may be referred to as “visual flair,” and are used to accentuate, stylize, or provide emphasis to gestures performed by individuals within the local area.Type: ApplicationFiled: June 27, 2018Publication date: January 2, 2020Inventors: Albert Peter Hwang, Kenrick Cheng-Kuo Kin
-
Publication number: 20200005026Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user and to manipulate displayed content items based upon gestures performed by users of the NED system. A user of the NED system may perform a gesture simulating the throwing of an object to “cast” a content item to a target location in an artificial reality (AR) environment displayed by the NED system. The gesture may comprise a first portion in which the user's hand “grabs” or “pinches” a virtual object corresponding to the content item and moves backwards relative to their body, and a second portion in which the user's hand moves forwards relative to their body and releases the virtual object. The target location may be identified based upon a trajectory associated with the backwards motion of the first portion of the gesture.Type: ApplicationFiled: June 27, 2018Publication date: January 2, 2020Inventors: Daniel Andersen, Albert Peter Hwang, Kenrick Cheng-Kuo Kin
-
Publication number: 20200004401Abstract: Embodiments are directed to a near eye display (NED) system for displaying artificial reality content to a user. In some embodiments, multiple users may be in a local area, each using a different NED. A first user of a first NED may view virtual content using a first NED. The first NED may comprise an imaging device capable of capturing images of the local area, allowing the first NED to identify gestures performed by the first user and/or by other users in the local area. In some embodiments, the first NED may, in response to detecting one or more predetermined gestures performed by the first user, share virtual content displayed to the first user with a second user using a second NED, allowing the second user to view the virtual content through the second NED.Type: ApplicationFiled: June 27, 2018Publication date: January 2, 2020Inventors: Albert Peter Hwang, Daniel Andersen, Kenrick Cheng-Kuo Kin
-
Publication number: 20190212828Abstract: A system includes a near eye display (NED) that is configured to display images in accordance with display instructions. The system also includes an imaging sensor configured to capture images. The system further includes a controller configured to identify the object in the captured images using one or more recognition patterns, and to determine a pose of the user's hand based on the captured images, with the determined pose indicating a touch gesture with the identified object. The controller also updates the display instructions to cause the electronic display to display a virtual menu in an artificial reality environment, with the virtual menu within a threshold distance of the position of the object in the artificial reality environment.Type: ApplicationFiled: January 10, 2018Publication date: July 11, 2019Inventors: Kenrick Cheng-kuo Kin, Albert Peter Hwang
-
Publication number: 20190212827Abstract: A system includes a near eye display (NED) that comprises an optical assembly with an electronic display, an imaging sensor configured to capture images of a user's hands, and an eye imaging sensor configured to capture images of an eye of the user. The system also includes a controller configured to determine eye tracking information using the captured images of the eye, the eye tracking information indicating a gaze orientation, wherein the gaze orientation terminates at first location. The controller determines that a pose of the user's hand indicates a pinch gesture based on the captured images of the user's hands. The controller also identifies an object in the local area that is at the first location, and updates the display instructions to cause the electronic display to display an indication of a selection of the object in an artificial reality environment.Type: ApplicationFiled: January 10, 2018Publication date: July 11, 2019Inventors: Kenrick Cheng-kuo Kin, Albert Peter Hwang
-
Patent number: 10261595Abstract: A system includes an electronic display configured to display one or more simulated objects in accordance with display instructions, an imaging sensor configured to capture images of a user's hands, and a console. The console is configured to receive the captured images from the imaging sensor, extract joint information of the user's hands from the captured images, and determine one or more poses based on the extracted joint information. In response to the determined poses indicating the user's index finger positioned orthogonally to the user's thumb, and the thumb within a minimum distance to the index finger, the console detects a directional pad display gesture, and update the display instructions to cause the electronic display to generate a simulated directional pad adjacent to the user's thumb in a simulated environment that is presented to the user via the electronic display.Type: GrantFiled: May 19, 2017Date of Patent: April 16, 2019Assignee: Facebook Technologies, LLCInventor: Kenrick Cheng-kuo Kin