Patents by Inventor Simon Charles Tickner
Simon Charles Tickner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12099693Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.Type: GrantFiled: April 12, 2022Date of Patent: September 24, 2024Assignee: Meta Platforms Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 11422669Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises a head-mounted display configured to output artificial reality content; a stylus; a stylus action detector configured to detect movement of the stylus, detect a stylus selection action, and after detecting the stylus selection action, detect further movement of the stylus; a UI engine configured to generate stylus movement content in response to detecting movement of the stylus, and generate a UI input element in response to detecting the stylus selection action; and a rendering engine configured to render the stylus movement content and the UI input element as overlays to the artificial reality content, and update the stylus movement content based on the further movement of the stylus.Type: GrantFiled: June 7, 2019Date of Patent: August 23, 2022Assignee: FACEBOOK TECHNOLOGIES, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20220244834Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.Type: ApplicationFiled: April 12, 2022Publication date: August 4, 2022Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 11334212Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.Type: GrantFiled: June 7, 2019Date of Patent: May 17, 2022Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 11086475Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device configured to capture image data representative of a physical environment, a head-mounted display (HMD) configured to output artificial reality content, a gesture detector configured to identify, from the image data, a gesture comprising a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that a thumb of the hand and at least one other finger form approximately a circle or approximately a circular segment, a user interface (UI) engine to generate a UI element in response to the identified gesture, and a rendering engine to render the UI element as an overlay to the artificial reality content.Type: GrantFiled: June 7, 2019Date of Patent: August 10, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 11043192Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a gesture detector, a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content. The gesture detector identifies, from the image data, a gesture including a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that an index finger and a thumb of the hand form approximately a right angle. The UI engine generates a UI element in response to the identified gesture. The rendering engine renders the UI element as an overlay to the artificial reality content.Type: GrantFiled: June 7, 2019Date of Patent: June 22, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 11003307Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device is configured to capture image data representative of a physical environment. The HMD is configured to output artificial reality content including a representation of a wrist. The rendering engine configured to render a user interface (UI) element. The gesture detector configured to identify a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at the location of the UI element, and a pulling motion away from the wrist while in the gripping configuration.Type: GrantFiled: June 7, 2019Date of Patent: May 11, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 10990240Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs artificial reality content. The artificial reality system renders a container that includes application content items as an overlay to the artificial reality content. The artificial reality system identifies, from the image data, a selection gesture comprising a configuration of a hand that is substantially stationary for a threshold period of time at a first location corresponding to a first application content item within the container, and a subsequent movement of the hand from the first location to a second location outside the container. The artificial reality system renders the first application content item at the second location in response.Type: GrantFiled: June 7, 2019Date of Patent: April 27, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 10955929Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs the artificial reality content. The artificial reality system identifies, from the image data, a gesture comprising a motion of a first digit of a hand and a second digit of the hand to form a pinching configuration a particular number of times within a threshold amount of time. The artificial reality system assigns one or more input characters to one or more of a plurality of digits of the hand and processes a selection of a first input character of the one or more input characters assigned to the second digit of the hand in response to the identified gesture.Type: GrantFiled: June 7, 2019Date of Patent: March 23, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 10921879Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content, the artificial reality content including an assistant element. The gesture detector identifies, from the image data, a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at a location that corresponds to the assistant element, and subsequent to the gripping motion, a throwing motion of the hand with respect to the assistant element. The UI engine generates a UI element in response to identifying the gesture.Type: GrantFiled: June 7, 2019Date of Patent: February 16, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 10890983Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system can include a menu that can be activated and interacted with using one hand. In response to detecting a menu activation gesture performed using one hand, the artificial reality system can cause a menu to be rendered. A menu sliding gesture (e.g., horizontal motion) of the hand can be used to cause a slidably engageable user interface (UI) element to move along a horizontal dimension of the menu while horizontal positioning of the UI menu is held constant. Motion of the hand orthogonal to the menu sliding gesture (e.g., non-horizontal motion) can cause the menu to be repositioned. The implementation of the artificial reality system does require use of both hands or use of other input devices in order to interact with the artificial reality system.Type: GrantFiled: June 7, 2019Date of Patent: January 12, 2021Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20200387286Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device configured to capture image data representative of a physical environment. The HMD outputs artificial reality content. The gesture detector is configured to identify, from the image data, a gesture including a configuration of a wrist that is substantially stationary for at least a threshold period of time and positioned such that a normal from the wrist is facing the HMD. The UI engine is configured to generate a UI element in response to the identified gesture. The rendering engine renders the UI element overlaid on an image of the wrist.Type: ApplicationFiled: June 7, 2019Publication date: December 10, 2020Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20200387228Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system can include a menu that can be activated and interacted with using one hand. In response to detecting a menu activation gesture performed using one hand, the artificial reality system can cause a menu to be rendered. A menu sliding gesture (e.g., horizontal motion) of the hand can be used to cause a slidably engageable user interface (UI) element to move along a horizontal dimension of the menu while horizontal positioning of the UI menu is held constant. Motion of the hand orthogonal to the menu sliding gesture (e.g., non-horizontal motion) can cause the menu to be repositioned.Type: ApplicationFiled: June 7, 2019Publication date: December 10, 2020Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20200387213Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content, the artificial reality content including an assistant element. The gesture detector identifies, from the image data, a gesture that includes a gripping motion of two or more digits of a hand to form a gripping configuration at a location that corresponds to the assistant element, and subsequent to the gripping motion, a throwing motion of the hand with respect to the assistant element. The UI engine generates a UI element in response to identifying the gesture.Type: ApplicationFiled: June 7, 2019Publication date: December 10, 2020Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20200388247Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a gesture detector, a user interface (UI) engine, and a rendering engine. The image capture device captures image data representative of a physical environment. The HMD outputs artificial reality content. The gesture detector identifies, from the image data, a gesture including a configuration of a hand that is substantially stationary for at least a threshold period of time and positioned such that an index finger and a thumb of the hand form approximately a right angle. The UI engine generates a UI element in response to the identified gesture. The rendering engine renders the UI element as an overlay to the artificial reality content.Type: ApplicationFiled: June 7, 2019Publication date: December 10, 2020Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20200387214Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment, renders artificial reality content and a virtual keyboard with a plurality of virtual keys as an overlay to the artificial reality content, and outputs the artificial reality content and the virtual keyboard. The artificial reality system identifies, from the image data, a gesture comprising a first digit of a hand being brought in contact with a second digit of the hand, wherein a point of the contact corresponds to a location of a first virtual key of the plurality of virtual keys of the virtual keyboard. The artificial reality system processes a selection of the first virtual key in response to the identified gesture.Type: ApplicationFiled: June 7, 2019Publication date: December 10, 2020Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20200387229Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system captures image data representative of a physical environment and outputs the artificial reality content. The artificial reality system identifies, from the image data, a gesture comprising a motion of a first digit of a hand and a second digit of the hand to form a pinching configuration a particular number of times within a threshold amount of time. The artificial reality system assigns one or more input characters to one or more of a plurality of digits of the hand and processes a selection of a first input character of the one or more input characters assigned to the second digit of the hand in response to the identified gesture.Type: ApplicationFiled: June 7, 2019Publication date: December 10, 2020Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Publication number: 20200387287Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. In one example, an artificial reality system comprises an image capture device configured to capture image data representative of a physical environment; a head-mounted display (HMD) configured to output artificial reality content; a gesture detector configured to identify, from the image data, a gesture comprising a motion of two fingers from a hand to form a pinching configuration and a subsequent pulling motion while in the pinching configuration; a user interface (UI) engine configured to generate a UI input element in response to identifying the gesture; and a rendering engine configured to render the UI input element as an overlay to at least some of the artificial reality content.Type: ApplicationFiled: June 7, 2019Publication date: December 10, 2020Inventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox
-
Patent number: 10852839Abstract: An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system includes an image capture device, a head-mounted display (HMD), a user interface (UI) engine, and a rendering engine. The head-mounted display (HMD) outputs artificial reality content, the artificial reality content including a display element that appears superimposed on and attached to an arm. The gesture detector identifies, from the image data, a gesture that includes a gripping motion of a hand with respect to the display element. The UI engine is configured to, in response to the identification of the gesture, (i) update the display element to appear detached from and separate from the arm, and (ii) generate a UI element that appears detached from the arm.Type: GrantFiled: June 7, 2019Date of Patent: December 1, 2020Assignee: Facebook Technologies, LLCInventors: Jonathan Ravasz, Jasper Stevens, Adam Tibor Varga, Etienne Pinchon, Simon Charles Tickner, Jennifer Lynn Spurlock, Kyle Eric Sorge-Toomey, Robert Ellis, Barrett Fox