Patents by Inventor Timo Arnall
Timo Arnall has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11531459Abstract: Systems and methods of providing gesture-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.Type: GrantFiled: April 12, 2021Date of Patent: December 20, 2022Assignee: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
-
Publication number: 20210232303Abstract: Systems and methods of providing gesture-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.Type: ApplicationFiled: April 12, 2021Publication date: July 29, 2021Applicant: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
-
Patent number: 11003345Abstract: Systems and methods of providing control-article-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.Type: GrantFiled: December 7, 2016Date of Patent: May 11, 2021Assignee: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
-
Patent number: 10936085Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.Type: GrantFiled: January 16, 2020Date of Patent: March 2, 2021Assignee: Google LLCInventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
-
Patent number: 10789781Abstract: Systems and methods for providing an interactive augmented experience using prerecorded video include: creating a scene model based on an image of a physical environment; generating a fantasy object; integrating a position of the fantasy object onto the scene model; determining a state of the fantasy object; selecting, using a type of meta data, one or more frames of a pre-recorded video of the physical environment associated with a desired physical camera, such that each of the frames is associated with a frame number and acquired with a physical camera; synchronizing a virtual camera with the desired physical camera; and projecting, using a first video player or a second video player, the one or more frames onto the scene model to position the scene model relative to the fantasy object, such that the projecting alternates between the first video player and the second video player.Type: GrantFiled: August 30, 2019Date of Patent: September 29, 2020Assignee: Playdeo LimitedInventors: Jack Schulze, Timo Arnall, Nicholas Ludlam
-
Publication number: 20200264765Abstract: Systems and methods of providing control-article-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.Type: ApplicationFiled: December 7, 2016Publication date: August 20, 2020Applicant: Google LLCInventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
-
Publication number: 20200150776Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.Type: ApplicationFiled: January 16, 2020Publication date: May 14, 2020Applicant: Google LLCInventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
-
Publication number: 20200074741Abstract: Described herein are systems and methods for providing an interactive augmented experience using prerecorded video including creating a scene model based; generating a fantasy object; integrating a position of the fantasy object onto the scene model; determining a state of the fantasy object; selecting, using a type of meta data, one or more frames of a pre-recorded video of the physical environment associated with a desired physical camera, wherein each of the one or more frames is associated with a frame number, wherein each of the one or more frames is acquired with a physical camera; synchronizing a virtual camera with the desired physical camera; and projecting, using the first video player or the second video player, the one or more frames onto the scene model to position the scene model relative to the fantasy object, wherein the projecting alternates between the first video player and the second video player.Type: ApplicationFiled: August 30, 2019Publication date: March 5, 2020Applicant: Playdeo LimitedInventors: Jack Schulze, Timo Arnall, Nicholas Ludlam
-
Patent number: 10572027Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.Type: GrantFiled: January 2, 2019Date of Patent: February 25, 2020Assignee: Google LLCInventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
-
Publication number: 20190138109Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.Type: ApplicationFiled: January 2, 2019Publication date: May 9, 2019Applicant: Google LLCInventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
-
Patent number: 10203763Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.Type: GrantFiled: October 5, 2015Date of Patent: February 12, 2019Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
-
Patent number: 10088908Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.Type: GrantFiled: September 23, 2015Date of Patent: October 2, 2018Assignee: Google LLCInventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
-
Publication number: 20160349845Abstract: Gesture detection haptics and virtual tools are described. In one example, movements are detected that involve contact in three-dimensional space, such as through use of radio waves, camera based techniques, and so forth. The contact provides haptic feedback to the user as part of making the movements. In another example, movements are detected that are used to both identify a virtual tool and a gesture that corresponds to the virtual tool. From these movements, gestures are identified that are used to initiate operations of a computing device.Type: ApplicationFiled: May 26, 2016Publication date: December 1, 2016Applicant: Google Inc.Inventors: Ivan Poupyrev, Timo Arnall, Carsten C. Schwesig, Jack Schulze