Patents by Inventor Timo Arnall

Timo Arnall has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11531459
    Abstract: Systems and methods of providing gesture-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.
    Type: Grant
    Filed: April 12, 2021
    Date of Patent: December 20, 2022
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
  • Publication number: 20210232303
    Abstract: Systems and methods of providing gesture-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.
    Type: Application
    Filed: April 12, 2021
    Publication date: July 29, 2021
    Applicant: Google LLC
    Inventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
  • Patent number: 11003345
    Abstract: Systems and methods of providing control-article-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.
    Type: Grant
    Filed: December 7, 2016
    Date of Patent: May 11, 2021
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
  • Patent number: 10936085
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Grant
    Filed: January 16, 2020
    Date of Patent: March 2, 2021
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
  • Patent number: 10789781
    Abstract: Systems and methods for providing an interactive augmented experience using prerecorded video include: creating a scene model based on an image of a physical environment; generating a fantasy object; integrating a position of the fantasy object onto the scene model; determining a state of the fantasy object; selecting, using a type of meta data, one or more frames of a pre-recorded video of the physical environment associated with a desired physical camera, such that each of the frames is associated with a frame number and acquired with a physical camera; synchronizing a virtual camera with the desired physical camera; and projecting, using a first video player or a second video player, the one or more frames onto the scene model to position the scene model relative to the fantasy object, such that the projecting alternates between the first video player and the second video player.
    Type: Grant
    Filed: August 30, 2019
    Date of Patent: September 29, 2020
    Assignee: Playdeo Limited
    Inventors: Jack Schulze, Timo Arnall, Nicholas Ludlam
  • Publication number: 20200264765
    Abstract: Systems and methods of providing control-article-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.
    Type: Application
    Filed: December 7, 2016
    Publication date: August 20, 2020
    Applicant: Google LLC
    Inventors: Ivan Poupyrev, Carsten C. Schwesig, Jack Schulze, Timo Arnall
  • Publication number: 20200150776
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Application
    Filed: January 16, 2020
    Publication date: May 14, 2020
    Applicant: Google LLC
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
  • Publication number: 20200074741
    Abstract: Described herein are systems and methods for providing an interactive augmented experience using prerecorded video including creating a scene model based; generating a fantasy object; integrating a position of the fantasy object onto the scene model; determining a state of the fantasy object; selecting, using a type of meta data, one or more frames of a pre-recorded video of the physical environment associated with a desired physical camera, wherein each of the one or more frames is associated with a frame number, wherein each of the one or more frames is acquired with a physical camera; synchronizing a virtual camera with the desired physical camera; and projecting, using the first video player or the second video player, the one or more frames onto the scene model to position the scene model relative to the fantasy object, wherein the projecting alternates between the first video player and the second video player.
    Type: Application
    Filed: August 30, 2019
    Publication date: March 5, 2020
    Applicant: Playdeo Limited
    Inventors: Jack Schulze, Timo Arnall, Nicholas Ludlam
  • Patent number: 10572027
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Grant
    Filed: January 2, 2019
    Date of Patent: February 25, 2020
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
  • Publication number: 20190138109
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Application
    Filed: January 2, 2019
    Publication date: May 9, 2019
    Applicant: Google LLC
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
  • Patent number: 10203763
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Grant
    Filed: October 5, 2015
    Date of Patent: February 12, 2019
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
  • Patent number: 10088908
    Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
    Type: Grant
    Filed: September 23, 2015
    Date of Patent: October 2, 2018
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Carsten Schwesig, Jack Schulze, Timo Arnall, Durrell Grant Bevington Bishop
  • Publication number: 20160349845
    Abstract: Gesture detection haptics and virtual tools are described. In one example, movements are detected that involve contact in three-dimensional space, such as through use of radio waves, camera based techniques, and so forth. The contact provides haptic feedback to the user as part of making the movements. In another example, movements are detected that are used to both identify a virtual tool and a gesture that corresponds to the virtual tool. From these movements, gestures are identified that are used to initiate operations of a computing device.
    Type: Application
    Filed: May 26, 2016
    Publication date: December 1, 2016
    Applicant: Google Inc.
    Inventors: Ivan Poupyrev, Timo Arnall, Carsten C. Schwesig, Jack Schulze