Patents by Inventor Tim Psiaki

Tim Psiaki has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9710130
    Abstract: A user input for a near-eye, see-through display device is disclosed. Hands-free user input in an augmented reality environment is provided for. A user can provide input by moving the orientation of their head. For example, the user could rotate their head. In one aspect, a user can provide input by moving their eye gaze along a direction. In one aspect, when the user directs their attention at a user interface symbol, a handle extends away from the user interface symbol. The handle may serve as a type of selection device such that if the user directs their attention along the handle, away from the user interface symbol, a selection can be made. “As one example, the selection causes a spoke menu to appear which the user can select by rotating their head such that the system determines the user is looking along the spoke away from a central hub.
    Type: Grant
    Filed: June 12, 2013
    Date of Patent: July 18, 2017
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Kathleen Mulcahy, Aaron Burns, Todd Omotani, Felicia Williams, Jeff Cole, Tim Psiaki
  • Patent number: 9445048
    Abstract: Systems and methods are disclosed for gesture-initiated actions in videoconferences. In one implementation, a processing device receives one or more content streams as part of a communication session. The processing device identifies, within the one or more content streams, a request for feedback. The processing device processes, based on an identification of a request for feedback within the one of the plurality of content streams, the one or more content streams to identify a presence of one or more gestures within at least one of the one or more content streams. The processing device initiates, based on an identification of the presence of one or more gestures within at least one of the one or more content streams, an action with respect to the communication session.
    Type: Grant
    Filed: July 29, 2014
    Date of Patent: September 13, 2016
    Assignee: GOOGLE INC.
    Inventors: Mehul Nariyawala, Rahul Garg, Navneet Dalal, Thor Carpenter, Greg Burgess, Tim Psiaki, Mark Chang, Antonio Bernardo Monteiro Costa, Christian Plagemann, Chee Chew
  • Patent number: 8957858
    Abstract: Systems and methods for multi-platform motion interactivity, is provided. The system includes a motion-sensing subsystem, a display subsystem including a display, a logic subsystem, and a data-holding subsystem containing instructions executable by the logic subsystem. The system configured to display a displayed scene on the display; receive a dynamically-changing motion input from the motion-sensing subsystem that is generated in response to movement of a tracked object; generate, in real time, a dynamically-changing 3D spatial model of the tracked object based on the motion input; control, based on the movement of the tracked object and using the 3D spatial model, motion within the displayed scene. The system further configured to receive, from a secondary computing system, a secondary input; and control the displayed scene in response to the secondary input to visually represent interaction between the motion input and the secondary input.
    Type: Grant
    Filed: May 27, 2011
    Date of Patent: February 17, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Dan Osborn, Christopher Willoughby, Brian Mount, Vaibhav Goel, Tim Psiaki, Shawn C. Wright, Christopher Vuchetich
  • Publication number: 20140372944
    Abstract: A user input for a near-eye, see-through display device is disclosed. Hands-free user input in an augmented reality environment is provided for. A user can provide input by moving the orientation of their head. For example, the user could rotate their head. In one aspect, a user can provide input by moving their eye gaze along a direction. In one aspect, when the user directs their attention at a user interface symbol, a handle extends away from the user interface symbol. The handle may serve as a type of selection device such that if the user directs their attention along the handle, away from the user interface symbol, a selection can be made. “As one example, the selection causes a spoke menu to appear which the user can select by rotating their head such that the system determines the user is looking along the spoke away from a central hub.
    Type: Application
    Filed: June 12, 2013
    Publication date: December 18, 2014
    Inventors: Kathleen Mulcahy, Aaron Burns, Todd Omotani, Felicia Williams, Jeff Cole, Tim Psiaki
  • Publication number: 20120299827
    Abstract: Systems and methods for multi-platform motion interactivity, is provided. The system includes a motion-sensing subsystem, a display subsystem including a display, a logic subsystem, and a data-holding subsystem containing instructions executable by the logic subsystem. The system configured to display a displayed scene on the display; receive a dynamically-changing motion input from the motion-sensing subsystem that is generated in response to movement of a tracked object; generate, in real time, a dynamically-changing 3D spatial model of the tracked object based on the motion input; control, based on the movement of the tracked object and using the 3D spatial model, motion within the displayed scene. The system further configured to receive, from a secondary computing system, a secondary input; and control the displayed scene in response to the secondary input to visually represent interaction between the motion input and the secondary input.
    Type: Application
    Filed: May 27, 2011
    Publication date: November 29, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Dan Osborn, Christopher Willoughby, Brian Mount, Vaibhav Goel, Tim Psiaki, Shawn C. Wright, Christopher Vuchetich