Patents by Inventor Mark Mihelich

Mark Mihelich has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240107258
    Abstract: Systems and methods for rendering spatial audio in accordance with embodiments of the invention are illustrated. One embodiment includes a spatial audio system, including a primary network connected speaker, including a plurality of sets of drivers, where each set of drivers is oriented in a different direction, a processor system, memory containing an audio player application, wherein the audio player application configures the processor system to obtain an audio source stream from an audio source via the network interface, spatially encode the audio source, decode the spatially encoded audio source to obtain driver inputs for the individual drivers in the plurality of sets of drivers, where the driver inputs cause the drivers to generate directional audio.
    Type: Application
    Filed: June 22, 2023
    Publication date: March 28, 2024
    Applicant: SYNG, Inc.
    Inventors: Christopher John Stringer, Afrooz Family, Fabian Renn-Giles, David Narajowski, Joshua Phillip Song, John Moreland, Pooja Patel, Pere Aizcorbe Arrocha, Nicholas Knudson, Nathan Hoyt, Marc Carino, Mark Rakes, Ryan Mihelich, Matthew Brown, Bas Ording, Robert Tilton, Jay Sterling Coggin, Lasse Vetter, Christos Kyriakakis, Matthew Robbetts, Matthias Kronlachner, Yuan-Yi Fan
  • Patent number: 9161012
    Abstract: Optical sensor information captured via one or more optical sensors imaging a scene that includes a human subject is received by a computing device. The optical sensor information is processed by the computing device to model the human subject with a virtual skeleton, and to obtain surface information representing the human subject. The virtual skeleton is transmitted by the computing device to a remote computing device at a higher frame rate than the surface information. Virtual skeleton frames are used by the remote computing device to estimate surface information for frames that have not been transmitted by the computing device.
    Type: Grant
    Filed: November 17, 2011
    Date of Patent: October 13, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Mark Mihelich, Kevin Geisner, Mike Scavezze, Stephen Latta, Daniel McCulloch, Brian Mount
  • Publication number: 20140380254
    Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.
    Type: Application
    Filed: September 4, 2014
    Publication date: December 25, 2014
    Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
  • Patent number: 8856691
    Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.
    Type: Grant
    Filed: May 29, 2009
    Date of Patent: October 7, 2014
    Assignee: Microsoft Corporation
    Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls
  • Publication number: 20130141419
    Abstract: A head-mounted display device is configured to visually augment an observed physical space to a user. The head-mounted display device includes a see-through display and is configured to receive augmented display information, such as a virtual object with occlusion relative to a real world object from a perspective of the see-through display.
    Type: Application
    Filed: December 1, 2011
    Publication date: June 6, 2013
    Inventors: Brian Mount, Stephen Latta, Daniel McCulloch, Kevin Geisner, Jason Scott, Jonathan Steed, Arthur Tomlin, Mark Mihelich
  • Publication number: 20130127994
    Abstract: Optical sensor information captured via one or more optical sensors imaging a scene that includes a human subject is received by a computing device. The optical sensor information is processed by the computing device to model the human subject with a virtual skeleton, and to obtain surface information representing the human subject. The virtual skeleton is transmitted by the computing device to a remote computing device at a higher frame rate than the surface information. Virtual skeleton frames are used by the remote computing device to estimate surface information for frames that have not been transmitted by the computing device.
    Type: Application
    Filed: November 17, 2011
    Publication date: May 23, 2013
    Inventors: Mark Mihelich, Kevin Geisner, Mike Scavezze, Stephen Latta, Daniel McCulloch, Brian Mount
  • Publication number: 20100306713
    Abstract: Systems, methods and computer readable media are disclosed for a gesture tool. A capture device captures user movement and provides corresponding data to a gesture recognizer engine and an application. From that, the data is parsed to determine whether it satisfies one or more gesture filters, each filter corresponding to user-performed gesture. The data and the information about the filters is also sent to a gesture tool, which displays aspects of the data and filters. In response to user input corresponding to a change in a filter, the gesture tool sends an indication of such to the gesture recognizer engine and application, where that change occurs.
    Type: Application
    Filed: May 29, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Kevin Geisner, Stephen Latta, Gregory N. Snook, Relja Markovic, Arthur Charles Tomlin, Mark Mihelich, Kyungsuk David Lee, David Jason Christopher Horbach, Matthew Jon Puls