Patents by Inventor Alex Kipman

Alex Kipman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240185579
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of computer vision and speech algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has computer vision and speech capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and computer vision and speech algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Application
    Filed: January 12, 2024
    Publication date: June 6, 2024
    Inventors: Michael EBSTYNE, Pedro Urbina ESCOS, Yuri PEKELNY, Jonathan Chi Hang CHAN, Emanuel SHALEV, Alex KIPMAN, Mark FLICK
  • Patent number: 11928856
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of computer vision and speech algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has computer vision and speech capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and computer vision and speech algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Grant
    Filed: May 5, 2022
    Date of Patent: March 12, 2024
    Assignee: Microsoft Technology Licensing, LLC.
    Inventors: Michael Ebstyne, Pedro Urbina Escos, Yuri Pekelny, Jonathan Chi Hang Chan, Emanuel Shalev, Alex Kipman, Mark Flick
  • Publication number: 20240062528
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of localization algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has localization capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and localization algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Application
    Filed: October 30, 2023
    Publication date: February 22, 2024
    Inventors: Michael EBSTYNE, Pedro Urbina ESCOS, Emanuel SHALEV, Alex KIPMAN, Yuri PEKELNY, Jonathan Chi Hang CHAN
  • Patent number: 11842529
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of localization algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has localization capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and localization algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Grant
    Filed: July 8, 2021
    Date of Patent: December 12, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Ebstyne, Pedro Urbina Escos, Emanuel Shalev, Alex Kipman, Yuri Pekelny, Jonathan Chi Hang Chan
  • Publication number: 20220261516
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of computer vision and speech algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has computer vision and speech capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and computer vision and speech algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Application
    Filed: May 5, 2022
    Publication date: August 18, 2022
    Inventors: Michael EBSTYNE, Pedro Urbina ESCOS, Yuri PEKELNY, Jonathan Chi Hang CHAN, Emanuel SHALEV, Alex KIPMAN, Mark FLICK
  • Patent number: 11354459
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of computer vision and speech algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has computer vision and speech capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and computer vision and speech algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Grant
    Filed: September 21, 2018
    Date of Patent: June 7, 2022
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Ebstyne, Pedro Urbina Escos, Yuri Pekelny, Jonathan Chi Hang Chan, Emanuel Shalev, Alex Kipman, Mark Flick
  • Publication number: 20210334601
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of localization algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has localization capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and localization algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Application
    Filed: July 8, 2021
    Publication date: October 28, 2021
    Inventors: Michael EBSTYNE, Pedro Urbina ESCOS, Emanuel SHALEV, Alex KIPMAN, Yuri PEKELNY, Jonathan Chi Hang CHAN
  • Patent number: 11087176
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of localization algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has localization capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and localization algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Grant
    Filed: May 8, 2018
    Date of Patent: August 10, 2021
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Michael Ebstyne, Pedro Urbina Escos, Emanuel Shalev, Alex Kipman, Yuri Pekelny, Jonathan Chi Hang Chan
  • Patent number: 10573085
    Abstract: A mixed-reality display device comprises an input system, a display, and a graphics processor. The input system is configured to receive a parameter value, the parameter value being one of a plurality of values of a predetermined range receivable by the input system. The display is configured to display virtual image content that adds an augmentation to a real-world environment viewed by a user of the mixed-reality display device. The graphics processor is coupled operatively to the input system and to the display; it is configured to render the virtual image content so as to variably change the augmentation, to variably change a perceived realism of the real world environment in correlation to the parameter value.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: February 25, 2020
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alex Kipman, Purnima M. Rao, Rebecca Haruyama, Shih-Sang Carnaven Chiu, Stuart Mayhew, Oscar E. Murillo, Carlos Fernando Faria Costa
  • Patent number: 10486065
    Abstract: A system to present the user a 3-D virtual environment as well as non-visual sensory feedback for interactions that user makes with virtual objects in that environment is disclosed. In an exemplary embodiment, a system comprising a depth camera that captures user position and movement, a three-dimensional (3-D) display device that presents the user a virtual environment in 3-D and a haptic feedback device provides haptic feedback to the user as he interacts with a virtual object in the virtual environment. As the user moves through his physical space, he is captured by the depth camera. Data from that depth camera is parsed to correlate a user position with a position in the virtual environment. Where the user position or movement causes the user to touch the virtual object, that is determined, and corresponding haptic feedback is provided to the user.
    Type: Grant
    Filed: July 27, 2011
    Date of Patent: November 26, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alex Kipman, Kudo Tsunoda, Todd Eric Holmdahl, John Clavin, Kathryn Stone Perez
  • Publication number: 20190347372
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of computer vision and speech algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has computer vision and speech capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and computer vision and speech algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Application
    Filed: September 21, 2018
    Publication date: November 14, 2019
    Inventors: Michael EBSTYNE, Pedro Urbina ESCOS, Yuri PEKELNY, Jonathan Chi Hang CHAN, Emanuel SHALEV, Alex KIPMAN, Mark FLICK
  • Publication number: 20190347369
    Abstract: A synthetic world interface may be used to model digital environments, sensors, and motions for the evaluation, development, and improvement of localization algorithms. A synthetic data cloud service with a library of sensor primitives, motion generators, and environments with procedural and game-like capabilities, facilitates engineering design for a manufactural solution that has localization capabilities. In some embodiments, a sensor platform simulator operates with a motion orchestrator, an environment orchestrator, an experiment generator, and an experiment runner to test various candidate hardware configurations and localization algorithms in a virtual environment, advantageously speeding development and reducing cost. Thus, examples disclosed herein may relate to virtual reality (VR) or mixed reality (MR) implementations.
    Type: Application
    Filed: May 8, 2018
    Publication date: November 14, 2019
    Inventors: Michael EBSTYNE, Pedro Urbina ESCOS, Emanuel SHALEV, Alex KIPMAN, Yuri PEKELNY, Jonathan Chi Hang CHAN
  • Patent number: 10388076
    Abstract: An optical see-through head-mounted display device includes a see-through lens which combines an augmented reality image with light from a real-world scene, while an opacity filter is used to selectively block portions of the real-world scene so that the augmented reality image appears more distinctly. The opacity filter can be a see-through LCD panel, for instance, where each pixel of the LCD panel can be selectively controlled to be transmissive or opaque, based on a size, shape and position of the augmented reality image. Eye tracking can be used to adjust the position of the augmented reality image and the opaque pixels. Peripheral regions of the opacity filter, which are not behind the augmented reality image, can be activated to provide a peripheral cue or a representation of the augmented reality image. In another aspect, opaque pixels are provided at a time when an augmented reality image is not present.
    Type: Grant
    Filed: January 31, 2018
    Date of Patent: August 20, 2019
    Assignee: TELEFONAKTIEBOLAGET LM ERICSSON (PUBL)
    Inventors: Avi Bar-Zeev, Bob Crocco, Alex Kipman, John Lewis
  • Publication number: 20190197784
    Abstract: A mixed-reality display device comprises an input system, a display, and a graphics processor. The input system is configured to receive a parameter value, the parameter value being one of a plurality of values of a predetermined range receivable by the input system. The display is configured to display virtual image content that adds an augmentation to a real-world environment viewed by a user of the mixed-reality display device. The graphics processor is coupled operatively to the input system and to the display; it is configured to render the virtual image content so as to variably change the augmentation, to variably change a perceived realism of the real world environment in correlation to the parameter value.
    Type: Application
    Filed: November 19, 2018
    Publication date: June 27, 2019
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Alex Kipman, Purnima M. Rao, Rebecca Haruyama, Shih-Sang Carnaven Chiu, Stuart Mayhew, Oscar E. Murillo, Carlos Fernando Faria Costa
  • Patent number: 10169922
    Abstract: A mixed-reality display device comprises an input system, a display, and a graphics processor. The input system is configured to receive a parameter value, the parameter value being one of a plurality of values of a predetermined range receivable by the input system. The display is configured to display virtual image content that adds an augmentation to a real-world environment viewed by a user of the mixed-reality display device. The graphics processor is coupled operatively to the input system and to the display; it is configured to render the virtual image content so as to variably change the augmentation, to variably change a perceived realism of the real world environment in correlation to the parameter value.
    Type: Grant
    Filed: October 21, 2016
    Date of Patent: January 1, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Alex Kipman, Purnima M. Rao, Rebecca Haruyama, Shih-Sang Carnaven Chiu, Stuart Mayhew, Oscar E. Murillo, Carlos Fernando Faria Costa
  • Patent number: 10055888
    Abstract: A computing system and method for producing and consuming metadata within multi-dimensional data is provided. The computing system comprising a see-through display, a sensor system, and a processor configured to: in a recording phase, generate an annotation at a location in a three dimensional environment, receive, via the sensor system, a stream of telemetry data recording movement of a first user in the three dimensional environment, receive a message to be recorded from the first user, and store, in memory as annotation data for the annotation, the stream of telemetry data and the message, and in a playback phase, display a visual indicator of the annotation at the location, receive a selection of the visual indicator by a second user, display a simulacrum superimposed onto the three dimensional environment and animated according to the telemetry data, and present the message via the animated simulacrum.
    Type: Grant
    Filed: April 28, 2015
    Date of Patent: August 21, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Jonathan Christen, John Charles Howard, Marcus Tanner, Ben Sugden, Robert C. Memmott, Kenneth Charles Ouellette, Alex Kipman, Todd Alan Omotani, James T. Reichert, Jr.
  • Publication number: 20180211448
    Abstract: An optical see-through head-mounted display device includes a see-through lens which combines an augmented reality image with light from a real-world scene, while an opacity filter is used to selectively block portions of the real-world scene so that the augmented reality image appears more distinctly. The opacity filter can be a see-through LCD panel, for instance, where each pixel of the LCD panel can be selectively controlled to be transmissive or opaque, based on a size, shape and position of the augmented reality image. Eye tracking can be used to adjust the position of the augmented reality image and the opaque pixels. Peripheral regions of the opacity filter, which are not behind the augmented reality image, can be activated to provide a peripheral cue or a representation of the augmented reality image. In another aspect, opaque pixels are provided at a time when an augmented reality image is not present.
    Type: Application
    Filed: January 31, 2018
    Publication date: July 26, 2018
    Inventors: Avi BAR-ZEEV, Bob CROCCO, Alex KIPMAN, John LEWIS
  • Patent number: 9977492
    Abstract: Embodiments that relate to presenting a mixed reality environment via a mixed reality display device are disclosed. For example, one disclosed embodiment provides a method for presenting a mixed reality environment via a head-mounted display device. The method includes using head pose data to generally identify one or more gross selectable targets within a sub-region of a spatial region occupied by the mixed reality environment. The method further includes specifically identifying a fine selectable target from among the gross selectable targets based on eye-tracking data. Gesture data is then used to identify a gesture, and an operation associated with the identified gesture is performed on the fine selectable target.
    Type: Grant
    Filed: December 6, 2012
    Date of Patent: May 22, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Peter Tobias Kinnebrew, Alex Kipman
  • Patent number: 9911236
    Abstract: An optical see-through head-mounted display device includes a see-through lens which combines an augmented reality image with light from a real-world scene, while an opacity filter is used to selectively block portions of the real-world scene so that the augmented reality image appears more distinctly. The opacity filter can be a see-through LCD panel, for instance, where each pixel of the LCD panel can be selectively controlled to be transmissive or opaque, based on a size, shape and position of the augmented reality image. Eye tracking can be used to adjust the position of the augmented reality image and the opaque pixels. Peripheral regions of the opacity filter, which are not behind the augmented reality image, can be activated to provide a peripheral cue or a representation of the augmented reality image. In another aspect, opaque pixels are provided at a time when an augmented reality image is not present.
    Type: Grant
    Filed: February 12, 2016
    Date of Patent: March 6, 2018
    Assignee: Telefonaktiebolaget L M Ericsson (publ)
    Inventors: Avi Bar-Zeev, Bob Crocco, Alex Kipman, John Lewis
  • Patent number: 9761057
    Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.
    Type: Grant
    Filed: November 21, 2016
    Date of Patent: September 12, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch