Patents by Inventor Alex Kipman
Alex Kipman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9911236Abstract: An optical see-through head-mounted display device includes a see-through lens which combines an augmented reality image with light from a real-world scene, while an opacity filter is used to selectively block portions of the real-world scene so that the augmented reality image appears more distinctly. The opacity filter can be a see-through LCD panel, for instance, where each pixel of the LCD panel can be selectively controlled to be transmissive or opaque, based on a size, shape and position of the augmented reality image. Eye tracking can be used to adjust the position of the augmented reality image and the opaque pixels. Peripheral regions of the opacity filter, which are not behind the augmented reality image, can be activated to provide a peripheral cue or a representation of the augmented reality image. In another aspect, opaque pixels are provided at a time when an augmented reality image is not present.Type: GrantFiled: February 12, 2016Date of Patent: March 6, 2018Assignee: Telefonaktiebolaget L M Ericsson (publ)Inventors: Avi Bar-Zeev, Bob Crocco, Alex Kipman, John Lewis
-
Patent number: 9861886Abstract: An virtual character such as an on-screen object, an avatar, an on-screen character, or the like may be animated using a live motion of a user and a pre-recorded motion. For example, a live motion of a user may be captured and a pre-recorded motion such as a pre-recorded artist generated motion, a pre-recorded motion of the user, and/or a programmatically controlled transformation may be received. The live motion may then be applied to a first portion of an the virtual character and the pre-recorded motion may be applied to a second portion of the virtual character such that the virtual character may be animated with a combination of the live and pre-recorded motions.Type: GrantFiled: July 14, 2014Date of Patent: January 9, 2018Assignee: Microsoft Technology Licensing, LLCInventors: Kathryn Stone Perez, Alex A. Kipman, Jeffrey Margolis
-
Patent number: 9761057Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.Type: GrantFiled: November 21, 2016Date of Patent: September 12, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
-
Publication number: 20170236332Abstract: A mixed-reality display device comprises an input system, a display, and a graphics processor. The input system is configured to receive a parameter value, the parameter value being one of a plurality of values of a predetermined range receivable by the input system. The display is configured to display virtual image content that adds an augmentation to a real-world environment viewed by a user of the mixed-reality display device. The graphics processor is coupled operatively to the input system and to the display; it is configured to render the virtual image content so as to variably change the augmentation, to variably change a perceived realism of the real world environment in correlation to the parameter value.Type: ApplicationFiled: October 21, 2016Publication date: August 17, 2017Inventors: Alex Kipman, Purnima M. Rao, Rebecca Haruyama, Shih-Sang Carnaven Chiu, Stuart Mayhew, Oscar E. Murillo, Carlos Fernando Faria Costa
-
Patent number: 9734636Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a first geo-located data item and provides a first visual information density level for the item to a head-mounted display device. When a spatial information density of geo-located data item information is below a threshold, the program provides a second visual information density level greater than the first level for a second geo-located data item displayed.Type: GrantFiled: March 7, 2017Date of Patent: August 15, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
-
Patent number: 9734633Abstract: A system and related methods for visually augmenting an appearance of a physical environment as seen by a user through a head-mounted display device are provided. In one embodiment, a virtual environment generating program receives eye-tracking information, lighting information, and depth information from the head-mounted display. The program generates a virtual environment that models the physical environment and is based on the lighting information and the distance of a real-world object from the head-mounted display. The program visually augments a virtual object representation in the virtual environment based on the eye-tracking information, and renders the virtual object representation on a transparent display of the head-mounted display device.Type: GrantFiled: January 27, 2012Date of Patent: August 15, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Darren Bennett, Brian Mount, Stephen Latta, Alex Kipman, Ryan Hastings, Arthur Tomlin, Sebastian Sylvan, Daniel McCulloch, Jonathan Steed, Jason Scott, Mathew Lamb
-
Publication number: 20170216718Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.Type: ApplicationFiled: April 19, 2017Publication date: August 3, 2017Inventors: R. STEPHEN POLZIN, ALEX A. KIPMAN, MARK J. FINOCCHIO, RYAN MICHAEL GEISS, KATHRYN STONE PEREZ, KUDO TSUNODA, DARREN ALEXANDER BENNETT
-
Publication number: 20170178412Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a first geo-located data item and provides a first visual information density level for the item to a head-mounted display device. When a spatial information density of geo-located data item information is below a threshold, the program provides a second visual information density level greater than the first level for a second geo-located data item displayed.Type: ApplicationFiled: March 7, 2017Publication date: June 22, 2017Applicant: Microsoft Technology Licensing, LLCInventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
-
Publication number: 20170144067Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.Type: ApplicationFiled: September 16, 2016Publication date: May 25, 2017Inventors: Oscar E. Murillo, Andy D. Wilson, Alex A. Kipman, Janet E. Galore
-
Patent number: 9656162Abstract: A system recognizes human beings in their natural environment, without special sensing devices attached to the subjects, uniquely identifies them and tracks them in three dimensional space. The resulting representation is presented directly to applications as a multi-point skeletal model delivered in real-time. The device efficiently tracks humans and their natural movements by understanding the natural mechanics and capabilities of the human muscular-skeletal system. The device also uniquely recognizes individuals in order to allow multiple people to interact with the system via natural movements of their limbs and body as well as voice commands/responses.Type: GrantFiled: April 14, 2014Date of Patent: May 23, 2017Assignee: Microsoft Technology Licensing, LLCInventors: R. Stephen Polzin, Alex A. Kipman, Mark J. Finocchio, Ryan Michael Geiss, Kathryn Stone Perez, Kudo Tsunoda, Darren Alexander Bennett
-
Patent number: 9619939Abstract: Embodiments that relate to presenting a plurality of visual information density levels for a plurality of geo-located data items in a mixed reality environment are disclosed. For example, in one disclosed embodiment a graduated information delivery program receives information for a selected geo-located data item and provides a minimum visual information density level for the item to a head-mounted display device. The program receives via the head-mounted display device a user input corresponding to the selected geo-located data item. Based on the input, the program provides an increasing visual information density level for the selected item to the head-mounted display device for display within the mixed reality environment.Type: GrantFiled: July 31, 2013Date of Patent: April 11, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda
-
Publication number: 20170069143Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.Type: ApplicationFiled: November 21, 2016Publication date: March 9, 2017Applicant: Microsoft Technology Licensing, LLCInventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, JR., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
-
Patent number: 9519989Abstract: Using facial recognition and gesture/body posture recognition techniques, a system can naturally convey the emotions and attitudes of a user via the user's visual representation. Techniques may comprise customizing a visual representation of a user based on detectable characteristics, deducting a user's temperament from the detectable characteristics, and applying attributes indicative of the temperament to the visual representation in real time. Techniques may also comprise processing changes to the user's characteristics in the physical space and updating the visual representation in real time. For example, the system may track a user's facial expressions and body movements to identify a temperament and then apply attributes indicative of that temperament to the visual representation. Thus, a visual representation of a user, which may be, for example, an avatar or fanciful character, can reflect the user's expressions and moods in real time.Type: GrantFiled: March 4, 2013Date of Patent: December 13, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Kathryn Stone Perez, Alex Kipman, Nicholas D. Burton, Andrew Wilson
-
Patent number: 9501873Abstract: Embodiments are disclosed that relate to operating a user interface on an augmented reality computing device comprising a see-through display system. For example, one disclosed embodiment includes identifying one or more objects located outside a field of view of a user, and for each object of the one or more objects, providing to the user an indication of positional information associated with the object.Type: GrantFiled: July 22, 2015Date of Patent: November 22, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Thomas George Salter, Ben Sugden, Daniel Deptford, Robert Crocco, Jr., Brian Keane, Laura Massey, Alex Kipman, Peter Tobias Kinnebrew, Nicholas Kamuda, Zachary Quarles, Michael Scavezze, Ryan Hastings, Cameron Brown, Tony Ambrus, Jason Scott, John Bevis, Jamie B. Kirschenbaum, Nicholas Gervase Fajt, Michael Klucher, Relja Markovic, Stephen Latta, Daniel McCulloch
-
Patent number: 9495801Abstract: An augmented reality device including a plurality of sensors configured to output pose information indicating a pose of the augmented reality device. The augmented reality device further includes a band-agnostic filter and a band-specific filter. The band-specific filter includes an error correction algorithm configured to receive pose information as filtered by the band-agnostic filter and reduce a tracking error of the pose information in a selected frequency band. The augmented reality device further includes a display engine configured to position a virtual object on a see-through display as a function of the pose information as filtered by the band-agnostic filter and the band-specific filter.Type: GrantFiled: May 1, 2014Date of Patent: November 15, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michael John Ebstyne, Frederik Schaffalitzky, Drew Steedly, Calvin Chan, Ethan Eade, Alex Kipman, Georg Klein
-
Publication number: 20160321841Abstract: A computing system and method for producing and consuming metadata within multi-dimensional data is provided. The computing system comprising a see-through display, a sensor system, and a processor configured to: in a recording phase, generate an annotation at a location in a three dimensional environment, receive, via the sensor system, a stream of telemetry data recording movement of a first user in the three dimensional environment, receive a message to be recorded from the first user, and store, in memory as annotation data for the annotation, the stream of telemetry data and the message, and in a playback phase, display a visual indicator of the annotation at the location, receive a selection of the visual indicator by a second user, display a simulacrum superimposed onto the three dimensional environment and animated according to the telemetry data, and present the message via the animated simulacrum.Type: ApplicationFiled: April 28, 2015Publication date: November 3, 2016Inventors: Jonathan Christen, John Charles Howard, Marcus Tanner, Ben Sugden, Robert C. Memmott, Kenneth Charles Ouellette, Alex Kipman, Todd Alan Omotani, James T. Reichert, JR.
-
Patent number: 9468848Abstract: Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space. In an example embodiment, the system may monitor a user's gestures and select a particular gesture dictionary in response to the manner in which the user performs the gestures. The gesture dictionary may be assigned in real time with respect to the capture of the data representative of a user's gesture. The system may generate calibration tests for assigning a gesture dictionary. The system may track the user during a set of short gesture calibration tests and assign the gesture dictionary based on a compilation of the data captured that represents the user's gestures.Type: GrantFiled: December 12, 2013Date of Patent: October 18, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Oscar E. Murillo, Andy Wilson, Alex A. Kipman, Janet Galore
-
Patent number: 9400559Abstract: Systems, methods and computer readable media are disclosed for gesture shortcuts. A user's movement or body position is captured by a capture device of a system, and is used as input to control the system. For a system-recognized gesture, there may be a full version of the gesture and a shortcut of the gesture. Where the system recognizes that either the full version of the gesture or the shortcut of the gesture has been performed, it sends an indication that the system-recognized gesture was observed to a corresponding application. Where the shortcut comprises a subset of the full version of the gesture, and both the shortcut and the full version of the gesture are recognized as the user performs the full version of the gesture, the system recognizes that only a single performance of the gesture has occurred, and indicates to the application as such.Type: GrantFiled: May 29, 2009Date of Patent: July 26, 2016Assignee: Microsoft Technology Licensing, LLCInventors: Stephen Latta, Kevin Geisner, John Clavin, Kudo Tsunoda, Kathryn Stone Perez, Alex Kipman, Relja Markovic, Gregory N. Snook
-
Publication number: 20160171779Abstract: An optical see-through head-mounted display device includes a see-through lens which combines an augmented reality image with light from a real-world scene, while an opacity filter is used to selectively block portions of the real-world scene so that the augmented reality image appears more distinctly. The opacity filter can be a see-through LCD panel, for instance, where each pixel of the LCD panel can be selectively controlled to be transmissive or opaque, based on a size, shape and position of the augmented reality image. Eye tracking can be used to adjust the position of the augmented reality image and the opaque pixels. Peripheral regions of the opacity filter, which are not behind the augmented reality image, can be activated to provide a peripheral cue or a representation of the augmented reality image. In another aspect, opaque pixels are provided at a time when an augmented reality image is not present.Type: ApplicationFiled: February 12, 2016Publication date: June 16, 2016Inventors: Avi Bar-Zeev, Bob Crocco, Alex Kipman, John Lewis
-
Patent number: 9361732Abstract: Various embodiments relating to controlling a see-through display are disclosed. In one embodiment, virtual objects may be displayed on the see-through display. The virtual objects transition between having a position that is body-locked and a position that is world-locked based on various transition events.Type: GrantFiled: May 1, 2014Date of Patent: June 7, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Michael John Ebstyne, Frederik Schaffalitzky, Stephen Latta, Paul Albert Lalonde, Drew Steedly, Alex Kipman, Ethan Eade