Patents by Inventor Forest Woodcroft Gouin
Forest Woodcroft Gouin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230401178Abstract: A metafolder system manages a graph that identifies activities that are performed with respect to digital cards and that identifies relationships among the digital cards. The digital cards from a plurality of different locations in the de-centralized data storage system architecture can be aggregated for projection as coming from a single virtual location.Type: ApplicationFiled: June 13, 2022Publication date: December 14, 2023Inventors: Salil DAS, Peter Loren ENGRAV, Cezar Augusto Alevatto Guimaraes Neto, Daniel Gabriel ZARZAR, Forest Woodcroft GOUIN
-
Patent number: 10740971Abstract: A wearable, head-mounted display system includes a near-eye display to display an augmented reality object perceivable at an apparent real world depth and an apparent real world location by a wearer of the head-mounted display system, and a controller to adjust the apparent real world location of the augmented reality object as a function of a field of view (FOV) of the wearer. The function is based on a bounding region of the augmented reality object and one or more overlap parameters between the bounding region of the augmented reality object and the FOV of the wearer.Type: GrantFiled: June 19, 2015Date of Patent: August 11, 2020Assignee: Microsoft Technology Licensing, LLCInventors: Scott Ramsby, Joe Thompson, Dan Osborn, Shawn Crispin Wright, Brian Kramp, Megan Saunders, Forest Woodcroft Gouin
-
Patent number: 10409443Abstract: A head mounted display device and method for contextual cursors comprising a camera configured to capture a series of images, a display, and a processor configured to: display, to a user of the head mounted display device, a virtual cursor on display so as to appear at a location in a three dimensional environment, determine whether motion of a hand of the user in the images is trackable, and in response to at least determining that motion of the hand is trackable, modify a visual appearance of the virtual cursor to indicate that motion of the hand is trackable, and in response to at least determining that motion of the hand is not trackable, modify the visual appearance of the virtual cursor to indicate that motion of the hand is not trackable.Type: GrantFiled: June 24, 2015Date of Patent: September 10, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Shawn Crispin Wright, Dan Osborn, Joe Thompson, Scott Robert Ramsby, Forest Woodcroft Gouin, Megan Saunders, Aaron David Cottle
-
Patent number: 10146303Abstract: A method to provide visual feedback for gazed-based user-interface navigation includes presenting, on a display, a first image representing a digital object available for user interaction, recognizing a user gaze axis, and computing a point of intersection of the user gaze axis through the first image. An offset distance between the point of intersection and a reference position of the first image then recognized, and a second image is presented on the display. The second image is presented displaced from the point of intersection by an amount dependent on the offset distance.Type: GrantFiled: May 20, 2015Date of Patent: December 4, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Alberto Cerriteno, Aaron Chandler Jeromin, Megan Saunders, Dan Osborn, Adam Christopher Heaney, Forest Woodcroft Gouin, James Nakashima, Patrick Ryan
-
Patent number: 10127917Abstract: A conferencing system includes a display device that displays video received from a remote communication device of a communication partner. An audio stream is transmitted to the remote communication device. The audio stream includes real-world sounds produced by one or more real-world audio sources captured by a microphone array and virtual sounds produced by one or more virtual audio sources. A relative volume of sounds in the audio stream is selectively adjusted based, at least in part, on real-world positioning of corresponding audio sources, including real-world and/or virtualized audio sources.Type: GrantFiled: December 16, 2016Date of Patent: November 13, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Shawn Crispin Wright, Dan Osborn, Joe Thompson, Hongwu Huai, Forest Woodcroft Gouin, Megan Saunders
-
Patent number: 9934614Abstract: An example wearable display system includes a controller, a left display to display a left-eye augmented reality image with a left-eye display size at left-eye display coordinates, and a right display to display a right-eye augmented reality image with a right-eye display size at right-eye display coordinates, the left-eye and right-eye augmented reality images collectively forming an augmented reality object perceivable at an apparent real world depth by a wearer of the display system. The controller sets the left-eye display coordinates relative to the right-eye display coordinates as a function of the apparent real world depth of the augmented reality object. The function maintains an aspect of the left-eye and right-eye display sizes throughout a non-scaling range of apparent real world depths of the augmented reality object, and the function scales the left-eye and right-eye display sizes with changing apparent real world depth outside the non-scaling range.Type: GrantFiled: May 20, 2015Date of Patent: April 3, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Scott Ramsby, Dan Osborn, Shawn Wright, Anatolie Gavriliuc, Forest Woodcroft Gouin, Megan Saunders, Jesse Rapczak, Stephen Latta, Adam G. Poulos, Daniel McCulloch, Wei Zhang
-
Patent number: 9898865Abstract: A method for operating a computing device is described herein. The method includes determining a user's gaze direction based on a gaze input, determining an intersection between the user's gaze direction and an identified environmental surface in a 3-dimensional environment, and generating a drawing surface based on the intersection within a user interface on a display.Type: GrantFiled: June 22, 2015Date of Patent: February 20, 2018Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Joe Thompson, Dan Osborn, Tarek Hefny, Stephen G. Latta, Forest Woodcroft Gouin, James Nakashima, Megan Saunders, Anatolie Gavriliuc, Alberto E. Cerriteno, Shawn Crispin Wright
-
Patent number: 9824499Abstract: A head-mounted display includes a visible-light camera configured to collect a visible-light image of a physical space, a surface sensor configured to measure one or more surface parameters of the physical space, a see-through display configured to visually present an augmentation image while light from the physical space passes through the see-through display to a user eye, and an augmented-reality engine. The augmented-reality engine may be configured to identify a surface of the physical space from the one or more measured surface parameters, compose a mixed-reality image that includes the augmentation image overlaid on the visible-light image, and visually present, via the see-through display, the mixed-reality image in alignment with the identified surface.Type: GrantFiled: June 23, 2015Date of Patent: November 21, 2017Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Shawn Crispin Wright, Dan Osborn, Alberto E. Cerriteno, Bofan Hsu, Evan Jones, Joe Thompson, Forest Woodcroft Gouin, Megan Saunders
-
Publication number: 20170098453Abstract: A conferencing system includes a display device that displays video received from a remote communication device of a communication partner. An audio stream is transmitted to the remote communication device. The audio stream includes real-world sounds produced by one or more real-world audio sources captured by a microphone array and virtual sounds produced by one or more virtual audio sources. A relative volume of sounds in the audio stream is selectively adjusted based, at least in part, on real-world positioning of corresponding audio sources, including real-world and/or virtualized audio sources.Type: ApplicationFiled: December 16, 2016Publication date: April 6, 2017Applicant: Microsoft Technology Licensing, LLCInventors: Shawn Crispin Wright, Dan Osborn, Joe Thompson, Hongwu Huai, Forest Woodcroft Gouin, Megan Saunders
-
Publication number: 20160379660Abstract: A conferencing system includes a near-eye display device that displays video received from a remote communication device of a communication partner. An audio stream is transmitted to the remote communication device. The audio stream includes real-world sounds produced by one or more real-world audio sources captured by a spatially-diverse microphone array and virtual sounds produced by one or more virtual audio sources. A relative volume of background sounds in the audio stream is selectively reduced based, at least in part, on real-world positioning of corresponding audio sources, including real-world and/or virtualized audio sources.Type: ApplicationFiled: June 24, 2015Publication date: December 29, 2016Inventors: Shawn Crispin Wright, Dan Osborn, Joe Thompson, Hongwu Huai, Forest Woodcroft Gouin, Megan Saunders
-
Publication number: 20160379408Abstract: A head-mounted display includes a visible-light camera configured to collect a visible-light image of a physical space, a surface sensor configured to measure one or more surface parameters of the physical space, a see-through display configured to visually present an augmentation image while light from the physical space passes through the see-through display to a user eye, and an augmented-reality engine. The augmented-reality engine may be configured to identify a surface of the physical space from the one or more measured surface parameters, compose a mixed-reality image that includes the augmentation image overlaid on the visible-light image, and visually present, via the see-through display, the mixed-reality image in alignment with the identified surface.Type: ApplicationFiled: June 23, 2015Publication date: December 29, 2016Inventors: Shawn Crispin Wright, Dan Osborn, Alberto E. Cerriteno, Bofan Hsu, Evan Jones, Joe Thompson, Forest Woodcroft Gouin, Megan Saunders
-
Publication number: 20160378294Abstract: A head mounted display device and method for contextual cursors comprising a camera configured to capture a series of images, a display, and a processor configured to: display, to a user of the head mounted display device, a virtual cursor on display so as to appear at a location in a three dimensional environment, determine whether motion of a hand of the user in the images is trackable, and in response to at least determining that motion of the hand is trackable, modify a visual appearance of the virtual cursor to indicate that motion of the hand is trackable, and in response to at least determining that motion of the hand is not trackable, modify the visual appearance of the virtual cursor to indicate that motion of the hand is not trackable.Type: ApplicationFiled: June 24, 2015Publication date: December 29, 2016Inventors: Shawn Crispin Wright, Dan Osborn, Joe Thompson, Scott Robert Ramsby, Forest Woodcroft Gouin, Megan Saunders, Aaron David Cottle
-
Patent number: 9530426Abstract: A conferencing system includes a near-eye display device that displays video received from a remote communication device of a communication partner. An audio stream is transmitted to the remote communication device. The audio stream includes real-world sounds produced by one or more real-world audio sources captured by a spatially-diverse microphone array and virtual sounds produced by one or more virtual audio sources. A relative volume of background sounds in the audio stream is selectively reduced based, at least in part, on real-world positioning of corresponding audio sources, including real-world and/or virtualized audio sources.Type: GrantFiled: June 24, 2015Date of Patent: December 27, 2016Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Shawn Crispin Wright, Dan Osborn, Joe Thompson, Hongwu Huai, Forest Woodcroft Gouin, Megan Saunders
-
Publication number: 20160371886Abstract: A method for operating a computing device is described herein. The method includes determining a user's gaze direction based on a gaze input, determining an intersection between the user's gaze direction and an identified environmental surface in a 3-dimensional environment, and generating a drawing surface based on the intersection within a user interface on a display.Type: ApplicationFiled: June 22, 2015Publication date: December 22, 2016Inventors: Joe Thompson, Dan Osborn, Tarek Hefny, Stephen G. Latta, Forest Woodcroft Gouin, James Nakashima, Megan Saunders, Anatolie Gavriliuc, Alberto E. Cerriteno, Shawn Crispin Wright
-
Publication number: 20160210784Abstract: A wearable, head-mounted display system includes a near-eye display to display an augmented reality object perceivable at an apparent real world depth and an apparent real world location by a wearer of the head-mounted display system, and a controller to adjust the apparent real world location of the augmented reality object as a function of a field of view (FOV) of the wearer. The function is based on a bounding region of the augmented reality object and one or more overlap parameters between the bounding region of the augmented reality object and the FOV of the wearer.Type: ApplicationFiled: June 19, 2015Publication date: July 21, 2016Inventors: Scott Ramsby, Joe Thompson, Dan Osborn, Shawn Crispin Wright, Brian Kramp, Megan Saunders, Forest Woodcroft Gouin
-
Publication number: 20160209917Abstract: A method to provide visual feedback for gazed-based user-interface navigation includes presenting, on a display, a first image representing a digital object available for user interaction, recognizing a user gaze axis, and computing a point of intersection of the user gaze axis through the first image. An offset distance between the point of intersection and a reference position of the first image then recognized, and a second image is presented on the display. The second image is presented displaced from the point of intersection by an amount dependent on the offset distance.Type: ApplicationFiled: May 20, 2015Publication date: July 21, 2016Inventors: Alberto Cerriteno, Aaron Chandler Jeromin, Megan Saunders, Dan Osborn, Adam Christopher Heaney, Forest Woodcroft Gouin, James Nakashima, Patrick Ryan
-
Publication number: 20150254905Abstract: An example wearable display system includes a controller, a left display to display a left-eye augmented reality image with a left-eye display size at left-eye display coordinates, and a right display to display a right-eye augmented reality image with a right-eye display size at right-eye display coordinates, the left-eye and right-eye augmented reality images collectively forming an augmented reality object perceivable at an apparent real world depth by a wearer of the display system. The controller sets the left-eye display coordinates relative to the right-eye display coordinates as a function of the apparent real world depth of the augmented reality object. The function maintains an aspect of the left-eye and right-eye display sizes throughout a non-scaling range of apparent real world depths of the augmented reality object, and the function scales the left-eye and right-eye display sizes with changing apparent real world depth outside the non-scaling range.Type: ApplicationFiled: May 20, 2015Publication date: September 10, 2015Inventors: Scott Ramsby, Dan Osborn, Shawn Wright, Anatolie Gavriliuc, Forest Woodcroft Gouin, Megan Saunders, Jesse Rapczak, Stephen Latta, Adam G. Poulos, Daniel McCulloch, Wei Zhang