Patents by Inventor Roger H. Hoole
Roger H. Hoole has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230115736Abstract: An interactive environment image may be displayed in a virtual environment space, and interaction with the interactive environment image may be detected within a three-dimensional space that corresponds to the virtual environment space. The interactive environment image may be a three-dimensional image, or it may be two-dimensional. An image is displayed to provide a visual representation of an interactive environment image including one or more virtual objects, which may be spatially positioned. User interaction with the visualized representation in the virtual environment space may be detected and, in response to user interaction, the interactive environment image may be changed.Type: ApplicationFiled: December 13, 2022Publication date: April 13, 2023Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Patent number: 11526238Abstract: An interactive environment image may be displayed in a virtual environment space, and interaction with the interactive environment image may be detected within a three-dimensional space that corresponds to the virtual environment space. The interactive environment image may be a three-dimensional image, or it may be two-dimensional. An image is displayed to provide a visual representation of an interactive environment image including one or more virtual objects, which may be spatially positioned. User interaction with the visualized representation in the virtual environment space may be detected and, in response to user interaction, the interactive environment image may be changed.Type: GrantFiled: February 23, 2021Date of Patent: December 13, 2022Assignee: MEP Tech, Inc.Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Publication number: 20210255728Abstract: An interactive environment image may be displayed in a virtual environment space, and interaction with the interactive environment image may be detected within a three-dimensional space that corresponds to the virtual environment space. The interactive environment image may be a three-dimensional image, or it may be two-dimensional. An image is displayed to provide a visual representation of an interactive environment image including one or more virtual objects, which may be spatially positioned. User interaction with the visualized representation in the virtual environment space may be detected and, in response to user interaction, the interactive environment image may be changed.Type: ApplicationFiled: February 23, 2021Publication date: August 19, 2021Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Patent number: 10928958Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: GrantFiled: May 26, 2020Date of Patent: February 23, 2021Assignee: MEP Tech, Inc.Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Publication number: 20200285346Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: ApplicationFiled: May 26, 2020Publication date: September 10, 2020Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Patent number: 10664105Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: GrantFiled: July 23, 2019Date of Patent: May 26, 2020Assignee: MEP Tech, Inc.Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Publication number: 20190346968Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: ApplicationFiled: July 23, 2019Publication date: November 14, 2019Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Publication number: 20190240567Abstract: An apparatus for detecting user interaction or another input with a projected image includes a scanning mechanism. The scanning mechanism is capable of scanning the at least one input region for any objects therein. An input identification mechanism may identify the input identified by the object, movement of the object, or a change in the object's position. That information may be used to alter the projected display.Type: ApplicationFiled: April 16, 2019Publication date: August 8, 2019Inventors: Mark L. Davis, John M. Black, Roger H. Hoole, Jeffrey Taylor, Kirby Bisline
-
Patent number: 10359888Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: GrantFiled: May 15, 2018Date of Patent: July 23, 2019Assignee: MEP Tech, Inc.Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Patent number: 10258878Abstract: An apparatus for detecting user interaction or another input with a projected display includes a boundary definition mechanism and a scanning mechanism. The boundary definition mechanism is capable of defining at least one input region. The scanning mechanism is capable of scanning the at least one input region for any objects therein. An input identification mechanism may identify the input identified by the object, movement of the object, or a change in the object's position. That information may be used to alter the projected display.Type: GrantFiled: August 22, 2017Date of Patent: April 16, 2019Assignee: MEP TechInventors: Mark L. Davis, John M. Black, Roger H. Hoole, Jeffrey Taylor, Kirby Bisline
-
Publication number: 20180260078Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: ApplicationFiled: May 15, 2018Publication date: September 13, 2018Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Patent number: 9971458Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: GrantFiled: January 24, 2017Date of Patent: May 15, 2018Assignee: MEP Tech, Inc.Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Patent number: 9946333Abstract: An accessory facilitating interaction with a projected image at least partially sourced by the image generation device. The accessory receives an input image from a device, and projects at least a derived image of the input image onto a surface on which the accessory sits. For instance, the accessory might project the input image itself, or perhaps some post-processed version of the input image. A camera system captures data representing user interaction with the projected image. Based on this data, an image input event is detected, and then communicated to the device. For instance, if the image generation device were a touch-sensitive device, the device may respond to the user contacting the projected image the same as it would if the user touched the image generation device at the same corresponding location. Embodiments described herein also relate color compensation of a displayed image.Type: GrantFiled: April 19, 2016Date of Patent: April 17, 2018Assignee: MEP TECH, INC.Inventors: Mark L. Davis, Matthew Lund Stoker, William Lorenzo Swank, Donald Roy Mealing, Roger H. Hoole, Jeffrey F. Taylor
-
Publication number: 20180024420Abstract: A projector that projects a visible image as well as a non-visible image. The non-visible image might be used for any purpose, but an example is in order to provide depth information regarding physical item(s) interacting with the projected visible image. The projector includes multiple projecting units (e.g., one for each pixel to be displayed), each including light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image. Optics may be positioned to project the visible image and the non-visible image. A depth sensing module detects depth of surfaces within the scope of the non-visible image using a reflected portion of the non-visible image. Methods for projecting an image are also disclosed.Type: ApplicationFiled: October 3, 2017Publication date: January 25, 2018Inventors: Donald Roy Mealing, Mark L. Davis, Roger H. Hoole, Matthew L. Stoker, W. Lorenzo Swank, Michael J. Bradshaw
-
Publication number: 20170368453Abstract: An apparatus for detecting user interaction or another input with a projected display includes a boundary definition mechanism and a scanning mechanism. The boundary definition mechanism is capable of defining at least one input region. The scanning mechanism is capable of scanning the at least one input region for any objects therein. An input identification mechanism may identify the input identified by the object, movement of the object, or a change in the object's position. That information may be used to alter the projected display.Type: ApplicationFiled: August 22, 2017Publication date: December 28, 2017Inventors: Mark L. Davis, John M. Black, Roger H. Hoole, Jeffrey Taylor, Kirby Bisline
-
Patent number: 9778546Abstract: A projector that projects a visible image as well as a non-visible image. The non-visible image might be used for any purpose, but an example is in order to provide depth information regarding physical item(s) interacting with the projected visible image. The projector includes multiple projecting units (e.g., one for each pixel to be displayed), each including light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image. Optics may be positioned to project the visible image and the non-visible image. A depth sensing module detects depth of surfaces within the scope of the non-visible image using a reflected portion of the non-visible image.Type: GrantFiled: August 15, 2013Date of Patent: October 3, 2017Assignee: MEP TECH, INC.Inventors: Donald Roy Mealing, Mark L. Davis, Roger H. Hoole, Matthew L. Stoker, W. Lorenzo Swank, Michael J. Bradshaw
-
Patent number: 9737798Abstract: A scanning game input mechanism that includes a light-emitting mechanism that defines multiple input regions for a game in which there are multiple players. Each of the input regions is a portion of the playing surface in which a corresponding player subset is to provide physical input (such as rolling dice, playing cards, or placing game pieces, and so forth) to affect game state. A scanning mechanism scans objects placed within the input regions, while a communication mechanism communicates information regarding the scanned object. The information might, for example, be communicated to affect an electronic game state maintained in another device or distributed across multiple devices.Type: GrantFiled: June 15, 2012Date of Patent: August 22, 2017Assignee: MEP Tech, Inc.Inventors: Mark L. Davis, John M. Black, Roger H. Hoole, Jeffrey Taylor, Kirby Bisline
-
Publication number: 20170235430Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: ApplicationFiled: January 24, 2017Publication date: August 17, 2017Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Patent number: 9550124Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three-dimensional image, or it may be two-dimensional. An image is projected onto a surface to provide a visual environment representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual environment space may be detected and, in response to user interaction, the projected visualized representation may be changed.Type: GrantFiled: August 19, 2014Date of Patent: January 24, 2017Assignee: MEP TECH, INC.Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
-
Publication number: 20160306418Abstract: An accessory facilitating interaction with a projected image at least partially sourced by the image generation device. The accessory receives an input image from a device, and projects at least a derived image of the input image onto a surface on which the accessory sits. For instance, the accessory might project the input image itself, or perhaps some post-processed version of the input image. A camera system captures data representing user interaction with the projected image. Based on this data, an image input event is detected, and then communicated to the device. For instance, if the image generation device were a touch-sensitive device, the device may respond to the user contacting the projected image the same as it would if the user touched the image generation device at the same corresponding location. Embodiments described herein also relate color compensation of a displayed image.Type: ApplicationFiled: April 19, 2016Publication date: October 20, 2016Inventors: Mark L. Davis, Matthew Lund Stoker, William Lorenzo Swank, Donald Roy Mealing, Roger H. Hoole, Jeffrey F. Taylor