Patents Assigned to MEP Tech, Inc.
  • Patent number: 11526238
    Abstract: An interactive environment image may be displayed in a virtual environment space, and interaction with the interactive environment image may be detected within a three-dimensional space that corresponds to the virtual environment space. The interactive environment image may be a three-dimensional image, or it may be two-dimensional. An image is displayed to provide a visual representation of an interactive environment image including one or more virtual objects, which may be spatially positioned. User interaction with the visualized representation in the virtual environment space may be detected and, in response to user interaction, the interactive environment image may be changed.
    Type: Grant
    Filed: February 23, 2021
    Date of Patent: December 13, 2022
    Assignee: MEP Tech, Inc.
    Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
  • Patent number: 10928958
    Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.
    Type: Grant
    Filed: May 26, 2020
    Date of Patent: February 23, 2021
    Assignee: MEP Tech, Inc.
    Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
  • Patent number: 10664105
    Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.
    Type: Grant
    Filed: July 23, 2019
    Date of Patent: May 26, 2020
    Assignee: MEP Tech, Inc.
    Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
  • Patent number: 10359888
    Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.
    Type: Grant
    Filed: May 15, 2018
    Date of Patent: July 23, 2019
    Assignee: MEP Tech, Inc.
    Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
  • Patent number: 9971458
    Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three dimensional image, or it may be two dimensional. An image is projected onto a surface to provide a visual representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual space may be detected and, in response to user interaction, the projected visualized representation may be changed.
    Type: Grant
    Filed: January 24, 2017
    Date of Patent: May 15, 2018
    Assignee: MEP Tech, Inc.
    Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
  • Patent number: 9946333
    Abstract: An accessory facilitating interaction with a projected image at least partially sourced by the image generation device. The accessory receives an input image from a device, and projects at least a derived image of the input image onto a surface on which the accessory sits. For instance, the accessory might project the input image itself, or perhaps some post-processed version of the input image. A camera system captures data representing user interaction with the projected image. Based on this data, an image input event is detected, and then communicated to the device. For instance, if the image generation device were a touch-sensitive device, the device may respond to the user contacting the projected image the same as it would if the user touched the image generation device at the same corresponding location. Embodiments described herein also relate color compensation of a displayed image.
    Type: Grant
    Filed: April 19, 2016
    Date of Patent: April 17, 2018
    Assignee: MEP TECH, INC.
    Inventors: Mark L. Davis, Matthew Lund Stoker, William Lorenzo Swank, Donald Roy Mealing, Roger H. Hoole, Jeffrey F. Taylor
  • Patent number: 9778546
    Abstract: A projector that projects a visible image as well as a non-visible image. The non-visible image might be used for any purpose, but an example is in order to provide depth information regarding physical item(s) interacting with the projected visible image. The projector includes multiple projecting units (e.g., one for each pixel to be displayed), each including light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image. Optics may be positioned to project the visible image and the non-visible image. A depth sensing module detects depth of surfaces within the scope of the non-visible image using a reflected portion of the non-visible image.
    Type: Grant
    Filed: August 15, 2013
    Date of Patent: October 3, 2017
    Assignee: MEP TECH, INC.
    Inventors: Donald Roy Mealing, Mark L. Davis, Roger H. Hoole, Matthew L. Stoker, W. Lorenzo Swank, Michael J. Bradshaw
  • Patent number: 9737798
    Abstract: A scanning game input mechanism that includes a light-emitting mechanism that defines multiple input regions for a game in which there are multiple players. Each of the input regions is a portion of the playing surface in which a corresponding player subset is to provide physical input (such as rolling dice, playing cards, or placing game pieces, and so forth) to affect game state. A scanning mechanism scans objects placed within the input regions, while a communication mechanism communicates information regarding the scanned object. The information might, for example, be communicated to affect an electronic game state maintained in another device or distributed across multiple devices.
    Type: Grant
    Filed: June 15, 2012
    Date of Patent: August 22, 2017
    Assignee: MEP Tech, Inc.
    Inventors: Mark L. Davis, John M. Black, Roger H. Hoole, Jeffrey Taylor, Kirby Bisline
  • Patent number: 9550124
    Abstract: An interactive environment image may be projected onto one or more surfaces, and interaction with the projected environment image may be detected within a three-dimensional space over the one or more surfaces. The interactive environment image may be a three-dimensional image, or it may be two-dimensional. An image is projected onto a surface to provide a visual environment representation of a virtual space including one or more of the virtual objects, which may be spatially positioned. User interaction with the projected visualized representation of the virtual environment space may be detected and, in response to user interaction, the projected visualized representation may be changed.
    Type: Grant
    Filed: August 19, 2014
    Date of Patent: January 24, 2017
    Assignee: MEP TECH, INC.
    Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black
  • Patent number: 9317109
    Abstract: An accessory facilitating interaction with a projected image at least partially sourced by the image generation device. The accessory receives an input image from a device, and projects at least a derived image of the input image onto a surface on which the accessory sits. For instance, the accessory might project the input image itself, or perhaps some post-processed version of the input image. A camera system captures data representing user interaction with the projected image. Based on this data, an image input event is detected, and then communicated to the device. For instance, if the image generation device were a touch-sensitive device, the device may respond to the user contacting the projected image the same as it would if the user touched the image generation device at the same corresponding location. Embodiments described herein also relate color compensation of a displayed image.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: April 19, 2016
    Assignee: MEP Tech, Inc.
    Inventors: Mark L. Davis, Matthew Lund Stoker, William Lorenzo Swank, Donald Roy Mealing, Roger H. Hoole, Jeffrey F. Taylor
  • Patent number: 8808089
    Abstract: The projection of an interactive game environment image on one or more surfaces. The interactive game environment image may be a three dimensional image, or may be two dimensional. Data is received that represents virtual objects that are spatially positioned in virtual space. An image is then projected on the substantially horizontal surface that includes a visual representation of all or a portion of the virtual space including one or more of the virtual objects. The system may then detect user interaction with the projected visualized representation of the virtual space, and in response thereto, change the projected visualized representation.
    Type: Grant
    Filed: July 12, 2012
    Date of Patent: August 19, 2014
    Assignee: MEP Tech, Inc.
    Inventors: Mark L. Davis, Timothy Alan Tabor, Roger H. Hoole, Jeffrey Taylor, John M. Black