Patents by Inventor Hisham Bedri

Hisham Bedri has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12026350
    Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.
    Type: Grant
    Filed: March 25, 2023
    Date of Patent: July 2, 2024
    Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
  • Publication number: 20230229282
    Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.
    Type: Application
    Filed: March 25, 2023
    Publication date: July 20, 2023
    Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
  • Patent number: 11625140
    Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.
    Type: Grant
    Filed: June 1, 2020
    Date of Patent: April 11, 2023
    Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
  • Publication number: 20200379627
    Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.
    Type: Application
    Filed: June 1, 2020
    Publication date: December 3, 2020
    Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
  • Patent number: 10330610
    Abstract: An imaging system images near-field objects with focused microwave or terahertz radiation. Multiple antennas emit microwave or terahertz radiation, such that the radiation varies in frequency over time, illuminates a near-field object, reflects from the near-field object, and travels to a passive aperture. For example, the passive aperture may comprise a dielectric lens or a parabolic reflector. The passive aperture focuses, onto a spatial region, the microwave or terahertz radiation that reflected from the near-field object. One or more antennas take measurements, in the spatial region, of the microwave or terahertz radiation that reflected from the near-field object. A computer calculates, based on the measurements, an image of the near-field object and depth information regarding the near-field object.
    Type: Grant
    Filed: September 15, 2016
    Date of Patent: June 25, 2019
    Assignees: Massachusetts Institute of Technology, Board of Trustees of Michigan State University
    Inventors: Gregory Charvat, Andrew Temme, Micha Feigin-Almon, Ramesh Raskar, Hisham Bedri
  • Patent number: 10003725
    Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.
    Type: Grant
    Filed: January 3, 2018
    Date of Patent: June 19, 2018
    Assignee: Massachusetts Institute of Technology
    Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
  • Publication number: 20180131851
    Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.
    Type: Application
    Filed: January 3, 2018
    Publication date: May 10, 2018
    Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
  • Patent number: 9894254
    Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.
    Type: Grant
    Filed: May 10, 2016
    Date of Patent: February 13, 2018
    Assignee: Massachusetts Institute of Technology
    Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
  • Publication number: 20170331990
    Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.
    Type: Application
    Filed: May 10, 2016
    Publication date: November 16, 2017
    Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
  • Publication number: 20170212059
    Abstract: An imaging system images near-field objects with focused microwave or terahertz radiation. Multiple antennas emit microwave or terahertz radiation, such that the radiation varies in frequency over time, illuminates a near-field object, reflects from the near-field object, and travels to a passive aperture. For example, the passive aperture may comprise a dielectric lens or a parabolic reflector. The passive aperture focuses, onto a spatial region, the microwave or terahertz radiation that reflected from the near-field object. One or more antennas take measurements, in the spatial region, of the microwave or terahertz radiation that reflected from the near-field object. A computer calculates, based on the measurements, an image of the near-field object and depth information regarding the near-field object.
    Type: Application
    Filed: September 15, 2016
    Publication date: July 27, 2017
    Inventors: Gregory Charvat, Andrew Temme, Micha Feigin-Almon, Ramesh Raskar, Hisham Bedri