Patents by Inventor Hisham Bedri
Hisham Bedri has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12026350Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.Type: GrantFiled: March 25, 2023Date of Patent: July 2, 2024Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
-
Publication number: 20230229282Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.Type: ApplicationFiled: March 25, 2023Publication date: July 20, 2023Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
-
Patent number: 11625140Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.Type: GrantFiled: June 1, 2020Date of Patent: April 11, 2023Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
-
Publication number: 20200379627Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.Type: ApplicationFiled: June 1, 2020Publication date: December 3, 2020Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
-
Patent number: 10330610Abstract: An imaging system images near-field objects with focused microwave or terahertz radiation. Multiple antennas emit microwave or terahertz radiation, such that the radiation varies in frequency over time, illuminates a near-field object, reflects from the near-field object, and travels to a passive aperture. For example, the passive aperture may comprise a dielectric lens or a parabolic reflector. The passive aperture focuses, onto a spatial region, the microwave or terahertz radiation that reflected from the near-field object. One or more antennas take measurements, in the spatial region, of the microwave or terahertz radiation that reflected from the near-field object. A computer calculates, based on the measurements, an image of the near-field object and depth information regarding the near-field object.Type: GrantFiled: September 15, 2016Date of Patent: June 25, 2019Assignees: Massachusetts Institute of Technology, Board of Trustees of Michigan State UniversityInventors: Gregory Charvat, Andrew Temme, Micha Feigin-Almon, Ramesh Raskar, Hisham Bedri
-
Patent number: 10003725Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.Type: GrantFiled: January 3, 2018Date of Patent: June 19, 2018Assignee: Massachusetts Institute of TechnologyInventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
-
Publication number: 20180131851Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.Type: ApplicationFiled: January 3, 2018Publication date: May 10, 2018Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
-
Patent number: 9894254Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.Type: GrantFiled: May 10, 2016Date of Patent: February 13, 2018Assignee: Massachusetts Institute of TechnologyInventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
-
Publication number: 20170331990Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.Type: ApplicationFiled: May 10, 2016Publication date: November 16, 2017Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
-
Publication number: 20170212059Abstract: An imaging system images near-field objects with focused microwave or terahertz radiation. Multiple antennas emit microwave or terahertz radiation, such that the radiation varies in frequency over time, illuminates a near-field object, reflects from the near-field object, and travels to a passive aperture. For example, the passive aperture may comprise a dielectric lens or a parabolic reflector. The passive aperture focuses, onto a spatial region, the microwave or terahertz radiation that reflected from the near-field object. One or more antennas take measurements, in the spatial region, of the microwave or terahertz radiation that reflected from the near-field object. A computer calculates, based on the measurements, an image of the near-field object and depth information regarding the near-field object.Type: ApplicationFiled: September 15, 2016Publication date: July 27, 2017Inventors: Gregory Charvat, Andrew Temme, Micha Feigin-Almon, Ramesh Raskar, Hisham Bedri