Patents Assigned to OWL AUTONOMOUS IMAGING, INC.
-
Patent number: 12092743Abstract: A first light beam transmits from a first location to a region of interest at a time t1a and reflects off at least one object disposed in the region of interest, producing a first reflected light beam. A time of flight (ToF) counter is incremented until the reflected first light beam is received back at the first location, whereupon ToF counter stops at time t1b. A second light beam transmits from the first location to the region of interest at time t2a subsequent to time t1b and reflects off the least one object to produce a second reflected light beam. ToF counter is decremented, starting from first count value, until the reflected second light beam is received back at the first location, whereupon ToF counter stops at time t2b. A real-time velocity of the object is computed based at least in part on t1a, t1b, t2a, and t2b.Type: GrantFiled: October 2, 2020Date of Patent: September 17, 2024Assignee: OWL AUTONOMOUS IMAGING, INC.Inventors: Eugene M. Petilli, Francis J. Cusack, Jr.
-
Publication number: 20240280703Abstract: A first light beam transmits from a first location to a region of interest at a time t1a and reflects off at least one object disposed in the region of interest, producing a first reflected light beam. A time of flight (ToF) counter is incremented until the reflected first light beam is received back at the first location, whereupon ToF counter stops at time t1b. A second light beam transmits from the first location to the region of interest at time t2a subsequent to time t1b and reflects off the least one object to produce a second reflected light beam. ToF counter is decremented, starting from first count value, until the reflected second light beam is received back at the first location, whereupon ToF counter stops at time t2b. A real-time velocity of the object is computed based at least in part on t1a, t1b, t2a, and t2b.Type: ApplicationFiled: October 2, 2020Publication date: August 22, 2024Applicant: OWL AUTONOMOUS IMAGING, INC.Inventors: Eugene M. PETILLI, Francis J. CUSACK, JR.
-
Patent number: 12063340Abstract: Embodiments of systems and methods for multi-aperture ranging are disclosed.Type: GrantFiled: October 28, 2022Date of Patent: August 13, 2024Assignee: Owl Autonomous Imaging, Inc.Inventors: Srinath Obla, Eugene M. Petilli, Akash Chintha, Charles F. Gershman, Francis J. Cusack, Jr.
-
Publication number: 20240264003Abstract: Methods and systems for thermal image sensing are disclosed. A method involves controlling a current (IACT) from a detector of a thermal image sensor with a first pulse width modulated (PWM) signal (PWMACT) for gain control, controlling a current (IREF) from a reference source of the thermal image sensor with a second PWM signal (PWMFEEDBACK) for gain control and for offset correction, wherein the second PWM signal (PWMFEEDBACK) is generated in response to a digital output (DQ) that is fed back from an analog to digital conversion circuit, and providing a current (ISUM), which is the sum of the current (IACT) and the current (IREF), to the analog to digital conversion circuit.Type: ApplicationFiled: February 7, 2024Publication date: August 8, 2024Applicant: OWL AUTONOMOUS IMAGING, INC.Inventors: Fatemeh Ataei, Eugene M. Petillli
-
Patent number: 11659260Abstract: A method for image acquisition includes receiving, by an image acquisition computing device, a digitized LiDAR image frame and a thermal image frame of a region of interest from a read out integrated circuit of an image acquisition device coupled to the image acquisition computing device. The LiDAR image frame and the thermal image frame are processed to detect one or more objects of interest located in the region of interest. The detected one or more objects of interest are correlated between the LiDAR image frame and the thermal image frame. The detected one or more objects of interest are identified based on the correlation between the LiDAR image frame and the thermal image frame. An integrated LiDAR and thermal image acquisition device is also disclosed.Type: GrantFiled: March 25, 2020Date of Patent: May 23, 2023Assignee: OWL AUTONOMOUS IMAGING, INC.Inventors: Eugene M. Petilli, Christopher S. Urban, Francis J. Cusack, Jr.
-
Publication number: 20230098450Abstract: Photosensitive semiconducting devices, such as bipolar junction transistors (BJTs) can be built up over a substrate that may include a read-out integrated circuit (ROIC). Semiconducting layers can be deposited over the substrate and bottom electrodes that are on or at the substrate's top surface. The bottom electrodes may be the input pads of the ROIC. A top electrode is deposited over the semiconducting layers. The semiconducting layers can form BJTs between the bottom electrodes and the top electrode. The top electrode and the bottom electrodes are the BJTs collectors and emitters. The semiconducting layers include a P-type quantum dot layer and a N-type metal oxide layer. The quantum dots act as light sensors for the ROIC because photons absorbed in a semiconducting layer can produce a BJT base current. The BJTs can be formed without requiring a vacuum or patterning of the top electrode.Type: ApplicationFiled: September 28, 2022Publication date: March 30, 2023Applicant: OWL AUTONOMOUS IMAGING, INC.Inventors: Jacob Eisensmith, Eugene M. Petilli
-
Publication number: 20230008557Abstract: An imaging apparatus has one or more lenses with a common optical axis and that define an image plane. A splitting optic is disposed to split the light along the optical axis to provide, at the image plane, at least a first copy of an image at a first magnification and a second copy of the image at a second magnification different from the first magnification.Type: ApplicationFiled: July 6, 2022Publication date: January 12, 2023Applicant: OWL AUTONOMOUS IMAGING, INC.Inventors: Eugene M. Petilli, Georg K. Nadorff
-
Patent number: 11490067Abstract: Embodiments of systems and methods for multi-aperture ranging are disclosed. An embodiment of a device includes a main lens, configured to receive an image from the field of view of the main lens, a multi-aperture optical component having optical elements optically coupled to the main lens and configured to create a multi-aperture image set that includes a plurality of subaperture images, wherein at least one point in the field of view is captured by at least two of the subaperture images, an array of sensing elements, a ROIC configured to receive the signals, to convert the signals to digital data, and to output the digital data, and an image processing system, responsive to the digital data that is output from the ROIC, which is configured to generate disparity values that correspond to at least one point in common between the at least two subaperture images.Type: GrantFiled: August 9, 2021Date of Patent: November 1, 2022Assignee: OWL AUTONOMOUS IMAGING, INC.Inventors: Srinath Obla, Eugene M. Petilli, Akash Chintha, Charles F. Gershman, Francis J. Cusack, Jr.
-
Publication number: 20220046219Abstract: Embodiments of systems and methods for multi-aperture ranging are disclosed. An embodiment of a device includes a main lens, configured to receive an image from the field of view of the main lens, a multi-aperture optical component having optical elements optically coupled to the main lens and configured to create a multi-aperture image set that includes a plurality of subaperture images, wherein at least one point in the field of view is captured by at least two of the subaperture images, an array of sensing elements, a ROIC configured to receive the signals, to convert the signals to digital data, and to output the digital data, and an image processing system, responsive to the digital data that is output from the ROIC, which is configured to generate disparity values that correspond to at least one point in common between the at least two subaperture images.Type: ApplicationFiled: August 9, 2021Publication date: February 10, 2022Applicant: OWL AUTONOMOUS IMAGING, INC.Inventors: Srinath Obla, Eugene M. Petilli, Akash Chintha, Charles F. Gershman, Francis J. Cusack, JR.
-
Publication number: 20200389606Abstract: A method for image acquisition includes receiving, by an image acquisition computing device, a digitized LiDAR image frame and a thermal image frame of a region of interest from a read out integrated circuit of an image acquisition device coupled to the image acquisition computing device. The LiDAR image frame and the thermal image frame are processed to detect one or more objects of interest located in the region of interest. The detected one or more objects of interest are correlated between the LiDAR image frame and the thermal image frame. The detected one or more objects of interest are identified based on the correlation between the LiDAR image frame and the thermal image frame. An integrated LiDAR and thermal image acquisition device is also disclosed.Type: ApplicationFiled: March 25, 2020Publication date: December 10, 2020Applicant: OWL AUTONOMOUS IMAGING, INC.Inventors: Eugene M. PETILLI, Christopher S. URBAN, Francis J. CUSACK, JR.
-
Patent number: 10600187Abstract: A trajectory detection device includes a lens configured to receive an image of a field of view. An array of microlenses is configured to create an array of light field images based on the image. A detector array includes a plurality of photon sensitive photodetectors. The detector array is configured to generate output signals from each photodetector based on the array of light field images. A controller is configured to integrate the output signals over an integration period. At least a portion of the output signals are modulated at a modulating frequency having a modulating frequency cycle time that is smaller than the integration period. A three-dimensional image of motion in the field of view is generated based on the integration of the modulated output signals.Type: GrantFiled: March 20, 2019Date of Patent: March 24, 2020Assignee: OWL AUTONOMOUS IMAGING, INC.Inventor: Eugene M. Petilli
-
Publication number: 20190295264Abstract: A trajectory detection device includes a lens configured to receive an image of a field of view. An array of microlenses is configured to create an array of light field images based on the image. A detector array includes a plurality of photon sensitive photodetectors. The detector array is configured to generate output signals from each photodetector based on the array of light field images. A controller is configured to integrate the output signals over an integration period. At least a portion of the output signals are modulated at a modulating frequency having a modulating frequency cycle time that is smaller than the integration period. A three-dimensional image of motion in the field of view is generated based on the integration of the modulated output signals.Type: ApplicationFiled: March 20, 2019Publication date: September 26, 2019Applicant: OWL AUTONOMOUS IMAGING, INC.Inventor: Eugene M. PETILLI