Patents by Inventor Frederic Garcia
Frederic Garcia has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11215700Abstract: A method and system for real-time motion artifact handling and noise removal for time-of-flight (ToF) sensor images. The method includes: calculating values of a cross correlation function c(?) at a plurality of temporally spaced positions or phases from sent (s(t)) and received (r(t)) signals, thereby deriving a plurality of respective cross correlation values [c(?0), c(?1), c(?2), c(?3)]; deriving, from the plurality of cross correlation values [c(?0), c(?1), c(?2), c(?3)], a depth map D having values representing, for each pixel, distance to a portion of an object upon which the sent signals (s(t)) are incident; deriving, from the plurality of cross correlation values [c(?0), c(?1), c(?2), c(?3)], a guidance image (I; I?); and generating an output image D? based on the depth map D and the guidance image (I; I?), the output image D? comprising an edge-preserving and smoothed version of depth map D, the edge-preserving being from guidance image (I; I?).Type: GrantFiled: March 29, 2016Date of Patent: January 4, 2022Assignee: IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A.Inventors: Cedric Schockaert, Frederic Garcia Becerro, Bruno Mirbach
-
Publication number: 20210165999Abstract: A method for head pose estimation using a monocular camera. The method includes: providing an initial image frame recorded by the camera showing a head; and performing at least one pose updating loop with the following steps: identifying and selecting of a plurality of salient points of the head having 2D coordinates in the initial image frame within a region of interest; determining 3D coordinates for the selected salient points using a geometric head model of the head, corresponding to a head pose; providing an updated image frame recorded by the camera showing the head; identifying within the updated image frame at least some previously selected salient points having updated 2D coordinates; updating the head pose by determining updated 3D coordinates corresponding to the updated 2D coordinates using a perspective-n-point method; and using the updated image frame as the initial image frame for the next pose updating loop.Type: ApplicationFiled: July 25, 2018Publication date: June 3, 2021Inventors: Bruno MIRBACH, Frederic GARCIA BECERRO, Jilliam Maria DIAZ BARROS
-
Patent number: 10883512Abstract: The invention relates to a fan, in particular for an aircraft cooling unit, comprising a wheel (128) and a shaft line (136) for driving the wheel about an axis (A), said wheel including a hub (138) supporting an annular row of blades (140), and means (142) for connection to said shaft line that are housed inside said hub. The fan is characterized in that the connection means include at least one meltable safety element (158) designed to be broken and to rotationally disengage at least a portion of the connection means from the shaft line when a torque for driving the wheel that is transmitted by the shaft line exceeds a specific threshold.Type: GrantFiled: September 9, 2016Date of Patent: January 5, 2021Assignee: SAFRAN ELECTRONICS & POWERInventors: Fabien Del Rio, Frederic Garcia, Francois Gauharou, Rene Salvador
-
Patent number: 10795006Abstract: A camera system comprises a 3D TOF camera for acquiring a camera-perspective range image of a scene and an image processor for processing the range image. The image processor contains a position and orientation calibration routine implemented therein in hardware and/or software, which position and orientation calibration routine, when executed by the image processor, detects one or more planes within a range image acquired by the 3D TOF camera, selects a reference plane among the at least one or more planes detected and computes position and orientation parameters of the 3D TOF camera with respect to the reference plane, such as, e.g., elevation above the reference plane and/or camera roll angle and/or camera pitch angle.Type: GrantFiled: April 27, 2009Date of Patent: October 6, 2020Assignee: IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A.Inventors: Frederic Garcia, Frederic Grandidier, Bruno Mirbach, Roberto Orsello, Thomas Solignac
-
Patent number: 10672112Abstract: A method and system for real-time noise removal and image enhancement of high-dynamic range (HDR) images. The method includes receiving an HDR input image I and operating processing circuitry for (i) applying a first edge-preserving filter (e.g. guided filter) to the input image I, thereby generating a first image component B1 and a first set of linear coefficients ?i,1; (ii) applying a second edge-preserving filter (e.g. guided filter) to the input image I, thereby generating a second image component B2 and a second set of linear coefficients ?i,2; (iii) generating a plausibility mask P from a combination of the first set of linear coefficients ?i,1 and the second set of linear coefficients ?i,2, the plausibility mask P indicating spatial detail within the input image I; and (iv) generating an output image O based on first image component B1, the second image component B2 and the plausibility mask P.Type: GrantFiled: March 2, 2016Date of Patent: June 2, 2020Assignee: IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A.Inventors: Frederic Garcia Becerro, Cedric Schockaert, Bruno Mirbach
-
Patent number: 10275857Abstract: A method for enhancing a depth image of a scene, comprises calculating an enhanced depth image by blending a first filtered depth image with a second filtered depth image or with the original depth image. The blending is achieved by application of a blending map, which defines, for each pixel, a contribution to the enhanced depth image of the corresponding pixel of the first filtered depth image and of the corresponding pixel of either the second filtered depth image or the original depth image. For pixels in the depth image containing no depth value or an invalid depth value, the blending map defines a zero contribution of the corresponding pixel of the second filtered depth image and a 100% contribution of the corresponding pixel of the first filtered image.Type: GrantFiled: September 9, 2013Date of Patent: April 30, 2019Assignee: IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A.Inventors: Bruno Mirbach, Thomas Solignac, Frederic Garcia Becerro, Djamila Aouada
-
Publication number: 20180266431Abstract: The invention relates to a fan, in particular for an aircraft cooling unit, comprising a wheel (128) and a shaft line (136) for driving the wheel about an axis (A), said wheel including a hub (138) supporting an annular row of blades (140), and means (142) for connection to said shaft line that are housed inside said hub. The fan is characterized in that the connection means include at least one meltable safety element (158) designed to be broken and to rotationally disengage at least a portion of the connection means from the shaft line when a torque for driving the wheel that is transmitted by the shaft line exceeds a specific threshold.Type: ApplicationFiled: September 9, 2016Publication date: September 20, 2018Inventors: Fabien DEL RIO, Frederic GARCIA, Francois GAUHAROU, Rene SALVADOR
-
Publication number: 20180067197Abstract: A method and system for real-time motion artifact handling and noise removal for time-of-flight (ToF) sensor images. The method includes: calculating values of a cross correlation function c(?) at a plurality of temporally spaced positions or phases from sent (s(t)) and received (r(t)) signals, thereby deriving a plurality of respective cross correlation values [c(?0), c(?1), c(?2), c(?3)]; deriving, from the plurality of cross correlation values [c(?0), c(?1), c(?2), c(?3)], a depth map D having values representing, for each pixel, distance to a portion of an object upon which the sent signals (s(t)) are incident; deriving, from the plurality of cross correlation values [c(?0), c(?2), c(?3)], a guidance image (I; I?); and generating an output image D? based on the depth map D and the guidance image (I; I?), the output image D? comprising an edge-preserving and smoothed version of depth map D, the edge-preserving being from guidance image (I; I?).Type: ApplicationFiled: March 29, 2016Publication date: March 8, 2018Inventors: Cedric SCHOCKAERT, Frederic GARCIA BECERRO, Bruno MIRBACH
-
Publication number: 20180053289Abstract: A method and system for real-time noise removal and image enhancement of high-dynamic range (HDR) images. The method includes receiving an HDR input image I and operating processing circuitry for (i) applying a first edge-preserving filter (e.g. guided filter) to the input image I, thereby generating a first image component B1 and a first set of linear coefficients ?i,1; (ii) applying a second edge-preserving filter (e.g. guided filter) to the input image I, thereby generating a second image component B2 and a second set of linear coefficients ?i,2; (iii) generating a plausibility mask P from a combination of the first set of linear coefficients ?i,1 and the second set of linear coefficients ?i,2, the plausibility mask P indicating spatial detail within the input image I; and (iv) generating an output image O based on first image component B1, the second image component B2 and the plausibility mask P.Type: ApplicationFiled: March 2, 2016Publication date: February 22, 2018Inventors: Frederic GARCIA BECERRO, Cedric SCHOCKAERT, Bruno MIRBACH
-
Publication number: 20150235351Abstract: A method for enhancing a depth image of a scene, comprises calculating an enhanced depth image by blending a first filtered depth image with a second filtered depth image or with the original depth image. The blending is achieved by application of a blending map, which defines, for each pixel, a contribution to the enhanced depth image of the corresponding pixel of the first filtered depth image and of the corresponding pixel of either the second filtered depth image or the original depth image. For pixels in the depth image containing no depth value or an invalid depth value, the blending map defines a zero contribution of the corresponding pixel of the second filtered depth image and a 100% contribution of the corresponding pixel of the first filtered image.Type: ApplicationFiled: September 9, 2013Publication date: August 20, 2015Inventors: Bruno Mirbach, Thomas Solignac, Frederic Garcia Becerro, Djamila Aouada
-
Patent number: 9025862Abstract: A method for matching the pixels (10-1, 10-2) of a first range image of a scene (18) as seen from a first point of sight (14) with pixels (12-1, 12-2) of a second range image of the scene as seen from a second point of sight (16) comprises the following steps: providing the first range image as a grid of source pixels (10), on which the scene is mapped in accordance with a first projection associated with the first point of sight, wherein each source pixel has a point in the scene projected thereon in accordance with the first projection and has associated therewith a range value determined for that point in the scene; providing a grid of target pixels (12) for the second range image and a second projection associated with the second point of sight; and for each one of the target pixels, a) determining which source pixel would have the same point (P1, P2) in the scene projected thereon in accordance with the first projection as the target pixel would have projected thereon in accordance with the second projecType: GrantFiled: October 7, 2011Date of Patent: May 5, 2015Assignee: IEE International Electronics & Engineering S.A.Inventors: Frederic Garcia Becerro, Bruno Mirbach
-
Publication number: 20130272600Abstract: A method for matching the pixels (10-1, 10-2) of a first range image of a scene (18) as seen from a first point of sight (14) with pixels (12-1, 12-2) of a second range image of the scene as seen from a second point of sight (16) comprises the following steps: providing the first range image as a grid of source pixels (10), on which the scene is mapped in accordance with a first projection associated with the first point of sight, wherein each source pixel has a point in the scene projected thereon in accordance with the first projection and has associated therewith a range value determined for that point in the scene; providing a grid of target pixels (12) for the second range image and a second projection associated with the second point of sight; and for each one of the target pixels, a) determining which source pixel would have the same point (P1, P2) in the scene projected thereon in accordance with the first projection as the target pixel would have projected thereon in accordance with the second projecType: ApplicationFiled: October 7, 2011Publication date: October 17, 2013Applicant: IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A.Inventors: Frederic Garcia Becerro, Bruno Mirbach
-
Publication number: 20110205340Abstract: A camera system comprises a 3D TOF camera for acquiring a camera-perspective range image of a scene and an image processor for processing the range image. The image processor contains a position and orientation calibration routine implemented therein in hardware and/or software, which position and orientation calibration routine, when executed by the image processor, detects one or more planes within a range image acquired by the 3D TOF camera, selects a reference plane among the at least one or more planes detected and computes position and orientation parameters of the 3D TOF camera with respect to the reference plane, such as, e.g., elevation above the reference plane and/or camera roll angle and/or camera pitch angle.Type: ApplicationFiled: April 27, 2009Publication date: August 25, 2011Applicant: IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A.Inventors: Frederic Garcia, Frederic Grandidier, Bruno Mirbach, Roberto Orsello, Thomas Solignac