Abstract: The method consists in taking, when the vehicle is running, images of the road markings delimiting the circulation lanes of the road, and in fully estimating, through an iterative process, the orientation of the camera with respect to the vehicle, based on the position of two lanes located side-by-side on the image. The calibration essentially comprises: correcting the position of the lane edges in the image (10, 12); estimating the residual pitch and yaw (16); updating the rotation matrix (18); estimating the residual roll (20); updating the rotation matrix (24). These steps are iterated until the corrective angles estimated by each module are negligible (22).
Abstract: A method for interactive adjustment of a parameter of a continuously variable optical lens involving having a subject view an eye chart through a variable lens frame comprising at least one continuously variable optical lens, applying a modulation to a selected parameter of said continuously variable optical lens around an average value, and tuning said average value by minimizing the flickering visible to the subject and due to the modulation.
Abstract: The drone (10) comprises an onboard video camera (14) picking up a sequence of images to be transmitted to a remote control. The user selects an exposure mode such as forward or sideways, panoramic or boom plane, tracking, defining a trajectory to be transmitted to the drone. Corresponding setpoint values are generated and applied to a processing subsystem controlling the motors of the drone. Once the drone is stabilized on the prescribed trajectory (38), the exposure by the video camera (14) is activated and the trajectory is stabilized by an open loop control avoiding the oscillations inherent in a servocontrol with feedback loop.
Abstract: The drone (10) comprises a camera with a hemispherical-field lens of the fisheye type pointing to a fixed direction (?) with respect to the drone body. A capture area (36) of reduced size is extracted from the image formed by this lens (42), the position of this area being function of the signals delivered by an inertial unit measuring the Euler angles characterizing the attitude of the drone with respect to an absolute terrestrial reference system. The position of this area is dynamically modified in a direction (44) opposite to that of the changes of attitude (38) of the drone detected by the inertial unit. The raw pixel data are then processed to compensate for the geometric distortions introduced by the fisheye lens in the image acquired in the region of the capture area.
Abstract: This headset has two earphones adapted to be introduced inside or near to the auditory canal of the user. The earphones are connected by a connection element that has a central band resting on the top of the headset user's head, and two lateral branches extending the band. Each of the bands has, at a proximal end, an articulated connection with the central band; at a distal end, a deformable connection with the respective earphone; and, in a median region, a case that houses electric or electronic components. An internal side of the case, which is directed towards the user's head, has a bearing face that is adapted to come into contact with a temporal region of the headset user.
Abstract: The attitude and speed of the drone are controlled by angular commands applied to a control loop (120) for controlling the engines of the drone according to the pitch and roll axes. A dynamic model of the drone, including, in particular, a Kalman predictive filter, represents the horizontal speed components of the drone on the basis of the drone mass and drag coefficients, the Euler angles of the drone relative to an absolute terrestrial reference, and the rotation of same about a vertical axis. The acceleration of the drone along the three axes and the relative speed of same in relation to the ground are measured and applied to the model as to estimate (128) the horizontal speed components of the cross wind. This estimation can be used to generate corrective commands (126) that are combined with the angular commands applied to the control loop of the drone in terms of pitch and roll.
Abstract: The system comprises audio headsets able to apply audio processing operations in response to presets that are common to all the headsets, smart terminals provided with audio flow generation means, and communication means to transmit to the headsets an audio flow and presets to be applied to this flow. A preset server memorises the presets associated with works. Certain terminals can generate presets and transmit them to the server, whereas other terminals collect these presets to apply them to the corresponding works at the time of reproducing these latter.
Abstract: This headset comprises two earphones (10), each comprising a generally ring-shaped, flexible circumaural or supra-aural pad (16), mounted on a shell (14) receiving a transducer (30). The headset comprises an electronic circuitry powered by a battery, and a battery recharging circuit connected to at least one inductive-coupling recharging coil (20). The coil is housed in a pad (16), and it is housed in a region of the pad (16) that is close to an outer limit plane, on the wearer side, of the envelope (16a, 16b) of the pad, for example near a seam (C) between two portions (16a, 16b) of an envelope of the pad.
Abstract: The headset includes an active noise control, with an internal ANC microphone (28) placed inside the acoustic cavity (22) and delivering a signal including an acoustic noise component. A digital signal processor DSP (50) comprises a feedback ANC branch (54) applying a filtering transfer function (54, HFB2) to the signal delivered by the ANC microphone, and a mixer (60) for mixing the signal of the feedback branch with an audio signal to be reproduced (M). The headset comprises a movement sensor (64) mounted on one of the earphones. The DSP comprises an anti-saturation module (68) for analyzing concurrently i) the signal delivered by the internal microphone (28) and ii) the signal delivered by the movement sensor (64), and verifying whether current characteristics of these signals fulfill or not a set of predetermined criteria. Upstream from the feedback ANC filter (54), an anti-saturation filter (70, HFB1) is selectively switched as a function of the result of this verification.
Type:
Grant
Filed:
April 21, 2015
Date of Patent:
October 11, 2016
Assignee:
PARROT
Inventors:
Vu Hoang Co Thuy, Benoit Pochon, Phong Hua, Pierre Guiu
Abstract: According to a first aspect, the invention relates to an intraocular variable focus implant comprising a non-conducting liquid with a melting temperature above 0° C., a conducting liquid, a liquid interface formed by the non-conducting and conducting liquids, a first electrode in contact with the conducting liquid, a second electrode insulated from the conducting liquid, wherein the liquid interface is movable by electro wetting according to a change in a voltage applied between the first and second electrodes.
Abstract: The drone comprises: a vertical-view camera (132) pointing downward to pick up images of a scene of the ground overflown by the drone; gyrometer, magnetometer and accelerometer sensors (176); and an altimeter (174). Navigation means determine position coordinates (X, Y, Z) of the drone in an absolute coordinate system linked to the ground. These means are autonomous, operating without reception of external signals. They include image analysis means, adapted to derive a position signal from an analysis of known predetermined patterns (210), present in the scene picked up by the camera, and they implement a predictive-filter estimator (172) incorporating a representation of a dynamic model of the drone, with as an input the position signal, a horizontal speed signal, linear and rotational acceleration signals, and an altitude signal.
Type:
Grant
Filed:
July 3, 2014
Date of Patent:
July 12, 2016
Assignee:
Parrot
Inventors:
Michael Rischmuller, Laure Chevalley, Francois Callou, Etienne Caldichoury