Patents by Inventor Gerard Lacey
Gerard Lacey has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9196054Abstract: An improved method and a system are disclosed for recovering a three-dimensional (3D) scene structure from a plurality of two-dimensional (2D) image frames obtained from imaging means. Sets of 2D features are extracted from the image frames, and sets corresponding to successive image frames are matched, such that at least one pair of matched 2D features refers to a same 3D point in a 3D scene captured in 2D in the image frames. A 3D ray is generated by back-projection from each 2D feature, and the generated 3D rays are subjected to an anchor-based minimization process, for determining camera motion parameters and 3D scene points coordinates, thereby recovering a structure of the 3D scene.Type: GrantFiled: November 11, 2011Date of Patent: November 24, 2015Assignee: The Provost, Fellows, Foundation Scholars, and Other Members of Board of the College of the Holy and Undivided Trinity of Queen Elizabeth Near DublinInventors: Sofiane Yous, Peter Redmond, Gerard Lacey
-
Patent number: 8924334Abstract: A system (1) comprises a physical surgical simulator (11) which transmits data concerning physical movement of training devices to an analysis engine (12). The engine (12) automatically generates rules for a rule base (13a) in a learning system (13). The learning system (13) also comprises content objects (13b) and 3D scenario objects (13c). A linked set of a 3D scenario object (13c), a rule base (13a), and a content object (13b) are together a lesson (10). Another simulator (14) is operated by a student. This transmits data concerning physical movement of training devices by a student to a verification engine (15). The verification engine (15) interfaces with the rule base (13a) to display the lesson in the manner defined by the lesson rule base (13a). It calculates performance measures defined in the lesson rule base (13a). It also records the performance measures into a lesson record (18) and it adapts the display of the lesson in line with the parameters defined in the lesson rule base (13a).Type: GrantFiled: August 12, 2005Date of Patent: December 30, 2014Assignee: CAE Healthcare Inc.Inventors: Gerard Lacey, Donncha Mary Ryan, Derek Cassidy, John Griffin, Laurence Griffin
-
Publication number: 20140147032Abstract: An improved method and a system are disclosed for recovering a three-dimensional (3D) scene structure from a plurality of two-dimensional (2D) image frames obtained from imaging means. Sets of 2D features are extracted from the image frames, and sets corresponding to successive image frames are matched, such that at least one pair of matched 2D features refers to a same 3D point in a 3D scene captured in 2D in the image frames. A 3D ray is generated by back-projection from each 2D feature, and the generated 3D rays are subjected to an anchor-based minimization process, for determining camera motion parameters and 3D scene points coordinates, thereby recovering a structure of the 3D scene.Type: ApplicationFiled: November 11, 2011Publication date: May 29, 2014Applicant: The Provost Fellows,and Scholars of the College of the Holy and Undivided Trinity of Queen ElizabethInventors: Sofiane Yous, Peter Redmond, Gerard Lacey
-
Patent number: 8090155Abstract: A hand washing monitoring system (1) comprising a camera (2), a processor (4), the processor being adapted to receive from the camera images of hand washing activity. The processor analyses mutual motion of hands to determine if the hands mutually move in desired poses, and if so, the durations of the patterns; and generates a hand washing quality indication according to the analysis. The processor extracts information features from the images and generates feature vectors based on the features, including bimanual hand and arm shape vectors, and executes a classifier with the vectors to determine the poses. The processor uses edge segmentation and pixel spatio-temporal measurements to form at least some of the feature vectors.Type: GrantFiled: May 4, 2007Date of Patent: January 3, 2012Assignee: Provost Fellows and Scholars of the College of the Holy and Undivided Trinity of Queen Elizabeth Near DublinInventors: Gerard Lacey, David Fernandez Llorca
-
Publication number: 20110032347Abstract: An endoscopy system (1) comprises an endoscope (2) with a camera (3) at its tip. The endoscope extends through an endoscope guide (4) for guiding movement of the endoscope and for measurement of its movement as it enters the body. The guide (4) comprises a generally conical body (5) having a through passage (105) through which the endoscope (2) extends. A motion sensor comprises an optical transmitter (7) and a detector (8) mounted alongside the passage (105) to measure the insertion-withdrawal linear motion and also rotation of the endoscope by the endoscopist's hand. The system (1) also comprises a flexure controller (10) having wheels operated by the endoscopist. The camera (3), the motion sensor (7/8), and the flexure controller (10) are all connected to a processor (11) which feeds a display.Type: ApplicationFiled: April 15, 2009Publication date: February 10, 2011Inventors: Gerard Lacey, Fernando Vilarino
-
Publication number: 20090087028Abstract: A hand washing monitoring system (1) comprising a camera (2), a processor (4), the processor being adapted to receive from the camera images of hand washing activity. The processor analyses mutual motion of hands to determine if the hands mutually move in desired poses, and if so, the durations of the patterns; and generates a hand washing quality indication according to the analysis. The processor extracts information features from the images and generates feature vectors based on the features, including bimanual hand and arm shape vectors, and executes a classifier with the vectors to determine the poses. The processor uses edge segmentation and pixel spatio-temporal measurements to form at least some of the feature vectors.Type: ApplicationFiled: May 4, 2007Publication date: April 2, 2009Inventors: Gerard Lacey, David Fernandez Llorca
-
Publication number: 20080147585Abstract: A system (1) comprises a physical surgical simulator (11) which transmits data concerning physical movement of training devices to an analysis engine (12). The engine (12) automatically generates rules for a rule base (13a) in a learning system (13). The learning system (13) also comprises content objects (13b) and 3D scenario objects (13c). A linked set of a 3D scenario object (13c), a rule base (13a), and a content object (13b) are together a lesson (10). Another simulator (14) is operated by a student. This transmits data concerning physical movement of training devices by a student to a verification engine (15). The verification engine (15) interfaces with the rule base (13a) to display the lesson in the manner defined by the lesson rule base (13a). It calculates performance measures defined in the lesson rule base (13a). It also records the performance measures into a lesson record (18) and it adapts the display of the lesson in line with the parameters defined in the lesson rule base (13a).Type: ApplicationFiled: August 12, 2005Publication date: June 19, 2008Applicant: HAPTICA LIMITEDInventors: Gerard Lacey, Donncha Mary Ryan, Derek Cassidy, John Griffin, Laurence Griffin
-
Patent number: 6892134Abstract: Movement of a vehicle such as a boat (30) is guided by dynamically monitoring parameters and generating instructions for the operator. The instructions are at a level to attain a number of sub-goals to reach a goal position (33) from a current position (32). Each sub-goal is attained in a single vehicle manoeuvre such as straight-ahead or rotation, each manoeuvre being instructable to the operator. The instructions are generated using a space model (12) of the space around the vehicle, an operator model (11) of operator vehicle control characteristics, and a vehicle model (12) of vehicle movement characteristics.Type: GrantFiled: September 15, 2003Date of Patent: May 10, 2005Assignee: Haptica LimitedInventors: Gerard Lacey, Shane MacNamara
-
Publication number: 20050084833Abstract: A simulator (1) has a body form apparatus (2) with a skin-like panel (4) through which laproscopic instruments (5) are inserted. Cameras (10) capture video images of internal movement of the instruments (5) and a computer (6) processes them. 3D positional data is generated using stereo triangulation and is linked with the associated video images. A graphics engine (60) uses the 3D data to generate graphical representations of internal scenes. A blending function (70) blends real and recorded images, or real and simulated images to allow demonstration of effects such as internal bleeding or suturing.Type: ApplicationFiled: November 9, 2004Publication date: April 21, 2005Inventors: Gerard Lacey, Derek Young, Derek Cassidy, Fiona Slevin, Donncha Ryan
-
Publication number: 20040064249Abstract: Movement of a vehicle such as a boat (30) is guided by dynamically monitoring parameters and generating instructions for the operator. The instructions are at a level to attain a number of sub-goals to reach a goal position (33) from a current position (32). Each sub-goal is attained in a single vehicle manoeuvre such as straight-ahead or rotation, each manoeuvre being instructable to the operator. The instructions are generated using a space model (12) of the space around the vehicle, an operator model (11) of operator vehicle control characteristics, and a vehicle model (12) of vehicle movement characteristics.Type: ApplicationFiled: September 15, 2003Publication date: April 1, 2004Inventors: Gerard Lacey, Shane MacNamara