Patents by Inventor Gilles Pinault
Gilles Pinault has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240023821Abstract: Described herein is an apparatus (a flexible pulsation sensor (FPS) device) that provides wireless monitoring capability. The FPS device includes a FPS configured to wrap around a measurement target of a conduit, such as a synthetic vascular graft or a vessel of a patient. The FPS device also includes a flexible circuit board fitting including circuit elements. The circuit elements can include a pressure sensor that collects data related to displacement of the FPS related to a pressure of and/or within the measurement target; and a wireless transmitter that transmits the data related to the pressure of and/or within the measurement target wirelessly to an external device.Type: ApplicationFiled: September 14, 2021Publication date: January 25, 2024Inventors: Steve J.A. MAJERUS, Christian ZORMAN, Hao CHONG, Jonathan BASKIN, Dustin J. TYLER, Gilles PINAULT
-
Patent number: 11513601Abstract: Described herein is a method for enabling human-to-computer three-dimensional hand gesture-based natural interactions from depth images provided by a range finding imaging system. The method enables recognition of simultaneous gestures from detection, tracking and analysis of singular points of interests on a single hand of a user and provides contextual feedback information to the user. The singular points of interest of the hand: include hand tip(s), fingertip(s), palm center and center of mass of the hand, and are used for defining at least one representation of a pointer. The point(s) of interest is/are tracked over time and are analyzed to enable the determination of sequential and/or simultaneous “pointing” and “activation” gestures performed by a single hand.Type: GrantFiled: December 15, 2016Date of Patent: November 29, 2022Assignee: Sony Depthsensing Solutions SA/NVInventors: Gilles Pinault, Utku Salihoglu, Carlos Augusto Pinheiro de Sousa, Julien Thollot, Laurent Guigues
-
Patent number: 9864433Abstract: Described herein is a method for enabling human-to-computer three-dimensional hand gesture-based natural interactions from depth images provided by a range finding imaging system. The method enables recognition of simultaneous gestures from detection, tracking and analysis of singular points of interests on a single hand of a user and provides contextual feedback information to the user. The singular points of interest of the hand: include hand tip(s), fingertip(s), palm center and center of mass of the hand, and are used for defining at least one representation of a pointer. The point(s) of interest is/are tracked over time and are analyzed to enable the determination of sequential and/or simultaneous “pointing” and “activation” gestures performed by a single hand.Type: GrantFiled: July 12, 2013Date of Patent: January 9, 2018Assignee: SoftKinetic SoftwareInventors: Gilles Pinault, Utku Salihoglu, Carlos Souza, Julien Thollot, Laurent Guigues
-
Publication number: 20170097687Abstract: Described herein is a method for enabling human-to-computer three-dimensional hand gesture-based natural interactions from depth images provided by a range finding imaging system. The method enables recognition of simultaneous gestures from detection, tracking and analysis of singular points of interests on a single hand of a user and provides contextual feedback information to the user. The singular points of interest of the hand: include hand tip(s), fingertip(s), palm centre and centre of mass of the hand, and are used for defining at least one representation of a pointer. The point(s) of interest is/are tracked over time and are analysed to enable the determination of sequential and/or simultaneous “pointing” and “activation” gestures performed by a single hand.Type: ApplicationFiled: December 15, 2016Publication date: April 6, 2017Inventors: Gilles PINAULT, Utku SALIHOGLU, Carlos SOUZA, Julien THOLLOT, Laurent GUIGUES
-
Patent number: 9360944Abstract: Described herein is a wireless remote control device (100) which can be used on the hand (150) of a user to provide both hardware-based control signals which can be associated with gesture-based control signals for enhanced gesture recognition systems. The device (100) comprises a housing (110) having a sensing unit having at least one control button (120, 130, 140) which is capable of generating a control signal for an associated computerized system. A computerized system utilizes information obtained from the control device (100) together with information obtained from a gesture recognition system to resolve any ambiguities due to, for example, occlusion of the hand performing the gesture or the hand being outside the field of view of an imaging system associated with the computerized system, and to trigger interactions within the gesture based interaction system.Type: GrantFiled: January 9, 2013Date of Patent: June 7, 2016Assignee: Softkinetic SoftwareInventor: Gilles Pinault
-
Patent number: 9081419Abstract: Described herein is a user interface that provides contextual feedback, controls and interface elements on a display screen of an interactive three-dimensional imaging system. A user interacts with the interface to provide control signals in accordance with those recognized by the system to a makes use of at least one POI in a three-dimensional scene that is imaged by the imaging system to provide control signals for the user interface. Control signals are provided by means of gestures which are analyzed in real-time by gesture recognition processes that analyze statistical and geometrical properties of POI motion and trajectories.Type: GrantFiled: June 28, 2013Date of Patent: July 14, 2015Assignee: Softkinetic SoftwareInventors: Juan Carlos Tocino Diaz, Kevin Simons, Gilles Pinault, Xavier Baele, Julien Thollot, David Dal Zot
-
Publication number: 20150153833Abstract: Described herein is a method for enabling human-to-computer three-dimensional hand gesture-based natural interactions from depth images provided by a range finding imaging system. The method enables recognition of simultaneous gestures from detection, tracking and analysis of singular points of interests on a single hand of a user and provides contextual feedback information to the user. The singular points of interest of the hand: include hand tip(s), fingertip(s), palm centre and centre of mass of the hand, and are used for defining at least one representation of a pointer. The point(s) of interest is/are tracked over time and are analysed to enable the determination of sequential and/or simultaneous “pointing” and “activation” gestures performed by a single hand.Type: ApplicationFiled: July 12, 2013Publication date: June 4, 2015Applicant: SOFTKINETIC SOFTWAREInventors: Gilles Pinault, Utku Salihoglu, Carlos Souza, Julien Thollot, Laurent Guigues
-
Publication number: 20140368428Abstract: Described herein is a wireless remote control device (100) which can be used on the hand (150) of a user to provide both hardware-based control signals which can be associated with gesture-based control signals for enhanced gesture recognition systems. The device (100) comprises a housing (110) having a sensing unit having at least one control button (120, 130, 140) which is capable of generating a control signal for an associated computerised system. A computerised system utilises information obtained from the control device (100) together with information obtained from a gesture recognition system to resolve any ambiguities due to, for example, occlusion of the hand performing the gesture or the hand being outside the field of view of an imaging system associated with the computerised system, and to trigger interactions within the gesture based interaction system.Type: ApplicationFiled: January 9, 2013Publication date: December 18, 2014Applicant: Softkinetic SortwareInventor: Gilles Pinault
-
Publication number: 20140073423Abstract: A videogame may include providing body position challenges to one or more players to assume various target body positions at, or within, a predetermined time. Body position challenges may be provided to the one or more players by displaying body position challenge representations that prompt the one or more players to assume body positions and/or execute one or more body movements or gestures corresponding to the body position challenges. The time may be indicated via various mechanisms associated with the display (as detailed below). At a given time or over a given time interval, the videogame system may compare the player's detected body position(s) and/or movement(s) to the body position challenges to determine a degree of match or mismatch based on various criteria. Based on the results of the comparison, the game may generate a score or other feedback for display.Type: ApplicationFiled: August 21, 2013Publication date: March 13, 2014Inventors: Eric Krzeslo, Gilles Pinault
-
Publication number: 20140007022Abstract: Described herein is a user interface that provides contextual feedback, controls and interface elements on a display screen of an interactive three-dimensional imaging system. A user interacts with the interface to provide control signals in accordance with those recognised by the system to a makes use of at least one POI in a three-dimensional scene that is imaged by the imaging system to provide control signals for the user interface. Control signals are provided by means of gestures which are analysed in real-time by gesture recognition processes that analyse statistical and geometrical properties of POI motion and trajectories.Type: ApplicationFiled: June 28, 2013Publication date: January 2, 2014Applicant: Softkinetic SoftwareInventors: JUAN CARLOS TOCINO DIAZ, Kevin Simons, Gilles Pinault, Xavier Baele, David Dal Zot
-
Patent number: 8517834Abstract: A videogame may include providing body position challenges to one or more players to assume various target body positions at, or within, a predetermined time. Body position challenges may be provided to the one or more players by displaying body position challenge representations that prompt the one or more players to assume body positions and/or execute one or more body movements or gestures corresponding to the body position challenges. The time may be indicated via various mechanisms associated with the display (as detailed below). At a given time or over a given time interval, the videogame system may compare the player's detected body position(s) and/or movement(s) to the body position challenges to determine a degree of match or mismatch based on various criteria. Based on the results of the comparison, the game may generate a score or other feedback for display.Type: GrantFiled: February 17, 2009Date of Patent: August 27, 2013Assignee: Softkinetic Studios SAInventors: Eric Krzeslo, Gilles Pinault
-
Patent number: 8175374Abstract: The present invention relates to a volume recognition method comprising the steps of: a) capturing three-dimensional image data using a 3D imaging system 3, wherein said image data represent a plurality of points 5, each point 5 having at least a set of coordinates in a three-dimensional space; b) grouping at least some of the points 5 in a set of clusters 6; c) selecting, according to a first set of parameters such as position and size, a cluster 6 corresponding to an object of interest 1 located in range of said imaging system 3; d) grouping at least some of the points 5 of the selected cluster 6 in a set of sub-clusters according to a second set of parameters comprising their positions in the three-dimensional space, wherein each sub-cluster has a centroid 11 in the three-dimensional space; and e) associating a volume 12 to each of at least some of said sub-clusters, wherein said volume 12 is fixed to the centroid 11 of said sub-cluster.Type: GrantFiled: April 20, 2007Date of Patent: May 8, 2012Assignee: Softkinetic SoftwareInventors: Gilles Pinault, Jeremie Roy, Laurent Desmecht, Xavier Baele
-
Publication number: 20100208035Abstract: The present invention relates to a volume recognition method comprising the steps of: a) capturing three-dimensional image data using a 3D imaging system 3, wherein said image data represent a plurality of points 5, each point 5 having at least a set of coordinates in a three-dimensional space; b) grouping at least some of the points 5 in a set of clusters 6; c) selecting, according to a first set of parameters such as position and size, a cluster 6 corresponding to an object of interest 1 located in range of said imaging system 3; d) grouping at least some of the points 5 of the selected cluster 6 in a set of sub-clusters according to a second set of parameters comprising their positions in the three-dimensional space, wherein each sub-cluster has a centroid 11 in the three-dimensional space; and e) associating a volume 12 to each of at least some of said sub-clusters, wherein said volume 12 is fixed to the centroid 11 of said sub-cluster.Type: ApplicationFiled: April 20, 2007Publication date: August 19, 2010Applicant: SOFTKINETIC S.A.Inventors: Gilles Pinault, Jeremie Roy, Laurent Desmecht, Xavier Baele
-
Publication number: 20100210359Abstract: A videogame may include providing body position challenges to one or more players to assume various target body positions at, or within, a predetermined time. Body position challenges may be provided to the one or more players by displaying body position challenge representations that prompt the one or more players to assume body positions and/or execute one or more body movements or gestures corresponding to the body position challenges. The time may be indicated via various mechanisms associated with the display (as detailed below). At a given time or over a given time interval, the videogame system may compare the player's detected body position(s) and/or movement(s) to the body position challenges to determine a degree of match or mismatch based on various criteria. Based on the results of the comparison, the game may generate a score or other feedback for display.Type: ApplicationFiled: February 17, 2009Publication date: August 19, 2010Inventors: Eric Krzeslo, Gilles Pinault